Monday, June 25, 2007

The Jurassic Park of languages

In a sociolinguistics course, I almost used that title for my final paper about the revival of Hebrew. Common sense made me reconsider. Although my teacher wasn't Jewish, I knew that some Jews take offense at the idea that Hebrew was ever an extinct language. Their attitude, ironically, leads them to overlook a remarkable Jewish achievement.

Hebrew is, put simply, the only language in history that has ever been successfully revived. This becomes clear once we understand what "revival" means. People tend to use the term somewhat loosely, applying it to movements aimed at preserving languages such as Irish, where there always have been at least a few native speakers left. And when the term is used correctly, as in the revival of Sanskrit, the movement is invariably nowhere near as successful as Modern Hebrew.

By the nineteenth century, Hebrew was not endangered but extinct, and it had been so for almost two millenia. By calling it "extinct," in no way am I trying to denigrate its central role in Judaism. On the contrary, it is a language that I myself use every day in formal prayer, as Jews have been doing throughout their entire history. But that's just the point: a purely religious language is not a living language, not in the sense that English or Spanish is.

Many people would disagree with me. William Chomsky writes, in Hebrew: The Eternal Language, "it may be safely assumed that there were always somewhere in the world, especially in Eretz Yisrael, individuals or even groups, who could and did employ the Hebrew language effectively in oral usage" (p. 218). There are various anecdotes of Jews conversing in Hebrew before the nineteenth century, such as when two Jews from faraway lands wanted to communicate and had no common vernacular. But the extent of these stories is disputed, and in any case it doesn't prove that Hebrew was a living language. Even today, there are people who can converse in Latin.

While Jews in the Middle Ages were trained from a young age to use Hebrew to a degree, it was nobody's native language. There's something special about native languages. Think about your native language. You probably can't remember ever not having spoken the language. It is so ingrained in your consciousness that it's a part of your very being. And all other languages seem like artificial systems of arbitrary sounds until you habituate yourself to them--and even then, they never feel quite as natural to you as your native tongue.

The simple fact is that Hebrew lacked that natural quality for almost the entire Diaspora. Jews studied Hebrew, prayed in Hebrew, and wrote books in Hebrew, but they did not truly speak the language except in very artificial, strained situations that rarely occurred. The revival turned it into a language that millions spoke in, thought in, and breathed in--to this day an unparalleled feat in the history of languages.

In my paper, I pondered what made this feat possible. I concluded that it depended on a whole range of factors happening simultaneously. It depended on the uncommon occurrence of a people who maintained a sense of unity for thousands of years while being scattered across the globe. It depended on their desire for a homeland, and their finding a place fertile for the creation of a new national tongue. It depended on Hebrew being their only common language. It depended on the dedication of a particular man who called himself Eliezer Ben Yehuda, and who was probably a little nuts.

What he did to bring Modern Hebrew to fruition has entered the lore of Jewish culture. He and his wife raised their son in total isolation, so that the child would be exposed to no language except Hebrew. If a non-Hebrew-speaking visitor arrived, Ben Yehuda would send the child to bed. When he came home one day to find his wife singing in Russian, he lost his temper. He even avoided having the child hear bird chirps and other animal sounds! Out of all this lunacy, the child became the world's first native speaker of Modern Hebrew.

One of the main sources I used for this information was Jack Fellman's 1973 book The Revival of a Classical Tongue. Fellman argued that Ben Yehuda's role in the revival has been overstated in popular treatments. Personally, I think Fellman's account proved just the opposite. It's true that Ben Yehuda couldn't have done it all on his own. Even after the experiment with his child, much work remained to turn Hebrew into a full modern language. But just the example he set had a huge impact on the movement. Of course, it raises significant ethical questions about language revival.

One of the problems facing the new language was that it lacked words for modern concepts. According to Fellman, Ben Yehuda sometimes had to rely on gestures and vague statements like "Take such and such...and bring me this and this, and I will drink" (p. 38). When Modern Hebrew took off in the populace, it borrowed wholesale from Arabic, English, and several other sources to enrich its vocabulary.

Nowadays, around five million people use Hebrew as their main language. As an American Jew who was taught Hebrew as a formal, religious language, I always get a weird feeling listening to Israelis use it so casually. To me, the word bitachon (ביטחון) refers to a spiritual concept meaning "trust"; it was odd to visit Israel and see that word printed on the backs of security personnel. I still can't wrap my mind around the idea that even criminals and street kids speak this language that a small group of scholars reconstructed from a holy tongue less than two centuries ago. As Robert St. John put it in Tongue of the Prophets, Ben Yehuda "made it possible for several million people to order groceries, drive cattle, make love, and curse out their neighbors in a language which until his day had been fit only for Talmudic argument and prayer" (pp. 11-12). Whether you consider the feat good or bad, it certainly is incredible.

Tuesday, June 19, 2007

Language observant

In 2000, Larry King asked Joe Lieberman which denomination of Judaism he followed: Orthodox, Conservative, or Reform. Lieberman replied, "I like to think of myself as an observant Jew, because it is broader and it's inclusive." This rather mild and good-natured remark sparked a torrent of criticism, providing fuel for those who felt Lieberman was selling out in his bid for the vice presidency. Binyamin Jolkovsky of The Jerusalem Post complained that Lieberman in this interview "changed his long-time self-description from 'Orthodox' to 'observant.'"

Jolkovsky's complaint ignored a couple of facts. Lieberman described himself as observant in his book In Praise of Public Life, released several months before the Larry King interview. What's more, in a separate interview just three days later, Lieberman began a sentence with the words "The fact that I'm Orthodox...." Nowhere did he change his self-description. He simply expressed a preference for one label over another.

Jolkovsky seemed to assume that adopting the term "observant" was tantamount to denying being Orthodox. I would expect non-Jews to be scratching their heads when listening to this squabble over terminology. Why would Orthodox Jews of all people be offended by the term "observant"? And why did Lieberman prefer the term?

Understanding what's going on here requires some historical background. The division of Judaism into its Orthodox and Reform branches occurred in the nineteenth century. As a new movement, Reform Judaism enacted changes to traditional Jewish practice. Jews who rejected the reforms and maintained the traditional ways came to be called Orthodox Jews.

In common parlance, Orthodox Judaism isn't really a single movement but rather an umbrella term for several Jewish groups that remained relatively traditional amidst the emergence of Reform Judaism. Sephardic Jews, who never even encountered the original Reform movement, are usually classed with the Orthodox today. But not everyone accepts this blanket use of the term Orthodox. There are those who restrict the term to the movement that arose as a direct reaction against Reform Judaism.

I tend to think of Orthodox Judaism as a retronym, or a new term for an old concept. Retronyms happen when a new version of something comes along, causing the old version to require a new name. For example, after microwave ovens were invented, older-style ovens came to be called "conventional ovens." (For those who think I'm implying that new is automatically better, I have two words: New Coke.)

In any case, it was Reform Jews who came up with the term Orthodox. Early Orthodox opponents of Reform, like Rabbi Samson Raphael Hirsch, resisted the term. (The opposite is true of Christianity: The Eastern Orthodox Church gave itself the term orthodox, meaning "correct belief." But by the nineteenth century the word had acquired some negative connotations.)

Among Orthodox Jews themselves, the popular term is frum (pronounced with the vowel sound in wood). This Yiddish word literally means "pious," occasionally carrying negative overtones but most of the time used by Orthodox Jews as a respectful, informal alternative to "Orthodox."

The English word "observant" is not quite as popular. The problem is that many non-Orthodox Jews call themselves observant, and there's a perception that they use the word much more often than Orthodox Jews do. Because Orthodox Judaism tends to consider itself the only legitimate expression of Judaism, some people interpret a Jew's refusal to specify a denomination as contrary to the spirit of Orthodoxy.

As an Orthodox Jew myself, I have never accepted this reasoning. I like to do away with labels as much as I can. When I first set up an account with the dating site Frumster, I was required to describe what type of Orthodox Jew I was. I couldn't just say I was Orthodox; the site made me choose from the following subcategories: "Modern Orthodox liberal," "Modern Orthodox machmir," "Yeshivish Black Hat," "Hasidic," and "Carlebachian." I didn't feel comfortable with any of those, but I settled on "Modern Orthodox machmir," which seemed the least problematic to me.

Eventually, the site expanded its categories. Conservative and Reform Jews could now join the site, and everyone was given a wide range of choices for self-identification. I selected a new category called "Shomer Mitzvot," which literally means "watchful of the commandments"--in other words, observant. It was exactly the type of generic self-description I had been searching for all along.

A friend of mine recently told me that his daughter thinks one should never select that category. He did not remember why she felt this way, but I had little trouble guessing. She probably believes that someone who identifies as "Shomer Mitzvot" is in effect not calling himself Orthodox. Or, at least, she thinks that people might perceive it that way, and so it's best to avoid it if you want to increase your chances of finding a prospective match in the Orthodox community.

You know what? I don't care. I feel comfortable calling myself "Shomer Mitzvot," and that's all that matters. The last thing I'm going to do is bend to someone else's standards. I'm not that desperate. Besides, it all adds up in the end. If a woman assumes I'm not suited to her simply because I call myself "Shomer Mitzvot," then she's probably right.

Sunday, June 17, 2007

Are video games a form of art?

I am prepared to believe that video games can be elegant, subtle, sophisticated, challenging and visually wonderful. But I believe the nature of the medium prevents it from moving beyond craftsmanship to the stature of art. To my knowledge, no one in or out of the field has ever been able to cite a game worthy of comparison with the great dramatists, poets, filmmakers, novelists and composers. That a game can aspire to artistic importance as a visual experience, I accept. But for most gamers, video games represent a loss of those precious hours we have available to make ourselves more cultured, civilized and empathetic.
Those words are from Roger Ebert in 2005, when he caused a firestorm with his assertion that video games are inherently not a form of art. Whether you agree with him or not, the ensuing debate was interesting. Unfortunately, Ebert's credibility was suspect, since by his own admission he lacked familiarity with modern video games. And his statement sounds just like the sort of narrow-minded declaration that's almost asking to be discredited. A couple of generations ago, most people would have scoffed at the idea that comic books are art; nowadays, that idea has gained increasingly wide acceptance (though the respected comic books carry a new name, graphic novels).

Still, I understand where Ebert is coming from. And I say this as someone who knows even less about modern video games than he does. I largely stopped playing them when I was a teenager. I felt that I wasn't getting out of them anywhere near as much as I was putting into them. They were an enjoyable diversion, but left me feeling drained when I spent too much time with them.

I'm aware that video games have increased in sophistication since the days of Nintendo, by huge orders of magnitude. And apparently some gamers think these are works of the human imagination that deserve comparison with great works of literature--or at least that they have that potential, even if the genre is in its infancy right now.

If you think that video games will never be Shakespeare, I should remind you that most people in Shakespeare's time would have laughed at the idea that his plays would be studied centuries later. Most of his plays weren't even published during his lifetime. And it took a long time before critics viewed them as serious works of literature, much less ranked him as the greatest writer in the English language.

Nowadays, people study the heavily footnoted texts of Shakespeare's plays in a manner that the Bard himself would never have envisioned. Is it possible that a similar process will happen with computer games? Will students of the future be studying games in the classroom, in a format that would perplex the original designers?

I debated these issues with someone on a message board a few years ago. He took the position that some video games have achieved an artistic level comparable to great literature and film; I was skeptical, even while admitting my ignorance. It was a nice debate, and I think we both ended up learning something. I learned from him that some modern video games--notably adventure games--have fairly complex narratives, with even the rudiments of character development. But I managed to persuade him that the very nature of video games cuts against the kind of complexity found in literature and film.

Traditionally, games have nothing to do with art. Chess may be a high intellectual activity, but it isn't really a form of art. What all art forms have in common, whether they be paintings, sculptures, poems, novels, plays, films, or comic books, is that the viewer contributes nothing to them beyond his own imagination. Interactive fiction (like the Choose Your Own Adventure series for kids) has always remained a minor phenomenon.

Computers have the potential to blur that line, however, creating games with a high degree of artistic content in terms of both graphics and narrative. But there is a limit. The only way they could achieve full artistic status is if they stopped being games.

I'll give an example from an old game I used to play, Infocom's text adventure The Hitchhiker's Guide to the Galaxy, based on the Douglas Adams novel. The novel contains a scene where Arthur is at a party trying pick up a girl, when another guy comes along and catches her attention by saying he's from another planet. Your goal in the game, as a player, is to make sure the scene happens exactly as it did in the book. You can have no effect on Arthur's chances of getting the girl. Arthur is a loser. That's part of the script. Even if in real life you're God's gift to women, as computer game addicts are known to be, you're not going to change what Arthur's like. The game has a lot of puzzles that require brain power, but your personality doesn't affect the outcome.

Would it be possible to create a game where your personality does make a difference? I can imagine it now. It will be called SimFlirt. Your objective is to go to a party and pick up a girl. (Or, if you are a girl, then you pick up a guy. Or, if you're gay...never mind.) Whether you succeed depends on what you do or say.

Of course, such a game would never compare to a novel or a film. The range of possible outcomes would still be relatively limited. If a character in a computer game can have any level of depth, then it must be written into the game's narrative, without the player having much influence. The problem is that the player's very presence dilutes this process. Games simply are not a good medium for exploring the nuances of human behavior, at least not to the degree found in literature.

The point here is that games are different, not inferior. They serve a different purpose. I'm reminded of something Orson Scott Card said in his 1990 book How to Write Science Fiction and Fantasy: "In a fantasy, if magic has no limitations, the characters are omnipotent gods; anything can happen, and so there's no story. There have to be strict limits on magic. Dungeons and Dragons uses a seniority system that may work well for games, but for stories it is truly stupid: The longer you manage to stay alive, the more spells you know and the more power you have" (p. 31).

Note Card's implication that what works well in a game isn't believable in fiction. I would go further and suggest that what works in fiction isn't suited to a game. Even games with narratives, like D&D and computer adventure games, do not require the same suspension of disbelief as fiction does. That's because the narrative is only a means to an end, whereas in fiction the narrative is central. Perhaps the computer adventure games of today put a greater focus on narrative than ever before. But there comes a point when the integrity of the narrative must bow to the integrity of the game. Everything in a game falls back on the player's choices, and by giving the player choices, the narrative inevitably suffers.

If video games are blurring the distinction by taking on many artistic qualities, the question is how far they can go while still remaining games. And while there will always be people who devote their precious hours to these games, their impact may remain marginal simply because they forge that impossible middle ground between the artistic and the recreational.

Friday, June 01, 2007

The Jewish cab test

Shortly after Tiger Woods became the first black to win the Masters Tournament, he insisted that he was not black but "Cablinasian," a word he coined to describe the different groups in his ancestry: Caucasian, Black, Indian, and Asian. Sarcastic African-American columnist Gregory Kane retorted that Woods should be given "the cab test": "Stand him on a street corner in any large American city and have him hail a cab. If he gets one, he's Cablinasian. If he doesn't, he's definitely black" (The Baltimore Sun, Apr. 27, 1997, pg. 1B).

I wonder if a similar test could be applied to Jews. Arguably, the Holocaust was a grotesque version of this test, as Jews who abandoned their heritage and became atheists or Christians discovered that they were just as likely to be gassed as the bearded shtetl Jew. Hitler justified this innovation to classical anti-Semitism by arguing that Jews who assimilated took Jewish ideas with them. I can't say he was totally wrong.

These examples highlight one of the most basic questions about ethnic identity: is it defined by members of the group themselves, or by outsiders? For us Jews, this dilemma is even more perplexing, because we haven't even settled the "Who is a Jew?" question amongst ourselves. Why should we expect others to fare any better?

The traditional definition of a Jew is one whose mother is Jewish, or one who converts. (Computer scientists would call that a recursive definition.) But Orthodox Jews do not accept conversions done by Conservative or Reform rabbis, and Reform Judaism has expanded the definition to include those born to a Jewish father. Depending on one's perspective, individuals in many U.S. synagogues may not be Jewish.

No matter how strongly Orthodox Jews insist that their definition is the only legitimate one, non-Jews cannot be bothered to take sides on this in-the-family dispute. They have enough trouble dealing with a group that even by their standards defies all normal classifications. I have seen confused people on message boards write "Is Judaism a race or a religion?" as if it must be one or the other. In recent times, the trend has been to think of Jews as purely a religion and not to recognize their ethnic character. I increasingly see articles that describe celebrities as having been "born to Jewish parents." Some younger stars like Natalie Portman openly identify as Jewish, but there's a sense that it would be rude to describe someone as Jewish without their permission.

To people with this outlook, a phrase like "Jewish atheist" sounds as oxymoronic as "Catholic atheist," even though many older Jews identify as one. And what about that quaint phrase "the Jewish nation" which shows up in our prayerbooks? How can Jews be a nation? Doesn't that require a country? Of course, now Jews have a country, but those who never set foot there are still Jews.

Our unconventional classification arises from our long and complex history over 4,000 years. Few groups in the world have retained a sense of shared identity for that long, and so no matter how much we attempt to adapt to current norms, there lurks in our existence an element of the ancient that relatively modern categories like "race," "ethnic group," "religion," and "nation" can never quite capture.

The ancient Israelites could possibly be called a "tribe," though that term is rarely used, reserved instead for the twelve tribes within ancient Israel. Eventually, Israel did constitute a true nation. But after the Jews were exiled, they continued to think of themselves as Jews. In this respect, they were unusual. Most religions that spread outward from a single land retained religious but not national or ethnic identity. Partly this was because religions like Catholicism and Islam had a prosyletizing mission which Judaism lacked. Thus the people of Turkey, Pakistan, and Iran are Muslims but not Arabs. Because Jewish conversions never happened on a large scale (with possible exceptions like the Khazars), the converts became part of the Jewish people, losing their previous cultural identity. I have heard rabbis compare Jews to a family, where the converts are like adopted children. It's not a perfect analogy (since adopted children do not choose their parents), but it does give a sense of how Jews can think of themselves as having blood ties even while accepting converts.

The problem is that Gentiles would not be expected to pay any attention to how Jews defined themselves. What ultimately bound Jews together mirrored what bound blacks together: namely, persecution. It is worth asking whether there would be a concept such as "black" today if racism had never existed. It is similarly worth asking if Jews would have outlasted their ancient Middle-Eastern origins if anti-Semitism had never existed. Nowadays, many secular Jews admit that their Jewish identity is often driven by a desire to stick it to the anti-Semites. As Ilya Ehrenburg said, "so long as there is a single anti-Semite in the world, I shall declare with pride that I am a Jew" (qtd. in Alan Dershowitz's book Chutzpah, p. 14). Likewise, as anti-Semitism declines, or at least fades into the background, the concept of a secular Jew becomes harder to maintain.

Of course, if you define a Jew as anyone who may be a victim of anti-Semitism, then the definition becomes as arbitrary as bigotry is senseless. Plessy v. Ferguson sanctioned discrimination against a man who was black on the basis of one great-grandparent; many people with more African ancestry have passed for white. In a similar way, Barry Goldwater was subject to anti-Semitism even though he was a practicing Episcopalian with a Gentile mother; he probably would have been safe if his name had been Anderson. The cab test may be a sad reality for blacks, but for Jews it is something we must actively resist if we are to make sense of our lives.

Wednesday, May 30, 2007

Critics of homeschooling need to do their homework

Polls suggest that a slim majority of Americans oppose homeschooling, the method of choice for approximately two percent of the population. Ever since I took this educational route in high school, I have been stunned by the negative reactions it provokes. Though the opposition has declined significantly in the last decade, millions of Americans continue to find fault with this unusual mode of education, eager to offer their opinion on a subject they know nothing about.

It's no wonder that the arguments against homeschooling frequently contradict one another. Some critics allege that parents lack the qualifications to teach their children properly; others suggest that homeschooled children will be so hopelessly ahead they will be unable to relate to other kids their age. Some people imagine the prototypical homeschooled kid as shy and withdrawn; others imagine such a kid as loud and obnoxious. Whatever the argument, the critics base their views on very little if any personal knowledge of homeschooling. They haven't got a clue what homeschoolers actually do during the day, yet they seem to have endless confidence in their ability to guess.

One recent example of this attitude is a piece by blogger Russell Shaw for The Huffington Post. Shaw concedes that "home schooling works in some cases" (a mountain of research would suggest that this is an understatement), but he nonetheless thinks it should be restricted to those with an education degree, teaching children who are unable to attend school for physical reasons such as paralysis. Shaw, who assumes that homeschoolers learn through "rote recitation," worries that too many of the parents "want to keep their students at home in the service of simplicity and protectiveness," a situation that will make them ill-prepared for living in the real world.

Shaw's essay is very typical of anti-homeschooling pieces, not only lacking the slightest factual support for his positions but making provably false assertions of his own, such as the claim that homeschoolers consist primarily of fundamentalist Christians who reject evolution. (See here for the actual demographics.) Had Shaw bothered to look into the history of the movement he opposes, he would have learned that its godfather was a rather secular fellow named John Holt, who advocated homeschooling as an alternative to the "rote recitation" and lack of real-world preparation he observed as an instructor in traditional schools.

It's true that some homeschooling parents, like some private schools, teach creationism. Without explaining what he thinks should happen to private schools, Shaw denounces the situation: "as to the home schooler subjected to beliefs that run counter to scientific inquiry...I say send them to school and let the parents devote some of their off-hours to teaching what they feel their kids should know." Shaw implies here that it is the task of schools to expose kids to what is true, against those parents who will teach them what is false. But who decides what is true and what is false? The government? Shaw's point may resonate with those who envision homeschooling parents as extremists, but his larger implication is, frankly, scary.

If Shaw truly values scientific inquiry, then he should base his conclusions on facts, not hunches. Stephen Colbert coined the word truthiness to describe conservatives who rely on gut feelings as a substitute for evidence. If there is any issue on which some liberals exhibit this quality in abundance, it is this one.

Tuesday, May 22, 2007

The democracy of encyclopedias

One time when I worked as a college tutor, a student referenced Wikipedia in his paper on plants. I asked him if he knew what Wikipedia was. He said no. I explained that it was a user-created encyclopedia, that anyone can alter the contents at any time, and that I could take my laptop right there and change the article to say, "Plants are little green men secretly plotting to take over Earth." The student looked at me in surprise, but I assured him I was dead serious.

At this point you might be expecting me to launch into an anti-Wikipedia rant. I refuse to jump on that bandwagon, however. Wikipedia is the perfect example of a new development that traditional people just don't "get." Not that these critics are wrong exactly. Wikipedia does often provide inaccurate information and should not be cited in an academic paper. But the critics assume that once they make this observation, the issue is closed: Wikipedia is virtually worthless as a resource. I have had a very hard time talking to people who take this attitude. When I try to defend Wikipedia, I am frequently greeted by a dismissive snort, as if to imply that giving Wikipedia any credit would be to demonstrate massive gullibility. ("What, you actually trust Wikipedia?") This reaction, in my view, reveals a somewhat one-dimensional perspective on what makes something a valuable research tool.

What is Wikipedia? It's an online encyclopedia in which anyone with Internet access may write an article or modify existing ones. You can make grammatical corrections, contribute a sentence, provide a citation, or add a new section. Of course, you can also put in something ridiculous or offensive. Many users do just that, but the site keeps a back log of all previous versions of every article, and so as soon as anyone changes anything, people check. Outlandish changes usually get quickly reverted--but they still occur quite often. I once visited an article on John Ritter only to be informed that the late actor had risen from the grave. Editors may temporarily close off articles that get swamped by "vandals."

Wikipedia has many standards that users are expected to uphold. Articles must have citations, "no original research," and a "neutral point of view." Every article has a discussion page in which users can work out conflicts or disagreements. Articles that fail to meet these standards will receive a tag pointing out this fact.

Forget about the accuracy question for a moment. I want to make a point that often gets overlooked in these discussions: Wikipedia is quite possibly the most extensive enyclopedia ever compiled. It is for sure one of the largest. (See here for comparisons.) I'm not sure the term "encyclopedia" does the site justice. The sheer amount of topics covered is mind-boggling. To wit, you will find lengthy articles on all of the following:

1. Books, movies, TV shows (even individual episodes!), and music groups (even individual songs!). This includes not just classics, but also numerous obscure modern works. The selection is constantly expanding; I contributed the article on the novel Somewhere in Time just a couple of weeks ago.

2. In-depth information about small, specialized subjects that get only the most cursory treatment in standard encyclopedias. What are your hobbies? One of mine is juggling. On Wikipedia, not only are there articles on Enrico Rastelli, Francis Brunn, and other names not well-known outside the juggling community, there is also an extensive history of juggling, a thorough examination of each piece of equipment, and a detailed look at a wide range of techniques and tricks.

3. Obscure concepts from technical fields, like evolutionary biology's "population bottleneck" or computer science's "self-balancing binary search tree."

4. Almanac-like lists of the major events in any particular year.

5. Huge information about cities. Not only is there a lengthy article about Baltimore, there are individual articles devoted to all the local universities, libraries, cemeteries, and even major streets! (My brother contributed an article about the local bus routes, which some editors considered deleting for being "unencyclopedic.")

At this point, you might be asking, "What's the point of all this information if it isn't reliable?" Now hold on just one second. Is Wikipedia unreliable? A controversial and hotly contested 2005 study in the journal Nature compared Wikipedia's scientific articles with those in Encyclopedia Britannica and found that Wikipedia on average has four errors per article, whereas Britannica has on average three. This fact becomes especially astonishing when you consider that Wikipedia's articles are typically much longer than Britannica's.

How can that be? How can an encyclopedia in which any twelve-year-old may contribute even begin to approach the accuracy of one compiled by a panel of experts? There lies the paradox of Wikipedia: even though it has an endless capacity for error, it doesn't necessarily have a much greater tendency toward error than traditional encyclopedias. It's true that any idiot can write an article, but it will then be subject to what amounts to a gigantic peer-review process.

There are some advantages to this format. The information tends to stay very up to date. (I have found their pages on celebrities updated within hours of a celebrity's death.) And they seem to be very good at staying on top of urban legends. My 1993 edition of Compton's, in contrast, under the topic of "Language" repeats the old urban legend that Eskimos have many words for snow. That kind of nonsense would never last long in Wikipedia, where there may be a lot more ignoramuses, but there are also a lot more fact-checkers.

Still, the errors are there. They might be even worse in the foreign-language editions, which I've noticed are often simply amateur translations of the English edition. I corrected an Israeli-edition article that identified Connecticut as a city in Maryland. (I later figured out the cause of the error: the translator misunderstood a sentence in which the two states were listed one after the other, separated by a comma.)

Thus, Wikipedia should not be viewed as authoritative. Any information you get from it needs to be corroborated. That, however, is very different from saying Wikipedia lacks value as a resource. I will mention one example from my experience to illustrate my point.

When I was doing a school paper on Silence of the Lambs (the basis for this post), I looked up the term "psychopath" on Wikipedia, because a character in the film had applied the term to Hannibal Lecter. Wikipedia brought me to the page on "anti-social personality disorder," the clinical term, and listed the seven symptoms associated with the disorder, which I subsequently mentioned in my paper.

There was no problem of verification here: the article had a direct link to DSM-IV-TR, the diagnostic manual from which this information came. You might now ask why I needed Wikipedia--why didn't I just go directly to the manual? But how would I, a layperson who hasn't studied psychiatry, know in advance to go there? That's the beauty of Wikipedia. It gathers together an enormous amount of resources that might otherwise be hard to locate.

As it stands, corroborating Wikipedia's information is not difficult, because the good articles provide links and citations. Some of the less developed articles do not, but so what? You are free to dismiss any unverified information you find. It's true that some folks, like the student I tutored, may fall prey to the misinformation. But that's their problem. If not for Wikipedia, these same people would be getting their information from "Bob's Webpage." Wikipedia is very open about its process and should not be blamed if some people misuse the site.

Not only can misinformation be found in respected encylopedias like Britannica, it can be found even in very scholarly texts. In other words, no resource should be viewed as 100% reliable. Corroboration is a standard procedure of research, and the proper use of Wikipedia is really no different than the way we approach any other source.

Certainly, Wikipedia is both imperfect and incomplete. That's a given that not even Wikipedia's staunchest defenders will deny. The site is a massive organic entity, constantly being tinkered, constantly being updated, and much work remains to be done. In a way, I feel bad for the critics. They're in a Catch-22 situation, since the more they complain about Wikipedia's faults, the better Wikipedia becomes.

Monday, May 21, 2007

Curses

Even though profanity is commonplace in the movies, I've never quite gotten the hang of hearing it in music. Though I rarely swear myself, I'm not intrinsically opposed to hearing others do it. After all, one of my favorite movies is Pulp Fiction, and one of my favorite comedians is Chris Rock. But whenever I hear it in songs, it almost invariably seems coarse to me. Why the double standard?

It may have something to do with my age and generation. Much of the music I heard growing up came through the radio and MTV, both of which censor offensive language. Movies, on the other hand, I was most likely to see through theaters, video, and premium cable stations, none of which are known to edit for content. In any case, before the 1990s swearing was not remotely as commonplace in popular music as it was in movies. When Ozzy Osbourne reinvented himself as a reality TV star in 2002, he quickly gained a reputation as a foul mouth. Yet I cannot recall ever hearing profanity in any of his songs, from Black Sabbath to his present solo records. He came from a generation of musicians where swearing was rare in any music that got wide radio play. That tendency continued well into the '80s, despite the prevalence of strong language even in family movies like Back to the Future.

Having never followed hip hop, I first noticed the change during the alternative rock boom of the early '90s. Pearl Jam used the f-word in two early hits, "Even Flow" and "Jeremy," though it was so mumbled it often got past the censors. In the mid-'90s, Alanis Morissette brought profanity into the mainstream with songs like "You Oughta Know" and "Hand in My Pocket." Bleeped out words became increasingly common on adult contemporary stations.

Although I am a fan of certain artists who use profanity in their music, I have rarely found that the practice adds anything of value to a song. In Johnny Cash's wonderful cover of Nine Inch Nail's "Hurt," for example, the line "I wear this crown of shit" is changed to "I wear this crown of thorns." Now doesn't the altered version sound so much nicer? Hey, I know it's a dark song, but that doesn't mean I want to be reminded of poop.

I realize that what I'm saying might just be a cultural prejudice. Swearing itself is a curious phenomenon, if you stop to think about it. There's nothing intrinsic to the meaning of swear words that makes people take offense at them. The way we designate them as out-of-bounds, while tolerating other words with the same meanings, is almost superstitious. Sociolinguists, in fact, liken both profanity and racial epithets to the magical words deemed unutterable in certain tribal societies.

There are times when it would almost be perverse not to swear. Even the normally wholesome Bill Cosby couldn't help indulging himself one time during his classic performance Bill Cosby Himself:
I said to a guy, "Tell me, what is it about cocaine that makes it so wonderful?" and he said, "Because it intensifies your personality." I said, "Yes, but what if you're an asshole?"
If you replaced the word "asshole" with a more polite alternative, the joke would simply not work. This suggests that swear words occasionally convey nuances that milder language cannot achieve. Most of the time, however, people resort to swearing as a way of avoiding more descriptive language. In that sense, the real problem with swear words is not so much that they're crude as that they're clichéd. When overused, they begin to take on the quality of the word "smurf" in those old Smurf cartoons, just all-purpose expressions that make the language less varied.

Since movies aim to capture the dialogue of real people, swearing has a well-established place in the movies, even though it can be overdone--and often is. I have more trouble justifying the practice in music, because song lyrics, much more than dialogue, thrive on indirectness. That's one of the reasons that "Blowin' in the Wind" is such a better antiwar song than, say, "Eve of Destruction." In music, it seems, the last thing you want to do is get to the point. Or it could be that I'm just getting old.

Tuesday, May 15, 2007

A paranormal romance

I recently discussed my love of the Richard Matheson novel What Dreams May Come. Now I'm going to talk about another Matheson novel, Somewhere in Time (originally titled Bid Time Return but later changed to fit the movie). I cannot exactly recommend this novel. In fact, I thought the 1980 film version improved on it considerably. Matheson, however, considers this book and What Dreams May Come to be his two greatest novels. He may even have conceived them as one book. The similarities between the two are striking. The protagonists in both books are blond 6'2" screenwriters who live in Hidden Hills, California and love classical music. They each have a brother named Robert who acquires the protagonist's memoirs but cannot bring himself to accept the otherworldly events described in them. Both books are paranormal love stories, but they emphasize different phenomena.

In Somewhere in Time, a 30-something man named Richard Collier has been diagnosed with an inoperable brain tumor and has decided, upon a coin flip, to spend his last days hanging around the Hotel del Coronado, a famous California hotel. There he grows obsessed with the photograph of a nineteenth-century stage actress, Elise McKenna, who once performed there. Through research, he learns that she never married, that she had an overprotective manager, and that she may have had a brief affair with a mysterious man while staying at the hotel. The more Richard learns, the more convinced he becomes that it is his destiny to travel back in time and become that mysterious man.

This is an interesting setup. But the execution is shaky. I barely could get through the first fifty pages, which consist of Richard's rambling journal entries as nothing happens. I realize that Matheson was attempting realism, but I don't consider that a good excuse. A novel must involve the reader or it isn't worth our time. The book's deepest flaw, however, is the love story itself. I simply did not like the character of Elise. The development of her relationship with Richard feels artificial and forced. It's definitely a case of the journey being more interesting than the destination.

The novel's most striking feature is its depiction of time travel. It's probably the only novel I've ever encountered that proposes a step-by-step method that does not require any futuristic accessories or special abilities. The method is presented in such detail that it almost tempts readers to try it themselves. I bet that some have, though I have my doubts that any have succeeded.

Richard bases his method on the theories of a real book, J.B. Priestley's Man and Time. The basic idea is that he uses self-hypnosis to convince his mind that he's in the past. He listens over and over to a tape recording of his own voice declaring that the year is 1896 and listing many details of how his surroundings would look in that year. After discovering that his voice is distracting him, he writes the hypnotic suggestions out on paper, over and over and over. The historical roots of the hotel help reinforce his purpose, as does an 1890s suit he buys for himself.

Of course, skeptical readers may suspect that the time-traveling experience occurs only in Richard's mind. Matheson leaves open that possibility. So if you think the love story is unconvincing and seems more like a lonely man's pathetic fantasy, that may be just what Matheson intended.

The movie does a better job of handling these themes. The plot is clearer and more focused. There's no mention of Richard having a brain tumor, and there seems to be external evidence that his journey really took place, such as in an early scene where an old woman goes up to him in a crowd and hands him a pocket watch and says, "Come back to me." He later takes the watch with him into the past, where he gives it to the woman when she's young. This generates one of the most famous time-travel paradoxes in the movies, the watch that seems to have no point of origin, just eternally existing in an endless loop. The book contains other paradoxes of this kind, but more subtly.

While the movie did poorly in theaters and receieved mostly negative reviews, it went on to become a cult classic, with an actual fan club. Being less than enamored by the love story, I was only able to admire the film's craft. It felt to me like an episode of The Twilight Zone, a show to which Matheson contributed heavily. But neither the book nor the film enthralled me the way the novel What Dreams May Come did, perhaps because this time I wasn't truly permitted entry into a new world.

Wednesday, May 09, 2007

The science of fruits, nuts, and flakes

Some years ago, not long after reading parts of Carl Sagan's The Demon-Haunted World, a book about the remarkable persistence of superstitions among educated people, I discovered that a highly intelligent woman I know, who has a science degree and has always seemed rationalist in her outlook, believes in astrology. She described the traits of my sign, Aquarius, and they did seem to fit me fairly well. I don't remember her description now, so instead I'll post what Wikipedia says about the sign:
Individuals born under this sign are thought to have a modest, creative, challenging, inquisitive, entertaining, progressive, stimulating, nocturnal, and independent character, but one which is also prone to rebelliousness, coldness, erraticism, indecisiveness, and impracticality.
I have to admit that that does kind of sound like me. I like to think of myself as creative, challenging, and independent. I know I am impractical and indecisive, and unquestionably I am nocturnal. On the other hand, the article also claims that male Aquarians "are often said to tend to be effeminate in appearance." Hey, I may not be the most macho of sorts, but I sure as hell ain't effeminate looking.

But there lies the problem. One of the most obvious features of pseudoscience is allowing subjectivity to influence the testing of predictions. Having people examine their own personalities is notoriously unreliable, since people tend not to have accurate perceptions of themselves. What exactly would constitute a "bad" astrological reading? If you didn't feel the description sounded like you? If your friends didn't think so? And how close does the description have to match your personality in order to prove that its alleged accuracy isn't merely coincidental? (For example, I suspect that there are many creative and independent types not born under Aquarius.) Astrologers never set up any quantifiable boundaries by which their "predictions" can succeed or fail. It's all left up to the whim of the person reading the horoscope.

Still, I can imagine ways in which the claims of astrologers might be tested. Gather a group of people together, and give only some of them their true astrological readings. Give the others a deliberately false reading. For example, the Gemini gets the Scorpio's reading. None of the people know whether they are receiving a correct reading or not. Perhaps none of them know there even are any incorrect readings. Even the person doing the testing doesn't know which ones are correct and which ones aren't. The subjects are then asked to rate their readings, on a scale of one to ten, by how closely they feel it describes them.

Now comes the fun part: compare the reactions of the people who got false readings and the people who got true readings. If the claims of astrology are valid, then the people who got true readings should be substantially more likely to think the readings accurately describe them. If, on the other hand, there isn't much difference between the reactions of the two groups, then the claims of astrology would seem to be bogus.

Has anyone ever tried such an experiment? Do believers in astrology even care? Hey, I know that even this test probably wouldn't pass muster with the scientific community. It still has the problem that people are examining their own personalities, rather than being objectively evaluated by a disinterested outsider.

Of course, many scientists will dismiss astrology out of hand simply because of its premise that celestial bodies many light years away can have a perceptible effect on human behavior. But I'm willing to entertain this premise for the sake of argument, because astrology in principle does make real predictions about observable facts. If those predictions were scientifically confirmed, we'd have to concede that there's something to the system, no matter how absurd its philosophical underpinnings may sound. Skeptics can take comfort from the fact that we're a long way from ever having to confront that possibility.

Thursday, May 03, 2007

Those other Jews

I will now quote an anecdote from Alan Dershowitz's wonderful book Chutzpah, because I think it provides considerable insight into the way a bigot's mind works:

"I related the story of a slightly eccentric Brahmin woman from Boston who came to see me about a legal problem. I concluded that her problem was not within the area of my expertise, so I recommended another lawyer. She asked whether he was Jewish, and I responded, 'What difference does that make?' She said that she didn't 'get along very well with Jews' and didn't know whether she could 'trust them.' I asked her why she had come to me, since I was obviously Jewish. I'll never forget her answer: 'The Jews I know are all fine. I have a Jewish doctor and a Jewish pharmacist whom I trust with my life. It's those other Jews--the money-grubbing ones, the dishonest ones--that I'm not comfortable with.'

"I pressed the Brahmin woman about whether she had actually ever encountered one of 'those' Jews, and she responded, 'Heavens, no. I would never allow myself to have any contact with such a person.' The lawyer I recommended happened to be Jewish, and the two of them got along famously." (p. 99)

Wednesday, May 02, 2007

The most overlooked novel

We generally expect books to be better than their film adaptations. The most extreme example from my experience may be What Dreams May Come. I strongly disliked the 1998 film, and I probably would have quickly forgotten about it, except that I noticed it was based on a 1978 novel by Richard Matheson, whose fiction I had enjoyed previously. So I picked up a copy from the library. It has since become one of my favorite books.

I am not alone in this reaction, but I am in a minority. Most people haven't read the book, and most people won't even try. There are just too many preconceptions standing in their way. And, unfortunately, the movie has helped further those preconceptions. People assume it isn't the type of book they'd like to read. They think it is too "New Agey" for their tastes, or too sentimental. On the first point, they are probably correct; on the second, they couldn't be more wrong.

Why do I like the book so much? Put simply, it is the most vivid, complex, and surprisingly convincing depiction of afterlife I have ever encountered in a work of fiction. Nothing else I have seen on the subject, in literature or in film, comes close--certainly not the movie version of the book. Before I read the novel, I had no idea that a story about Heaven and Hell could have such a profound effect on me. I have always believed in afterlife as a matter of faith, but I would never before have thought it could be convincingly described in human terms.

The story involves a middle-aged man named Chris who dies and goes to Heaven, and who ultimately descends into Hell to rescue his wife. It's basically a modern-day variation on The Divine Comedy, as a man gets a tour of the afterlife. In the metaphysics of the film and the book, dying involves shedding your physical body and entering a mental environment shaped by thoughts. Your fate in such an environment is largely self-imposed. If you're a decent, pleasant person, your afterlife experience will be pleasant; if you're someone with moral problems, you'll naturally have a more difficult experience. As the novel puts it, "People are not punished for their deeds but by them" (p. 265).

That much of the movie intrigued me, the first time I saw it. The problem was the schmaltz. I mean real schmaltz, piled on in large mounds, in place of strong narrative. The beginning gives us scene after scene of two lovers (Robin Williams and Anabella Sciorra) doing little more than stare at each other and giggle, with solemn music in the background. The film is so inept at fleshing out their lives that it shows their wedding, the tragic death of their children, and fifteen years passing before it reveals out of the blue that Chris is a doctor! The performances by Williams and Cuba Gooding, Jr. are surprisingly atrocious, maybe because they could scarcely believe their own lines.

For those who have only seen the movie, it's hard for me to convey just how very different the novel is. Of course there are major differences in the plot. One such difference is the ending. (Even Roger Ebert, who heaped high praise on the film, was disappointed by the ending.) Another is the beginning, where the film has Chris's children also die and go to Heaven. In doing this, the movie (1) makes the early scenes so depressing they become surreal (2) needlessly clutters the story with extra characters (3) introduces a silly and confusing subplot about Chris's attempts to find his children, who are in disguise.

In the book, Chris's children are adults, not youngsters, and they're minor characters. The details of Chris's life on Earth differ so greatly between the book and the film that it's like reading about a completely different person. Even though I saw the movie first, the image of Robin Williams completely vanished from my mind as I read, because he was so unlike the character described in the book.

The entire feel of the book is different, telling a touching love story that uses real characterization, not cheap manipulation, to move the audience. And Matheson's vision of the afterlife truly comes alive on the page. The Hell scenes are actually terrifying, reminding us, as the movie does not, why Matheson is primarily famous as a horror writer.

I won't overlook the movie's gorgeous visual effects, which earned the film a well-deserved Academy Award. They just aren't put to good purpose. The movie's vision of the afterlife as like being inside giant paintings fails to evoke a sense of reality. The book, in contrast, bases its afterlife imagery (vividly brought to life by Matheson's skillful prose) much more on Earth-like scenery. This approach ironically leads to far more exotic ideas, such as architects who build things using their minds, and a library containing history books more objective than those on Earth. Matheson puts the reader right inside this setting, as the following passage illustrates:
I noticed, then, there were no shadows on the ground. I sat beneath a tree yet not in shade. I didn't understand that, and looked for the sun.

There wasn't any.... There was light without a sun. I looked around in confusion. As my eyes grew more accustomed to the light, I saw further into the countryside. I had never seen such scenery: a stunning vista of green-clad meadows, flowers, and trees. Ann would love this, I thought.

I remembered then. Ann was still alive. And I? I stood and pressed both palms against the solid tree trunk. Stamped on solid ground with my shoe. I was dead; there could be no question about it any longer. Yet here I was, possessed of a body that felt the same and looked the same, was even dressed the same. Standing on this very real ground in this most tangible of landscapes....

I turned my hands over and noticed that their skin and nails were pink. There was blood inside me. I had to shake myself to make certain I wasn't dreaming. I held my right hand over my nose and mouth and felt breath pulsing warmly from my lungs.... (p. 55-6)
And here is a passage from one of the book's Hell scenes: "Now I saw that, interspersed throughout the area we crossed, were pools of dark and filthy-looking liquid; I hesitate to call it water. A loathsome stench beyond that which I had ever been exposed to rose from these pools. And I was horrified to see movement in them as though unfortunates had slipped beneath the surface and were unable to rise" (p. 183).

Matheson, as I mentioned, is a famous horror writer. One of his unique qualities is his almost scientific approach to the supernatural. As Roger Ebert explains in his review of the Matheson-penned Legend of Hell House:
Matheson labored for years in the elusive territory between straight science fiction and the supernatural horror genre, developing a kind of novel in which vampires, ghouls, and the occult are treated as if they came under ordinary scientific classifications.

There was, for example, the Matheson classic I Am Legend.... In that one, a single normal man held hundreds of vampires...at bay by figuring out the scientific reasons for old medieval antivampire measures like mirrors, crucifixes, and garlic. The Matheson novels of the 1950s and early 1960s anticipated pseudorealistic fantasy novels like Rosemary's Baby and The Exorcist.
In What Dreams May Come, Matheson makes Heaven and Hell seem like a scientific, natural process, and one of the joys of the book is discerning all the intricate "rules" of how everything works. (That's another area where the movie falls short.) What needs to be kept in mind, however, is that Matheson doesn't do this just for entertainment purposes. In the novel's introduction, he tells his readers that the characters are the only fictional component of the novel, and that almost everything else is based on research. The book even includes a lengthy bibliography. Thus, the afterlife that Matheson describes isn't some fantasy world he concocted from his own head, but something he believes to be an accurate description of reality.

Some people may wonder, at this point, about Matheson's religious background. He was raised a Christian Scientist, but gradually developed what he calls his own religion, taking elements from many sources. One of the book's main influences, I believe, is eighteenth-century Christian mystic Emanuel Swedenborg.

The book avoids seeming "religious" in any way other than its setting. There are no deities, no mention of Christ or anything specific to the theology of a particular religion. The characters occasionally talk about God, but it is left up to the reader to decide whether the Creator in charge of this system is at all like the Judeo-Christian God or like something more pantheistic. The book even implies that no single religion has the whole truth:
"For instance, you'll find, in the hereafter, the particular heaven of each theology."

"Which is right then?" I asked, completely baffled now.

"All of them," he said, "and none. Buddhist, Hindu, Moslem, Christian, Jew--each has an after-life experience which reflects his own beliefs. The Viking had his Valhalla, the American Indian his Happy Hunting Ground, the zealot his City of Gold. All are real. Each is a portion of the overall reality.

"You'll even find, here, those who claim that survival is nonsense," he said. "They bang their nonmaterial tables with their nonmaterial fists and sneer at any suggestion of a life beyond matter. It's the ultimate irony of delusion." (pp. 90-1)
From what I've seen, people react negatively to this book based on how far it departs from their personal beliefs. Christians complain about the absence of Jesus, while those who don't believe in any afterlife consider the story too nonsensical to accept. Most readers, it seems, are put off by the New Age terminology and concepts scattered throughout the book.

These reactions are puzzling, if you stop to think about it. Books about elves, fairies, dragons, and wizards remain popular even though nobody believes in any of those things. Why should people be bothered by a fiction book portraying a Heaven and Hell that conflicts with what they believe? The book is perfectly enjoyable whether or not you accept Matheson's metaphysics.

Of course, I personally do think Matheson provides insight into the subject--though I admit I'm a little wary of his acceptance of paranormal phenomena. But it amazes me how so many people refuse to even touch the book, thinking that any story with such a plot must automatically be hokey. In most cases, they'd be right. What Dreams May Come is a big exception. It suggests the endless possibilities in a subject that normally is dead weight for fiction. And it really makes you think.

Wednesday, April 25, 2007

Ultra-Beautiful

When I was a child, I knew of only two divisions in Judaism: frum and not frum. From a Yiddish word meaning "pious," this is the Orthodox Jewish way of designating observant Jews. The word "Orthodox" itself seemed fairly alien to me, used mostly as a formality. Later, I became aware of a distinct sub-group called "Modern Orthodox," which I first conceptualized as frum Jews who ignore strictures against mixed dancing. Later still, I began hearing the term "ultra-Orthodox," which my friends and family perceived as a vague slur applied by ignorant outsiders to any Orthodox Jews they considered too extreme. We were irritated, and a bit perplexed, by the media's increasing use of the term as though it were an objective, neutral description of a distinct group.

By now, the term "ultra-Orthodox" has become so standard in the media that people use it without blinking an eye. I'm torn on the subject, wondering if I should still fight the trend, or just give in. There are two primary issues here. The first is whether the term is inherently pejorative. The second is whether such a group as "ultra-Orthodox" really exists.

To answer the first question, we need only look at the history of the prefix ultra. According to the Online Etymology Dictionary, the term originally meant "beyond" (as in ultraviolet) but came to mean "extremist" when applied to political movements.1 The connotation is that such movements are "beyond the pale," which is obviously a value judgment applied by outsiders, not a term people would normally apply to themselves. Occasionally you will find people today who proudly identify as ultraconservative or ultraliberal, but there's no question that those terms were originally intended as insults. It's like when some blacks call themselves by the N-word.

One blogger told me that he doesn't mind being called ultra-Orthodox, just as he wouldn't mind being called ultra-beautiful or ultra-smart. That's an interesting argument, but I think it proves my very point: people rarely use phrases like "ultra-beautiful," because the prefix ultra is generally reserved for insults, which is almost certainly what was originally intended by the term "ultra-Orthodox."

As an experiment, I googled the phrase "ultra-Orthodox." Of the first ten hits, two are Wikipedia articles, two are allegedly neutral news articles, one site complains about the term, and the remaining five are sites bashing ultra-Orthodox Jews. You may consider this result too small a sample to draw a conclusion, but I invite anyone to try the experiment on a larger scale. You will likely find what many of us have sensed all along, which is that "ultra-Orthodox" is widely used as an insulting term, and almost never used in a complimentary sense.

Of course, it is possible to take a pejorative expression and wear it defiantly, as a badge of pride. But so-called ultra-Orthodox Jews have made no collective attempts to do so. Those rare few who self-identify by the term are, I suspect, surrendering themselves to a trend they feel powerless against, rather than eagerly embracing the term.

Because the secular press regularly treats the term as a neutral expression, and because the term simultaneously exists as an insult, the people who use the term insultingly have gained a significant rhetorical advantage. It has become one of those words like fundamentalist where you can pretend to be neutral when you're actually invoking a stereotype. I'm reminded of an article by Rabbi Shmuley Boteach in which he describes religious fundamentalism as immoral and destructive.2 He never bothers to provide a precise definition of "fundamentalism," and he seems unaware that the standard definition would apply to his own religious practice. He unblinkingly defines the term according to popular negative stereotypes associated with the term. Once you start incorporating stereotypes about a group into the very definition of that group, you've won the argument before you've even started. The term "ultra-Orthodox" has that same two-faced quality.

That brings me to the second question. Does "ultra-Orthodox," insulting or not, refer to an actual group? It is supposedly the English equivalent of Haredi. There are certainly many Jews who self-identify as Haredi. Though there are also people who bash Haredim, nobody considers the term in itself to be insulting. It is fairly neutral, neither positive nor negative. (Repeating the Google experiment with "Haredi," I found three web definitions, one actual Haredi site, one site bashing the non-mainstream Neturei Karta sect rather than Haredim in general, and five neutral articles. The articles, I should mention, are on the whole more respectful than the ones I found in the "ultra-Orthodox" search.) There probably ought to be a campaign to have the secular press adopt the term, but for now it is rather obscure, known only to Orthodox Jews and occasional outsiders.

The problem, which few people acknowledge, is that Haredi is vague and imprecise. It presupposes that Orthodoxy can be neatly divided into two groups, those who reject the outside world and those who embrace it. The former are Haredi or "ultra-Orthodox," the latter are Modern Orthodox. This classification has been widely reported in the media, but it would raise the eyebrows of most Jews in my native Baltimore. Baltimore's Orthodox community is very largely "yeshivish" or "black hat," two insider terms referring to non-Hasidic Jews who are stricter in their observance than Modern Orthodoxy. By the two-pole classification, that would constitute Haredi. But most Baltimore frummies do not fit the standard definition of Haredim. Most people here have a strong work ethic, for example, and there has been no community ban on using the Internet in one's home. Anti-secular attitudes exist here but do not generally prevail.

Modern Orthodox, for that matter, covers a wide range of attitudes and practices. Some people have attempted to recognize a third group, "centrist," represented most prominently by the Orthodox Union and Yeshiva University. This would cover Jews who are strict in their observance but who embrace the outside world. In fact, it is pretty common to hear Orthodox Jews using phrases like "left-of-center," "right-of-center," "far left," etc., as though Orthodoxy resembled the left-right political spectrum in the secular world. While still a simplification, this outlook is a vast improvement over those who conceptualize Orthodoxy as two distinct "camps."

Thus, it's important to understand that when Orthodox Jews use a term like "Haredi," they usually recognize how blurry the dividing line is. But what about people outside the Orthodox community, especially those with little knowledge of Orthodoxy? Those people are likely to be considerably less understanding--and they're also more likely to use a term like "ultra-Orthodox" instead of "Haredi." It's no wonder, therefore, that most people who use the term "ultra-Orthodox" use it thoughtlessly, without a clear picture of what they're referring to. For many people, it's just a code word for "Jewish religious nut." Hence, it's not uncommon to see the term applied ignorantly to Religious Zionists, even though that group is usually distinct from the Haredim, at least in Israel.

If we were to run a successful campaign and the media were to stop saying "ultra-Orthodox" and to start saying "Haredi" instead, it wouldn't solve everything. Outsiders would continue to oversimplify the dynamics of the Orthodox community. But it would be a start. A few people might think twice before applying such an exotic term with such a broad brush to people they don't know.

Works Cited
1 The Online Etymology Dictionary. "ultra-" http://www.etymonline.com/index.php?term=ultra-"
2 Rabbi Shmuley Boteach. "How Religion Leads to Fundamentalism." http://www.shmuley.com/articles.php?id=183

Friday, November 17, 2006

In defense of Orthodox liberalism

Cross-posted at DovBear's blog

R. Harry Maryles writes in this this post, "It is a fact that the conservative principles are generally more in line with Orthodox Judaism than are liberal principles. Although that isn’t 100% the case, I think it is true most of the time."

I care to disagree. But I should note that if Harry had begun the sentence with "It is my opinion..." rather than "It is a fact..." I would not have objected. He is entitled to his views, but they are debatable. Still, I have heard similar sentiments from many other frum people, and it is a topic worth discussing.

A large part of what has inspired the rightward shift among frum voters in recent decades parallels the influences on evangelical Christians: the "traditional values" of which the Republican Party has appointed itself the sole bearer. While those values have nothing to do with the conservative philosophy of unfettered capitalism, Republican politicians created a marriage between these two meanings of conservatism. It is an unhappy marriage. Religious conservatives were duped by Reagan, and many of them have recently woken up to the fact that they've also been duped by Bush.

I've always been amazed at the mental acrobatics of those who argue that Judaism fits the philosophy behind economic conservatism. Their rationale depends partly on the standard but inaccurate translation of tzedakah as "charity." In modern American society, charity is simply a praiseworthy act. In ancient Israel, however, tzedakah was the law of the land. The conservative tenet that we must encourage volunteerism in place of government aid runs contrary to much traditional Jewish thought.

When I raised this point on Harry's blog, Bari noted differences between the ancient Jewish system and modern liberal programs. For example, in halacha a person gets to decide which poor people to give to. When I pointed out that one of the highest forms of tzedakah is giving to someone unknown, Bari replied, "And it's theft if you take it from me to give it to someone else who I don't know. When the govt. does it, maybe it's not theft, but it's not right Al Pi Din Torah."

Bari is walking on thin ice here. Either you think that it's okay to have the government enforce donations to the poor, or you don't. If you don't, but you make an exception for Judaism's specific mandates, and you declare anything else to be "theft" or something close to it, then you're not being philosophically consistent.

Having said that, I should point out that there is a good deal more to politics than philosophy. I don't fault any frum person for taking conservative positions on particular issues. There is room in Yiddishkeit for a variety of political perspectives, once we move past ideology and get into specifics. The problem is that many of us have a hard time stepping outside our own political perspectives and acknowledging that other viewpoints have legitimacy. When we feel strongly about an issue, it is easy to fall into the trap of ascribing simplistic motives to the other side and of not recognizing how complex the issue really is. I'm sure I have been guilty of this before, but I definitely see it in frum conservatives. It is implicit in Harry's statement that "conservative principles are generally more in line with Orthodox Judaism," which almost makes it sound like we can just do a head-count of political positions and declare this one as being more in line with Torah values, that one as being less, and so on.

So let me be clear: On almost any major issue in American politics today, a case could be made for both sides without sacrificing one's commitment to Torah principles. There are possible exceptions, like gay marriage or opposition to stem-cell research. But most issues fall into one of the following three categories:

1) Issues where the Torah's view is irrelevant. One example is gun control. Occasionally I have heard Orthodox rabbis on both sides of this debate attempt to "spin" their favored position as more Torah-based, but their arguments are unconvincing, for the disagreement (properly understood) does not stem from any fundamental difference of values and has no real bearing on halacha. So too with the vast majority of American political issues.

2) Issues where the Torah's view is relevant, but where there is still rabbinic support for both sides. An excellent example is the death penalty. Harry's mentor R. Ahron Soloveichik not only opposed the death penalty but believed that every Jew should.

3) Issues where Jewish law may seem more in line with one side, but where pragmatic considerations might tilt it the other way. This category includes many "social issues" that religious conservatives focus upon, such as abortion.

In sum, I welcome debate on the specifics of any issue. At the same time, I believe that there is much in common between traditional Judaism and many core liberal ideals. It's not absolute, but then neither is the pact that R. Lapin and co. have attempted to make with the Christian Right. And frankly I think the latter poses a greater danger to our freedom as Jews than the fuzzy liberal tolerance that so many frum people claim to despise. Christian conservatives may play nicey-nice to us, but in the long run they're being disingenuous, as becomes clear in the slip-ups by the less shrewd among them (e.g. Katherine Harris). You have to be extremely deluded to believe that the Christian Right views us as an equal partner. No doubt we should stand up for what we believe in, whether economic or social, but we must also be careful not to be so blinded by ideology that we enter into an unhealthy relationship.

Saturday, November 11, 2006

Too stupid for chess


It's one thing to know that someone is smarter than you, it's quite another to be reminded of that fact week after week after week.

From my childhood onward, I used to play chess regularly with a friend of mine. He beat me a good majority of the time. This made the game a tiring experience for me. I could have viewed my losses as a challenge, an incentive to work harder. But these were times when all I wanted to do was relax. The mental effort needed to keep track of a chess game just didn't inspire me.

Occasionally, we played other games, where our skills were more even. We even invented a new game we called "losing chess." While we weren't the first to come up with either the idea or the name, our version was somewhat original. In the "standard" form of losing chess, the object is to force the opponent to capture all your pieces, and the king holds no special importance. But in our variant, the object was to force a checkmate on your own king--in other words, to expose your king to capture when the opponent is threatening no other piece. This game turned out to be rather interesting and unpredictable. I tended to win the game (or "lose," if you will) slightly more often than he did.

I still tried to improve my skills in regular chess. I read books about chess strategy. I downloaded a fairly decent chess program to examine the strategies of a computer player. That actually kept me busy for some time, but as with all other computer games played with myself, I grew bored of it. In any case, my friend continued to beat me.

Only in the last few years did my interest in checkers get revitalized. Windows XP comes with a game called Internet Checkers. The computer sets me up with another actual player. The only information I get about the other player is his language and skill level. Players get a choice of three skill levels: Beginner, Intermediate, and Expert. I am set up with a player of the same skill level as long as one is available. Since the program rarely takes much time in setting up a game, it seems that numerous people around the world are constantly using this software. The only other possibility is that I'm unknowingly being set up with a computer player at least part of the time, though the instructions give no indication that the program ever does such a thing, and I believe I can tell the difference between a computer player and a human.

I can send the other player a message from the following pre-set list: "Nice try," "Good job," "Good game," "Good luck," "It's your turn," "I'm thinking," "Play again?," "Yes," "No," "Hello," "Goodbye," "Thank you," "You're welcome," "It was luck," "Be right back," "Okay, I'm back," "Are you still there?," "Sorry, I have to go now," "I'm going to play at zone.com," ":-)," ":-(," "Uh-oh," "Oops!," "Ouch!," "Nice move," "Good jump," "Good double-jump," "King me!" I assume that the comments get translated into whatever language the other player speaks. While I'm usually set up with another English speaker, I have also frequently been set up with players who speak German, French, Turkish, Arabic, Hebrew, Thai, and many other languages. No wonder there seem to be players available at all times of the day, and the night as well.

When I first started playing, I had very little knowledge of the game. I had played checkers before, and I was familiar with the rules. But I knew no strategies or techniques, except for a belief that I should avoid moving pieces in the back row. The first strategy I devised was a simple copycat routine: as long as I was the player who moved second, I could simply imitate the other player, doing a mirror image of his every move. Of course, the game always reached a point where I could no longer do this. Sometimes the opponent's opening move made it impossible for me to follow the copycat routine. But this routine usually got me to a point where I could find an advantage, and I did often win the game when playing as a Beginner.

I began to learn some tricks. One rule that casual checkers players frequently ignore is that when you can capture, you must capture. I think people avoid this rule because they feel that it limits their choices. But the computer versions of checkers require you to play by this rule. I began to discover that this rule is what makes the game so fascinating and unpredictable. Because you can force your opponent to capture a piece, you can make him do things he didn't expect to be doing, and then gain a sudden advantage. The simplest form of this technique is when you force your opponent to capture one of your pieces, and you end up taking two in return. I discovered this technique on my own, in a situation that frequently occurs toward the beginning of the game, at the side of the board. I became quite adept at making my opponent fall for this trick. But the opponent must be gullible enough to put himself into this vulnerable position. Moreover, I had to learn to avoid putting myself into this position. Because this is one of the simplest tricks, even players with modest experience usually know better than to allow it to happen. But they remain prepared should the opponent make himself vulnerable to this move. It's one of the litmus tests early in the game that makes it easy to tell the experienced players from the novices.

After I discovered that I was beating Beginner players the majority of the time, I decided to move up my skill level to Intermediate. Soon I moved it to Expert. Of course there was no guarantee that I was playing someone who actually was on that skill level. All it meant was that the player identified himself as being on that skill level. But I did beat Expert players less often than Beginner players, and while the challenge was intriguing me, I sometimes went back to the Beginner level, for relaxation purposes. I had given up the copycat routine and started to learn more sophisticated strategies.

Finally, I got a book out from the library on checkers techniques. Reading this book greatly refined my skills, teaching me techniques that I still use to this day. For one thing, I radically changed my opening strategy. I used to open the games most often by moving my side pieces first. Apparently, this is a common error that beginners make. They move the side pieces because the side pieces are less vulnerable. Unfortunately, this strategy is weak in the long run, because it doesn't help break through the opponent's defenses. It is best to start by moving the pieces in the center of the board, and to keep one's pieces close together. I also learned that moving the back row is not necessarily to be avoided. What I should avoid is moving the second and fourth pieces in the back row, but the first and third often can safely be moved early in the game. I also learned to keep my own double-corner well-protected, and to work on attacking the opponent's double-corner.

Getting used to these new techniques took some time. At first, I experienced some difficulties, and it appeared that I was getting worse, not better. But I soon realized that I was simply taking time to get accustomed to using the techniques properly. This new opening strategy made it easy for me to fall into a devastating trap, where the opponent would get a two-for-one and a king. But as I learned more caution, I began to see the great advantages of this strategy. I began to win games without using any tricks, simply by having a strong defense and by either making the opponent's back row vulnerable or putting him into a position where he couldn't move except by exposing his pieces to attack.

Of course, I also learned some more advanced tricks, not just from the book, but from a checkers program I downloaded where I got to examine a computer player's strategies. I learned complicated moves that involved making my opponent capture one piece, then another, and then finally launching a devastating attack that he had no idea was coming. To play checkers well, you have to be able to think four-dimensionally, to anticipate future moves by visualizing the board in other configurations.

The strategies for handling game endings are just as complex. For starters, if you have two pieces left and your opponent has only one, you can definitely beat him--but it requires some practice to learn how. If you have three pieces and your opponent has two, your best bet is to force him to take one of your pieces in exchange for one of his. It is possible for him to prevent you from doing this, using one or both of the double corners, and such a game will end in a draw even though you have more pieces.

I wanted a way to track my progress. Internet Checkers does not record wins or losses. So I created my own file where I kept track of that information. Next to each skill level, I wrote how many games on that level I won, how many I lost, and how many turned out as draws. Here are my current stats, which are still ever-changing:

Expert: 2697/964/365
Intermediate: 422/141/38
Beginner: 830/101/278

According to this record, I win approximately two-thirds of the time at any skill level. But I suspect my Beginner and Intermediate record would be much higher if not for the fact that most of the games I played at those levels occurred long ago, before I improved my skills. I regularly play the Expert level now, only rarely venturing to lower levels.

I have my own rules for determining whether I have won, because Internet Checkers has an annoying but understandable loophole: any player can abandon a game at any time. Thus, if a player is losing, he may simply quit without selecting the "resign" option. Sometimes this is not mere rudeness: computer and Internet glitches can cause a game to be ended prematurely. But I made a personal rule that if a player quits and I am clearly ahead, with more pieces, I record that in my personal file as a win. Similarly, if a computer glitch causes the game to be terminated and the opponent is ahead, I record it as a loss. If I'm at the end of the game and the opponent refuses to draw even though he clearly can establish no advantage, I quit the game and record it as a draw. I have recently adopted a 40-move rule used in official competitions, which says that if a player can gain no advantage in 40 moves, the game is automatically a draw.

These personal rules which I have concocted are relevant only to myself. The opponent doesn't know that I play by them, because I have no way of communicating them to him, given the limited pre-set list of comments we can pass between each other. I am somewhat amazed at the lame tricks that some of my opponents attempt to pull. For example, if they are losing, many of them will ask for a draw, hoping I will accidentally click "yes" when the pop-up window appears. On rare occasion I have even fallen for this trick, but so what? It makes no difference as to the truth of who won, and the truth is all that matters to me when I keep my personal record. I view these games as practice and recreation, and the recognition of having the computer say "You won!" means nothing to me.

Apparently, all this practice has paid off. After coaxing my friend to play checkers with me, I discovered that I was suddenly much better at the game than he is. He's still good--he has a strategic mind. But he's nowhere near my level, and he hasn't come close to reaching it. I've actually become a sort of tutor to him, showing him some of the techniques I've learned and giving him corny advice, like "The best offense is a good defense."

Why did I discover such skill at checkers, when I was always so hopelessly bad at chess? Part of the reason is that I stumbled upon the software that allowed me to compete with real players. This not only has kept me from growing bored of the game, but has been enormously good practice. Most of what I know now, I learned simply from the experience of playing, not from the strategy books. Perhaps if I were to start playing Internet versions of chess, my skill at that game would improve as well. But so far I lack the interest. Checkers just seems more suited to me than chess. It's a much simpler game, with far fewer rules. You have basically only two types of pieces, and you use only half of the board space that you use in chess. I'm the type of person who has trouble multitasking, processing many different things at once, and that may be the key to why I find checkers easier to deal with. I'm not going to admit that it's simply because I'm too stupid for chess.

Wednesday, October 25, 2006

Moral philosophy

I used to believe that atheists have no basis for accepting the idea of morality. I no longer believe this is true.

I have recently been discussing on David Guttmann's blog the issue of morality and religion. I admit that my views on this issue have significantly evolved in the last ten years. For my more recent views on the matter, see this post, which is based on an essay I wrote at the end of college. An essay I wrote at the beginning of college, however, presents quite a different perspective. I present an abridged version of the essay here:
Why must we act morally in the first place? An ethical system founded upon a religion that worships God is, in principle, more rational than an ethical system that denies this basis.

Personal feelings are too subjective to base an entire code upon them. The conclusion that "murder is immoral," for example, is not a direct implication of the premise that "murder does not feel good (to some people)." A person may be firmly ethical without following his or her emotions, while a person who follows his or her emotions may be morally lacking.

The argument that morality is a necessary part of society is also insufficient, because it is possible to behave contemptibly without harming society as a whole. While all cultures retain values at their foundation, they are free to adopt new tenets and discard old ones as time goes by. Morality, in contrast, is a permanent value system. It is not a product of Western society but rather an ideal toward which most societies strive. To say that murder is immoral is to imply that there is no possibility of it becoming moral anytime in the future, even if society would begin to approve of it. This is particularly evident in such topics as abortion or euthanasia, where people do not agree on the definition of an ethical concept. Their goal in debating the issue is to make society acquire a better understanding of morality. Since no society is infallible, no society provides us with the ultimate philosophical basis for ethics.

Another possibility is that disregarding proper ethical standards is self-destructive. It is certainly true that many people's moral actions are motivated by a desire for self-preservation. Obviously, most of us will follow the law of the land so as not to be punished. More to the point, all our actions have natural consequences, and for some moral actions, the consequences on the person who performs them are favorable. For example, the case for environmental protection is strengthened by the fact that problems in the environment can have hazardous effects for all the life on earth, including ourselves. On the other hand, the principle does not accurately describe all situations. Throughout history, evil societies and people have thrived while innocent individuals suffered. How does the self-preservation theory account for cases when criminals escape justice, even if such cases are uncommon? In any case, the suggestion that the goal of morality is self-preservation trivializes the concept, which has nearly always been understood to go beyond self-interest.

That last point exemplifies the problem with all these theories, which is their inconsistency with the morality understood to exist in our daily lives. If morality were merely a matter of personal taste or choice, as some philosophers have suggested, that would fail to explain most people's passionate hostility toward opposing moral philosophies. The passion implies that in most real-life situations, morality is assumed to occupy an objective reality of its own. In contrast, the previously mentioned theories invoke a conception of ethics that is more narrow and vague than the conception most people apply in practice.

Despite the general perception that morality is objectively true, people tend to relegate it to a separate realm of reality. For example, most people assume it to be an objective fact that advocating murder is "wrong," but would probably treat a comment that advocating murder is "inaccurate" with bewilderment. Moral judgments cannot be measured by accuracy, they would say. On the other hand, the same people would probably agree that the immorality in a doctrine of racial superiority is intrinsically related to the fact that it is false. The identification of racism as morally wrong in addition to being false is, like the first example, unprovable. Nevertheless, in the second example, we notice a logical connection between the realms of fact and value. It is inherently impossible to "prove" statements about how people should behave, yet such statements still constitute a kind of philosophical knowledge. This is clear because we so frequently use demonstrable facts to back up moral propositions.

If we assume that morality implies a system occupying a sphere of reality, the question presented at the beginning of this essay--Why must we act morally in the first place?--should be rephrased: How do we know that morality exists to begin with? That is, what knowledge confirms our general belief that all people must behave in certain ways and not others? From a rational standpoint, the assumption that morality exists is not self-evident, even though many people--including many religious people--treat it as such. It is rather a logical implication of the fact that it is God's will. Because God created and is in control of the world, He wants us to behave in specific ways, to satisfy the purpose of creation. This also provides the strongest rationale against destruction toward nature and society--because it is all part of God's creation.

Without God, there is no source by which to judge any statement on how we should behave as true. All we can say is that some people are motivated to behave that way, yet that does not tell us whether people should behave that way. Religion allows us to view moral propositions as truths rather than simply as preferences, instincts, or rules designed to maintain social order.

Accepting God's will as the ultimate basis for ethics does not preclude the previously mentioned motivating factors behind moral behavior; it simply gives them a unified point of reference. Compassion still plays an important role by enforcing moral rules deep within our psyche. Furthermore, even religiously based ethical systems use the standard of what is beneficial to society to decide on specific moral issues. Finally, religion embraces a version of the self-preservation argument, by its conviction that God punishes all evildoers. This belief, though, is based on faith rather than observation, whereas the secular version of the argument is an attempt to explain morality without reference to God.

I am not implying that those who do not recognize the ultimate basis for ethics are necessarily amoral. Divine authority is not the only cause of moral behavior, although it is the ultimate rational basis for ethics. The bottom line is that in the absence of religion, there are no ultimate grounds for condemning a person who chooses to behave immorally. Why is the evildoer's choice inferior to anyone else’s choice? None of the non-religious explanations for ethics answer that question; they merely clarify why some people prefer certain ways of behaving over others. Only when we recognize that the ultimate moral authority is God do we have a universal explanation for morality that applies equally to all people in all cultures, regardless of what the people or the cultures themselves may believe.
I actually still accept some of the ideas expressed in the above essay, even if I reject the general thesis. The philosophers who have attempted to root morality in social pragmatism or personal preference have not provided a convincing case for ethics. But I recognize now that basic morality is deducible without believing in God. I believe that the purpose of religion is to move us beyond this basic level in an attempt to perfect the world.