Wednesday, December 26, 2007

Funny, you don't look Semitic

On Wikipedia, there's a long-running debate over whether the term anti-Semitism should include a hyphen or not. Last I checked, the less common form antisemitism won. It's a contentious issue. In his book Jewish Literacy, Rabbi Joseph Telushkin argues that the hyphenated version "only fosters the impression that there is a wider ethnic entity against which 'anti-Semitism' is leveled" (p. 467).

On occasion, people will use that impression to deflect the charge. The editor of an Egyptian newspaper wrote that "the Israeli and Zionist media have manipulated the concept of Semitic ethnicity so as to apply to Jews alone." (Al-Ahram Weekly, Nov. 2003). The Jewish-born chess champion Bobby Fischer, who maintains that the Holocaust didn't happen, claims not to be an anti-Semite since "the Arabs are also Semites and I'm definitely not anti-Arab." (The New York Times, Sep. 2, 1992, pg. A1). An Arab American defending Jesse Jackson shortly after the "hymietown" incident told a reporter: "In my mind, I don't consider him an anti-Semite. Arabs are Semites, too, you know." (The New York Times, Apr. 28, 1984, pg. 8). Perhaps the most disturbing instance of this reasoning came in a New York Times editorial:
American Jews have reason to be particularly sensitive about demonizing a Semitic people. In unthinking caricature, Arabs are portrayed as demented terrorists or greedy oil sheiks. This is a variation of the hateful depiction of Jews as rapacious bankers or sinister revolutionaries. Anti-Semitism is anti-Semitism in both forms. (Sep. 9, 1990, pg. E24)
I have seen websites that take this argument a step further. By combining it with the claim that modern-day Jews are the descendants of converted Khazars, these sites allege that Arabs are really the "true" Semites, the implication being that the term "anti-Semitism" should only be used to describe bigotry against Arabs and not bigotry against Jews! It is as if redefining the term suddenly changes the reality of which group deserves more sympathy.

When I hear arguments of this sort, I have my doubts that the presence or absence of a hyphen is likely to make much of a difference. These aren't innocent errors; they are deliberate attempts to muddle the issue. Most Americans understand perfectly well what the term anti-Semitism means, and serious people do not go around dissecting everyday words to make overly literal interpretations of their constituent parts. If you ask to be shown to a bathroom, and someone brings you to a room with a bathtub but no toilet, you'd rightly think the person was either crazy or a very bad jokester. That probably wouldn't change even if "bath-room" were written with a hyphen.

To play that game with "anti-Semite" is even stranger, since Semite scarcely exists as an independent word in modern English. We talk about Semitic languages, but as an ethnic term, Semite is simply a relic of an old racial category that no one today takes seriously. Even white supremacists don't generally think of Jews and Arabs as one entity. ("I was walking down the street, and there was this dumb Semite standing there....")

Wikipedia also has an article on what it calls "anti-Arabism." I suppose that's a reasonable coinage. But I can almost hear the tone of resentment: "You guys have your term, so we should be allowed our own." Of course, Jews didn't invent the term anti-Semitism. It was invented by anti-Semites themselves, as a term of pride. It is usually credited to the nineteenth-century German racialist Wilhelm Marr, who used the term as a substitute for the more vulgar-sounding Judenhass, or Jew hatred. Appearing at a time when racial theories were in vogue, Semite was a convenient euphemism for "Jew," the only Semitic type with a significant presence in Eastern and Central Europe.

Marr and others actually started organizations with names like the Anti-Semitic People's Party, the French National Anti-Semitic League, and the Universal Anti-Semitic League. With a few exceptions, these bigots were not known to extend their hatred to other Semitic peoples. The most famous anti-Semitic regime in history made a pact with Arab leaders, dubbing them "honorary Aryans" (also a euphemism). It was only after WWII, following the West's official rejection of the concept of racism, that anti-Semite lost its respectability and became the type of word that very few people adopt as a self-description.

To most Americans today, it sounds pretty bizarre that there were ever organizations so devoted to contempt for Jews that they made it part of their name. Until recent times, indeed, anti-Semitism was the only common word in English denoting opposition to a specific minority. English doesn't have a word for hatred of blacks (except for the obscure and problematic Negrophobia). We got through the entire civil rights era with the more general term racism.

In theory, we don't need the term anti-Semitism either. We could get by with a less official phrase, such as "anti-Jewish bigotry," which would actually be clearer and less confusing. But having a term like anti-Semitism around makes the phenomenon stand out and seem distinct from other prejudices. That's what bothers people who think that Jews receive too much attention, for better or worse.

It's not surprising that anti-Zionists on the left become indignant when they get accused of anti-Semitism. Their whole philosophy, after all, depends on opposing racism, and to them anti-Semitism is simply another form of racism that has perhaps received more attention than it deserves. But a lot of animosity toward Jews these days seems aimed at rendering Jews insignificant. The notion that Jews are important enough to merit an entire ideology devoted to hating them, complete with its own term, is something that could only have been dreamed up by a Jewish conspiracy.

Friday, October 19, 2007

How Roy Orbison changed my life

The ability to compose music is, I'm convinced, something you're born with. The tricky part is learning what to do with the compositions. That takes knowledge and practice.

I remember composing songs as a child without even trying. For example, I once had a nightmare about some creature trying to push me down the steps, and it sang, in a tune I distinctly remember, "So you have to fall downstairs." Later, I made up tunes on a Casio synthesizer I still have.

But my songs were lacking in a quality I didn't then understand. Each one was just an isolated sequence of notes, without being organized into anything larger. Aside from the dream, only one of my songs had lyrics, but it consisted of nothing more than one verse and a chorus. I somehow didn't realize that it needed something more to be complete.

The day that all changed was around the time Roy Orbison's posthumous hit "You Got It" was released. I'm still not quite sure what it was about this seemingly ordinary pop song that changed my entire musical outlook.

I first heard the song on VH1. I wanted to hear it again but didn't get a chance for several months. When it finally aired on the radio while I was listening one night in my bedroom, I was extremely excited, and I listened to the song closely.

For the first time in my life, I became conscious of song structure. The song had three distinct sections, which I later learned were called verse, pre-chorus, and chorus. It went through these sections twice, then it had a side part I later learned was called a bridge, then it returned to the pre-chorus. This arrangement wasn't unique or original, but having never paid attention to structure before, I found it clever.

I began to dissect other songs. I discovered that most rock and pop songs fall into one of just a few common structures. "You Got It" has one of the more complex ones, originating perhaps in the 1960s.

I went back to the Casio and created an instrumental piece with three sections, plus a bridge. It was my first composition that felt in any way complete. Over the next year, I composed numerous other songs, most of which I can still play.

I was twelve back then. I would have expected my interest in dissecting songs to fade after a while. But to this day whenever I hear a new song, determining its structure is part of the listening experience. Songs with unconventional structures always intrigue me.

Early rock 'n' roll generally stuck to very simple structures--either verse-chorus with no bridge, or verse-bridge with no chorus. The songs tended to be around two minutes long, and when a song was longer, it was usually because of more verses or a slower tempo, not because of greater complexity. The increase in structural complexity in rock music was gradual, probably peaking around the late 1960s when popular rock bands like the Beatles and the Moody Blues experimented and broke out of the radio format that demanded songs of no longer than three minutes. This was the era of long songs, sometimes a result of arbitrary repetition (as in "Hey Jude"), other times due to unusually intricate structures (as in "Stairway to Heaven").

Still, rock and pop have continued to confine themselves to familiar formulas, with only particular genres such as heavy metal and progressive rock seeking to emulate the complexity of classical music and jazz. On rare occasion, a pop song will attempt a more ambitious structure, as in Madonna's surprisingly multi-layered hit "Like a Prayer."

Structure has a significant effect on the feel of a song, even if most listeners aren't conscious of it. The wrong structure can hurt a song even if the music is good. A case in point is Rush's "Roll the Bones" (the video of which can be seen here). It is the perfect example of a good musical idea ruined by structural overkill.

The chorus is great. But the song takes nearly a minute and a half to reach it, and the verse and pre-chorus that precede it sound like they come from a totally different song. It's as if the songwriter felt he needed to fill something into that time slot, so he grafted on the leftovers from an earlier composition.

But that's only the first problem. The song came out in 1991, when it was common for a pop song to include a "rap" verse, and Rush bowed to this trend. In the video, the rapping is done by a silly-looking animated skeleton, which appears after three minutes of live action. The rapping ends and the song cuts to a solo, where we think it's going to return to the chorus. But no--it goes back to the rapping skeleton for another minute! Finally, it returns to the chorus, and the song ends, clocking in at five-and-a-half minutes, pretty much ensuring its exclusion from mainstream pop radio.

The skeleton's monologue includes lines like "Gonna kick some gluteus max," along with a series of similarly cheesy puns and rhymes. This section seems especially incongruous next to the song's earlier lyrics, which were serious and contemplative. The song lacks cohesion: it's like a bunch of spare parts thrown together. Maybe they were rushing to meet a deadline, no pun intended.

What Rush probably should have done is made the chorus the entire song, and abandoned the verse and pre-chorus sections, not to mention the rapping. One-part songs aren't common in progressive rock, but even Rush has done them, as in their 1977 song "Closer to the Heart." There's nothing wrong with a song having many sections, but they should complement each other rather than sounding like appendages. It's hard to explain why some songs do this correctly, while others don't. But you know it when you hear it.

I can only speculate why an understanding of structure helped me learn to compose properly. Deciding which structure to use for a particular song is not an exact science. I suspect that many musicians just pick one that "feels right," without giving the matter much thought. But for me the music itself always came easily, and all I needed to learn was how to organize it into something coherent. And for that I have Roy Orbison to thank.

Sunday, October 07, 2007

From amoebas to elephants

Panati's Extraordinary Endings of Practically Everything and Everybody, a book whose accuracy I have sometimes found wanting, has an interesting take on a familiar fact. According to Panati, sex and death are two aspects of the same phenomenon. Organisms that reproduce asexually do not have finite lifespans like we do. The amoeba simply splits into two amoebas, which in turn exist until themselves splitting. As long as it isn't killed, it multiplies endlessly without ever dying.

But in sexual reproduction, two individuals contribute genetic material that grows into a new organism, then they die. Death is simply the logical corollary to a process that uses a mere fraction of an organism to produce a new creature. In humans, it may happen decades after the sex act itself, but it's inevitable--and linked to the fact that we are sexual beings.

"In one sense," writes Panati, "we possess immortality, but not where we want it. We have it in our generational genes. We'd prefer it in our body, in the form we cherish, in the face that gazes reassuringly back from the mirror" (p. 6).

Despite Panati's argument, there isn't much to envy about amoebas. They don't care about their immortality any more than your sex cells care about theirs. Lacking consciousness, they are unable to care.

For your part, cloning wouldn't satisfy your desire to live forever. You would think of the clone as a new person, just as you think of identical twins as two separate individuals despite their genetic sameness. What you really want is a continuance of your soul, not your body. Even if you don't believe in the soul, you recognize a distinct inner self which perishes at death. Immortality, to you, means the permanent existence of this self.

Previously, I raised the question of whether you'd be willing to die if a clone with all your memories were created in your place. That sort of experiment could be the key to immortality, assuming that merely copying the information in your brain to another vessel would effectively move your consciousness there, like a transfer of data between computers.

There are problems with that assumption, however. In his book How to Write Science Fiction and Fantasy, Orson Scott Card discusses a fictional species that communicates by transferring memories. According to Card, "individual identity would be much less important to them than to us. And death would be almost meaningless. As long as you passed memories before you died, then everything you thought and experienced would continue to live on, so that even though you might cease to take part, everyone in the community would clearly remember having done everything you did!" (p. 50)

Card here implies that there is some intangible "you" existing independently of your memories, so that even if those memories are passed to someone else before your bodily death occurs, "you" perish. But if "you" are nothing more than your memories, as some philosophers have argued, then how is a memory transfer from one body to another any different than what happens in life from each moment to the next?

I believe it is different. No matter what the philosophers say, there is something intangible inside us. Consider the attempt to create an android with human emotions. You might program it to have a distinct state called "sadness," where it would display symptoms such as frowning, downcast eyes, and broken concentration. But it's doubtful that any of this would cause the machine to experience sadness, any more than an actor experiences the emotions he performs.

Inner experience, one of the most mysterious features of human life, is not just information in the brain. It is the part of yourself which experiences the information. You can't prove it exists. You're directly aware of your own, but you can only infer that other people are something more than machines programmed to behave as though they have inner experience.

Philosophical materialists typically attempt to ignore or downplay the mystery of inner experience, viewing it as merely another illusion to be swept away by the inevitable march of scientific progress. The problem with that argument is that since all illusions are perceived through a person's inner experience, to call inner experience itself illusory borders on the tautological. Inner experience is the elephant in the room, the one thing in existence that absolutely defeats a materialistic explanation.

Wednesday, August 22, 2007

Interpreting dreams

According to popular dream interpretation manuals, if you dream about a cat, that means you are thinking about female sexuality. Parrots in dreams, on the other hand, symbolize gossip, or that "a message is being conveyed to you," or that "someone is being repetitive or even mocking you." I mention all this because I had lots of cat dreams as a child, and plenty of parrot dreams as a teenager. Of course, the fact that I've been around cats since I was four, and acquired a cockatiel when I was older, has nothing to do with it.

Never do these manuals stop to consider that dreams might be based on memories of daytime experiences. They always start from the assumption that all concrete objects are metaphors for suppressed emotions and anxieties, often of a sexual nature. Where they get these explanations I have no idea. It all sounds to me like a rather clichéd attempt to apply literary analysis (with Freudian overtones) to dream narratives.

Still, it's curious that some dreams seem so widespread. For example, lots of people report dreaming about accidentally going outside without clothing. Other common dreams include flying, or falling, or having one's teeth fall out, or facing an exam after having forgotten to come to class the entire semester. If dreams are particular to individuals, why are some of the most absurd ones nearly universal?

My personal theory is that dreams depict events that we expect might happen based on our life experiences. Many dreams consist of little more than distorted memories mixed together. Whenever I dream about an experience I've never had, I get the sense that my dreaming mind believes it could happen. There are things I desire which I never dream about, because I don't expect them to happen. Thus, I'm inclined to reject Freud's theory of dreams as wish fulfillment, even though I know he would argue that my subconscious mind suppresses my true desires.

An important mechanism in dreams is exaggeration. I once dreamt I was visiting a friend who in real life has one cat, but in the dream had five. I've also had dreams in which my own pets spoke English--not just my parrot, but my cats as well. I believe this reflects the way I anthropomorphize animals, thinking of them in such human terms that I practically expect them to talk. Only common sense (which I lack during sleep) makes me know better.

What about nightmares? How do they fit my theory? The answer is that when we're anxious about something, some part of us expects it to happen. That explains the feeling of inevitability in nightmares, nicely captured in The Wizard of Oz when the lion appears as soon as Dorothy begins worrying about "lions and tigers and bears." Those dreams in which we triumph over scary things, as Dorothy ultimately does, reflect our confidence.

As for the dreams where we walk outside without clothes, I suspect that they stem from our anxiety over relying on habit to get through daily routines. Most of the time when we dress ourselves, we aren't thinking about the activity at all. Just as we're occasionally forgetful with other habitual activities, it is semi-plausible (and I believe it has happened) that someone might forget to put on a vital piece of clothing. In the embellished world of a dream, that fear translates into reality.

As it stands, scientists now believe that REM sleep (the period in which most of our dreams occur) serves a memory-related function. It almost appears as if we're not meant to remember any of our dreams. We experience a kind of amnesia upon awakening, in which we quickly forget what we were dreaming about, unless we make a special effort to remember.

There are people who claim never to dream, but scientists assert that everybody dreams. Some people just don't remember. How do scientists know? In the sleep experiments of the 1950s which revolutionized the field, all the subjects, even those who initially denied ever dreaming, ended up reporting dreams at the moment the researchers awoke them from REM sleep.

The study of dreams presents many scientific difficulties. If a person says, "I dreamt I was a dog last night," we have to take his word for it. It's like basing our testimony on an amnesia patient who was the only witness to an event. We know that dreams don't occur in physical reality, but how can we verify that they occur at all? How do we know, as scientists currently believe, that they are hallucinatory experiences that occur in real time during REM sleep?

The 1950s researchers would awake sleepers who had just gone through a REM cycle (detected with an electroencephalogram) and ask them to approximate how long their dream had lasted. The scientists found a positive correlation between the length of the dream reported and the length of the REM cycle observed. They also studied the effects of external stimuli on dreams. In one case, a scientist dripped water down a subject's back thirty seconds before waking him. The subject reported having dreamt about singing at an opera, when molten wax suddenly began dripping from the ceiling.

The best book on dreaming I've read is J. Allan Hobson's 1988 The Dreaming Brain, which includes a remarkable amount of scientific information (much of which went over my head) as well as a theory that ties these observations together with stunning simplicity. The book is one of a kind: most books on dreaming focus either on the subjective aspects (and usually wind up sounding hokey and unscientific) or purely on the scientific observations (which are too limited to tell us much about the subjective state of dreaming). Hobson finds a middle ground where he is able to quantify the characteristics of dreaming and correlate them with the findings of neurobiology and sleep research.

I learned from this book that the portions of our brain governing movement and sight are activated during REM sleep. Hobson proposes that the ultimate source of dream sensation--which seems to be predominantly visual and motor--is physical rather than psychological. It is no wonder that dream sensations feel largely involuntary. As Hobson puts it, "we seldom have the experience of willing the movements that occur in dreams, but instead experience a sense of compelled or involuntary motor activity over which we have little or sometimes no control. This contrast is particularly and strikingly sharp in those dreams in which motor activity becomes a central part of the plot: for example, when one attempts to escape from a pursuer" (p. 171). This also helps explain the sensation of rapid movement we frequently experience during dreams. If we're anxious, we perceive that we're falling; otherwise, we may perceive that we're flying.

Since the sensations are physiological in origin, Freud was wrong: dreams are not messages from our subconscious (or from anywhere else, for that matter). But here's the catch: our thought reflections, which help organize the sensations into a narrative, infuse the dream with meaning. This meaning, according to Hobson, is usually transparent rather than oblique:
The activated brain-mind does its best to attribute meaning to the internally generated signals. It is this synthetic effort that gives our dreams their impressive thematic coherence: dream plots remain remarkably intact despite their orientational disorganization. And it may be that their symbolic, prophetic character arises from the integrative strain of this synthetic effort. The brain-mind may need to call upon its deepest myths to find a narrative frame that can contain the data. Hence, one can continue to interpret dreams metaphorically, and even in terms of the dynamically repressed unconscious, if one so chooses. But such a practice is no longer either necessary or sufficient as an explanation of either the origin or the nature of dreaming. (p. 214)
A recent dream of mine exemplifies this discrepancy. I dreamt that people's faces kept turning into non-digital clocks. Interpreting this dream reveals the different possible ways of answering the question "Why did I have this dream?" It could have been that I had been walking around in the early morning and glanced at a non-digital clock on the wall; the image stayed in my mind so that when I conjured up images of people's faces, my memory of the clock came back to me. Dreams are good at finding analogies. But the dream may also have reflected my feelings over my grandfather's recent death, an event that has made me think lately about how life is finite. In the dream, I expressed this feeling by seeing people as ticking clocks.

Symbolic dreams of this kind are probably a function of the individual dreamer, rather than an expression of universal mythic archetypes. People who have symbolic dreams are probably symbolic thinkers; concrete individuals will tend to dream in concrete terms. Still, human dreams might have a lot to do with why we exhibit more creativity than other species. The biochemist Otto Loewi claimed that the experiment which led him to the postulations of chemical neutrotransmission--and earned him a Nobel Prize--occurred to him in a dream, and Robert Louis Stevenson based his story "Dr. Jekyll and Mr. Hyde" on a dream.

The important point is that our reactions to dream sensations prove just as important as the sensations themselves in establishing the dream "plot." This is not to suggest that dreams have no meaning. They certainly do, but according to Hobson, the meaning is usually more transparent than both Freudians and mystics would believe. Don't waste your money on a pseudoscientific dream manual; most likely, you are your own best dream manual.

Saturday, August 18, 2007

Jewish rainbow

When I walk into a room and say to people I meet "I'm Jewish" often I will get the response "But you're Black." I often want to say "no kidding," but the usual response I give is "Yes, my family has been practicing Judaism for at least three generations, now." The point that I aim to make is that it would make it easier to just "BE" as a Jewish person of color if "black" and "Jewish" identity were not so commonly assumed to be mutually exclusive. Historically, Jews have been multiple skin colors and it's unfortunate that the passive internalization of color consciousness that happens so easily in American society, helped us to forget the freedom from identifying around color that is a part of our Jewish history. (p. 27)
The above quote comes from Yavilah McCoy, as recorded in a fascinating book I just read, Melanie Kaye Kantrowitz's The Colors of Jews: Racial Politics and Radical Diasporism. Designed to raise awareness about Jews of color, the book presents numerous anecdotes about the experiences of nonwhite Jews, followed by stimulating discussions on the implications of this research. The book is marred for me by its anti-Zionist standpoint, which culminates in a final chapter that has very little to do with the rest of the book. The back cover sports endorsements by Tony Kushner, Chandra Talpade Mohanty, and Adrienne Rich, all left-wing thinkers squarely in Kantrowitz's ideological camp. I get the sense that Kantrowitz underestimates her audience, thinking that her discussions about racial diversity in the Jewish community either will appeal only to those with her positions or will inevitably move readers toward those positions. Instead, her dogmatic advocacy of ideas that most Jews find offensive will likely turn away many readers who would otherwise find much value in the information she presents.

What types of nonwhite Jews are there? The question is not as easy to answer as one might expect, because it depends on how one defines "white" and "Jewish," both highly contested categories. Most Americans today assume that the prototypical Jew (which usually means Ashkenazic Jew) is white, but that was not always the common perception in this country. The very act of designating Jews as white or nonwhite can be a political statement, because it is taken to suggest something about their status and position in society. (I have known Jews who mark themselves as "other" in forms asking for their race.)

With these precautions in mind, Kantrowitz considers several types of people: (1) African American and Asian American converts to Judaism (2) nonwhite children adopted by Jewish families (3) children of mixed marriages (4) Ethiopian Jews and other black African communities that have practiced Judaism for centuries or more (5) the most ambiguous category, Sephardic and Mizrahi Jews, some of them quite dark-skinned, others scarcely distinguishable from their Ashkenazic brethren.

One reason this information has value is that Jews, despite a long history of crucial involvement in the civil rights movement, also have a history of racism that persists today in some religious communities. The historical tension between Ashkenazim and Sephardim can take on racial overtones. More pertinently, many Jews have trouble accepting the very concept of a black Jew--it seems an impossibility, a contradiction in terms.

One anecdote really struck home for me, reflecting the type of compartmentalized behavior I've witnessed. McCoy, the woman I quoted before, attended a Hasidic school as a child. On one occasion when she complained to classmates who were denigrating non-Jewish blacks, they assured her they weren't talking about her. The inconsistent thinking required to sustain this kind of attitude demonstrates why Jews of color may help stem the tides of bigotry coming from both Jews and non-Jews.

Kantrowitz wishes to uproot the perception that "black" and "Jewish" are mutually exclusive categories, and to reveal the Jewish people as a racially diverse group. She feels that recognizing this reality will ease the tension between Jews and blacks. Her point is that if the two categories can overlap, people will be less inclined to view the two groups as enemies of each other.

She weakens her argument, however, by trying to downplay the well-documented fact that a disproportionate level of anti-Semitism exists among African Americans. She fails to discuss any of the official studies, such as Harris Polls taken over the last several decades. As Charles Silberman noted in his book A Certain People, black anti-Semitism is not a mirror image of Jewish racism. The latter is far more marginalized in the Jewish community than the former is in the African American community. The problem is not that most blacks agree with Louis Farrakhan's anti-Semitic pronouncements, but that leaders like him exhibit far greater influence than any comparably racist figure in the Jewish community.

She also wishes to blur the line between "Jew" and "Arab" by emphasizing the strong Arab element of Jews who are the product of Arabic land, culture, and language as surely as American Jews are Americans. She eagerly embraces the term "Arab Jew" to highlight this dual identity.

She correctly observes that Islamic countries in the Middle Ages generally treated Jews far better than Christian countries from the same era did. The height of this relatively peaceful coexistence occurred in the Golden Age of Spain from the eighth to twelfth centuries. But the picture is more complicated than she would like to believe. This period, when Jews enjoyed more rights and privileges than any time until the modern age, ended not because of the fifteenth-century Christian rulers who instigated the Inquisition, but because of violent Muslim invaders from two centuries earlier.

In addressing this fact, Kantrowitz performs a remarkable sleight of hand. She quotes Victor Perera saying, "[u]ntil the arrival of bloody-minded Almohade Berbers in 1146, bent on implanting Islam in all of Europe, Spain's Jews generally lived at peace with Muslim rulers and their Christian subjects; and they thrived culturally and commercially as never before or since" (p. 81). She immediately comments, "This peace persisted until the Christian conquest of Iberia and the Inquisition," which is not only false, but directly contradicts what she just quoted!

The final chapter of the book is little more than an essay on forging a Jewish identity apart from Zionism. Too bad. There is considerable merit to her thesis that recognizing racial diversity among Jews will help improve relations with other people. She doesn't seem to accept, or even consider, that Jews can support Israel and still be fully committed to peaceful relations with non-Jews. I hope that intelligent readers will be able to overlook the book's flaws, because beneath the rhetoric lies some valuable material about forgotten portions of world Jewry.

Thursday, August 09, 2007

Ethnic time

One night when I worried I would show up late for a Talmud class, I joked to a friend, "I wouldn't want them to think I'm Jewish."

That quip was just a variation on a common Jewish theme. If a meeting will take place at "8:00, Jewish time," what this means is that it's scheduled for eight but will likely begin at least fifteen minutes later. Only a Yekke (an affectionately derogatory term for a German Jew) would arrive on the dot.

I assumed that this concept was uniquely Jewish. It seemed to fit my image of the absentminded Jewish thinker, best exemplified in a novelty item I once saw in a catalog, the "relative time watch," featuring a picture of Einstein surrounded by the numbers "1ish, 2ish, 3ish...." (Wait, wasn't Einstein a Yekke?!)

But then one day I was talking to a black friend and learned about a similar concept called C.P. time. "What does C.P. stand for?" I asked stupidly. Uh...colored people. I was a little taken aback. I later saw Dave Barry talk about a Cuban sense of time (because his wife and in-laws are Cuban). According to Barry:
If a WASP wedding is scheduled to start at 2 p.m. Saturday, the wedding march will start at 2 p.m. sharp, and the bride will come down the aisle at 2:03 p.m., no matter what, even if the originally scheduled groom has bailed out and the bride has to use an emergency backup groom taken right off the street.

Whereas in a typical Cuban wedding, the phrase "2 p.m." is translated as "possibly this weekend." (True fact: I once went to a wedding at a Cuban home; I arrived 20 minutes before the scheduled start, and was greeted at the door by the bride, who was still in curlers.) I believe that the Cuban community will not be affected by the Millennium Bug until the year 2004 at the earliest.
I was a little surprised by this column, since Barry usually avoids edgy, politically incorrect humor. It turns out that his wife is actually a Cuban Jew. Put together, that means she won't be experiencing the Millennium Bug until 2008.

You'd think that ethnic stereotypes would have no place in modern discourse, but many people seem more than eager to embrace stereotypes of their own group (or their spouse's). And not just positive stereotypes, like "Jews are smart," but seemingly negative ones, like having a loose sense of time. It gives minorities a warm, intimate feeling that sets them apart from their more fastidious WASP neighbors.

Of course, whether a stereotype is positive or negative depends on perspective. Republican presidential candidate Tommy Thompson got himself in a little trouble a few months ago when he characterized great business acumen as one of the "accomplishments of the Jewish religion." Jews didn't take to that remark very well, but he meant it as a compliment.

Thompson was confusing the religious tradition with the sociological reality. The Jewish religion has much to say about philanthropy and ethics, but very little about financial know-how. The stereotype of the financially astute Jew goes back to the Middle Ages, when Jews became moneylenders after the Church barred them from most other occupations.

Because this stereotype has been the source of so many slanderous beliefs, such as the claim that Jews are greedy, most Jews feel uncomfortable at any reference to a connection between Jewishness and money. But they seem not to mind other stereotypes. The expression "For every two Jews, there are three opinions" was almost certainly invented by a Jew. Then there are all the traits associated with Jewish mothers.

People say that stereotypes usually have basis in truth, but that's a dangerous observation, easy to misunderstand. Lacking a strong inside knowledge of a group's history, one can have a hard time telling truth apart from myth. A good synonym for stereotype is "caricature." Maybe that's why people are more comfortable poking fun at their own group than someone else's: the closer you are to the target, the more you understand how to attack it sensitively.

Wednesday, August 01, 2007

The cage of language

One of the labels I frequently attach to my posts is "power of words." I believe that words do have power. Our whole life, after all, is shaped by language. Few if any of us remember before we learned to speak, and we probably cannot even conceive of the experience. If language is a vessel for thoughts, it is one that transforms what is inside of it.

One person who truly understood this point was George Orwell, who explained his views most forcefully in "Politics and the English Language." Many people have read this wonderful essay, and many more are at least passingly familiar with Orwell's ideas. But not everyone fully grasps what he was talking about. Ironically, people today who throw around the word Orwellian are usually falling into the very trap Orwell warned against: the use of hackneyed but politically charged terms to mask lazy thinking.

Bemoaning the decline of the English language, the beginning of Orwell's essay almost sounds like it's going to be a tired old trope about the misuse of grammar. But those who read further will discover a far more distinctive argument. Indeed, one of the examples that Orwell cites of bad writing is itself a critique of grammar-related sins, an issue to which Orwell seems indifferent. The writing issues that concern Orwell are wordiness, triteness, and vagueness.

I have observed that people have three levels for understanding Orwell's critique. Level One readers interpret it simply as a call to communicate more effectively. Strunk & White's popular style manual, for example, eagerly seizes upon Orwell's humorous "translation" of Ecclesiastes 9:11 into dry academic prose: "Objective considerations of contemporary phenomena compel the conclusion that success or failure in competitive activities exhibits no tendency to be commensurate with innate capacity, but that a considerable element of the unpredictable must invariably be taken into account."

Level Two readers understand that the bad communication cited by Orwell serves a definite purpose, as the people in power try to confuse the masses. This is the most popular conception of Orwell's ideas, the kind that shows up whenever someone rallies against "doublespeak," a term that Orwell did not actually invent.

Level Three readers correctly identify the core of Orwell's message: if we don't exercise control over the language, then the language will exercise control over us. Bad communication is not merely a conscious process designed to keep the masses in line. It is something that we blind ourselves with.

What would Orwell say about the current political climate if he were alive today? Conservatives speculate that he would attack political correctness. There is definitely an element of doublespeak in PC terminology, not just because of oversensitivity but also because of the way it implicitly excludes some views from discussion.

But what many conservatives fail to acknowledge is that overuse of the phrase politically correct has itself become an Orwellian tactic. In the culture at large, the phrase has practically lost its political implications. It is simply a synonym for "polite," but with negative connotations.

I once was reading a blog discussion where a guy referred to the author of some book as an idiot. Another guy said he agreed with the criticism but added that there was no need to launch ad hominem attacks against the author. The first guy came back and retorted, "Oh, don't be so PC." This discussion, I should note, had nothing to do with politics.

What PC means, to most people, is "avoiding saying what you mean for fear of offending someone." People use the expression so that they don't have to take responsibility for their words. It gives people the license to be as offensive as they want and then make it sound as if anyone who disagrees is being namby-pamby. The elder Bush once described political correctness as something that began as "a crusade for civility" and turned into "Orwellian...crusades that demand correct behavior." I would describe the backlash against political correctness as something that began as a crusade against censorship and turned into an all-purpose excuse for poor decorum.

Orwell did not direct his criticisms against any one party or philosophy. He realized that the problem was almost universal: "Political language--and with variations this is true of all political parties, from Conservatives to Anarchists--is designed to make lies sound truthful and murder respectable, and to give an appearance of solidity to pure wind." Properly, Orwell's critique should be used for reflection, not just condemnation. We need to consider how we communicate, not just how others do. No one escapes the cage of language; the best we can do is be conscious of how it surrounds us.

Thursday, July 19, 2007

The myth of the fantasy genre

Sometimes viewpoints that are dead wrong can provide a starting point for insightful discussions. I feel that way about Christian opponents of the Harry Potter series. They think it's bad to expose children to stories depicting witchcraft in a positive light. Fans respond that Harry Potter is fantasy, and that these books are a healthy tool for stimulating a child's imagination.

"Fantasy" is a funny name for a genre. The word suggests make-believe. All fiction is make-believe, but fantasy deals specifically with events that not only didn't happen, but couldn't happen. We, the readers, allow our minds to enter a universe that we know could never exist. The books tap into some part of our subconscious where rationality has not penetrated, and for a brief period of time we "believe" in magic. The genre is not about exploring possibilities, as science fiction does, but about losing ourselves in impossibilities. As Orson Scott Card puts it in his 1990 book How to Write Science Fiction and Fantasy, "science fiction is about what could be but isn't; fantasy is about what couldn't be" (p. 22).

But that raises a problem. If we designate all supernatural stories as fantasy, as is the common practice, we're implying that there's no such thing as the supernatural. I presume that Card, as a Mormon, would resist that implication. Yet he doesn't appear bothered by it: "As rational people, we know that magic doesn't work and superstitions are meaningless" (p. 22). True, but what about the miracles of the Bible? What about God and the afterlife?

The fantasy genre avoids this dilemma because it rarely deals with religion. Magic may be rooted in pagan belief, but most fantasies do not feature pagan deities. Two important pioneers of the genre--J.R.R. Tolkien and C.S. Lewis--were devout Christians. (J.K. Rowling is, too, but let that pass.) I once saw someone make the following clueless remark about Harry Potter: "Why do the kids celebrate Christmas if they practice witchcraft?" But that's just the point. The magic in Harry Potter isn't a religion; it is an alternative series of natural laws. It has about as much to do with ancient occult practice as Westerns have to do with the real Old West. And it has about as much to do with Satan worship as armadillos have to do with Swiss cheese.

The fantasy genre, in any case, centers on things that almost everyone agrees are imaginary, like elves and dragons. Religiously rooted supernatural fiction, like The Exorcist, usually ends up in the horror section of a bookstore. People may call it fantasy, but that's not the publishing category. Fantasy, despite its religious origins, is essentially a very secular genre. Individual works like C.S. Lewis's Narnia series may express religious ideas, but mostly through metaphor.

Does a book even need magic to be considered fantasy? The Princess Bride has no obvious magical elements, unless you consider a volcanic swamp populated by ferocious capybaras to be magical. As for Miracle Max, he's an herbalist, not a sorcerer, and he doesn't show up until quite late in the story. Yet everybody thinks of The Princess Bride as a fantasy, largely because it has all the trappings of one.

The matter gets even fuzzier with books that deal with the afterlife, like Richard Matheson's What Dreams May Come, which I discussed in a previous post. Wikipedia classifies such novels as "bangsian fantasy," after an author named John Kendrick Bangs. But the thing is, Matheson actually believed in what he wrote. He based his depiction of afterlife on extensive research into near-death experiences and the visions of mystics. To him, and to many readers, his book is surely not "fantasy."

Card hints that the genre designation does have something to do with what people actually believe about the world. For example, are The Iliad and The Odyssey fantasies? No, says Card, because they were written at a time when most people believed in such stuff. What about the Bible or Paradise Lost? Card prudently remarks that even today many people "would be outraged to hear of either being classified as fantasy" (p. 18).

Even science fiction isn't immune to this dilemma. I have heard George Lucas described as a "modern mythmaker." But myths are what a culture actually believes in. People do not go to Star Wars to see a world they believe is real. Rather, Lucas takes the myths of the past and spins them into entertainment which modern audiences accept on a purely symbolic level. In a sense, that's what all fantasy writers do. If we could see what future societies will think of our own, we might be surprised at what they consider our myths and our fantasies.

Tuesday, July 17, 2007

Clever translations

When visiting Israel a few years ago, I purchased a Modern Hebrew translation of Harry Potter and the Order of the Phoenix (the fifth in the series, and the source of the latest movie). My Hebrew skills are at a level where I can understand snippets of dialogue, while having much more trouble with the general text. But what interested me was comparing it with the English version. I quickly learned that the translator must exercise some imagination in conveying ideas that can't be understood through literal translation.

The first question that occurred to me was how the translator handled puns. Because puns exploit word pairs that sound alike but have different meanings, they usually are language-specific. Therefore, they cannot be translated directly. Take the following passage from Chapter Seven:
"And don't take too long, Weasley, the delay on that firelegs report held our investigation up for a month."

"If you had read my report you would know that the term is 'firearms,'" said Mr. Weasley coolly.
This exchange depends on the double meaning of arms. Unfortunately, Hebrew has no word that means both weapons and limbs. What the translation does is make Mr. Weasley's report about ekdichay yad (אקדחי יד), or handguns, and the confused wizard calls them ekdichay regel (אקדחי רגל): "footguns." This gets the same point across as in the original--the wizard's ignorance of technology--and creates an equally outlandish image. But it doesn't involve the same level of wordplay.

In other instances, the translation does manage to retain the wordplay of the original book. For example, in the English version Hermione starts a club called Society for the Promotion of Elf Welfare, or S.P.E.W. The Hebrew translation gets lucky on this one, rendering the club's name almost word for word, with the resulting acronym sounding very close to the the Hebrew word for "allergy"!

At least the Hebrew version preserves most of the character names from the original. Many other Harry Potter translations don't. When I posted the "footguns" example on a language list, someone wrote back to me that the Norwegian version changes Dumbledore's name to Humlesnurr. The reason given is that the name Dumbledore comes from a British dialect word for "bumblebee," and humle is Norwegian for "bumblebee," while snurr means "to whirr."

I have no idea how often translators find sensible solutions to these kinds of problems. Puns and wordplay are only the beginning of the challenges. Good translations stand on their own as works in their own right. But they may overstep their boundaries by improving on the original work. One Israeli teenager told Haaretz that he considers the Hebrew version of Order of the Phoenix superior to the English version. That is not necessarily a compliment.

Friday, July 06, 2007

I post, therefore I am

You're walking down the street, and a scrawny young man approaches you. He has acne and wears braces. He can't be more than seventeen. He says to you, "I'm George, a 52-year-old surgeon." A moment later, he says, "And I'm Clara, a 26-year-old law student."

As absurd as this scenario sounds, many people are acting this way for real, not because they are psychotic, but because the Internet enables them to get away with this sort of behavior. In most online discussion areas, users can adopt whatever name they want and say just about anything without facing the consequences that would result from similar behavior in the real world. Many use this anonymity to their advantage, making up facts about themselves to intimidate others and empower themselves. They also may pretend to be more than one person by posting under more than one name. In extreme circumstances people can be traced to their real identities, but this rarely happens except when law enforcement is tracking criminals. There are other, limited ways of counteracting the problem, none of them foolproof.

It is therefore important to be cautious before accepting claims that people online make about themselves. I'm not saying that you should go around calling people liars, but you should be cognizant about the way people in Internet discussions can use their anonymity to manipulate the situation.

I have had several apparent encounters with such individuals. I say "apparent" because I never was able to prove my suspicions. The earliest was in 1997, on the Excite message boards, which happened to be my first experience with message boards. A fellow started a board entitled "HOMESCHOOLERS ARE NOT QUALIFIED TO TEACH!!!!!!"

He took every opportunity to insult anyone who challenged his views. His main argument was based on personal experience: he wrote that he knew several homeschooled children who were "all lacking in social behavior, adjusting to their surroundings and general knowledge." I asked him to clarify what exactly he had seen, pointing out that his observations may have been too limited to draw general conclusions. He replied that he had encountered numerous homeschooled kids in his jobs as an interviewer and a college instructor, and they were "all blown away by others who had attended formal education."

A fairly well-known homeschooling advocate named Karl Bunday, who has made a side career interviewing homeschooling families, joined the board and asked for permission to interview some of those kids. The person replied, "you expect me to just cough up names of my students so that some nutcase can come to their house to evaluate them?!? That is hardly confidential or professional." The question, then, was why he would base his arguments on information that others could not verify. For all we knew, he could have made it all up. Even if he hadn't, his judgment of those kids may have been less than fair. That was a distinct possibility considering how quick he was to insult complete strangers.

When I raised these points, he said that questioning his claims was unreasonable since the purpose of these boards was to "share experiences." He then attempted to put his experiences on the same level as the documented evidence of homeschooling success I had presented: "I guess I could compile my studies and call it research refuting homeschooling as being effective."

People observing the debate told me I handled myself well, but I felt I was being suckered too easily by this person's antics. He was clearly exploiting his anonymity to win the argument, by relying on "data" that could never be investigated. I acted flustered at times, and he seized on that weakness. (The good news is that I successfully changed the mind of someone else on the board, a person who actually listened to my arguments instead of just trying to fight me.)

Years later, I got a new opportunity to deal with this sort of situation. In 2005, after taking a course in which my final paper examined the movie Fight Club, I went to the Internet Movie Database's message board for that film, eager to share my insights. I soon found myself in a heated argument with two posters. If I take what they said at face value, they consisted of a middle-aged psychology professor and a young female fan. Personally, I suspect both of them were in fact one person. How do I know? I don't. I found clues, to be sure, like the fact that they frequently posted just minutes apart from each other, and that their writing styles (particularly spelling errors) were similar. But I mostly based my suspicions on my familiarity with this sort of situation.

The argument started when the "professor" posted an analysis of the protagonist's mental condition. He ridiculed all the laymen (which he spelled "laimen") who had addressed the issue. Having researched this topic, I found many errors in his post, and I pointed out that the filmmakers were themselves laymen. I hinted not too subtly that I doubted he was a real professor.

He and the "girl" erupted into insults. Here is a sample (pardon the strong language, but this is what they said): "Stop being a little bitch crying because you don't like that someone here has experience and truly knows what they're talking about." "If you ever have something to say, do yourself a favor and just shut up." "YOU were the only retard unable to comprehend me, you think it's my fault. That's your mom's fault." "Being the arrogant asshole you are, I'm sure you're not done yet, so go ahead and prove me right again by posting another whiney bitch post."

Does any of this sound like the words of a typical professor? I asked him for proof of his credentials. He replied that they had been "proven to TRUSTED people on this board, aka NOT YOU." Of course I respected his privacy; however, as long as he was flaunting his credentials, it was ridiculous that he should expect everyone to believe him without verification. Any anonymous user can claim to be a professor. It doesn't prove anything.

For a working professor, he sure seemed to spend a lot of time on the board. He had established a club called the "Space Monkeys" (modeled after the film), and board members who wanted to join he would initiate into it. He started a new thread inviting his Space Monkeys to attack me. There, they each provided me with brilliant kernels of wit, such as the following (I'm not making this up): "Kylopod is stupid and his logic is stupid and the reason he is stupid is because the opposite of smart is Kylopod." Afterwards, the "professor" praised the intelligence of these comments.

I had a chance to get angry and join in the abuse. But I decided to restrain myself from such an emotional reaction. That's one of the advantages of message boards as opposed to real-life encounters. You get some time to think before posting a reply. Here is a passage from one of my posts:
I want to thank you.

Now, I'm sure you're thinking I mean that only in a sarcastic way, and maybe you're right. But I'm actually trying to be quite honest now. I really am thankful for the insight you've provided me with your latest posts, though I realize my gratitude is of a kind that you're not likely to appreciate.

You see, I've always been fascinated by multiple-identity trolls such as yourself. What makes you tick? I've had my theories, but it's largely impossible to confirm any of them. Trolls are elusive almost by definition. All I can deduce about you is that you're pretending to be a professor, and that you are assuming at least two identities on this forum. Other than that, I know very little about you. I assume you are young, but I don't know how young--you could be anywhere from teens to twenties. Perhaps you actually are older; I really can't say for sure.

But your two latest posts have helped me understand the troll mindset better than I ever have before.
Despite his/her/their attempts to portray me as an "attacker," I adopted a cool, detached tone when replying to their rants. I acted completely unruffled by their insults. I gave the impression that I was coldly analyzing them and reacting to their attacks only with quiet amusement. As one observer put it to me in an email, "you calmly and coolly handed them their asses time and again, and you did it without swearing at them. I am humbled by your patience."

I could have simply walked away as soon as they started attacking me. That would probably have been the most sensible approach. But I've always had the weakness that I enjoy deflating bullies. In this case, I was being passive-aggressive, a quality I don't normally display in the real world. I'm usually very direct with people. But because this was a message board, I had the opportunity to try a different strategy, and I was pleased by the results.

I may have improved in my ability to handle this sort of situation, but I'm still perplexed by what it all means. I have made actual friends online. But the lack of accountability remains a problem with Internet communication. I know I'm for real, but I often cannot be sure that someone else is. There's something positively solipsistic about this situation, where the reality of everyone else's life can only be accepted on trust.

Monday, June 25, 2007

The Jurassic Park of languages

In a sociolinguistics course, I almost used that title for my final paper about the revival of Hebrew. Common sense made me reconsider. Although my teacher wasn't Jewish, I knew that some Jews take offense at the idea that Hebrew was ever an extinct language. Their attitude, ironically, leads them to overlook a remarkable Jewish achievement.

Hebrew is, put simply, the only language in history that has ever been successfully revived. This becomes clear once we understand what "revival" means. People tend to use the term somewhat loosely, applying it to movements aimed at preserving languages such as Irish, where there always have been at least a few native speakers left. And when the term is used correctly, as in the revival of Sanskrit, the movement is invariably nowhere near as successful as Modern Hebrew.

By the nineteenth century, Hebrew was not endangered but extinct, and it had been so for almost two millenia. By calling it "extinct," in no way am I trying to denigrate its central role in Judaism. On the contrary, it is a language that I myself use every day in formal prayer, as Jews have been doing throughout their entire history. But that's just the point: a purely religious language is not a living language, not in the sense that English or Spanish is.

Many people would disagree with me. William Chomsky writes, in Hebrew: The Eternal Language, "it may be safely assumed that there were always somewhere in the world, especially in Eretz Yisrael, individuals or even groups, who could and did employ the Hebrew language effectively in oral usage" (p. 218). There are various anecdotes of Jews conversing in Hebrew before the nineteenth century, such as when two Jews from faraway lands wanted to communicate and had no common vernacular. But the extent of these stories is disputed, and in any case it doesn't prove that Hebrew was a living language. Even today, there are people who can converse in Latin.

While Jews in the Middle Ages were trained from a young age to use Hebrew to a degree, it was nobody's native language. There's something special about native languages. Think about your native language. You probably can't remember ever not having spoken the language. It is so ingrained in your consciousness that it's a part of your very being. And all other languages seem like artificial systems of arbitrary sounds until you habituate yourself to them--and even then, they never feel quite as natural to you as your native tongue.

The simple fact is that Hebrew lacked that natural quality for almost the entire Diaspora. Jews studied Hebrew, prayed in Hebrew, and wrote books in Hebrew, but they did not truly speak the language except in very artificial, strained situations that rarely occurred. The revival turned it into a language that millions spoke in, thought in, and breathed in--to this day an unparalleled feat in the history of languages.

In my paper, I pondered what made this feat possible. I concluded that it depended on a whole range of factors happening simultaneously. It depended on the uncommon occurrence of a people who maintained a sense of unity for thousands of years while being scattered across the globe. It depended on their desire for a homeland, and their finding a place fertile for the creation of a new national tongue. It depended on Hebrew being their only common language. It depended on the dedication of a particular man who called himself Eliezer Ben Yehuda, and who was probably a little nuts.

What he did to bring Modern Hebrew to fruition has entered the lore of Jewish culture. He and his wife raised their son in total isolation, so that the child would be exposed to no language except Hebrew. If a non-Hebrew-speaking visitor arrived, Ben Yehuda would send the child to bed. When he came home one day to find his wife singing in Russian, he lost his temper. He even avoided having the child hear bird chirps and other animal sounds! Out of all this lunacy, the child became the world's first native speaker of Modern Hebrew.

One of the main sources I used for this information was Jack Fellman's 1973 book The Revival of a Classical Tongue. Fellman argued that Ben Yehuda's role in the revival has been overstated in popular treatments. Personally, I think Fellman's account proved just the opposite. It's true that Ben Yehuda couldn't have done it all on his own. Even after the experiment with his child, much work remained to turn Hebrew into a full modern language. But just the example he set had a huge impact on the movement. Of course, it raises significant ethical questions about language revival.

One of the problems facing the new language was that it lacked words for modern concepts. According to Fellman, Ben Yehuda sometimes had to rely on gestures and vague statements like "Take such and such...and bring me this and this, and I will drink" (p. 38). When Modern Hebrew took off in the populace, it borrowed wholesale from Arabic, English, and several other sources to enrich its vocabulary.

Nowadays, around five million people use Hebrew as their main language. As an American Jew who was taught Hebrew as a formal, religious language, I always get a weird feeling listening to Israelis use it so casually. To me, the word bitachon (ביטחון) refers to a spiritual concept meaning "trust"; it was odd to visit Israel and see that word printed on the backs of security personnel. I still can't wrap my mind around the idea that even criminals and street kids speak this language that a small group of scholars reconstructed from a holy tongue less than two centuries ago. As Robert St. John put it in Tongue of the Prophets, Ben Yehuda "made it possible for several million people to order groceries, drive cattle, make love, and curse out their neighbors in a language which until his day had been fit only for Talmudic argument and prayer" (pp. 11-12). Whether you consider the feat good or bad, it certainly is incredible.

Tuesday, June 19, 2007

Language observant

In 2000, Larry King asked Joe Lieberman which denomination of Judaism he followed: Orthodox, Conservative, or Reform. Lieberman replied, "I like to think of myself as an observant Jew, because it is broader and it's inclusive." This rather mild and good-natured remark sparked a torrent of criticism, providing fuel for those who felt Lieberman was selling out in his bid for the vice presidency. Binyamin Jolkovsky of The Jerusalem Post complained that Lieberman in this interview "changed his long-time self-description from 'Orthodox' to 'observant.'"

Jolkovsky's complaint ignored a couple of facts. Lieberman described himself as observant in his book In Praise of Public Life, released several months before the Larry King interview. What's more, in a separate interview just three days later, Lieberman began a sentence with the words "The fact that I'm Orthodox...." Nowhere did he change his self-description. He simply expressed a preference for one label over another.

Jolkovsky seemed to assume that adopting the term "observant" was tantamount to denying being Orthodox. I would expect non-Jews to be scratching their heads when listening to this squabble over terminology. Why would Orthodox Jews of all people be offended by the term "observant"? And why did Lieberman prefer the term?

Understanding what's going on here requires some historical background. The division of Judaism into its Orthodox and Reform branches occurred in the nineteenth century. As a new movement, Reform Judaism enacted changes to traditional Jewish practice. Jews who rejected the reforms and maintained the traditional ways came to be called Orthodox Jews.

In common parlance, Orthodox Judaism isn't really a single movement but rather an umbrella term for several Jewish groups that remained relatively traditional amidst the emergence of Reform Judaism. Sephardic Jews, who never even encountered the original Reform movement, are usually classed with the Orthodox today. But not everyone accepts this blanket use of the term Orthodox. There are those who restrict the term to the movement that arose as a direct reaction against Reform Judaism.

I tend to think of Orthodox Judaism as a retronym, or a new term for an old concept. Retronyms happen when a new version of something comes along, causing the old version to require a new name. For example, after microwave ovens were invented, older-style ovens came to be called "conventional ovens." (For those who think I'm implying that new is automatically better, I have two words: New Coke.)

In any case, it was Reform Jews who came up with the term Orthodox. Early Orthodox opponents of Reform, like Rabbi Samson Raphael Hirsch, resisted the term. (The opposite is true of Christianity: The Eastern Orthodox Church gave itself the term orthodox, meaning "correct belief." But by the nineteenth century the word had acquired some negative connotations.)

Among Orthodox Jews themselves, the popular term is frum (pronounced with the vowel sound in wood). This Yiddish word literally means "pious," occasionally carrying negative overtones but most of the time used by Orthodox Jews as a respectful, informal alternative to "Orthodox."

The English word "observant" is not quite as popular. The problem is that many non-Orthodox Jews call themselves observant, and there's a perception that they use the word much more often than Orthodox Jews do. Because Orthodox Judaism tends to consider itself the only legitimate expression of Judaism, some people interpret a Jew's refusal to specify a denomination as contrary to the spirit of Orthodoxy.

As an Orthodox Jew myself, I have never accepted this reasoning. I like to do away with labels as much as I can. When I first set up an account with the dating site Frumster, I was required to describe what type of Orthodox Jew I was. I couldn't just say I was Orthodox; the site made me choose from the following subcategories: "Modern Orthodox liberal," "Modern Orthodox machmir," "Yeshivish Black Hat," "Hasidic," and "Carlebachian." I didn't feel comfortable with any of those, but I settled on "Modern Orthodox machmir," which seemed the least problematic to me.

Eventually, the site expanded its categories. Conservative and Reform Jews could now join the site, and everyone was given a wide range of choices for self-identification. I selected a new category called "Shomer Mitzvot," which literally means "watchful of the commandments"--in other words, observant. It was exactly the type of generic self-description I had been searching for all along.

A friend of mine recently told me that his daughter thinks one should never select that category. He did not remember why she felt this way, but I had little trouble guessing. She probably believes that someone who identifies as "Shomer Mitzvot" is in effect not calling himself Orthodox. Or, at least, she thinks that people might perceive it that way, and so it's best to avoid it if you want to increase your chances of finding a prospective match in the Orthodox community.

You know what? I don't care. I feel comfortable calling myself "Shomer Mitzvot," and that's all that matters. The last thing I'm going to do is bend to someone else's standards. I'm not that desperate. Besides, it all adds up in the end. If a woman assumes I'm not suited to her simply because I call myself "Shomer Mitzvot," then she's probably right.

Sunday, June 17, 2007

Are video games a form of art?

I am prepared to believe that video games can be elegant, subtle, sophisticated, challenging and visually wonderful. But I believe the nature of the medium prevents it from moving beyond craftsmanship to the stature of art. To my knowledge, no one in or out of the field has ever been able to cite a game worthy of comparison with the great dramatists, poets, filmmakers, novelists and composers. That a game can aspire to artistic importance as a visual experience, I accept. But for most gamers, video games represent a loss of those precious hours we have available to make ourselves more cultured, civilized and empathetic.
Those words are from Roger Ebert in 2005, when he caused a firestorm with his assertion that video games are inherently not a form of art. Whether you agree with him or not, the ensuing debate was interesting. Unfortunately, Ebert's credibility was suspect, since by his own admission he lacked familiarity with modern video games. And his statement sounds just like the sort of narrow-minded declaration that's almost asking to be discredited. A couple of generations ago, most people would have scoffed at the idea that comic books are art; nowadays, that idea has gained increasingly wide acceptance (though the respected comic books carry a new name, graphic novels).

Still, I understand where Ebert is coming from. And I say this as someone who knows even less about modern video games than he does. I largely stopped playing them when I was a teenager. I felt that I wasn't getting out of them anywhere near as much as I was putting into them. They were an enjoyable diversion, but left me feeling drained when I spent too much time with them.

I'm aware that video games have increased in sophistication since the days of Nintendo, by huge orders of magnitude. And apparently some gamers think these are works of the human imagination that deserve comparison with great works of literature--or at least that they have that potential, even if the genre is in its infancy right now.

If you think that video games will never be Shakespeare, I should remind you that most people in Shakespeare's time would have laughed at the idea that his plays would be studied centuries later. Most of his plays weren't even published during his lifetime. And it took a long time before critics viewed them as serious works of literature, much less ranked him as the greatest writer in the English language.

Nowadays, people study the heavily footnoted texts of Shakespeare's plays in a manner that the Bard himself would never have envisioned. Is it possible that a similar process will happen with computer games? Will students of the future be studying games in the classroom, in a format that would perplex the original designers?

I debated these issues with someone on a message board a few years ago. He took the position that some video games have achieved an artistic level comparable to great literature and film; I was skeptical, even while admitting my ignorance. It was a nice debate, and I think we both ended up learning something. I learned from him that some modern video games--notably adventure games--have fairly complex narratives, with even the rudiments of character development. But I managed to persuade him that the very nature of video games cuts against the kind of complexity found in literature and film.

Traditionally, games have nothing to do with art. Chess may be a high intellectual activity, but it isn't really a form of art. What all art forms have in common, whether they be paintings, sculptures, poems, novels, plays, films, or comic books, is that the viewer contributes nothing to them beyond his own imagination. Interactive fiction (like the Choose Your Own Adventure series for kids) has always remained a minor phenomenon.

Computers have the potential to blur that line, however, creating games with a high degree of artistic content in terms of both graphics and narrative. But there is a limit. The only way they could achieve full artistic status is if they stopped being games.

I'll give an example from an old game I used to play, Infocom's text adventure The Hitchhiker's Guide to the Galaxy, based on the Douglas Adams novel. The novel contains a scene where Arthur is at a party trying pick up a girl, when another guy comes along and catches her attention by saying he's from another planet. Your goal in the game, as a player, is to make sure the scene happens exactly as it did in the book. You can have no effect on Arthur's chances of getting the girl. Arthur is a loser. That's part of the script. Even if in real life you're God's gift to women, as computer game addicts are known to be, you're not going to change what Arthur's like. The game has a lot of puzzles that require brain power, but your personality doesn't affect the outcome.

Would it be possible to create a game where your personality does make a difference? I can imagine it now. It will be called SimFlirt. Your objective is to go to a party and pick up a girl. (Or, if you are a girl, then you pick up a guy. Or, if you're gay...never mind.) Whether you succeed depends on what you do or say.

Of course, such a game would never compare to a novel or a film. The range of possible outcomes would still be relatively limited. If a character in a computer game can have any level of depth, then it must be written into the game's narrative, without the player having much influence. The problem is that the player's very presence dilutes this process. Games simply are not a good medium for exploring the nuances of human behavior, at least not to the degree found in literature.

The point here is that games are different, not inferior. They serve a different purpose. I'm reminded of something Orson Scott Card said in his 1990 book How to Write Science Fiction and Fantasy: "In a fantasy, if magic has no limitations, the characters are omnipotent gods; anything can happen, and so there's no story. There have to be strict limits on magic. Dungeons and Dragons uses a seniority system that may work well for games, but for stories it is truly stupid: The longer you manage to stay alive, the more spells you know and the more power you have" (p. 31).

Note Card's implication that what works well in a game isn't believable in fiction. I would go further and suggest that what works in fiction isn't suited to a game. Even games with narratives, like D&D and computer adventure games, do not require the same suspension of disbelief as fiction does. That's because the narrative is only a means to an end, whereas in fiction the narrative is central. Perhaps the computer adventure games of today put a greater focus on narrative than ever before. But there comes a point when the integrity of the narrative must bow to the integrity of the game. Everything in a game falls back on the player's choices, and by giving the player choices, the narrative inevitably suffers.

If video games are blurring the distinction by taking on many artistic qualities, the question is how far they can go while still remaining games. And while there will always be people who devote their precious hours to these games, their impact may remain marginal simply because they forge that impossible middle ground between the artistic and the recreational.

Friday, June 01, 2007

The Jewish cab test

Shortly after Tiger Woods became the first black to win the Masters Tournament, he insisted that he was not black but "Cablinasian," a word he coined to describe the different groups in his ancestry: Caucasian, Black, Indian, and Asian. Sarcastic African-American columnist Gregory Kane retorted that Woods should be given "the cab test": "Stand him on a street corner in any large American city and have him hail a cab. If he gets one, he's Cablinasian. If he doesn't, he's definitely black" (The Baltimore Sun, Apr. 27, 1997, pg. 1B).

I wonder if a similar test could be applied to Jews. Arguably, the Holocaust was a grotesque version of this test, as Jews who abandoned their heritage and became atheists or Christians discovered that they were just as likely to be gassed as the bearded shtetl Jew. Hitler justified this innovation to classical anti-Semitism by arguing that Jews who assimilated took Jewish ideas with them. I can't say he was totally wrong.

These examples highlight one of the most basic questions about ethnic identity: is it defined by members of the group themselves, or by outsiders? For us Jews, this dilemma is even more perplexing, because we haven't even settled the "Who is a Jew?" question amongst ourselves. Why should we expect others to fare any better?

The traditional definition of a Jew is one whose mother is Jewish, or one who converts. (Computer scientists would call that a recursive definition.) But Orthodox Jews do not accept conversions done by Conservative or Reform rabbis, and Reform Judaism has expanded the definition to include those born to a Jewish father. Depending on one's perspective, individuals in many U.S. synagogues may not be Jewish.

No matter how strongly Orthodox Jews insist that their definition is the only legitimate one, non-Jews cannot be bothered to take sides on this in-the-family dispute. They have enough trouble dealing with a group that even by their standards defies all normal classifications. I have seen confused people on message boards write "Is Judaism a race or a religion?" as if it must be one or the other. In recent times, the trend has been to think of Jews as purely a religion and not to recognize their ethnic character. I increasingly see articles that describe celebrities as having been "born to Jewish parents." Some younger stars like Natalie Portman openly identify as Jewish, but there's a sense that it would be rude to describe someone as Jewish without their permission.

To people with this outlook, a phrase like "Jewish atheist" sounds as oxymoronic as "Catholic atheist," even though many older Jews identify as one. And what about that quaint phrase "the Jewish nation" which shows up in our prayerbooks? How can Jews be a nation? Doesn't that require a country? Of course, now Jews have a country, but those who never set foot there are still Jews.

Our unconventional classification arises from our long and complex history over 4,000 years. Few groups in the world have retained a sense of shared identity for that long, and so no matter how much we attempt to adapt to current norms, there lurks in our existence an element of the ancient that relatively modern categories like "race," "ethnic group," "religion," and "nation" can never quite capture.

The ancient Israelites could possibly be called a "tribe," though that term is rarely used, reserved instead for the twelve tribes within ancient Israel. Eventually, Israel did constitute a true nation. But after the Jews were exiled, they continued to think of themselves as Jews. In this respect, they were unusual. Most religions that spread outward from a single land retained religious but not national or ethnic identity. Partly this was because religions like Catholicism and Islam had a prosyletizing mission which Judaism lacked. Thus the people of Turkey, Pakistan, and Iran are Muslims but not Arabs. Because Jewish conversions never happened on a large scale (with possible exceptions like the Khazars), the converts became part of the Jewish people, losing their previous cultural identity. I have heard rabbis compare Jews to a family, where the converts are like adopted children. It's not a perfect analogy (since adopted children do not choose their parents), but it does give a sense of how Jews can think of themselves as having blood ties even while accepting converts.

The problem is that Gentiles would not be expected to pay any attention to how Jews defined themselves. What ultimately bound Jews together mirrored what bound blacks together: namely, persecution. It is worth asking whether there would be a concept such as "black" today if racism had never existed. It is similarly worth asking if Jews would have outlasted their ancient Middle-Eastern origins if anti-Semitism had never existed. Nowadays, many secular Jews admit that their Jewish identity is often driven by a desire to stick it to the anti-Semites. As Ilya Ehrenburg said, "so long as there is a single anti-Semite in the world, I shall declare with pride that I am a Jew" (qtd. in Alan Dershowitz's book Chutzpah, p. 14). Likewise, as anti-Semitism declines, or at least fades into the background, the concept of a secular Jew becomes harder to maintain.

Of course, if you define a Jew as anyone who may be a victim of anti-Semitism, then the definition becomes as arbitrary as bigotry is senseless. Plessy v. Ferguson sanctioned discrimination against a man who was black on the basis of one great-grandparent; many people with more African ancestry have passed for white. In a similar way, Barry Goldwater was subject to anti-Semitism even though he was a practicing Episcopalian with a Gentile mother; he probably would have been safe if his name had been Anderson. The cab test may be a sad reality for blacks, but for Jews it is something we must actively resist if we are to make sense of our lives.

Wednesday, May 30, 2007

Critics of homeschooling need to do their homework

Polls suggest that a slim majority of Americans oppose homeschooling, the method of choice for approximately two percent of the population. Ever since I took this educational route in high school, I have been stunned by the negative reactions it provokes. Though the opposition has declined significantly in the last decade, millions of Americans continue to find fault with this unusual mode of education, eager to offer their opinion on a subject they know nothing about.

It's no wonder that the arguments against homeschooling frequently contradict one another. Some critics allege that parents lack the qualifications to teach their children properly; others suggest that homeschooled children will be so hopelessly ahead they will be unable to relate to other kids their age. Some people imagine the prototypical homeschooled kid as shy and withdrawn; others imagine such a kid as loud and obnoxious. Whatever the argument, the critics base their views on very little if any personal knowledge of homeschooling. They haven't got a clue what homeschoolers actually do during the day, yet they seem to have endless confidence in their ability to guess.

One recent example of this attitude is a piece by blogger Russell Shaw for The Huffington Post. Shaw concedes that "home schooling works in some cases" (a mountain of research would suggest that this is an understatement), but he nonetheless thinks it should be restricted to those with an education degree, teaching children who are unable to attend school for physical reasons such as paralysis. Shaw, who assumes that homeschoolers learn through "rote recitation," worries that too many of the parents "want to keep their students at home in the service of simplicity and protectiveness," a situation that will make them ill-prepared for living in the real world.

Shaw's essay is very typical of anti-homeschooling pieces, not only lacking the slightest factual support for his positions but making provably false assertions of his own, such as the claim that homeschoolers consist primarily of fundamentalist Christians who reject evolution. (See here for the actual demographics.) Had Shaw bothered to look into the history of the movement he opposes, he would have learned that its godfather was a rather secular fellow named John Holt, who advocated homeschooling as an alternative to the "rote recitation" and lack of real-world preparation he observed as an instructor in traditional schools.

It's true that some homeschooling parents, like some private schools, teach creationism. Without explaining what he thinks should happen to private schools, Shaw denounces the situation: "as to the home schooler subjected to beliefs that run counter to scientific inquiry...I say send them to school and let the parents devote some of their off-hours to teaching what they feel their kids should know." Shaw implies here that it is the task of schools to expose kids to what is true, against those parents who will teach them what is false. But who decides what is true and what is false? The government? Shaw's point may resonate with those who envision homeschooling parents as extremists, but his larger implication is, frankly, scary.

If Shaw truly values scientific inquiry, then he should base his conclusions on facts, not hunches. Stephen Colbert coined the word truthiness to describe conservatives who rely on gut feelings as a substitute for evidence. If there is any issue on which some liberals exhibit this quality in abundance, it is this one.

Tuesday, May 22, 2007

The democracy of encyclopedias

One time when I worked as a college tutor, a student referenced Wikipedia in his paper on plants. I asked him if he knew what Wikipedia was. He said no. I explained that it was a user-created encyclopedia, that anyone can alter the contents at any time, and that I could take my laptop right there and change the article to say, "Plants are little green men secretly plotting to take over Earth." The student looked at me in surprise, but I assured him I was dead serious.

At this point you might be expecting me to launch into an anti-Wikipedia rant. I refuse to jump on that bandwagon, however. Wikipedia is the perfect example of a new development that traditional people just don't "get." Not that these critics are wrong exactly. Wikipedia does often provide inaccurate information and should not be cited in an academic paper. But the critics assume that once they make this observation, the issue is closed: Wikipedia is virtually worthless as a resource. I have had a very hard time talking to people who take this attitude. When I try to defend Wikipedia, I am frequently greeted by a dismissive snort, as if to imply that giving Wikipedia any credit would be to demonstrate massive gullibility. ("What, you actually trust Wikipedia?") This reaction, in my view, reveals a somewhat one-dimensional perspective on what makes something a valuable research tool.

What is Wikipedia? It's an online encyclopedia in which anyone with Internet access may write an article or modify existing ones. You can make grammatical corrections, contribute a sentence, provide a citation, or add a new section. Of course, you can also put in something ridiculous or offensive. Many users do just that, but the site keeps a back log of all previous versions of every article, and so as soon as anyone changes anything, people check. Outlandish changes usually get quickly reverted--but they still occur quite often. I once visited an article on John Ritter only to be informed that the late actor had risen from the grave. Editors may temporarily close off articles that get swamped by "vandals."

Wikipedia has many standards that users are expected to uphold. Articles must have citations, "no original research," and a "neutral point of view." Every article has a discussion page in which users can work out conflicts or disagreements. Articles that fail to meet these standards will receive a tag pointing out this fact.

Forget about the accuracy question for a moment. I want to make a point that often gets overlooked in these discussions: Wikipedia is quite possibly the most extensive enyclopedia ever compiled. It is for sure one of the largest. (See here for comparisons.) I'm not sure the term "encyclopedia" does the site justice. The sheer amount of topics covered is mind-boggling. To wit, you will find lengthy articles on all of the following:

1. Books, movies, TV shows (even individual episodes!), and music groups (even individual songs!). This includes not just classics, but also numerous obscure modern works. The selection is constantly expanding; I contributed the article on the novel Somewhere in Time just a couple of weeks ago.

2. In-depth information about small, specialized subjects that get only the most cursory treatment in standard encyclopedias. What are your hobbies? One of mine is juggling. On Wikipedia, not only are there articles on Enrico Rastelli, Francis Brunn, and other names not well-known outside the juggling community, there is also an extensive history of juggling, a thorough examination of each piece of equipment, and a detailed look at a wide range of techniques and tricks.

3. Obscure concepts from technical fields, like evolutionary biology's "population bottleneck" or computer science's "self-balancing binary search tree."

4. Almanac-like lists of the major events in any particular year.

5. Huge information about cities. Not only is there a lengthy article about Baltimore, there are individual articles devoted to all the local universities, libraries, cemeteries, and even major streets! (My brother contributed an article about the local bus routes, which some editors considered deleting for being "unencyclopedic.")

At this point, you might be asking, "What's the point of all this information if it isn't reliable?" Now hold on just one second. Is Wikipedia unreliable? A controversial and hotly contested 2005 study in the journal Nature compared Wikipedia's scientific articles with those in Encyclopedia Britannica and found that Wikipedia on average has four errors per article, whereas Britannica has on average three. This fact becomes especially astonishing when you consider that Wikipedia's articles are typically much longer than Britannica's.

How can that be? How can an encyclopedia in which any twelve-year-old may contribute even begin to approach the accuracy of one compiled by a panel of experts? There lies the paradox of Wikipedia: even though it has an endless capacity for error, it doesn't necessarily have a much greater tendency toward error than traditional encyclopedias. It's true that any idiot can write an article, but it will then be subject to what amounts to a gigantic peer-review process.

There are some advantages to this format. The information tends to stay very up to date. (I have found their pages on celebrities updated within hours of a celebrity's death.) And they seem to be very good at staying on top of urban legends. My 1993 edition of Compton's, in contrast, under the topic of "Language" repeats the old urban legend that Eskimos have many words for snow. That kind of nonsense would never last long in Wikipedia, where there may be a lot more ignoramuses, but there are also a lot more fact-checkers.

Still, the errors are there. They might be even worse in the foreign-language editions, which I've noticed are often simply amateur translations of the English edition. I corrected an Israeli-edition article that identified Connecticut as a city in Maryland. (I later figured out the cause of the error: the translator misunderstood a sentence in which the two states were listed one after the other, separated by a comma.)

Thus, Wikipedia should not be viewed as authoritative. Any information you get from it needs to be corroborated. That, however, is very different from saying Wikipedia lacks value as a resource. I will mention one example from my experience to illustrate my point.

When I was doing a school paper on Silence of the Lambs (the basis for this post), I looked up the term "psychopath" on Wikipedia, because a character in the film had applied the term to Hannibal Lecter. Wikipedia brought me to the page on "anti-social personality disorder," the clinical term, and listed the seven symptoms associated with the disorder, which I subsequently mentioned in my paper.

There was no problem of verification here: the article had a direct link to DSM-IV-TR, the diagnostic manual from which this information came. You might now ask why I needed Wikipedia--why didn't I just go directly to the manual? But how would I, a layperson who hasn't studied psychiatry, know in advance to go there? That's the beauty of Wikipedia. It gathers together an enormous amount of resources that might otherwise be hard to locate.

As it stands, corroborating Wikipedia's information is not difficult, because the good articles provide links and citations. Some of the less developed articles do not, but so what? You are free to dismiss any unverified information you find. It's true that some folks, like the student I tutored, may fall prey to the misinformation. But that's their problem. If not for Wikipedia, these same people would be getting their information from "Bob's Webpage." Wikipedia is very open about its process and should not be blamed if some people misuse the site.

Not only can misinformation be found in respected encylopedias like Britannica, it can be found even in very scholarly texts. In other words, no resource should be viewed as 100% reliable. Corroboration is a standard procedure of research, and the proper use of Wikipedia is really no different than the way we approach any other source.

Certainly, Wikipedia is both imperfect and incomplete. That's a given that not even Wikipedia's staunchest defenders will deny. The site is a massive organic entity, constantly being tinkered, constantly being updated, and much work remains to be done. In a way, I feel bad for the critics. They're in a Catch-22 situation, since the more they complain about Wikipedia's faults, the better Wikipedia becomes.