Wednesday, June 28, 2006

The self-created monster

Although the great psychodrama Silence of the Lambs has enjoyed tremendous popularity and acclaim, many viewers have overlooked its most provocative insight: Hannibal Lecter, though a fearsome killer, is not truly crazy. This is a radical interpretation, I admit. The conventional view is that he can't help being the way he is. As Roger Ebert writes, Hannibal "bears comparison...with such other movie monsters as Nosferatu, Frankenstein (especially in Bride of Frankenstein), King Kong and Norman Bates. They have two things in common: They behave according to their natures, and they are misunderstood. Nothing that these monsters do is 'evil' in any conventional moral sense, because they lack any moral sense. They are hard-wired to do what they do. They have no choice."

I believe that this interpretation is mistaken. But I admit that there is superficial evidence to support it. There is no doubt that all the characters in the movie, aside from Hannibal himself, consider Hannibal crazy. That's why he's in an institution for the criminally insane. That's why Anthony Hopkins, on the DVD, describes Lecter as a good man trapped in a madman's body. Who am I to disagree with the actor who brought the character to life?

But I have observed that people tend to apply the word "madman" indiscriminately to anyone whose actions fall outside the boundaries of civilized behavior. Only in that sense is Hannibal "mad"; by any other criteria, he exhibits none of the usual signs of madness. He is not delusional in the least, and he has full control over his behavior. Everything he does is a carefully considered choice, based on a personal value system that permits him to perform grisly acts when he believes the circumstances justify it.

Dr. Chilton describes Hannibal as "a monster, a pure psychopath," but Hannibal in many ways does not fit the traditional definition of a psychopath. According to the diagnostic manual DSM-IV-TR, a person must exhibit three or more of the following behaviors to be classified as a psychopath:
(1) failure to conform to social norms with respect to lawful behaviors as indicated by repeatedly performing acts that are grounds for arrest
(2) deceitfulness, as indicated by repeated lying, use of aliases, or conning others for personal profit or pleasure
(3) impulsivity or failure to plan ahead
(4) irritability and aggressiveness, as indicated by repeated physical fights or assaults
(5) reckless disregard for safety of self or others
(6) consistent irresponsibility, as indicated by repeated failure to sustain consistent work behavior or honor financial obligations
(7) lack of remorse, as indicated by being indifferent to or rationalizing having hurt, mistreated, or stolen from another
Hannibal clearly is not reckless, irresponsible, or impulsive. His lack of impulsivity is notable, since the usual image of a psychopath is someone who lives in the present and doesn't think ahead. Hannibal seems to have everything intricately planned--including his escape, which he carries out while listening to classical music as if he had outlined the attack to the exact key.

One might assume that he's deceitful, but actually he lies only once in the entire movie, when he deliberately gives the FBI incorrect information about the name and whereabouts of the serial killer on the loose. Yet he does this in retaliation after he is lied to, hardly an indication of habitual lying. On the contrary, most of the time he uses his brutal honesty as a weapon, to wound others.

That leaves three categories that arguably apply to Hannibal: "failure to conform to social norms," "irritability and aggressiveness," and "lack of remorse." If those three traits truly describe Hannibal, then he may qualify as a psychopath. However, there is a good case for saying that he doesn't fit the second category. While he is certainly aggressive, I wouldn't describe him as irritable. His aggression is not haphazard but methodical. Whatever drives him, it isn't anger or rage. He is willing to hurt or kill those who stand in his way, but there is usually an element of moral judgment in his choice of victims. He tells Clarice that he has no intentions of coming after her, because "the world is more interesting with you in it." He has firmly held beliefs about how people ought to behave, and they influence his decisions on how to act. For example, when he causes Miggs's death, Dr. Chilton claims that Hannibal did it "to amuse himself," but Hannibal has his own explanation: "Discourtesy is unspeakably ugly to me." That is an ethical belief he repeatedly follows throughout the film.

What about his cannibalism? Doesn't this greatly undermine my argument? How could any sane man eat people? But there's nothing compulsive about his behavior. He performs no elaborate rituals along the lines of any standard serial killer. His cannibalism seems to reflect, rather, his contempt for much of the human race. He doesn't value human life, but he is capable of being kind to those he feels have earned his respect, like Clarice.

Hannibal is neither a psychopath nor a madman. Then how, you might ask, can we explain his monstrous behavior? Here is a telling exchange from the novel:
"You can't reduce me to a set of influences. You've given up good and evil for behaviorism...nothing is ever anybody's fault. Look at me, Officer Starling. Can you stand to say I'm evil? Am I evil, Officer Starling?"

"I think you've been destructive. For me it's the same thing."

"Evil's just destructive? Then storms are evil, if it's that simple." (p. 19)
Hannibal here is criticizing both the psychiatric profession and society as a whole. There is a common temptation to explain all human behavior in terms of mental states. We seek to distance ourselves from our horror by labeling anyone who commits horrifying crimes as "sick," as though that person is somehow the product of forces beyond his control rather than someone who has made a conscious choice to be the way he is. Hannibal Lecter represents our worst nightmare, a living proof that brutality and rationality do not necessarily conflict.

The promise of a sound resolution

Most people accept the concept of objective truth. If someone says that ice cream is a health food, that person is simply wrong. But if someone says, "ice cream is delicious," that statement is neither true nor false; it is simply a matter of opinion. A lot of people today place morality in the latter category. I hear this all the time: "Morality is subjective," they say. As Bertrand Russell asserts, "in a question as to whether this or that is ultimately Good, there is no evidence either way; each disputant can only appeal to his own emotions, and employ such rhetorical devices as shall rouse similar emotions in others." I disagree. Although people's emotions do often influence their views on morality (or, for that matter, on any other subject), it is possible to objectively assess a moral view based on the quality of the reasoning used to support it and on the weight of the evidence.

All societies share certain core principles. Killing would be a crime even in Hitler's ideal society. What Hitler claimed was not that killing in general was acceptable, but that the only way to create an ideal society was by first destroying or enslaving certain races. That claim rested on demonstrably false assumptions about reality, such as his pseudoscientific notions about race and the mortal threat that Jews allegedly posed for the rest of mankind.

Morality and truth are more closely linked than subjectivists admit. According to the subjectivist, if one culture practices cannibalism, and a second culture considers cannibalism immoral, there's no objective way of determining which side is right. If we investigate how the cannibals justify their actions, however, we are likely to find that they hold mistaken beliefs. They may believe, for example, that eating human flesh gives a person great powers, or that the people they are eating are less than human, coming as they do from outside the tribe. To suggest that those beliefs are rooted in superstition and ignorance is hardly a matter of subjective opinion.

By identifying core beliefs that all societies accept, we can determine through reason which moral views come closer to meeting those core beliefs. We can also determine when moral laws have exceptions, such as killing in self-defense. Since the goal of the law against killing is to protect human life, occasionally we must violate this law to reach the same goal. The reasoning here is similar to why people undergo surgery: they allow their body to be damaged in the short run to improve their health in the long run.

Moral ambiguity arises from the conflict between short-term and long-term consequences. If the United States government had learned that hijacked planes were heading toward the World Trade Center, it may have chosen to shoot down the planes, killing all the passengers, because failing to do so would lead to even more deaths. As a rule, long-term consequences take priority over short-term consequences. The problem is that they are harder to determine. The Nazi worldview perhaps represented the extreme of reasoning on the basis of long-term consequences, in the suggestion that enormous destruction of human life was needed to create a peaceful world. The primary danger of utopian visions is that people who seek to transform society to such an extent may ignore the harm they are driven to inflict on society in its current state. Sound moral reasoning involves a balance between what one knows to be true in the present and what one can reasonably infer about the future.

In real life, of course, many of the situations we face are not as clear-cut as the previous examples. The fallacy that subjectivists commit is in thinking that lack of clarity automatically implies subjectivity. Objective reality is not always accessible to human knowledge. For example, nobody knows whether life exists on other planets, but either it does or it does not; the answer does not depend on what humans believe or know. Of course, we may disagree on how to define life. But everyone agrees that humans, lions, trees, and bacteria are alive. The concept does not collapse into subjectivity simply because people aren't sure how far to extend the definition. By that logic, all concepts would turn out to be subjective.

Similarly, the fact that two people in full knowledge of the facts reach opposite conclusions on a moral question does not imply that the issue is subjective. One person may err in his reasoning, and their views may rest on assumptions that are hard to prove. Uncertainty is not subjectivity. While a person's emotions may influence where he stands on the issue, a rational person recognizes that any attempt to resolve the issue is ultimately a search for truth, not an appeal to emotions. Just as unsolved mathematical problems do not shake people's faith that one plus one equals two, complex moral issues do not refute the existence of simple moral truths.

After all, anyone who enters any moral debate hopes that this society will eventually resolve the issue when most people decide which side’s arguments are the most cogent. If neither side is ultimately right or wrong, however, then the view that triumphs in the end will do so simply because its proponents have enough political power. All moral controversies are ultimately power struggles, if one follows moral subjectivism to its logical conclusions. Only moral objectivity can offer the promise of a sound resolution.

Tuesday, June 27, 2006

The issue that will not die



I think this cartoon perfectly captures the flag burning debate. It's human nature that as soon as someone says you can't do something, that's when everyone wants to do it. The current amendment seeks to enshrine in our Constitution the ability to outlaw a very specific form of political protest that is scarcely more common of an activity than machete juggling. But I believe this is an important symbolic issue with much larger implications, because the proposed amendment cuts back on the First Amendment's right to free speech.

How can I possibly say that, you ask? How can I possibly equate the act of burning a flag with "speech"? Well, I'm certainly in good company with this position. The Supreme Court has ruled repeatedly that nonverbal forms of communication do have some protection under the First Amendment. In the 1931 case Stromberg v. California, the Court struck down a statute that prevented people from displaying red flags in support of Communism, and in the 1969 case Tinker v. Des Moines Independent School District, the Court ruled against a school that prohibited its students from wearing an armband in protest against the Vietnam War. It’s true that lawmakers can put more restrictions on nonverbal communication than on actual speech and writing, but one thing they cannot do is outlaw an activity purely because they oppose the message that activity is intended to express.

This point is lost on Robert H. Bork. According to Bork, banning flag burning isn't outlawing an offensive idea; it's outlawing an offensive "method of expression." For example, says Bork, we are certainly entitled to "stop a political speech made from a sound truck at 2:00 AM, or prosecute a protest against sodomy laws where demonstrators engage in the practice in public" ("Waiving the Flag," Omni, Oct. 1990, p. 10).

But in both of those examples, the reason both actions are prohibited has nothing to do with the messages being expressed. If you go into a quiet neighborhood in the middle of the night and give a speech blaring from a sound truck, you'll be arrested no matter what you say. You don't even have to be saying anything; you could howl at the moon and you'd still be violating the same law.

The laws prohibiting flag burning are clearly in a different category. For example, the Texas statute which the Supreme Court struck down in 1989 actually permitted the burning of an American flag if the purpose was only to dispose of a torn or dirty flag. Obviously, the law was directed specifically at people who used flag burning to express a message of anger and contempt toward the American government.

Similarly, the proposed amendment isn't against the destruction of an American flag; it's against the "physical desecration" of the flag. What does it mean to desecrate a flag? The dictionary won't help me on this; they all define the word desecrate as strictly a religious term that you might apply to the destruction of objects found in a church or synagogue, certainly not to a secular object like an American flag. Congress, however, has defined the word as "deface, damage, or otherwise physically mistreat in a way that the actor knows will seriously offend one or more persons likely to observe or discover his action."

In other words, the amendment, like the Texas law, is not directed toward the act of flag burning. It is directed toward people who use the act to express a message of disrespect. Not that this should be surprising. There's nothing about the act of setting a flag on fire that's inherently offensive. Once you start talking about the person's intent--what's in his mind--it becomes abundantly clear that it's not what he's doing that offends, it's what he's communicating.

Sunday, June 25, 2006

The neverending series

Everyone knows that sequels tend to suck. But it's particularly distressing to see a favorite childhood movie ruined by one or more inferior sequels. The Neverending Story is an excellent example of this problem. The original was far from a perfect movie, but it was a fun and engaging movie with a sense of wonder. Unfortunately, it depended greatly on its high production values and the skill of its director, Wolfgang Petersen, who apparently had no intention of making a sequel even though his film covered only half of Michael Ende's novel. The sequels that did get made were low-budget enterprises with little chance of doing justice to the original. Even so, they managed to hit rock bottom, and I cannot think of a single other series that has taken such a low fall.

It's not just an issue of budgets. The people who made the sequels seemed clueless as to what made the original special. No fantasy film I've seen has tapped more successfully into the kinds of philosophical thoughts that kids have. Think of Rockbiter's speech describing the Nothing: "A hole would be something. Nah, this was nothing. And it got bigger, and bigger, and bigger...." It's the type of film that greatly appeals to introspective kids who think about things like infinity and the end of the universe. Do children really think about such things? I did. People who find that surprising have forgotten how profound children can sometimes be.

The whole of Fantasia, indeed, seems to be built out of children's dreams and fears. Some of it is about exhilaration, as when Atreyu rides Falkor. Others reflect anxiety, as in Atreyu's trek through the Swamps of Sadness. What appealed to me most as a kid was how an imaginative but passive child, sort of a young Walter Mitty, opens up a book in which an older, braver version of himself goes on adventures. But The Neverending Story isn't so much escapism as it is about escapism. It's essentially a fable about the destruction of a child's fantasy world as he grows older and adapts to the modern world.

The special effects are good for their day. Although they occasionally look phony, the film's distinct visual look, from the shimmering Ivory Tower to the assortment of weird creatures, holds up well today. What makes the film work especially well is that the two child stars--Barret Oliver and Noah Hathaway--prove themselves capable actors. I use the word "capable" because almost everyone in the film overacts in an annoying way, which I blame primarily on the director. But there's a wonderful cameo by Gerald McRaney as Bastian's father. He has the perfect tone for the scene, appearing loving but distant, unable to fathom Bastian's mind. I wish the film had followed through by returning to their relationship at the end and exploring how Bastian changes as a result of his experiences in Fantasia.

The reason the ending doesn't work is obvious to anyone who's read the book. Simply put, the movie shows only the first half of the book! While this isn't the movie's fault entirely--there was no way the entire story could have fit into one movie--this could have been handled better. The Wizard of Oz faced precisely the same problem yet managed not only to become one of the greatest fantasy movies of all time but to surpass its source material in some ways. The Neverending Story doesn't accomplish that feat. The story feels unresolved at the end while at the same time failing to clearly set up for a sequel. It attempts to wrap everything up with a sequence in which Bastian takes revenge on his old bullies. I enjoyed this scene when I was a kid, but in retrospect it creates a clash between the real world and the fantasy world. Bastian never grows as a character, he never learns to put his feet on the ground, something the early scenes suggest will happen.

There's one other problem, and that's that Wolfgang Petersen never really figured out the proper tone for a children's movie. He must not have had a clear idea what age he was shooting for. Some of the scenes are quite scary and violent, making this film inappropriate for younger children. Yet the muppet-like characters are presented in a cloying way that I doubt older kids (not to mention teens and adults) would appreciate. For example, the first scene in Fantasia plays like a revival of Sesame Street, with Rockbiter filling the Cookie Monster role. By the time I was old enough to appreciate the deeper aspects of the story, I cringed at the film's cutesy moments.

This sort of approach is never justified, in my view. The best children's movies do not condescend to their audience. Films like The Wizard of Oz, Mary Poppins, or any of the great Disney animated films, are easy to enjoy and appreciate as an adult. This is a lesson that does not seem to have rubbed off on Petersen. Had he shot for a wider age group, the result would have been fresher and more authentic for everyone.

The movie went on to become a box office hit and a minor classic, and the people who made the sequels appear to have learned more from the film's bad points than its good points. I cannot give a detailed description of the second film, because I saw it just once about fifteen years ago, and I have no desire to see it again. What I do remember is that it was painfully bad, one of the worst movies I had ever seen--maybe on the bottom thirty. It attempted to tell the second half of the novel. Unfortunately, the plot had continuity problems and ended up not making much sense. And it fell back on cliches that didn't belong there, like Bastian trying to overcome a fear of water, and a fight between him and Atreyu due to an improbable coincidence. The actor who played Bastian's father this time around was in no way in the Gerald McRaney league, and he came off generic and nondescript. Overall, the film was just poorly done.

Not more than a couple of years ago, I discovered a third Neverending Story movie being played on cable. Intensely curious, I decided to watch it. I did not have high hopes for it. But I knew that, at least, it could not possibly be worse than the second film.

Boy, was I wrong.

Released in 1994, exactly one decade after the original, it is unquestionably one of the worst movies I have ever seen--easily on the bottom ten, maybe bottom five. It is so bad that I risk making it sound like it's worth watching. Trust me, it's not that type of "bad," the enjoyable Ed Wood variety of movies that are so incompetently done they become enjoyable to watch. Those moviegoers who take pleasure in seeing cinematic disasters should be forewarned about this one, lest they never again be able to erase from their memory Rockbiter's gravelly-voiced version of "Born to be Wild," played in a video sequence early in the film and again during the end credits.

No, I am not joking.

The second film does have its admirers, and in a weird way I understand where they're coming from. At least that film had a legitimate purpose, to finish the story from Michael Ende's novel. But the third film has to make up its own reason for being, with a shabbier budget than ever before. So it concocts a story that allows us to see as little of Fantasia as possible. Here, Bastian is a little older, attending a new school. A gang of bullies chases him into the school library. The librarian just happens to be Mr. Koreander, the bookstore owner from the first film. Bastian hides from the bullies by finding the magic book and slipping into Fantasia. But the bullies also find the book, and they use it to wreak havoc on Fantasia. Through a series of magical mishaps, a bunch of creatures from Fantasia end up being transported into the real world along with Bastian. These include Falkor the luck dragon, a baby rockbiter about the size of a fountain statue, and a talking tree. Falkor, who must have gotten a lobotomy sometime between the second and third film, will later chase after a "dragon" at a Chinese festival.

What we do see of Fantasia makes the place seem a lot smaller than ever before. Almost all of the scenes there take place in the empress's chamber in the Ivory Tower, though there is also one sequence where we get to see Rockbiter's home (just what I've always wanted to do!) with Mama Rockbiter and of course the previously mentioned Baby Rockbiter sitting in front of a large stone TV set. Needless to say, the Fantasians seem to possess quite a bit more knowledge of Earth than they did in the first two films. When the gnome describes Bastian as "not exactly Arnold Schwarzenegger in the muscle department," we're reminded how much more enjoyable the film would probably be if Schwarzenegger were actually in it.

Curiously, the bullies never seem surprised to learn that magic exists. Think how long it took in the first film for even imaginative, ten-year-old Bastian to become convinced of the book's supernatural qualities. These bullies, much older and more concrete, never go through such a skeptical period. And later, when the Auryn falls into the hands of a teenage girl, she treats it with about the same level of awe as if she got hold of her parents' credit card.

The creatures bear scant physical resemblance to their counterparts from the earlier movies. They look like people parading around in bad Halloween costumes. And Falkor (who in the original was voiced by an accomplished and prolific voice actor, Alan Oppenheimer) now sounds like Goofy.

There are actually some familiar actors in this mess. Mr. Koreander is played by British character actor Freddie Jones, Bastian is played by the kid from Free Willy, and the main bully is played by a relatively young Jack Black, who now probably would like to do with this film what George Lucas wants to do with the Star Wars Holiday Special.

Thursday, June 22, 2006

The fine line between love and hate

The 2001 film The Believer contains rare insights into Jewish identity, and it's unfortunate that the film was withheld from mainstream audiences due to ongoing controversy. But it deals with an ugly subject, and it handles that subject in an ambiguous way that makes many people, including many Jews, uncomfortable. Make no mistake about it, though: the film is uncompromisingly pro-Jewish, and the director, himself a Jew, has said that he became more religious because of his work on the film. Ironically, the film is likely to resonate the most with Jews, though it also contains universal themes familiar to anyone who has ever struggled with faith.

The idea of a white supremacist who's secretly Jewish is not new to me. I've long known about Frank Collin, who caused a national controversy in the 1970s when he planned to have his neo-Nazi group march in a predominantly Jewish suburb of Skokie, Illinois. It was later discovered that Collin's father was not only Jewish but a Holocaust survivor. This case is so bizarre that it leads one to assume the guy was simply insane. While there may be some truth to that assumption, it isn't a satisfactory explanation. What would possibly lead a Jew to join a group that believes in the inherent evil of all Jews? What is such a person thinking? How does such a person live with himself, rationalize his own actions?

What The Believer accomplishes is to go inside the head of one such person and provide a compelling, believable explanation for how such a person could exist. The film is based loosely on a 1960s incident in which a high-ranking member of the KKK was discovered to be Jewish. The movie updates the story to modern times and depicts the young man, Danny, as a skinhead rather than a Klansman. His characterization is speculative but reveals a deep understanding of human nature.

What's truly bizarre about this story is that Danny never abandons his Jewish roots entirely. After attending a neo-fascist meeting, he goes home to his family, whom he treats with respect. He even performs Jewish rituals in private. Yet he terrorizes a Jewish kid on the subway, tells his neo-Nazi buddies that he wants to assassinate a prominent Jewish diplomat, and spouts what sounds on the surface like typical white supremacist ideology. But he's not, as we might suspect, a hypocrite saying things he doesn't believe, or a two-faced lunatic. His philosophy is surprisingly coherent. Sure, he's a walking contradiction, but so are many other people who have a love-hate relationship with their religious background.

His anti-Semitic beliefs all revolve around a single idea: he thinks Jews are too weak and passive. Sometimes he adopts a "macho" outlook, since he doesn't want to be associated with a people stereotyped as brainy intellectuals. On a deeper level, he dislikes the persecution theme in Jewish history and culture. But is this theme a sign of weakness or strength? Danny isn't sure. He eventually decides that Jews gain strength from their persecution; they seem to grow stronger the worse they're treated, and the biggest threat to their survival is not those who want to destroy them but those who don't care. This is a far more Jewish idea than an anti-Semitic one. Several Jewish holidays, including Passover, Purim, and Chanukah, commemorate events where Jews grew strong after periods of persecution. Many Jews today believe that assimilation into the culture is a greater danger than genocide, because it could signal the disappearance of Jews as a distinct people. As Irving Kristol once remarked, "The problem is that they don't want to persecute us, they want to marry us."

The implication is that Danny actually admires Judaism, and that his anti-Semitism is his own warped way of affirming his Jewish identity in a world where, he fears, Jews are increasingly seen as irrelevant--not loved or hated but simply ignored. His ambivalent feelings escalate as the movie progresses. When he has his neo-Nazi buddies deface a synagogue, he can't bring himself to damage the Torah scroll, and he secretly takes it home with him. His intimate knowledge of Jewish beliefs and practices looks strange to his fellow skinheads, to say the least. He tells them that he studies these things in order to know the enemy, pointing out that Eichmann did the same thing. Do they buy this explanation? Apparently they do, but Danny's girlfriend is a little smarter than that, and she finds herself strangely drawn to the religion he's running away from.

Like American History X, this movie contains disturbing scenes where the protagonist articulately expresses his bigoted ideas. There are other intelligent characters who argue back, but not everything he spouts gets answered, so I can understand why this movie makes some viewers uncomfortable. In one particularly distasteful scene, Danny mocks Holocaust survivors, and while they do answer him eloquently for the most part, his raising of the old "sheep to the slaughter" canard is left open.

Nevertheless, this a powerful and compelling film, with a lead performance by Ryan Gosling that manages to rival Ed Norton's Oscar-nominated performance in American History X. We see early on that Danny is capable of doing appalling things, but his moral conflicts are then presented so persuasively that we cannot help but sympathize with him. The climax is painfully ambiguous. Those who are looking for easy answers may want to skip this film. But they will be missing out on what is easily the most authentic and profound exploration of Jewish self-hatred ever portrayed on screen.

Tuesday, June 20, 2006

Linguistic creationism

I have recently been discussing with other bloggers Torah-science conflicts. These issues include, but are not limited to, the age of the universe; Darwinian evolution; and the history of mankind. I have examined this subject on my own for over ten years. One area that has been sorely neglected, but which interests me, is the evolution of languages. The traditional Jewish view holds that Biblical Hebrew is historically the first language of mankind. Yet that notion does not seem tenable in light of modern linguistics.

Hebrew, along with Aramaic and Arabic, is classed as a Semitic language. Medieval rabbis recognized the similarities between those three languages. In so doing, they became among the first people to notice, and thoroughly document, systematic sound shifts between languages. For example, they noted that Hebrew words with the letter zayin often resembled Aramaic words with the letter dalet: in Hebrew zachar means "to remember," whereas the Aramaic equivalent is dechar. These observations came hundreds of years before linguists began noticing systematic sound shifts between Indo-European languages, comparing, for example, English fire with Greek pyra.

It should be noted, however, that the medieval rabbis tended to assume that Aramaic and Arabic had sprung from Hebrew. Modern linguists would say that all three languages are descended from an extinct tongue they call Proto-Semitic. The existence of this tongue is purely hypothetical, of course, but it's not unreasonable to think that languages existed which left no written evidence. Most languages in the world today were not written down until recent times, because the populations who spoke them were illiterate. These include the languages of dwindling indigenous tribes in America, Australia, New Guinea, and elsewhere. English itself did not have a regular writing system, apart from occasional inscriptions in an old runic alphabet, until missionaries traveled to the British Isles sometime around the seventh century and gave us the Roman alphabet that we use today.

For those who accept the possibility that Adam had ancestors, the language issue shouldn't be much of a problem. Adam spoke Hebrew, but earlier human beings spoke other languages. When Genesis describes the rise of mankind, it is primarily talking about the rise of human civilization, not the rise of the human species. Hebrew may not have been historically the first language, but the Old Hebrew alphabet, which through the Phoenicians gave rise to the Greek and then the Roman alphabet, is widely recognized as historically the first alphabet, or at least the earliest one known.

Curiously, I have not seen many Orthodox Jews address this issue, even when talking broadly about biological evolution and human history. I have encountered one book which could be described as a work of linguistic creationism: Isaac Mozeson's The Word: The Dictionary that Reveals the Hebrew Roots of the English Language. It is, I'm afraid, a pretty shoddy job that invites ridicule. Mozeson's approach is to look for superficial similarities in sound and meaning between Hebrew and English words, to claim them as proof of a direct ancestral relationship between the two languages, and to ignore all the historical evidence contradicting his thesis. He establishes no systematic rules of sound change, and he seems unfamiliar with what the mainstream theories say, even though he is quick to dismiss them.

We can do better than that. I am not learned enough at this time to provide a more detailed response to the language question, but I have always held that we have nothing to fear from scientific knowledge, even if we cannot always explain a particular Biblical passage in light of a particular scientific theory. We should all be willing to admit at some point that we don't have all the answers.

Skeptics would say that I am being selective in what theories I accept. They would be correct. For example, there is no way that I will accept the idea that Exodus didn't happen. My rejection of this "theory," however, in no way implies that I must reject the scientific method of inquiry, or the many true discoveries that have resulted from application of this method. Not everything that falls under the banner of accepted scientific or historical knowledge is as firmly established as its adherents claim. The goal of synthesizing Torah and science should not be conformity to accepted opinion, or avoidance of ridicule. It should be a willingness to examine what the scientists have to say, and then make a judgment on our own.

Thursday, June 15, 2006

How chaos can be fun

Alan Dean Foster's Parallelities is a very funny book, but it's also a creepy and unnerving book, one that aims to shatter our sense of stability in the world around us. While parallel universes are a staple of contemporary science fiction, this novel does an exceptional job of conveying how disturbing the concept is, using it to explore philosophical questions about knowledge, identity, and randomness. But the book also has a sly sense of humor. Foster seeks to mess with our heads, and he has a fun time doing it.

The story centers around Max Parker, a slimy Los Angeles tabloid reporter sent to interview a rich man, Barrington Boles, who claims to have invented a machine that can break through the barrier between parallel worlds, dubbed "paras" in this novel. Max naturally assumes that the man is a typical loonie, but then the machine not only works, but has a side effect that not even Boles anticipated: it "zaps" the reporter (the scientific nature of what happens is never explained), inflicting him with a bizarre condition. At first he doesn't notice anything different, but as soon as he returns home, strange things start to happen, in several absurdly hilarious scenes. I don't want to give away too many of the surprises here, but let's just say Max has become a sort of cosmic magnet, pulling people and things from parallel universes into his world, and eventually drifting on his own into other worlds. He has no control over the process, which seems to intensify as the story continues.

That's the basic setup to what has become one of my prime book obsessions, a truly special novel that I have read cover-to-cover numerous times. I should note that I'm virtually alone in this reaction. The novel is still little-known, even among Foster fans. I have seen only one web reviewer who seems to hint at the book's greatness: "I didn't expect to sympathize with a shallow, arrogant tabloid reporter, but the unfolding of his inner self as he reacts to the wildly variable parallelities around him reveals a complex character study not promised in the opening chapters."

The slow opening chapters are, I believe, the main reason why the book isn't more famous. The entire first chapter, showing Max's regular life before it goes awry, is unnecessary and distracting. I do not exaggerate when I say that you could skip it and have no difficulty following the rest of the story. The chapter lacks the tension needed to engage us, it introduces characters who never appear again in the novel, and it depicts events that are entirely tangential to the later plot. This is the book's single and greatest flaw, and I'm sure that many readers have tossed the book aside before they had a chance to reach the good parts.

In most books and films of this genre, parallel universes are entirely distinct from our world. Foster's own Spellsinger series, for example, depicts a parallel world full of magic and talking animals. Parallelities uses a less common approach, depicting the worlds that Max visits as virtually identical to his own world, to the point that Max has great difficulty telling whether he's in his original world or in a slightly different para. Each of the "para" versions of Los Angeles has the same buildings, the same streets, and even the same people, including another version of Max! But subtle differences abound, and one of the joys of the book is seeing how the paras get progressively weirder, even as they continue to resemble Max's original world, at least superficially.

Every para he visits functions as a story in its own right, and in the process the book catalogs several genres. One of the paras, for example, is directly inspired by the works of H.P. Lovecraft, but I will say no more because I don't want to give away one of the book's great shock moments. The running joke is that every time Max thinks that his experience has reached the height of madness, and that it couldn't possibly get any weirder, it then proceeds to do just that by several orders of magnitude. We, the readers, are in a constant guessing game to see how far Foster will take the story into the realm of the absurd.

One of Foster's charms is his wry sense of humor. His prose has a delightfully smart-alecky tone, which permeates even the most mundane of lines, as in the following sentence: "While he was waiting for the deep fryer to perform its task of inserting cholesterol and fat into otherwise healthy fish, Max examined his surroundings" (p. 153). At other times, Foster's playfulness is employed for shock value: "He...sucked in a mouthful of water. Fresh, not salt. Not that it mattered much under the present circumstances. He could not breathe either one" (p. 247).

Foster's vivid prose, which constantly pushes the limits of what's possible to put into words, brings the parallel universes alive. For example, here is one passage describing a futuristic, utopia-like version of Los Angeles:
A much larger hover vehicle appeared, traveling from north to south. As it turned up Pico, it bent in the middle to make the corner, flexible as a snake. The people within were not affected. Overhead, the sky shone a deep, untrammeled blue. There was not a hint, not a suggestion, of smog, much less the gray-white ash of total devastation. (p. 204)
Foster's writing is so intricate and detailed that it further allows the surprises to creep up on us without warning. It also includes much introspection, largely because Max is so isolated by his experience. Max is shown to be dishonest, unethical, and insensitive, but he has enough real-life traits that we can relate to him as a human being. There is a scene where he sits on a diving board and looks into the stars, contemplating the vastness of the universe, and how much vaster it must be now that he knows about parallel universes.

We have the feeling that his experience as an unwilling cosmic traveler is causing him to become more reflective and considerate. When he encounters the previously mentioned utopian para, he is impressed by people's courtesy, especially when contrasted with the behavior of those in his L.A. And even the more negative paras are giving Max a deeper perspective on life. But Parallelities isn't a sci-fi version of Groundhog Day, where circumstances inspire a jerk to become nicer. Max never really gets an opportunity to change. All he wants is to return to his original, normal life, but we're never sure whether his experiences will ever make him want to move beyond his immoral lifestyle.

The novel primarily dwells on the negative effects that the experience is having on Max. Because he keeps meeting different versions of himself, his whole sense of individual identity is coming apart, a situation that makes self-preservation seem less understandable. As he ponders in one of the book's eeriest lines, "What would happen to him if he died here? To his real self? Probably, his paras would live on, including no doubt the one who occupied his life position here.... But he, him, the one Max that was Max to the Max, he would perish, permanently and forever" (p. 186). For most science fiction books and films, parallel universes are just an excuse to bring us to exotic new places. Parallelities stands as a unique example of the genre, by examining this concept more closely than usual, while at the same time never forgetting to be entertaining.

Tuesday, June 13, 2006

Modern mythology

Urban legends are so much a part of our culture, they force us to ask ourselves how we acquire knowledge about the world. So many people get attached to these legendary stories that I hesitate to discuss the topic, for fear of coming off as a party-pooper. Yet in recent years, debunking urban legends has become popular, with several TV series and websites devoted to the sport. By now, most people who have heard the old alligators-in-the-sewer tale probably know it to be false. I suppose this is all a good sign, but the cynic in me suggests that the general public is no less gullible now than it was twenty years ago. Most people still believe anything they see on television (the continuing influence of political ads makes this all too clear), even if that information includes skepticism.

It's not just the dumb or uneducated who have fallen prey to these tales. Cokie Roberts, a respectable journalist who should have known better, uncritically repeated the Internet rumor that the phrase rule of thumb derives from an old English law condoning spousal abuse. If the very people responsible for telling us what's happening in the world can't keep the truth straight, what hope is there for the rest of us?

Some of these tales come from outright hoaxes, as in the picnic etymology that proposes a shockingly racist origin to an everyday word. But urban legends aren't always the result of intentional deception. The old game of telephone may be at work, where a basically true story gets embellished beyond recognition. That's evidently how the alligators-in-the-sewers business started. In other cases, one writer's offhand speculation is mistaken for fact. That's how people came to believe, for example, that "Ring Around the Rosie" was about the Black Death. And sometimes a joke gets misinterpreted as a true story, as when the parody newspaper The Onion published an article entitled "Harry Potter Books Spark Rise in Satanism Among Children." While most readers understood the parody, some critics of J.K. Rowling's fantasy series--the very people whom the article was lampooning--cited the article as fact.

Fear and paranoia are prime mechanisms in the spread of urban legends. Indeed, many horror movies are based on old urban legends, the Urban Legends series being only one example. And today's legends often sound like horror stories of their own: the AIDS needles that pop out of trash cans, the man who wakes up in the bathtub and finds that his kidney has been stolen, the black gangs with sinister murder rituals. Many of the legends prey on racial fear in much the same way that the blood libel myth of the Middle Ages exploited anti-Semitic prejudice.

Are urban legends really anything new? A great deal of what passes for common historical knowledge is in fact folkloric. Take the stories about Columbus being the first to prove the earth round, or George Washington chopping down the cherry tree, or the Pilgrims landing on Plymouth Rock. Even the revised accounts often lack historical basis, and there is scarcely a major event in American history that doesn't have a legendary version existing alongside a more accurate account.

When we look back at all the myths and legends from the Ancient Greeks and other cultures of antiquity, we tend to assume that the people back then must somehow have been stupider than us to have believed such things. In reality, very little has changed. We've got mythologies right here in the present day, and they show no signs of waning in our age of Internet literacy. Discerning the valid information from among the junk is a far more elusive skill than most people realize. There simply isn't enough time in our short lifespans to look at everything with a critical eye. At some point, we have to trust our instincts.

Thursday, June 08, 2006

Running away from one's shadow

For reasons I've never fully understood, Americans have an aversion to naming the dominant racial group in East Asia. When I was growing up, the accepted term was Oriental. It was no more controversial a term than Asian is today. For example, The New York Times in 1985 described Haing S. Ngor as "the first Oriental actor to win an Oscar." But around the early '90s, the term fell out of use and came to be regarded as an anachronism if not a slur, much like the word Negro had done a generation earlier. Suddenly, everyone was expected to say Asian when referring to people of Chinese, Korean, or Japanese descent.

In this new scheme, Arabs, Iranians, and Turks are not Asian, even if they live on the Asian continent, and even if their ancestors lived there for thousands of years. It's true that other continental terms have also acquired a racial sense; after all, people often use European to mean "white" and African to mean "black." But Asian is the only continental adjective that has been narrowed to such an extreme that it now refers only to a segment of the continent's traditional boundaries.

How did this happen? I'm not sure. The narrow sense of Asian goes back a long time. For example, a 1979 journal article notes that "Most literature on ethnic studies has...narrowed the term 'Asian Americans' to refer mainly to those people coming from the Far East and Southeast Asia." But this very same article unhesitatingly uses the phrase "Oriental children." It took a while before the media decided that Asian was not simply an alternative, but the only acceptable term.

The most frequent explanation offered is that Oriental is too Eurocentric. The term comes from a Latin word meaning "to rise," and it was first applied to the area now called the Middle East, because that area lay in the direction where Europeans observed the sunrise. (This meaning is still used in the phrase "Oriental Jew," though because of its association with the Arab world, it is even applied to Jews from Morocco, which ironically is farther west than almost all of Europe!) Eventually the term was transferred to the Far East, and that's when it acquired its racial connotations.

But if Eurocentric terminology is inherently offensive, why do we still say Middle East, or for that matter Western civilization? Besides, according to the Online Etymological Dictionary, the word Asia itself may come from an Akkadian word meaning "to rise," and, if so, it came to refer to that part of the world for precisely the same reason that Oriental did.

In fact, etymology has very little to do with why we regard certain words as offensive. For example, Negro is simply the Spanish word for "black." In the mouths of English speakers, it acquired a derogatory sense over time. That's apparently also what has happened with the word Oriental.

The term Asian American gained popularity around the same time that African American did. The apparent purpose of these coinages was to replace the language of race with the language of geography, on the assumption that if people stop talking about race, they'll stop thinking that way as well. Unfortunately, this assumption reflects a naive understanding of how language works. When people say Asian today, they mean exactly what people from a generation ago meant when they said Oriental. The thinking hasn't changed one bit; only the label has. All we've accomplished is taken a previously race-neutral term (Asian), destroyed its original geographic meaning and given it a new racial meaning.

Not that I'm advocating going back to saying Oriental. The shift is here to stay, and those who object are fighting a lost cause. Hopefully as time goes by, we will no longer need to use racial terms altogether. But we haven't reached that point today. America remains a race-conscious society, and that's not going to disappear just because we change the way we talk. The notion that it will is the linguistic equivalent of running away from one's shadow. As long as race continues to play a role in society, racial thinking will follow us wherever we go, no matter how many changes we make to our speech. Only by working directly on people's attitudes can we make a real difference.

Wednesday, June 07, 2006

The genius of J.K. Rowling

While I always enjoyed the Harry Potter books, my admiration for them grew over time. I first heard about them in 1998, when my mother described them to me as the adventures of a group of nerdy kids who are also wizards. It sounded like the typical book my mother likes to read, but as the series increased in popularity and a fourth book awaited publication in 2000, I decided to read the first three.

I remember my first night reading Harry Potter and the Sorcerer's Stone. I was reminded of Roald Dahl, one of my favorite authors as a kid. All the ingredients were there: the child who lives a cruel life and is then granted a journey into a wonderful magical world; the wretched characters who eventually get their comeuppance; and the bizarre types of candy that are right out of Charlie and the Chocolate Factory.

But I also noticed some differences. Not only was the scantly illustrated book at a much higher reading level than Dahl's fiction, the story itself was far more sophisticated than the modern fairy tales for which Dahl is famous. Rowling's writing is never condescending; it never explains straight out the rules of the magical world. Like an adult fantasy book, it allows readers to figure out these rules for themselves as the story progresses. The first chapter may even confuse children with its bewildering sequence of strange events that only gradually become understandable.

The surprise ending caught me off guard. In hindsight, it didn't seem like an original idea. But I simply wasn't expecting such a twist. Having associated the novel in my mind with Roald Dahl's fiction, I took all the characterization to be black and white. I thought I had the characters pegged, but their motives turned out to be more complex than I had initially assumed.

The first book remains my favorite of the series, with the next three close in line. But the fifth and sixth have not pleased me as much. They get bogged down in plot mechanics and have less of the delightful humor that characterized the earlier books. I know there are many fans who prefer the later books, but personally I think this reflects people's attachment to the unfolding storyline. Everyone wants to know what will happen next.

At first, I did not accept the popular idea that the Harry Potter books are classics. I considered them good entertainment, but hardly groundbreaking. The premise of the series, in which kids attend a school for witches and wizards, seems to lack the uniqueness of true fantasy classics like The Wizard of Oz and Lord of the Rings.

Though the broad sketch of Harry Potter is fairly conventional, the details are incredible in their scope. They bring to life a magical society that exists within our own "muggle" world and that is kept secret by a bureaucracy with its own rules, history, and politics. Rowling's approach differs from that of traditional fantasy, which typically takes place in a medieval type of setting. Even when fantasies are set in modern times, as in Charles De Lint's "urban fantasies," the magic itself is still rooted in the past. Rowling's magical world resembles an advanced technological society. But because that society is fueled by magic and not by science, she has more freedom than a science fiction writer, who is less likely to envision a future where people communicate through owls rather than through telephones. This flexibility makes it easier for us to believe in the story's assortment of creatures from classical mythology and medieval folklore. Rowling, who studied classical literature in college, gives these beings a feeling of reality that is missing from most fantasy books.

But the intricacy of Rowling's world cannot fully explain the popularity of the series. If people love the books, it is in part because they love the characters, who are drawn with a surprising level of realism. Harry, Ron, and Hermione are not high school stereotypes but fully fleshed out characters with many distinctive traits. Harry may look like a nerd, but he's brave, quick-thinking, and altruistic. He also has several flaws. Hagrid may seem on the surface like the standard "dumb giant," but he's more nuanced than that. We get to know these characters inside out as the series progresses. Not all the characters are well drawn. The main villain, Voldemort, always seems vague and generic. But some of the other menacing figures are fascinating, like the ever-elusive Snape, whose true nature has yet to be explained.

The books also have a strong satirical undercurrent aimed at schools, the media, and politicians. The first book contains the following amusing line: "Professor McGonagall watched [her students] turn a mouse into a snuffbox--points were given for how pretty the snuffbox was, but taken away if it had whiskers." The later books lampoon tabloid reporters and obtuse educators, and the sixth book seems to comment indirectly on the War on Terror, though I'm sure that wasn't a part of Rowling's original plan for the series.

One of the most overlooked qualities of the series is the clever plotting. The plots are better crafted than most cinematic thrillers. Rowling never cheats by creating a surprise that doesn't fit the earlier events. I suspect that she had worked out the outline of the entire series before she had published even the first book. As each new layer of the story gets uncovered, I am astonished at how well it meshes with everything that came before.

In this regard, the sixth (and most recent) book is particularly interesting. I have found that readers have two entirely different interpretations of the ending. I suppose we will find out which one is correct as soon as Rowling releases the seventh (and presumably final) novel. This is one of several plot mysteries in the series that explore the theme of how people are too judgmental, making assumptions about other people before knowing all the facts. Harry is unfairly judged by people who put too much stock into rumors, yet he himself misjudges other characters. The plot twists which emphasize this theme are not simply manipulation; they contain lessons about human nature.

Will Rowling provide a satisfying conclusion to the series? I don't see any reason why not. She has shown an absolute mastery over this genre. That's what I failed to grasp when I first thought her books unoriginal. I've since come to the conclusion that it's much less important where artists get their ideas as the way they handle them. I'll take Rowling's finely executed novels any day over an original but sloppily written work.

Tuesday, June 06, 2006

Introducing myself

I am pretty new to the world of blogging, and I consider this an experiment. I don't know how regular my blogs will be. In this post, I will summarize some of my views that I may elaborate in future posts.

I am a single, 29-year-old Jewish male from Baltimore, Maryland. My interests are broad and varied. I have an interest in blogging because I love to write and correspond with people, and I have posted on Internet discussion boards for years. Essay writing is a prime interest of mine, and I would do it even if I had no audience, but I do enjoy feedback.

I am opinionated, while at the same time openminded. People may find that my opinions seem rather unusual in combination. I am a Democrat who opposes President Bush and thought the Iraq war was a bad idea from day one.

I am an Orthodox Jew who refuses to identify as either Haredi or Modern. I think these labels greatly oversimplify the differences among Orthodox Jews. Orthodoxy is a spectrum, not a choice between two distinct camps, and I've known numerous people who do not fit neatly into either category.

I am a movie buff and a prospective reviewer. I am an avid reader of Roger Ebert. I have submitted 40+ comments to the Internet Movie Database, and here they are.

My interest in literature is more selective. The novels I love tend to have the following characteristics: (1) They transport me to another world (2) They are full of vivid sensory prose that puts me right in the moment (3) They appeal to my fears and anxieties (4) They expand my perspective. You can view my profile for further information on my reading interests.

For the last few years, I have had a fascination with the history of languages. I have read in depth about the history of the English language. This interest changed the way I look at language: I have become more cognizant of the fact that all languages are in a continual process of change, and this observation has affected my outlook on contemporary language issues. I strongly considered switching my college major to linguistics. But I decided that I was more interested in the art of language than the science of language.

I recently joined a Toastmasters club to help improve my public speaking skills. Although I'm an introvert and I do have some stage fright, I love speaking in front of people. I have tremendously enjoyed my year in this club, and it has given me a new perspective on the art of communication.

I think that about covers most of the points I want to address. I hope to greatly expand on these ideas in future posts.