Friday, November 17, 2006

In defense of Orthodox liberalism

Cross-posted at DovBear's blog

R. Harry Maryles writes in this this post, "It is a fact that the conservative principles are generally more in line with Orthodox Judaism than are liberal principles. Although that isn’t 100% the case, I think it is true most of the time."

I care to disagree. But I should note that if Harry had begun the sentence with "It is my opinion..." rather than "It is a fact..." I would not have objected. He is entitled to his views, but they are debatable. Still, I have heard similar sentiments from many other frum people, and it is a topic worth discussing.

A large part of what has inspired the rightward shift among frum voters in recent decades parallels the influences on evangelical Christians: the "traditional values" of which the Republican Party has appointed itself the sole bearer. While those values have nothing to do with the conservative philosophy of unfettered capitalism, Republican politicians created a marriage between these two meanings of conservatism. It is an unhappy marriage. Religious conservatives were duped by Reagan, and many of them have recently woken up to the fact that they've also been duped by Bush.

I've always been amazed at the mental acrobatics of those who argue that Judaism fits the philosophy behind economic conservatism. Their rationale depends partly on the standard but inaccurate translation of tzedakah as "charity." In modern American society, charity is simply a praiseworthy act. In ancient Israel, however, tzedakah was the law of the land. The conservative tenet that we must encourage volunteerism in place of government aid runs contrary to much traditional Jewish thought.

When I raised this point on Harry's blog, Bari noted differences between the ancient Jewish system and modern liberal programs. For example, in halacha a person gets to decide which poor people to give to. When I pointed out that one of the highest forms of tzedakah is giving to someone unknown, Bari replied, "And it's theft if you take it from me to give it to someone else who I don't know. When the govt. does it, maybe it's not theft, but it's not right Al Pi Din Torah."

Bari is walking on thin ice here. Either you think that it's okay to have the government enforce donations to the poor, or you don't. If you don't, but you make an exception for Judaism's specific mandates, and you declare anything else to be "theft" or something close to it, then you're not being philosophically consistent.

Having said that, I should point out that there is a good deal more to politics than philosophy. I don't fault any frum person for taking conservative positions on particular issues. There is room in Yiddishkeit for a variety of political perspectives, once we move past ideology and get into specifics. The problem is that many of us have a hard time stepping outside our own political perspectives and acknowledging that other viewpoints have legitimacy. When we feel strongly about an issue, it is easy to fall into the trap of ascribing simplistic motives to the other side and of not recognizing how complex the issue really is. I'm sure I have been guilty of this before, but I definitely see it in frum conservatives. It is implicit in Harry's statement that "conservative principles are generally more in line with Orthodox Judaism," which almost makes it sound like we can just do a head-count of political positions and declare this one as being more in line with Torah values, that one as being less, and so on.

So let me be clear: On almost any major issue in American politics today, a case could be made for both sides without sacrificing one's commitment to Torah principles. There are possible exceptions, like gay marriage or opposition to stem-cell research. But most issues fall into one of the following three categories:

1) Issues where the Torah's view is irrelevant. One example is gun control. Occasionally I have heard Orthodox rabbis on both sides of this debate attempt to "spin" their favored position as more Torah-based, but their arguments are unconvincing, for the disagreement (properly understood) does not stem from any fundamental difference of values and has no real bearing on halacha. So too with the vast majority of American political issues.

2) Issues where the Torah's view is relevant, but where there is still rabbinic support for both sides. An excellent example is the death penalty. Harry's mentor R. Ahron Soloveichik not only opposed the death penalty but believed that every Jew should.

3) Issues where Jewish law may seem more in line with one side, but where pragmatic considerations might tilt it the other way. This category includes many "social issues" that religious conservatives focus upon, such as abortion.

In sum, I welcome debate on the specifics of any issue. At the same time, I believe that there is much in common between traditional Judaism and many core liberal ideals. It's not absolute, but then neither is the pact that R. Lapin and co. have attempted to make with the Christian Right. And frankly I think the latter poses a greater danger to our freedom as Jews than the fuzzy liberal tolerance that so many frum people claim to despise. Christian conservatives may play nicey-nice to us, but in the long run they're being disingenuous, as becomes clear in the slip-ups by the less shrewd among them (e.g. Katherine Harris). You have to be extremely deluded to believe that the Christian Right views us as an equal partner. No doubt we should stand up for what we believe in, whether economic or social, but we must also be careful not to be so blinded by ideology that we enter into an unhealthy relationship.

Saturday, November 11, 2006

Too stupid for chess


It's one thing to know that someone is smarter than you, it's quite another to be reminded of that fact week after week after week.

From my childhood onward, I used to play chess regularly with a friend of mine. He beat me a good majority of the time. This made the game a tiring experience for me. I could have viewed my losses as a challenge, an incentive to work harder. But these were times when all I wanted to do was relax. The mental effort needed to keep track of a chess game just didn't inspire me.

Occasionally, we played other games, where our skills were more even. We even invented a new game we called "losing chess." While we weren't the first to come up with either the idea or the name, our version was somewhat original. In the "standard" form of losing chess, the object is to force the opponent to capture all your pieces, and the king holds no special importance. But in our variant, the object was to force a checkmate on your own king--in other words, to expose your king to capture when the opponent is threatening no other piece. This game turned out to be rather interesting and unpredictable. I tended to win the game (or "lose," if you will) slightly more often than he did.

I still tried to improve my skills in regular chess. I read books about chess strategy. I downloaded a fairly decent chess program to examine the strategies of a computer player. That actually kept me busy for some time, but as with all other computer games played with myself, I grew bored of it. In any case, my friend continued to beat me.

Only in the last few years did my interest in checkers get revitalized. Windows XP comes with a game called Internet Checkers. The computer sets me up with another actual player. The only information I get about the other player is his language and skill level. Players get a choice of three skill levels: Beginner, Intermediate, and Expert. I am set up with a player of the same skill level as long as one is available. Since the program rarely takes much time in setting up a game, it seems that numerous people around the world are constantly using this software. The only other possibility is that I'm unknowingly being set up with a computer player at least part of the time, though the instructions give no indication that the program ever does such a thing, and I believe I can tell the difference between a computer player and a human.

I can send the other player a message from the following pre-set list: "Nice try," "Good job," "Good game," "Good luck," "It's your turn," "I'm thinking," "Play again?," "Yes," "No," "Hello," "Goodbye," "Thank you," "You're welcome," "It was luck," "Be right back," "Okay, I'm back," "Are you still there?," "Sorry, I have to go now," "I'm going to play at zone.com," ":-)," ":-(," "Uh-oh," "Oops!," "Ouch!," "Nice move," "Good jump," "Good double-jump," "King me!" I assume that the comments get translated into whatever language the other player speaks. While I'm usually set up with another English speaker, I have also frequently been set up with players who speak German, French, Turkish, Arabic, Hebrew, Thai, and many other languages. No wonder there seem to be players available at all times of the day, and the night as well.

When I first started playing, I had very little knowledge of the game. I had played checkers before, and I was familiar with the rules. But I knew no strategies or techniques, except for a belief that I should avoid moving pieces in the back row. The first strategy I devised was a simple copycat routine: as long as I was the player who moved second, I could simply imitate the other player, doing a mirror image of his every move. Of course, the game always reached a point where I could no longer do this. Sometimes the opponent's opening move made it impossible for me to follow the copycat routine. But this routine usually got me to a point where I could find an advantage, and I did often win the game when playing as a Beginner.

I began to learn some tricks. One rule that casual checkers players frequently ignore is that when you can capture, you must capture. I think people avoid this rule because they feel that it limits their choices. But the computer versions of checkers require you to play by this rule. I began to discover that this rule is what makes the game so fascinating and unpredictable. Because you can force your opponent to capture a piece, you can make him do things he didn't expect to be doing, and then gain a sudden advantage. The simplest form of this technique is when you force your opponent to capture one of your pieces, and you end up taking two in return. I discovered this technique on my own, in a situation that frequently occurs toward the beginning of the game, at the side of the board. I became quite adept at making my opponent fall for this trick. But the opponent must be gullible enough to put himself into this vulnerable position. Moreover, I had to learn to avoid putting myself into this position. Because this is one of the simplest tricks, even players with modest experience usually know better than to allow it to happen. But they remain prepared should the opponent make himself vulnerable to this move. It's one of the litmus tests early in the game that makes it easy to tell the experienced players from the novices.

After I discovered that I was beating Beginner players the majority of the time, I decided to move up my skill level to Intermediate. Soon I moved it to Expert. Of course there was no guarantee that I was playing someone who actually was on that skill level. All it meant was that the player identified himself as being on that skill level. But I did beat Expert players less often than Beginner players, and while the challenge was intriguing me, I sometimes went back to the Beginner level, for relaxation purposes. I had given up the copycat routine and started to learn more sophisticated strategies.

Finally, I got a book out from the library on checkers techniques. Reading this book greatly refined my skills, teaching me techniques that I still use to this day. For one thing, I radically changed my opening strategy. I used to open the games most often by moving my side pieces first. Apparently, this is a common error that beginners make. They move the side pieces because the side pieces are less vulnerable. Unfortunately, this strategy is weak in the long run, because it doesn't help break through the opponent's defenses. It is best to start by moving the pieces in the center of the board, and to keep one's pieces close together. I also learned that moving the back row is not necessarily to be avoided. What I should avoid is moving the second and fourth pieces in the back row, but the first and third often can safely be moved early in the game. I also learned to keep my own double-corner well-protected, and to work on attacking the opponent's double-corner.

Getting used to these new techniques took some time. At first, I experienced some difficulties, and it appeared that I was getting worse, not better. But I soon realized that I was simply taking time to get accustomed to using the techniques properly. This new opening strategy made it easy for me to fall into a devastating trap, where the opponent would get a two-for-one and a king. But as I learned more caution, I began to see the great advantages of this strategy. I began to win games without using any tricks, simply by having a strong defense and by either making the opponent's back row vulnerable or putting him into a position where he couldn't move except by exposing his pieces to attack.

Of course, I also learned some more advanced tricks, not just from the book, but from a checkers program I downloaded where I got to examine a computer player's strategies. I learned complicated moves that involved making my opponent capture one piece, then another, and then finally launching a devastating attack that he had no idea was coming. To play checkers well, you have to be able to think four-dimensionally, to anticipate future moves by visualizing the board in other configurations.

The strategies for handling game endings are just as complex. For starters, if you have two pieces left and your opponent has only one, you can definitely beat him--but it requires some practice to learn how. If you have three pieces and your opponent has two, your best bet is to force him to take one of your pieces in exchange for one of his. It is possible for him to prevent you from doing this, using one or both of the double corners, and such a game will end in a draw even though you have more pieces.

I wanted a way to track my progress. Internet Checkers does not record wins or losses. So I created my own file where I kept track of that information. Next to each skill level, I wrote how many games on that level I won, how many I lost, and how many turned out as draws. Here are my current stats, which are still ever-changing:

Expert: 2697/964/365
Intermediate: 422/141/38
Beginner: 830/101/278

According to this record, I win approximately two-thirds of the time at any skill level. But I suspect my Beginner and Intermediate record would be much higher if not for the fact that most of the games I played at those levels occurred long ago, before I improved my skills. I regularly play the Expert level now, only rarely venturing to lower levels.

I have my own rules for determining whether I have won, because Internet Checkers has an annoying but understandable loophole: any player can abandon a game at any time. Thus, if a player is losing, he may simply quit without selecting the "resign" option. Sometimes this is not mere rudeness: computer and Internet glitches can cause a game to be ended prematurely. But I made a personal rule that if a player quits and I am clearly ahead, with more pieces, I record that in my personal file as a win. Similarly, if a computer glitch causes the game to be terminated and the opponent is ahead, I record it as a loss. If I'm at the end of the game and the opponent refuses to draw even though he clearly can establish no advantage, I quit the game and record it as a draw. I have recently adopted a 40-move rule used in official competitions, which says that if a player can gain no advantage in 40 moves, the game is automatically a draw.

These personal rules which I have concocted are relevant only to myself. The opponent doesn't know that I play by them, because I have no way of communicating them to him, given the limited pre-set list of comments we can pass between each other. I am somewhat amazed at the lame tricks that some of my opponents attempt to pull. For example, if they are losing, many of them will ask for a draw, hoping I will accidentally click "yes" when the pop-up window appears. On rare occasion I have even fallen for this trick, but so what? It makes no difference as to the truth of who won, and the truth is all that matters to me when I keep my personal record. I view these games as practice and recreation, and the recognition of having the computer say "You won!" means nothing to me.

Apparently, all this practice has paid off. After coaxing my friend to play checkers with me, I discovered that I was suddenly much better at the game than he is. He's still good--he has a strategic mind. But he's nowhere near my level, and he hasn't come close to reaching it. I've actually become a sort of tutor to him, showing him some of the techniques I've learned and giving him corny advice, like "The best offense is a good defense."

Why did I discover such skill at checkers, when I was always so hopelessly bad at chess? Part of the reason is that I stumbled upon the software that allowed me to compete with real players. This not only has kept me from growing bored of the game, but has been enormously good practice. Most of what I know now, I learned simply from the experience of playing, not from the strategy books. Perhaps if I were to start playing Internet versions of chess, my skill at that game would improve as well. But so far I lack the interest. Checkers just seems more suited to me than chess. It's a much simpler game, with far fewer rules. You have basically only two types of pieces, and you use only half of the board space that you use in chess. I'm the type of person who has trouble multitasking, processing many different things at once, and that may be the key to why I find checkers easier to deal with. I'm not going to admit that it's simply because I'm too stupid for chess.

Wednesday, October 25, 2006

Moral philosophy

I used to believe that atheists have no basis for accepting the idea of morality. I no longer believe this is true.

I have recently been discussing on David Guttmann's blog the issue of morality and religion. I admit that my views on this issue have significantly evolved in the last ten years. For my more recent views on the matter, see this post, which is based on an essay I wrote at the end of college. An essay I wrote at the beginning of college, however, presents quite a different perspective. I present an abridged version of the essay here:
Why must we act morally in the first place? An ethical system founded upon a religion that worships God is, in principle, more rational than an ethical system that denies this basis.

Personal feelings are too subjective to base an entire code upon them. The conclusion that "murder is immoral," for example, is not a direct implication of the premise that "murder does not feel good (to some people)." A person may be firmly ethical without following his or her emotions, while a person who follows his or her emotions may be morally lacking.

The argument that morality is a necessary part of society is also insufficient, because it is possible to behave contemptibly without harming society as a whole. While all cultures retain values at their foundation, they are free to adopt new tenets and discard old ones as time goes by. Morality, in contrast, is a permanent value system. It is not a product of Western society but rather an ideal toward which most societies strive. To say that murder is immoral is to imply that there is no possibility of it becoming moral anytime in the future, even if society would begin to approve of it. This is particularly evident in such topics as abortion or euthanasia, where people do not agree on the definition of an ethical concept. Their goal in debating the issue is to make society acquire a better understanding of morality. Since no society is infallible, no society provides us with the ultimate philosophical basis for ethics.

Another possibility is that disregarding proper ethical standards is self-destructive. It is certainly true that many people's moral actions are motivated by a desire for self-preservation. Obviously, most of us will follow the law of the land so as not to be punished. More to the point, all our actions have natural consequences, and for some moral actions, the consequences on the person who performs them are favorable. For example, the case for environmental protection is strengthened by the fact that problems in the environment can have hazardous effects for all the life on earth, including ourselves. On the other hand, the principle does not accurately describe all situations. Throughout history, evil societies and people have thrived while innocent individuals suffered. How does the self-preservation theory account for cases when criminals escape justice, even if such cases are uncommon? In any case, the suggestion that the goal of morality is self-preservation trivializes the concept, which has nearly always been understood to go beyond self-interest.

That last point exemplifies the problem with all these theories, which is their inconsistency with the morality understood to exist in our daily lives. If morality were merely a matter of personal taste or choice, as some philosophers have suggested, that would fail to explain most people's passionate hostility toward opposing moral philosophies. The passion implies that in most real-life situations, morality is assumed to occupy an objective reality of its own. In contrast, the previously mentioned theories invoke a conception of ethics that is more narrow and vague than the conception most people apply in practice.

Despite the general perception that morality is objectively true, people tend to relegate it to a separate realm of reality. For example, most people assume it to be an objective fact that advocating murder is "wrong," but would probably treat a comment that advocating murder is "inaccurate" with bewilderment. Moral judgments cannot be measured by accuracy, they would say. On the other hand, the same people would probably agree that the immorality in a doctrine of racial superiority is intrinsically related to the fact that it is false. The identification of racism as morally wrong in addition to being false is, like the first example, unprovable. Nevertheless, in the second example, we notice a logical connection between the realms of fact and value. It is inherently impossible to "prove" statements about how people should behave, yet such statements still constitute a kind of philosophical knowledge. This is clear because we so frequently use demonstrable facts to back up moral propositions.

If we assume that morality implies a system occupying a sphere of reality, the question presented at the beginning of this essay--Why must we act morally in the first place?--should be rephrased: How do we know that morality exists to begin with? That is, what knowledge confirms our general belief that all people must behave in certain ways and not others? From a rational standpoint, the assumption that morality exists is not self-evident, even though many people--including many religious people--treat it as such. It is rather a logical implication of the fact that it is God's will. Because God created and is in control of the world, He wants us to behave in specific ways, to satisfy the purpose of creation. This also provides the strongest rationale against destruction toward nature and society--because it is all part of God's creation.

Without God, there is no source by which to judge any statement on how we should behave as true. All we can say is that some people are motivated to behave that way, yet that does not tell us whether people should behave that way. Religion allows us to view moral propositions as truths rather than simply as preferences, instincts, or rules designed to maintain social order.

Accepting God's will as the ultimate basis for ethics does not preclude the previously mentioned motivating factors behind moral behavior; it simply gives them a unified point of reference. Compassion still plays an important role by enforcing moral rules deep within our psyche. Furthermore, even religiously based ethical systems use the standard of what is beneficial to society to decide on specific moral issues. Finally, religion embraces a version of the self-preservation argument, by its conviction that God punishes all evildoers. This belief, though, is based on faith rather than observation, whereas the secular version of the argument is an attempt to explain morality without reference to God.

I am not implying that those who do not recognize the ultimate basis for ethics are necessarily amoral. Divine authority is not the only cause of moral behavior, although it is the ultimate rational basis for ethics. The bottom line is that in the absence of religion, there are no ultimate grounds for condemning a person who chooses to behave immorally. Why is the evildoer's choice inferior to anyone else’s choice? None of the non-religious explanations for ethics answer that question; they merely clarify why some people prefer certain ways of behaving over others. Only when we recognize that the ultimate moral authority is God do we have a universal explanation for morality that applies equally to all people in all cultures, regardless of what the people or the cultures themselves may believe.
I actually still accept some of the ideas expressed in the above essay, even if I reject the general thesis. The philosophers who have attempted to root morality in social pragmatism or personal preference have not provided a convincing case for ethics. But I recognize now that basic morality is deducible without believing in God. I believe that the purpose of religion is to move us beyond this basic level in an attempt to perfect the world.

Friday, October 13, 2006

Teleporatation and the Clone Test

According to recent news, physicists have succeeded in teleporting a combination of light and matter, transporting the information over a distance. The news reports have hyped this achievement as the next step in a progression that will end in "Beam me up, Scotty!" transportation of human beings, the kind where you get "zapped" and reappear someplace else.

It's an intriguing possibility, but one that has always disturbed me. Wouldn't you be a little scared to go into such a machine, even if you'd seen it run successfully on hundreds of previous subjects? I'm not talking about the possibility of a disastrous malfunction. I'm saying that the whole idea of teleportation presents some curious philosophical problems, even if the process itself is foolproof.

I wouldn't have so much of a problem if I was assured that the machine was merely moving all the particles of my body to a different location. But not all science fiction writers have conceived of teleportation in that way. For many of them, teleportation means actually destroying all the particles, all the atoms, all the cells and flesh and tissues in your body, and reconstructing it using different material in another location. For anyone who isn't disturbed by this idea, I propose a simple test: would you be willing to be killed, if you were assured that a clone with all your memories would be created in your place?

My intuitive repulsion at this idea stems from my belief that there's an essential "me" contained within my body, that can't be reduced to the sum of my body's material. I'm perfectly aware that, due to growth and regeneration of cells, I'm not actually composed of the same material as I was a decade ago. But I carry with me a sense of self from every moment to the next, no matter how much my body changes.

Strangely, many reductionist scientists think that this "me" is an illusion. In the words of the late Francis Crick, from his book The Astonishing Hypothesis, "You, your joys and your sorrows, your memories and your ambitions, your sense of personal identity and free will, are in fact no more than the behavior of a vast assembly of nerve cell and their associated molecules." My response to Crick, or to anyone else who holds such a belief, would be to subject him to the Clone Test.

Friday, September 22, 2006

Of Mountains and Planets

A couple of days ago I happened to watch the 1995 comedy The Englishman Who Went Up a Hill But Came Down a Mountain, a film that naturally brings to mind a recent controversy. I'd intended to see this film for quite some time. I've long been an admirer of the actor Hugh Grant even though I have disliked many of his films, including the massively overrated Four Weddings and a Funeral. Only in recent years have I warmed up to his work, most notably with About a Boy and Love Actually. He has a particular presence that shines through even his lesser roles.

I got a little worried by the film's opening sequence depicting an old man telling a story to his grandson. Movies about adults telling stories to kids tend to have an artificial feel (though there are exceptions, like The Princess Bride). Fortunately, the film doesn't dwell on this contrived story device, and it quickly becomes an engaging British comedy that is hard to look at today without being reminded of the Pluto controversy.

The film takes place during World War I. Two English mapmakers (the younger one played by Grant) are sent to a Welsh village to survey a local mountain to see if it's really a mountain. Their readings show it to have only a 986-foot elevation, which means it has to be demoted to a hill, because it falls short of the 1,000-foot minimum needed to be considered a mountain. The villagers are tremendously upset by this revelation. As the grandfather narrator explains, "The Egyptians built pyramids, the Greeks built temples, but we did none of that, because we had mountains. Yes, the Welsh were created by mountains: where the mountain starts, there starts Wales. If this isn't a mountain...then [Grant's character] might just as well redraw the border and put us all in England, God forbid." (A great deal of the movie's humor comes from the cultural pride of the Welsh villagers and their antagonism toward these English outsiders.) Grant insists that he's only a scientist, out to discover the truth, and that the mountain, hill, or whatever is still a wonderful landmark regardless of its height. The villagers won't have any of it. They quickly craft a plan to fill in the missing 14 feet, while devising ways to keep the mapmakers from leaving town. The film manages to take this premise and stretch it to 90 minutes, with even a love story thrown in to boot.

For those who've been following recent news, does any of this sound awfully familiar? The film is from 1995. I've heard that the controversy over Pluto's status goes back to 1992, though I personally didn't hear about it until a few years ago. I doubt that the people who made this film had it in mind. But it's hard not to notice the parallels.

The controversy was provoked by the discovery that, well, Pluto is too small to be a planet. But what exactly defines a planet? Given that almost all our knowledge of planets comes from our solar system, there's very limited information to work with. You can call Pluto a planet, if you like. The problem is that, to be consistent you then have to include in your definition hundreds of other objects in the solar system that have not previously been considered planets. (Actually, they've previously been known as minor planets, or dwarf planets.) Not only are some of those objects larger than Pluto, but Pluto itself lacks many of the characteristics that the other "official" planets possess, such as a uniform orbit.

But the demotion of Pluto was met with outrage by some, depressed resignation by others. Part of the problem is that it's the only "planet" discovered by an American. When you listen to people's reaction to the demotion, you hear echoes of an emotional plea. As one curator of the American Museum of Natural History put it, "We had enormous numbers of telephone calls and I would say things that verged on hate mail from second-graders--very angry children who said, 'What have you done? This is the cutest, most Disney-esque of the planets. How could you possibly demote it?'"

Of course, unlike in the film, there isn't going to be any campaign to add piles of dirt to Pluto so that it qualifies as a planet once again. With no way to reach Pluto, much less change its appearance, the best we can do is argue about definitions. Still, the clash of science with culture must be something of a universal theme. Someone now ought to write a sci-fi parody titled The Astronaut Who Landed on a Planet But Left an Asteroid.

Wednesday, September 13, 2006

Religious tabloids

(Cross-posted at DovBear's blog.)

I see that Krum has posted about Yated Ne'eman's biases. I don't know if I've ever heard a bigger understatement. Yated Ne'eman doesn't just slant. It lies.

This became abundantly clear to me a few years ago when the paper did an article on Rabbi Yechiel Eckstein. Rabbi Eckstein is controversial because of his attempts to build bridges between Orthodox Jews and evangelical Christians. The Atlanta Jewish Times did a good profile of him in this article.

One day, Yated attacked Rabbi Eckstein. That in itself did not surprise me. It was the manner of the attack that caught my attention. According to Yated, Rabbi Eckstein had recently converted to Messianic Judaism:
In ads and books, [Eckstein's organization] has made numerous alarming remarks over the years, including Eckstein's declaration in one of his books that he had become a Jew for J. Eckstein has denied that he is a Jew for J.
Now, that's a pretty serious charge. But what was particularly confusing is that the two above sentences border on contradicting each other. One sentence says that R. Eckstein announced in a book that he'd become a Jew for Jesus, the next sentence claims that he has denied the charges. The conflation of the two sentences makes the paragraph come off rather like a Wikipedia article.

But this is one of the strange things I've noticed about Yated. It's not just that the paper lies. The paper lies, but unconvincingly. Even if I'd known nothing about R. Eckstein, I still would have been scratching my head after reading this article.

So, what is the truth of the matter? In 2001, R. Eckstein released a novel called The Journey Home. In that novel, a fictional version of R. Eckstein travels with a fictional version of a real-life Christian friend of his in the Holy Land. At one point, the rabbi says, "While I still don't believe in Jesus as the Christ as Jamie does, and view him instead as a Jew who brought salvation to the gentiles, in some respects, that is exactly what I have become--a Jew for Jesus."

Now, I can understand why some Orthodox Jews were alarmed by this statement. But that doesn't give anyone the right to lie about R. Eckstein. If Yated had clarified that this was a fictional story, and that even the fictional version of Eckstein was not embracing Messianic Judaism, the attack on Eckstein would have been more credible.

Of course, the article does quote someone defending R. Eckstein by pointing out that the claim against him was based on "words taken out of context from a story that was totally fictitious." But the article never explains what this remark means. It leaves readers with the impression that the rabbi really did become a Messianic Jew. Who cares if he claims that his comment was taken out of context? That's what they all say!

A couple of months ago, Rabbi Harry Maryles wrote on his blog about an article in Yated written by Dr. David Berger against Lubavitch. I objected to Harry's source, both because Dr. Berger is a known anti-Chabad zealot and because Yated is not a reliable source. Harry agreed with me, admitting that Yated was biased and even dishonest. But he insisted that they lie not overtly but "by omission." I remembered that Harry had on another occasion mentioned being friends with Rabbi Eckstein. Knowing this, I showed him the Yated article on Eckstein. This was Harry's response:
OK. I admit this stretches the outer reaches of truth, but although they are obviously wrong, I do not think they think they were deliberately lying. They were presenting the views of their misinformed Gedolim as fact. This is not the same as a deliberate lie.
I find the above statement disturbing to the max. So it's supposed to be better if Gedolim came up with the false information rather than the paper itself? And where did the Gedolim get the false information? At some point, somebody had to be lying--either that, or they were so careless they literally didn't care whether what they were writing was true or not. The article didn't just print a false rumor. It printed the rumor, but also printed the fact that R. Eckstein disputed the charges, and it vaguely hinted as to why the charges were disputed. But it still stated the false claim as fact.

Harry asked Dr. Berger, who is Modern Orthodox, why he had chosen to print his article in Yated. Dr. Berger contributed a lengthy explanation. He said that he was actually impressed by Yated's standards, because the editor censored a few sentences from his article. In Dr. Berger's words,
I argued that this additional information is critically important, but the editor felt that it was not important enough to overcome the larger editorial policy. I did not draw a line in the sand and allowed the deletion. While I think the editor's decision was mistaken, I admire the commitment to avoiding what he sees as unseemly content, a commitment that overrode any desire to add additional unfavorable information about Lubavitch. I ask myself if I can think of any other forum that would be so fastidious, and I come up empty.
Sarcastically, I replied, "Yeah, they think it's unseemly to mention Jesus by name, but they don't have a problem with falsely accusing someone of worshipping him."

Anyone familiar with Yated knows that distortions of this magnitude are hardly uncommon. The article on R. Eckstein appeared at least a year before the Slifkin controversy erupted, with all the lies and false rumors that went along with that account. Yated is essentially a mouthpiece for the forces responsible for the Slifkin fiasco.

Not too long ago, an article in the Baltimore Jewish News (the Orthodox spinoff of the Baltimore Jewish Times) talked about how Orthodox families in Baltimore handled exposure to secular media. A couple of the families interviewed were uncomfortable getting newspapers like The Baltimore Sun and The New York Times because of their perceived liberal and/or anti-Israel slant. One family preferred The Wall Street Journal, while another preferred, er, Yated Ne'eman.

There's nothing wrong with getting your news from the WSJ, because that publication, like The New York Times, is a legitimate newspaper, ideological slant or no. Sure, they may have occasional lapses from their fairly high standards, but at least they have standards. To prefer Yated, on the other hand, is laughable. Yated isn't a real newspaper; it's a frum tabloid rag. It's amazing to me that the same people who accuse others of being brainwashed are the most eager to brainwash themselves.

Bush is like Chauncey Gardiner

This is a post I wrote long ago, and recently posted to DovBear's blog.

Being There is one of the best films I have ever seen. It came out in 1979 but seems remarkably relevant today. I'm not the first person to have noticed similarities between President Bush and Chauncey Gardiner, but I did come to this conclusion independently.

Peter Sellers stars as a mentally retarded man, Chauncey, who, through a series of accidents, gets mistaken for a great thinker. His actual understanding of the world is so limited that he thinks a television set is people in a box. His only area of expertise is gardening. But nobody seems to notice this, and they interpret his literal statements about TV-watching and gardening (e.g., "As long as the roots are not severed, all is well, and all will be well, in the garden") as profound metaphors about the world. When he tells someone he can't read, the person thinks he means that he doesn't have the time in this busy world. (Sound familiar?) Gradually, he becomes famous, appearing on talk shows and meeting with public officials. All the while, nobody seems to notice that he doesn't have a clue what's going on! Everyone assumes he's this sophisticated, high-class thinker and misinterprets the simple, mundane things he says as brilliant kernels of wisdom. The film ends with the suggestion that the people around him might have him run for president.

None of this is meant to be taken literally, of course. The story is a satire designed to skewer the vapidity of television culture. I don't think the author of this tale, Jerzy Kosinski, ever foresaw that the situation he was describing would actually come true one day.

You might call me a Bush hater, but that would be a mistake. The Bush haters greatly overestimate Dubya's intelligence. Sure, they all say "Bush is a moron," reciting those words like a mantra, but they don't act like they really believe it. They give the man an awful lot of credit for the actions and policies of the Bush Administration, as though he's somehow in charge of everything rather than (as I see it) a puppet being controlled by others. When a reporter asked him for his opinion following the revelation of Deep Throat's identity, this was his response: "I don't have an opinion yet." Open-minded, huh? I'm sure that's how his admirers have spun it. I have a more sinister explanation: he hadn't yet discussed the matter with his advisors, who would tell him what his opinion should be.

He's like that a lot. You think he's the one who came up with those words about the sacrifice in Iraq being worth it? He never writes his speeches. I'm not saying there's anything wrong with that; most politicians have speechwriters. But the thing about Bush is that, in everything he does, he seems to rely heavily on the efforts of other people--Dick Cheney, Karl Rove ("Bush's Brain"), Condaleeza Rice. Remember Fahrenheit 9/11 and all the vacation time he spends? It often seems like Cheney's really the acting president, while Bush goes off to play golf, or jog, or relax somewhere. He has had this reputation ever since he was governor of Texas, a position, I should point out, that has so little power it's almost symbolic.

But what's truly amazing is how little the public notices this, even when they disagree with him. Take the aftermath of 9/11. He looks noble, delivers a nice speech he didn't write, and suddenly he's the most popular president ever. His handling of 9/11 was no more impressive than I would expect from any other president. He was just fortunate enough to be around when this great tragedy happened, and he has continually exploited its political value ever since. When he arrives with Bin Laden in chains, I'll give him credit. Until then, he should shut his trap.

I may be making the same mistake I mentioned before, of crediting Bush with the actions of others in the administration. It's often hard to tell who's really making the decisions, since most of Bush's public appearances are scripted. When he's forced to make off-the-cuff remarks every now and then, he ends up saying stupid things that reveal a startling lack of understanding. Rove is actually on record having instructed Bush to make as few public appearances as possible during his 1998 gubernatorial run. They wanted to keep him out of the limelight as much as possible, otherwise people were bound to notice that the emperor has no clothes.

What about his handling of debates? Wasn't there a consensus that he won all those debates against Ann Richards and Al Gore? What we have to realize is that ever since the first televised presidential debate in 1960, the press has had a strange tendency to judge a candidate's performance based on criteria that have absolutely nothing to do with the content of what the person is saying. In 2000, Al Gore was said to have "lost" the debates because of his body language and subtle behavior--he rolled his eyes a lot, stepped into Bush's space--making him come off, supposedly, as arrogant and rude. Then there was the matter of Gore's alleged "exaggerations," like saying he went to a Texas fire site with James Lee Witt, when in reality he went with another official, and traveled with Witt during another incident. The press jumped on Gore for this minor blunder, acting like he was a liar, all the while ignoring Bush's misstatements during the same debates (and there were several). The emphasis on all this trivial stuff ended up overshadowing the fact that the polls taken right after two of the debates showed that the initial consensus was that Gore had won. Only after the press started focusing on these irrelevant details did people change their minds.

If you actually listen to what Bush says during the debates, a different picture emerges. He frequently doesn't answer the questions given to him, sometimes completely changing the subject to talk about something else (i.e., something he's rehearsed). When he does come up with answers of his own, they are startlingly simplistic. A lot of what he does is just repeat key themes over and over, a technique that has proven effective. And his admirers mistake the simplicity for clarity.

As Roger Ebert wrote in his book The Great Movies II, Bush has never said anything Chauncey Gardiner couldn't have said. This is not to suggest that Bush is actually retarded--I'm not prepared to back that up, and it isn't true. He seems to have certain kinds of smarts. But he's intellectually vapid, and proud of it. (He actually boasted to having been a C student.) He's like Chauncey in the sense that he's a know-nothing who's being controlled by the people around him.

Thursday, August 17, 2006

The will to deny one's will

In Kurt Vonnegut's novel Slaughterhouse-Five, Billy Pilgrim is caught in a time warp, making him shift back and forth without warning to different points in his life. (Those who want to read my thoughts on the 1972 film adaptation can go here.) Vonnegut uses this plot device in two ways: to tell a semi-autobiographical account of his experience as a POW in World War II, and to express his mechanistic outlook on life.

This isn't the kind of time travel where you can change your own past. Billy occasionally tells people around him about future events he has seen, and he appears to take this information into account when making choices. But there's a sense that he never tries to change anything. Thus, he gets on a plane he knows will crash, without even making a fuss about it, because "he didn't want to make a fool of himself" (p. 133). What would have happened if he had avoided the flight? The book never answers that question, and there's an underlying implication that he has no ability to avoid what he knows will happen.

As in most time travel stories, the reader had best not scrutinize the logic of the situation. If it is possible to possess information about one's future, then there is a potential for the events to turn out differently. Books like this may try to ignore that fact, but it is inescapable.

Vonnegut wants to argue that everything which happens in life is inevitable, including the choices one makes, and thus that even someone who sees the future cannot control his own choices. But this notion overlooks the crucial consideration that a part of decision-making involves the information that one possesses. How many decisions might you have done differently had you possessed more information about the outcome?

Later in the book, Billy finds himself inside a dome on Planet Tralfamadore, where he becomes an exhibit at some alien zoo. This situation is not meant to elicit horror. He has conversations with the alien scientists, and he isn't trapped, since he's still constantly moving back and forth to other points in his life. The aliens themselves are even more detached from the human time frame than Billy is, living as they do in the fourth dimension. Because they view life from outside of time itself, all events to them look like one big simultaneous blob. They "don't see humans as two-legged creatures.... They see them as great millipedes--'with babies' legs at one end and old people's legs at the other'" (p. 75). This situation naturally affects their entire outlook. As one Tralfamadorian explains to Billy, "Today we [have a peaceful planet]. On other days we have wars as horrible as any you've ever seen or read about. There isn't anything we can do about them, so we simply don't look at them. We ignore them. We spend eternity looking at pleasant moments" (p. 101). Vonnegut here seems to be hinting at the real message of his book: that our purpose in life should not be to control what happens (that being impossible), but to cherish the good moments.

Of course, this philosophy involves a denial of the concept of free will. Vonnegut has the alien insist, "I've visited thirty-one inhabited planets in the universe, and I have studied reports on one hundred more. Only on earth is there any talk of free will" (p. 74). The position that free will is so unnatural that an alien likely would be perplexed by it is a strange assumption, considering how vital the concept is in human society.

If you don't believe me, imagine the following scenario: a stranger begins hurling nasty insults at you. Naturally, you think, "What a jerk!" But how would your judgment of that person change if you knew he had Tourette's Syndrome and his insults were involuntary? In that case, you wouldn't fault him for his behavior. And why not? Because he couldn't help it. In other words, he didn't choose to do what he did. We measure human behavior by the choices people make, which is why we do not count behavior stemming from a brain disease. That's why our justice system has a concept called "not guilty by reason of insanity." If we exempt a crazy person from guilt simply because he doesn't know right from wrong and hence cannot choose between the two, we're implying that sane people do possess such an ability.

The concept of good and evil goes out the window if there's no free will. Sure, everyone agrees that some people are destructive, but as Hannibal Lecter reminds us in Thomas Harris's novel Silence of the Lambs, a storm is also destructive, and we do not refer to a storm as evil. Our society makes a strong distinction between who a person is and what that person does. We detest any system that places strong limitations on a person based on birth; that's why we have such antipathy toward the Indian caste system and doctrines of racial superiority. We hold that anyone, regardless of background, has the right to be judged by behavior, and not to be judged disfavorably until behaving disfavorably. Among other things, this belief forms a large part of the basis for rejecting preventative detention, the practice of locking someone up in order to prevent a crime. Our society would begin to look draconian if we truly followed the principle that free will doesn't exist. It would hardly lead to the "let's just enjoy life" philosophy that Vonnegut seems to favor.

But people who deny the concept of free will never seem to think through the consequences of their belief. Like those who claim that morality is purely subjective, they make a philosophical claim but don't live life as though they really believe what they're saying. In speaking against injustice throughout the world, Bertrand Russell certainly didn't act like he truly considered the difference between his own positions and those of, say, Hitler's, a mere matter of personal taste. And Vonnegut's portrayal of the horrors of war doesn't seem to stem from the position that the Nazis couldn't help the way they were. Undermining a basic principle of society is easy enough in the abstract, but few such people truly live by their words.

Tuesday, August 01, 2006

A time to reflect

What's especially striking to me about the breaking news of Mel Gibson's recent anti-Semitic outburst while drunk is how rare this kind of situation is. Many celebrities and public figures have been accused of anti-Semitism, sometimes fairly, sometimes not, but in very few cases is there explicit proof. Even Vanessa Redgrave tried make it sound like she opposed only Zionists, not Jews, in her notorious Oscar acceptance speech, and Marlon Brando framed his attack on Jews as a criticism of Hollywood's alleged insensitivity toward "other" groups. In the last few years people have argued passionately over whether Mel Gibson is an anti-Semite, closet or otherwise. Until now, the proof was far more ambiguous than it was for Redgrave or Brando, and even some prominent Jews like Michael Medved rushed to his defense.

Now, what do his former defenders have to say? It looks like they really have egg on their faces. But the gist I've gotten from them is, okay, so it's true, Mel is an anti-Semite, but that doesn't mean we were wrong to defend him before. As Michael Medved argues in his blog, "Gibson's comments...remain particularly perplexing in the light of a previous record free of personal, anti-Semitic incident." I find this reaction naive. Gibson's former defenders should be considering that maybe they weren't quite as sensitive as they could have been to the presence of closet anti-Semitism in a man whose career could have been damaged by this information. Bigotry is a complex phenomenon, and it still amazes me how many people are blind to how subtle it can be.

Medved considers it perverse that the press focused on Hutton Gibson, Mel's father, who is an out-and-out Holocaust denier, instead of on Mel himself, who supposedly renounced the views of his father. But a closer look reveals that not to be the case. Mel was quite vague about the details regarding his own belief in the Holocaust ("chillingly ambiguous," as Charles Krauthammer put it). These are his words: "Atrocities happened. War is horrible. The Second World War killed tens of millions of people. Some of them were Jews in concentration camps. Many people lost their lives. In the Ukraine, several million starved to death between 1932 and 1933. During the last century, 20 million people died in the Soviet Union." Anyone who's familiar with the rhetoric of Holocaust deniers and Holocaust minimizers will recognize the similarities here. They all admit that some Jews were murdered. What they dispute is the numbers, and they deny there was any systematic attempt at genocide. Nothing that Mel said here contradicts that outlook.

I have not, by the way, seen The Passion. Still, even without getting into the debate over that particular film, there was plenty of evidence to support the claim that Gibson was an anti-Semite, long before this drunk-driving incident. I was once willing to give him the benefit of the doubt, on the grounds that maybe he was trying hard not to insult his father. That isn't an excuse, but it did leave open the possibility that he wasn't anti-Semitic in his heart. Now, we've gotten a rare glimpse into his heart, so maybe we need to look into our own.

Wednesday, June 28, 2006

The self-created monster

Although the great psychodrama Silence of the Lambs has enjoyed tremendous popularity and acclaim, many viewers have overlooked its most provocative insight: Hannibal Lecter, though a fearsome killer, is not truly crazy. This is a radical interpretation, I admit. The conventional view is that he can't help being the way he is. As Roger Ebert writes, Hannibal "bears comparison...with such other movie monsters as Nosferatu, Frankenstein (especially in Bride of Frankenstein), King Kong and Norman Bates. They have two things in common: They behave according to their natures, and they are misunderstood. Nothing that these monsters do is 'evil' in any conventional moral sense, because they lack any moral sense. They are hard-wired to do what they do. They have no choice."

I believe that this interpretation is mistaken. But I admit that there is superficial evidence to support it. There is no doubt that all the characters in the movie, aside from Hannibal himself, consider Hannibal crazy. That's why he's in an institution for the criminally insane. That's why Anthony Hopkins, on the DVD, describes Lecter as a good man trapped in a madman's body. Who am I to disagree with the actor who brought the character to life?

But I have observed that people tend to apply the word "madman" indiscriminately to anyone whose actions fall outside the boundaries of civilized behavior. Only in that sense is Hannibal "mad"; by any other criteria, he exhibits none of the usual signs of madness. He is not delusional in the least, and he has full control over his behavior. Everything he does is a carefully considered choice, based on a personal value system that permits him to perform grisly acts when he believes the circumstances justify it.

Dr. Chilton describes Hannibal as "a monster, a pure psychopath," but Hannibal in many ways does not fit the traditional definition of a psychopath. According to the diagnostic manual DSM-IV-TR, a person must exhibit three or more of the following behaviors to be classified as a psychopath:
(1) failure to conform to social norms with respect to lawful behaviors as indicated by repeatedly performing acts that are grounds for arrest
(2) deceitfulness, as indicated by repeated lying, use of aliases, or conning others for personal profit or pleasure
(3) impulsivity or failure to plan ahead
(4) irritability and aggressiveness, as indicated by repeated physical fights or assaults
(5) reckless disregard for safety of self or others
(6) consistent irresponsibility, as indicated by repeated failure to sustain consistent work behavior or honor financial obligations
(7) lack of remorse, as indicated by being indifferent to or rationalizing having hurt, mistreated, or stolen from another
Hannibal clearly is not reckless, irresponsible, or impulsive. His lack of impulsivity is notable, since the usual image of a psychopath is someone who lives in the present and doesn't think ahead. Hannibal seems to have everything intricately planned--including his escape, which he carries out while listening to classical music as if he had outlined the attack to the exact key.

One might assume that he's deceitful, but actually he lies only once in the entire movie, when he deliberately gives the FBI incorrect information about the name and whereabouts of the serial killer on the loose. Yet he does this in retaliation after he is lied to, hardly an indication of habitual lying. On the contrary, most of the time he uses his brutal honesty as a weapon, to wound others.

That leaves three categories that arguably apply to Hannibal: "failure to conform to social norms," "irritability and aggressiveness," and "lack of remorse." If those three traits truly describe Hannibal, then he may qualify as a psychopath. However, there is a good case for saying that he doesn't fit the second category. While he is certainly aggressive, I wouldn't describe him as irritable. His aggression is not haphazard but methodical. Whatever drives him, it isn't anger or rage. He is willing to hurt or kill those who stand in his way, but there is usually an element of moral judgment in his choice of victims. He tells Clarice that he has no intentions of coming after her, because "the world is more interesting with you in it." He has firmly held beliefs about how people ought to behave, and they influence his decisions on how to act. For example, when he causes Miggs's death, Dr. Chilton claims that Hannibal did it "to amuse himself," but Hannibal has his own explanation: "Discourtesy is unspeakably ugly to me." That is an ethical belief he repeatedly follows throughout the film.

What about his cannibalism? Doesn't this greatly undermine my argument? How could any sane man eat people? But there's nothing compulsive about his behavior. He performs no elaborate rituals along the lines of any standard serial killer. His cannibalism seems to reflect, rather, his contempt for much of the human race. He doesn't value human life, but he is capable of being kind to those he feels have earned his respect, like Clarice.

Hannibal is neither a psychopath nor a madman. Then how, you might ask, can we explain his monstrous behavior? Here is a telling exchange from the novel:
"You can't reduce me to a set of influences. You've given up good and evil for behaviorism...nothing is ever anybody's fault. Look at me, Officer Starling. Can you stand to say I'm evil? Am I evil, Officer Starling?"

"I think you've been destructive. For me it's the same thing."

"Evil's just destructive? Then storms are evil, if it's that simple." (p. 19)
Hannibal here is criticizing both the psychiatric profession and society as a whole. There is a common temptation to explain all human behavior in terms of mental states. We seek to distance ourselves from our horror by labeling anyone who commits horrifying crimes as "sick," as though that person is somehow the product of forces beyond his control rather than someone who has made a conscious choice to be the way he is. Hannibal Lecter represents our worst nightmare, a living proof that brutality and rationality do not necessarily conflict.

The promise of a sound resolution

Most people accept the concept of objective truth. If someone says that ice cream is a health food, that person is simply wrong. But if someone says, "ice cream is delicious," that statement is neither true nor false; it is simply a matter of opinion. A lot of people today place morality in the latter category. I hear this all the time: "Morality is subjective," they say. As Bertrand Russell asserts, "in a question as to whether this or that is ultimately Good, there is no evidence either way; each disputant can only appeal to his own emotions, and employ such rhetorical devices as shall rouse similar emotions in others." I disagree. Although people's emotions do often influence their views on morality (or, for that matter, on any other subject), it is possible to objectively assess a moral view based on the quality of the reasoning used to support it and on the weight of the evidence.

All societies share certain core principles. Killing would be a crime even in Hitler's ideal society. What Hitler claimed was not that killing in general was acceptable, but that the only way to create an ideal society was by first destroying or enslaving certain races. That claim rested on demonstrably false assumptions about reality, such as his pseudoscientific notions about race and the mortal threat that Jews allegedly posed for the rest of mankind.

Morality and truth are more closely linked than subjectivists admit. According to the subjectivist, if one culture practices cannibalism, and a second culture considers cannibalism immoral, there's no objective way of determining which side is right. If we investigate how the cannibals justify their actions, however, we are likely to find that they hold mistaken beliefs. They may believe, for example, that eating human flesh gives a person great powers, or that the people they are eating are less than human, coming as they do from outside the tribe. To suggest that those beliefs are rooted in superstition and ignorance is hardly a matter of subjective opinion.

By identifying core beliefs that all societies accept, we can determine through reason which moral views come closer to meeting those core beliefs. We can also determine when moral laws have exceptions, such as killing in self-defense. Since the goal of the law against killing is to protect human life, occasionally we must violate this law to reach the same goal. The reasoning here is similar to why people undergo surgery: they allow their body to be damaged in the short run to improve their health in the long run.

Moral ambiguity arises from the conflict between short-term and long-term consequences. If the United States government had learned that hijacked planes were heading toward the World Trade Center, it may have chosen to shoot down the planes, killing all the passengers, because failing to do so would lead to even more deaths. As a rule, long-term consequences take priority over short-term consequences. The problem is that they are harder to determine. The Nazi worldview perhaps represented the extreme of reasoning on the basis of long-term consequences, in the suggestion that enormous destruction of human life was needed to create a peaceful world. The primary danger of utopian visions is that people who seek to transform society to such an extent may ignore the harm they are driven to inflict on society in its current state. Sound moral reasoning involves a balance between what one knows to be true in the present and what one can reasonably infer about the future.

In real life, of course, many of the situations we face are not as clear-cut as the previous examples. The fallacy that subjectivists commit is in thinking that lack of clarity automatically implies subjectivity. Objective reality is not always accessible to human knowledge. For example, nobody knows whether life exists on other planets, but either it does or it does not; the answer does not depend on what humans believe or know. Of course, we may disagree on how to define life. But everyone agrees that humans, lions, trees, and bacteria are alive. The concept does not collapse into subjectivity simply because people aren't sure how far to extend the definition. By that logic, all concepts would turn out to be subjective.

Similarly, the fact that two people in full knowledge of the facts reach opposite conclusions on a moral question does not imply that the issue is subjective. One person may err in his reasoning, and their views may rest on assumptions that are hard to prove. Uncertainty is not subjectivity. While a person's emotions may influence where he stands on the issue, a rational person recognizes that any attempt to resolve the issue is ultimately a search for truth, not an appeal to emotions. Just as unsolved mathematical problems do not shake people's faith that one plus one equals two, complex moral issues do not refute the existence of simple moral truths.

After all, anyone who enters any moral debate hopes that this society will eventually resolve the issue when most people decide which side’s arguments are the most cogent. If neither side is ultimately right or wrong, however, then the view that triumphs in the end will do so simply because its proponents have enough political power. All moral controversies are ultimately power struggles, if one follows moral subjectivism to its logical conclusions. Only moral objectivity can offer the promise of a sound resolution.

Tuesday, June 27, 2006

The issue that will not die



I think this cartoon perfectly captures the flag burning debate. It's human nature that as soon as someone says you can't do something, that's when everyone wants to do it. The current amendment seeks to enshrine in our Constitution the ability to outlaw a very specific form of political protest that is scarcely more common of an activity than machete juggling. But I believe this is an important symbolic issue with much larger implications, because the proposed amendment cuts back on the First Amendment's right to free speech.

How can I possibly say that, you ask? How can I possibly equate the act of burning a flag with "speech"? Well, I'm certainly in good company with this position. The Supreme Court has ruled repeatedly that nonverbal forms of communication do have some protection under the First Amendment. In the 1931 case Stromberg v. California, the Court struck down a statute that prevented people from displaying red flags in support of Communism, and in the 1969 case Tinker v. Des Moines Independent School District, the Court ruled against a school that prohibited its students from wearing an armband in protest against the Vietnam War. It’s true that lawmakers can put more restrictions on nonverbal communication than on actual speech and writing, but one thing they cannot do is outlaw an activity purely because they oppose the message that activity is intended to express.

This point is lost on Robert H. Bork. According to Bork, banning flag burning isn't outlawing an offensive idea; it's outlawing an offensive "method of expression." For example, says Bork, we are certainly entitled to "stop a political speech made from a sound truck at 2:00 AM, or prosecute a protest against sodomy laws where demonstrators engage in the practice in public" ("Waiving the Flag," Omni, Oct. 1990, p. 10).

But in both of those examples, the reason both actions are prohibited has nothing to do with the messages being expressed. If you go into a quiet neighborhood in the middle of the night and give a speech blaring from a sound truck, you'll be arrested no matter what you say. You don't even have to be saying anything; you could howl at the moon and you'd still be violating the same law.

The laws prohibiting flag burning are clearly in a different category. For example, the Texas statute which the Supreme Court struck down in 1989 actually permitted the burning of an American flag if the purpose was only to dispose of a torn or dirty flag. Obviously, the law was directed specifically at people who used flag burning to express a message of anger and contempt toward the American government.

Similarly, the proposed amendment isn't against the destruction of an American flag; it's against the "physical desecration" of the flag. What does it mean to desecrate a flag? The dictionary won't help me on this; they all define the word desecrate as strictly a religious term that you might apply to the destruction of objects found in a church or synagogue, certainly not to a secular object like an American flag. Congress, however, has defined the word as "deface, damage, or otherwise physically mistreat in a way that the actor knows will seriously offend one or more persons likely to observe or discover his action."

In other words, the amendment, like the Texas law, is not directed toward the act of flag burning. It is directed toward people who use the act to express a message of disrespect. Not that this should be surprising. There's nothing about the act of setting a flag on fire that's inherently offensive. Once you start talking about the person's intent--what's in his mind--it becomes abundantly clear that it's not what he's doing that offends, it's what he's communicating.

Sunday, June 25, 2006

The neverending series

Everyone knows that sequels tend to suck. But it's particularly distressing to see a favorite childhood movie ruined by one or more inferior sequels. The Neverending Story is an excellent example of this problem. The original was far from a perfect movie, but it was a fun and engaging movie with a sense of wonder. Unfortunately, it depended greatly on its high production values and the skill of its director, Wolfgang Petersen, who apparently had no intention of making a sequel even though his film covered only half of Michael Ende's novel. The sequels that did get made were low-budget enterprises with little chance of doing justice to the original. Even so, they managed to hit rock bottom, and I cannot think of a single other series that has taken such a low fall.

It's not just an issue of budgets. The people who made the sequels seemed clueless as to what made the original special. No fantasy film I've seen has tapped more successfully into the kinds of philosophical thoughts that kids have. Think of Rockbiter's speech describing the Nothing: "A hole would be something. Nah, this was nothing. And it got bigger, and bigger, and bigger...." It's the type of film that greatly appeals to introspective kids who think about things like infinity and the end of the universe. Do children really think about such things? I did. People who find that surprising have forgotten how profound children can sometimes be.

The whole of Fantasia, indeed, seems to be built out of children's dreams and fears. Some of it is about exhilaration, as when Atreyu rides Falkor. Others reflect anxiety, as in Atreyu's trek through the Swamps of Sadness. What appealed to me most as a kid was how an imaginative but passive child, sort of a young Walter Mitty, opens up a book in which an older, braver version of himself goes on adventures. But The Neverending Story isn't so much escapism as it is about escapism. It's essentially a fable about the destruction of a child's fantasy world as he grows older and adapts to the modern world.

The special effects are good for their day. Although they occasionally look phony, the film's distinct visual look, from the shimmering Ivory Tower to the assortment of weird creatures, holds up well today. What makes the film work especially well is that the two child stars--Barret Oliver and Noah Hathaway--prove themselves capable actors. I use the word "capable" because almost everyone in the film overacts in an annoying way, which I blame primarily on the director. But there's a wonderful cameo by Gerald McRaney as Bastian's father. He has the perfect tone for the scene, appearing loving but distant, unable to fathom Bastian's mind. I wish the film had followed through by returning to their relationship at the end and exploring how Bastian changes as a result of his experiences in Fantasia.

The reason the ending doesn't work is obvious to anyone who's read the book. Simply put, the movie shows only the first half of the book! While this isn't the movie's fault entirely--there was no way the entire story could have fit into one movie--this could have been handled better. The Wizard of Oz faced precisely the same problem yet managed not only to become one of the greatest fantasy movies of all time but to surpass its source material in some ways. The Neverending Story doesn't accomplish that feat. The story feels unresolved at the end while at the same time failing to clearly set up for a sequel. It attempts to wrap everything up with a sequence in which Bastian takes revenge on his old bullies. I enjoyed this scene when I was a kid, but in retrospect it creates a clash between the real world and the fantasy world. Bastian never grows as a character, he never learns to put his feet on the ground, something the early scenes suggest will happen.

There's one other problem, and that's that Wolfgang Petersen never really figured out the proper tone for a children's movie. He must not have had a clear idea what age he was shooting for. Some of the scenes are quite scary and violent, making this film inappropriate for younger children. Yet the muppet-like characters are presented in a cloying way that I doubt older kids (not to mention teens and adults) would appreciate. For example, the first scene in Fantasia plays like a revival of Sesame Street, with Rockbiter filling the Cookie Monster role. By the time I was old enough to appreciate the deeper aspects of the story, I cringed at the film's cutesy moments.

This sort of approach is never justified, in my view. The best children's movies do not condescend to their audience. Films like The Wizard of Oz, Mary Poppins, or any of the great Disney animated films, are easy to enjoy and appreciate as an adult. This is a lesson that does not seem to have rubbed off on Petersen. Had he shot for a wider age group, the result would have been fresher and more authentic for everyone.

The movie went on to become a box office hit and a minor classic, and the people who made the sequels appear to have learned more from the film's bad points than its good points. I cannot give a detailed description of the second film, because I saw it just once about fifteen years ago, and I have no desire to see it again. What I do remember is that it was painfully bad, one of the worst movies I had ever seen--maybe on the bottom thirty. It attempted to tell the second half of the novel. Unfortunately, the plot had continuity problems and ended up not making much sense. And it fell back on cliches that didn't belong there, like Bastian trying to overcome a fear of water, and a fight between him and Atreyu due to an improbable coincidence. The actor who played Bastian's father this time around was in no way in the Gerald McRaney league, and he came off generic and nondescript. Overall, the film was just poorly done.

Not more than a couple of years ago, I discovered a third Neverending Story movie being played on cable. Intensely curious, I decided to watch it. I did not have high hopes for it. But I knew that, at least, it could not possibly be worse than the second film.

Boy, was I wrong.

Released in 1994, exactly one decade after the original, it is unquestionably one of the worst movies I have ever seen--easily on the bottom ten, maybe bottom five. It is so bad that I risk making it sound like it's worth watching. Trust me, it's not that type of "bad," the enjoyable Ed Wood variety of movies that are so incompetently done they become enjoyable to watch. Those moviegoers who take pleasure in seeing cinematic disasters should be forewarned about this one, lest they never again be able to erase from their memory Rockbiter's gravelly-voiced version of "Born to be Wild," played in a video sequence early in the film and again during the end credits.

No, I am not joking.

The second film does have its admirers, and in a weird way I understand where they're coming from. At least that film had a legitimate purpose, to finish the story from Michael Ende's novel. But the third film has to make up its own reason for being, with a shabbier budget than ever before. So it concocts a story that allows us to see as little of Fantasia as possible. Here, Bastian is a little older, attending a new school. A gang of bullies chases him into the school library. The librarian just happens to be Mr. Koreander, the bookstore owner from the first film. Bastian hides from the bullies by finding the magic book and slipping into Fantasia. But the bullies also find the book, and they use it to wreak havoc on Fantasia. Through a series of magical mishaps, a bunch of creatures from Fantasia end up being transported into the real world along with Bastian. These include Falkor the luck dragon, a baby rockbiter about the size of a fountain statue, and a talking tree. Falkor, who must have gotten a lobotomy sometime between the second and third film, will later chase after a "dragon" at a Chinese festival.

What we do see of Fantasia makes the place seem a lot smaller than ever before. Almost all of the scenes there take place in the empress's chamber in the Ivory Tower, though there is also one sequence where we get to see Rockbiter's home (just what I've always wanted to do!) with Mama Rockbiter and of course the previously mentioned Baby Rockbiter sitting in front of a large stone TV set. Needless to say, the Fantasians seem to possess quite a bit more knowledge of Earth than they did in the first two films. When the gnome describes Bastian as "not exactly Arnold Schwarzenegger in the muscle department," we're reminded how much more enjoyable the film would probably be if Schwarzenegger were actually in it.

Curiously, the bullies never seem surprised to learn that magic exists. Think how long it took in the first film for even imaginative, ten-year-old Bastian to become convinced of the book's supernatural qualities. These bullies, much older and more concrete, never go through such a skeptical period. And later, when the Auryn falls into the hands of a teenage girl, she treats it with about the same level of awe as if she got hold of her parents' credit card.

The creatures bear scant physical resemblance to their counterparts from the earlier movies. They look like people parading around in bad Halloween costumes. And Falkor (who in the original was voiced by an accomplished and prolific voice actor, Alan Oppenheimer) now sounds like Goofy.

There are actually some familiar actors in this mess. Mr. Koreander is played by British character actor Freddie Jones, Bastian is played by the kid from Free Willy, and the main bully is played by a relatively young Jack Black, who now probably would like to do with this film what George Lucas wants to do with the Star Wars Holiday Special.

Thursday, June 22, 2006

The fine line between love and hate

The 2001 film The Believer contains rare insights into Jewish identity, and it's unfortunate that the film was withheld from mainstream audiences due to ongoing controversy. But it deals with an ugly subject, and it handles that subject in an ambiguous way that makes many people, including many Jews, uncomfortable. Make no mistake about it, though: the film is uncompromisingly pro-Jewish, and the director, himself a Jew, has said that he became more religious because of his work on the film. Ironically, the film is likely to resonate the most with Jews, though it also contains universal themes familiar to anyone who has ever struggled with faith.

The idea of a white supremacist who's secretly Jewish is not new to me. I've long known about Frank Collin, who caused a national controversy in the 1970s when he planned to have his neo-Nazi group march in a predominantly Jewish suburb of Skokie, Illinois. It was later discovered that Collin's father was not only Jewish but a Holocaust survivor. This case is so bizarre that it leads one to assume the guy was simply insane. While there may be some truth to that assumption, it isn't a satisfactory explanation. What would possibly lead a Jew to join a group that believes in the inherent evil of all Jews? What is such a person thinking? How does such a person live with himself, rationalize his own actions?

What The Believer accomplishes is to go inside the head of one such person and provide a compelling, believable explanation for how such a person could exist. The film is based loosely on a 1960s incident in which a high-ranking member of the KKK was discovered to be Jewish. The movie updates the story to modern times and depicts the young man, Danny, as a skinhead rather than a Klansman. His characterization is speculative but reveals a deep understanding of human nature.

What's truly bizarre about this story is that Danny never abandons his Jewish roots entirely. After attending a neo-fascist meeting, he goes home to his family, whom he treats with respect. He even performs Jewish rituals in private. Yet he terrorizes a Jewish kid on the subway, tells his neo-Nazi buddies that he wants to assassinate a prominent Jewish diplomat, and spouts what sounds on the surface like typical white supremacist ideology. But he's not, as we might suspect, a hypocrite saying things he doesn't believe, or a two-faced lunatic. His philosophy is surprisingly coherent. Sure, he's a walking contradiction, but so are many other people who have a love-hate relationship with their religious background.

His anti-Semitic beliefs all revolve around a single idea: he thinks Jews are too weak and passive. Sometimes he adopts a "macho" outlook, since he doesn't want to be associated with a people stereotyped as brainy intellectuals. On a deeper level, he dislikes the persecution theme in Jewish history and culture. But is this theme a sign of weakness or strength? Danny isn't sure. He eventually decides that Jews gain strength from their persecution; they seem to grow stronger the worse they're treated, and the biggest threat to their survival is not those who want to destroy them but those who don't care. This is a far more Jewish idea than an anti-Semitic one. Several Jewish holidays, including Passover, Purim, and Chanukah, commemorate events where Jews grew strong after periods of persecution. Many Jews today believe that assimilation into the culture is a greater danger than genocide, because it could signal the disappearance of Jews as a distinct people. As Irving Kristol once remarked, "The problem is that they don't want to persecute us, they want to marry us."

The implication is that Danny actually admires Judaism, and that his anti-Semitism is his own warped way of affirming his Jewish identity in a world where, he fears, Jews are increasingly seen as irrelevant--not loved or hated but simply ignored. His ambivalent feelings escalate as the movie progresses. When he has his neo-Nazi buddies deface a synagogue, he can't bring himself to damage the Torah scroll, and he secretly takes it home with him. His intimate knowledge of Jewish beliefs and practices looks strange to his fellow skinheads, to say the least. He tells them that he studies these things in order to know the enemy, pointing out that Eichmann did the same thing. Do they buy this explanation? Apparently they do, but Danny's girlfriend is a little smarter than that, and she finds herself strangely drawn to the religion he's running away from.

Like American History X, this movie contains disturbing scenes where the protagonist articulately expresses his bigoted ideas. There are other intelligent characters who argue back, but not everything he spouts gets answered, so I can understand why this movie makes some viewers uncomfortable. In one particularly distasteful scene, Danny mocks Holocaust survivors, and while they do answer him eloquently for the most part, his raising of the old "sheep to the slaughter" canard is left open.

Nevertheless, this a powerful and compelling film, with a lead performance by Ryan Gosling that manages to rival Ed Norton's Oscar-nominated performance in American History X. We see early on that Danny is capable of doing appalling things, but his moral conflicts are then presented so persuasively that we cannot help but sympathize with him. The climax is painfully ambiguous. Those who are looking for easy answers may want to skip this film. But they will be missing out on what is easily the most authentic and profound exploration of Jewish self-hatred ever portrayed on screen.

Tuesday, June 20, 2006

Linguistic creationism

I have recently been discussing with other bloggers Torah-science conflicts. These issues include, but are not limited to, the age of the universe; Darwinian evolution; and the history of mankind. I have examined this subject on my own for over ten years. One area that has been sorely neglected, but which interests me, is the evolution of languages. The traditional Jewish view holds that Biblical Hebrew is historically the first language of mankind. Yet that notion does not seem tenable in light of modern linguistics.

Hebrew, along with Aramaic and Arabic, is classed as a Semitic language. Medieval rabbis recognized the similarities between those three languages. In so doing, they became among the first people to notice, and thoroughly document, systematic sound shifts between languages. For example, they noted that Hebrew words with the letter zayin often resembled Aramaic words with the letter dalet: in Hebrew zachar means "to remember," whereas the Aramaic equivalent is dechar. These observations came hundreds of years before linguists began noticing systematic sound shifts between Indo-European languages, comparing, for example, English fire with Greek pyra.

It should be noted, however, that the medieval rabbis tended to assume that Aramaic and Arabic had sprung from Hebrew. Modern linguists would say that all three languages are descended from an extinct tongue they call Proto-Semitic. The existence of this tongue is purely hypothetical, of course, but it's not unreasonable to think that languages existed which left no written evidence. Most languages in the world today were not written down until recent times, because the populations who spoke them were illiterate. These include the languages of dwindling indigenous tribes in America, Australia, New Guinea, and elsewhere. English itself did not have a regular writing system, apart from occasional inscriptions in an old runic alphabet, until missionaries traveled to the British Isles sometime around the seventh century and gave us the Roman alphabet that we use today.

For those who accept the possibility that Adam had ancestors, the language issue shouldn't be much of a problem. Adam spoke Hebrew, but earlier human beings spoke other languages. When Genesis describes the rise of mankind, it is primarily talking about the rise of human civilization, not the rise of the human species. Hebrew may not have been historically the first language, but the Old Hebrew alphabet, which through the Phoenicians gave rise to the Greek and then the Roman alphabet, is widely recognized as historically the first alphabet, or at least the earliest one known.

Curiously, I have not seen many Orthodox Jews address this issue, even when talking broadly about biological evolution and human history. I have encountered one book which could be described as a work of linguistic creationism: Isaac Mozeson's The Word: The Dictionary that Reveals the Hebrew Roots of the English Language. It is, I'm afraid, a pretty shoddy job that invites ridicule. Mozeson's approach is to look for superficial similarities in sound and meaning between Hebrew and English words, to claim them as proof of a direct ancestral relationship between the two languages, and to ignore all the historical evidence contradicting his thesis. He establishes no systematic rules of sound change, and he seems unfamiliar with what the mainstream theories say, even though he is quick to dismiss them.

We can do better than that. I am not learned enough at this time to provide a more detailed response to the language question, but I have always held that we have nothing to fear from scientific knowledge, even if we cannot always explain a particular Biblical passage in light of a particular scientific theory. We should all be willing to admit at some point that we don't have all the answers.

Skeptics would say that I am being selective in what theories I accept. They would be correct. For example, there is no way that I will accept the idea that Exodus didn't happen. My rejection of this "theory," however, in no way implies that I must reject the scientific method of inquiry, or the many true discoveries that have resulted from application of this method. Not everything that falls under the banner of accepted scientific or historical knowledge is as firmly established as its adherents claim. The goal of synthesizing Torah and science should not be conformity to accepted opinion, or avoidance of ridicule. It should be a willingness to examine what the scientists have to say, and then make a judgment on our own.

Thursday, June 15, 2006

How chaos can be fun

Alan Dean Foster's Parallelities is a very funny book, but it's also a creepy and unnerving book, one that aims to shatter our sense of stability in the world around us. While parallel universes are a staple of contemporary science fiction, this novel does an exceptional job of conveying how disturbing the concept is, using it to explore philosophical questions about knowledge, identity, and randomness. But the book also has a sly sense of humor. Foster seeks to mess with our heads, and he has a fun time doing it.

The story centers around Max Parker, a slimy Los Angeles tabloid reporter sent to interview a rich man, Barrington Boles, who claims to have invented a machine that can break through the barrier between parallel worlds, dubbed "paras" in this novel. Max naturally assumes that the man is a typical loonie, but then the machine not only works, but has a side effect that not even Boles anticipated: it "zaps" the reporter (the scientific nature of what happens is never explained), inflicting him with a bizarre condition. At first he doesn't notice anything different, but as soon as he returns home, strange things start to happen, in several absurdly hilarious scenes. I don't want to give away too many of the surprises here, but let's just say Max has become a sort of cosmic magnet, pulling people and things from parallel universes into his world, and eventually drifting on his own into other worlds. He has no control over the process, which seems to intensify as the story continues.

That's the basic setup to what has become one of my prime book obsessions, a truly special novel that I have read cover-to-cover numerous times. I should note that I'm virtually alone in this reaction. The novel is still little-known, even among Foster fans. I have seen only one web reviewer who seems to hint at the book's greatness: "I didn't expect to sympathize with a shallow, arrogant tabloid reporter, but the unfolding of his inner self as he reacts to the wildly variable parallelities around him reveals a complex character study not promised in the opening chapters."

The slow opening chapters are, I believe, the main reason why the book isn't more famous. The entire first chapter, showing Max's regular life before it goes awry, is unnecessary and distracting. I do not exaggerate when I say that you could skip it and have no difficulty following the rest of the story. The chapter lacks the tension needed to engage us, it introduces characters who never appear again in the novel, and it depicts events that are entirely tangential to the later plot. This is the book's single and greatest flaw, and I'm sure that many readers have tossed the book aside before they had a chance to reach the good parts.

In most books and films of this genre, parallel universes are entirely distinct from our world. Foster's own Spellsinger series, for example, depicts a parallel world full of magic and talking animals. Parallelities uses a less common approach, depicting the worlds that Max visits as virtually identical to his own world, to the point that Max has great difficulty telling whether he's in his original world or in a slightly different para. Each of the "para" versions of Los Angeles has the same buildings, the same streets, and even the same people, including another version of Max! But subtle differences abound, and one of the joys of the book is seeing how the paras get progressively weirder, even as they continue to resemble Max's original world, at least superficially.

Every para he visits functions as a story in its own right, and in the process the book catalogs several genres. One of the paras, for example, is directly inspired by the works of H.P. Lovecraft, but I will say no more because I don't want to give away one of the book's great shock moments. The running joke is that every time Max thinks that his experience has reached the height of madness, and that it couldn't possibly get any weirder, it then proceeds to do just that by several orders of magnitude. We, the readers, are in a constant guessing game to see how far Foster will take the story into the realm of the absurd.

One of Foster's charms is his wry sense of humor. His prose has a delightfully smart-alecky tone, which permeates even the most mundane of lines, as in the following sentence: "While he was waiting for the deep fryer to perform its task of inserting cholesterol and fat into otherwise healthy fish, Max examined his surroundings" (p. 153). At other times, Foster's playfulness is employed for shock value: "He...sucked in a mouthful of water. Fresh, not salt. Not that it mattered much under the present circumstances. He could not breathe either one" (p. 247).

Foster's vivid prose, which constantly pushes the limits of what's possible to put into words, brings the parallel universes alive. For example, here is one passage describing a futuristic, utopia-like version of Los Angeles:
A much larger hover vehicle appeared, traveling from north to south. As it turned up Pico, it bent in the middle to make the corner, flexible as a snake. The people within were not affected. Overhead, the sky shone a deep, untrammeled blue. There was not a hint, not a suggestion, of smog, much less the gray-white ash of total devastation. (p. 204)
Foster's writing is so intricate and detailed that it further allows the surprises to creep up on us without warning. It also includes much introspection, largely because Max is so isolated by his experience. Max is shown to be dishonest, unethical, and insensitive, but he has enough real-life traits that we can relate to him as a human being. There is a scene where he sits on a diving board and looks into the stars, contemplating the vastness of the universe, and how much vaster it must be now that he knows about parallel universes.

We have the feeling that his experience as an unwilling cosmic traveler is causing him to become more reflective and considerate. When he encounters the previously mentioned utopian para, he is impressed by people's courtesy, especially when contrasted with the behavior of those in his L.A. And even the more negative paras are giving Max a deeper perspective on life. But Parallelities isn't a sci-fi version of Groundhog Day, where circumstances inspire a jerk to become nicer. Max never really gets an opportunity to change. All he wants is to return to his original, normal life, but we're never sure whether his experiences will ever make him want to move beyond his immoral lifestyle.

The novel primarily dwells on the negative effects that the experience is having on Max. Because he keeps meeting different versions of himself, his whole sense of individual identity is coming apart, a situation that makes self-preservation seem less understandable. As he ponders in one of the book's eeriest lines, "What would happen to him if he died here? To his real self? Probably, his paras would live on, including no doubt the one who occupied his life position here.... But he, him, the one Max that was Max to the Max, he would perish, permanently and forever" (p. 186). For most science fiction books and films, parallel universes are just an excuse to bring us to exotic new places. Parallelities stands as a unique example of the genre, by examining this concept more closely than usual, while at the same time never forgetting to be entertaining.