Tuesday, July 17, 2012

The challenge of old movies

Want to know how certifiable a movie fanatic I am? I actually keep an Excel spreadsheet noting every movie I see and the date on which I first see it. About a year ago, realizing I'd been keeping this list for literally half my life (since 1994, just before my 17th birthday), I attempted to identify the movies I'd seen during the first half. I got some help from Wikipedia, which has articles listing the films released every year (most of the major ones, at any rate). One thing I've determined is that I've seen well over a thousand films in my life--but perhaps three-fourths of them have been ones made within my lifetime, starting in the late 1970s.

This was a bit of a surprise to me, since I remember watching lots of old movies as a kid. But when I think about it, there are indeed an astonishing number of classics I still have not seen. And when I do get around to seeing them, the experience isn't always as satisfying as it's supposed to be. Part of the problem is a feeling of being intimidated by a movie's reputation. It's tricky trying to sit back and enjoy a Great Movie when I'm conscious of how I'm supposed to be feeling the weight of its Greatness at every moment. This is a big reason why I still have never watched Citizen Kane, and why I had the DVD to Lawrence of Arabia for a long time before I gathered up the courage to stick it in the drive.

The genre I find easiest to appreciate regardless of period is comedy. I was raised on classic comedy--the Marx Bros., Laurel & Hardy, Chaplin (who remains one of my favorite filmmakers to this day), Danny Kaye, the Three Stooges. I believe comedy is essentially timeless as long as it avoids topical humor, as these old movies generally did. Good 21st-century comedies like 40-Year-Old Virgin or Borat may be more profane than their predecessors, but the underlying principles of humor haven't changed. I can't say the same for dramas, westerns, romances, or horror films.

There are four basic challenges to becoming engaged in older movies. One I am not dealing with here is advancement in special effects and other technical matters. Everyone agrees movies have improved over time on that score. Instead, I wish to focus on those areas that pose significant and non-superficial barriers between modern viewers and even the best films of the past:

1. Changes in decency standards

While I find many of today's movies overly coarse, those made at the height of the Hays Commission had the opposite problem. They couldn't talk about sex in anything approaching a candid manner and were forced to employ ridiculous euphemisms, which can be hard for a modern viewer to adjust to. When I watched His Girl Friday, I had already seen the 1974 version of The Front Page with Jack Lemmon and Walter Matthau, based on the same play. Being a big Matthau/Lemmon fan, I loved the '74 version, and while I enjoyed the older movie as well, I was conscious of its limited ability to depict certain plot points. This actually led to a couple of good lines, as when Hildy reports that a character got shot in the "classified ads." At other times I felt the movie suffered from the constraints, as when it presented Mollie Malloy as an old maid rather than (as in the original play) a prostitute. And I'm sorry, but I just can't have as much affection for a film that omits the play's hilarious closing line, "The son of a bitch stole my watch!"

Some old movies have an innocence that looks laughable today. I saw the Oscar-winning 1938 film Boys Town when Newt Gingrich hosted a showing of it on TNT in 1994. Gingrich felt people should watch this movie to learn how to help today's troubled youth. The movie tells the story of Father Flanagan (Spencer Tracy) and his heroic efforts running an orphanage. His motto is that there's "no such thing as a bad boy." And indeed, most of the boys we see in the orphanage behave like perfect angels, except for one played by Mickey Rooney as a juvenile delinquent so terrible he actually smokes, plays cards, and acts sassy toward the grownups. (As a neighbor of mine at the time put it, "Sounds like the typical yeshiva bochur.") Despite these immeasurable crimes, Father Flanagan somehow manages to get through to him in the end and make him into a good kid, a message of great relevance for today's crack babies.

2. Changes in moral sensibilities

This category covers a lot of ground, but it's most notable with attitudes about race. Movies from the '30s and '40s are often shockingly racist, and when they are, I'm thrown right out of the picture. Duck Soup is one of my favorite comedies, but when Groucho utters the line--"My father was a little headstrong, my mother was a little armstrong. The Headstrongs married the Armstrongs, and that's why the darkies were born"--the movie for me just stops dead. (Few people today are aware that Groucho was actually referencing a hit song from the time titled "That's Why the Darkies Were Born" that was supposedly satirizing racism, but it still sounds pretty offensive to modern ears.) And that's just dialogue. The stereotyping of nonwhite characters in films from this period is so awful it leads to the sadly ironic fact that films from this period tend to be more watchable when they feature all-white casts.

3. Lack of freshness

It may not be fair, but movie ideas that were once highly original can come to seem banal if they get imitated enough. Hitchcock went to great lengths to keep the surprise ending to Psycho from getting leaked, but by today's standards it seems almost trite. (As Nicolas Cage declares in Adaptation, "The only idea more overused than serial killers is multiple personalities.") I recently saw Casablanca for the first time, but it feels like I've seen it my whole life. It was like deja vu as I watched, memories from my childhood of stuff I had seen that referenced the film coming to the surface of my mind: an episode of Moonlighting, a scene from one of the Naked Gun films, parts of When Harry Met Sally, you name it. There's also something surpassing strange about hearing lines like "Here's looking at you, kid," "I think this is the beginning of a beautiful friendship," "I'm shocked, shocked to find that gambling is going on here," uttered in earnest. While I did enjoy the movie, it was certainly not the same experience moviegoers in the '40s had. It was like viewing some grand antique.

4. Differences in filmmaking style

This section will be much longer than the previous ones because it deals with a characteristic of movies from the '30s and '40s that has always been obvious to me but which, for reasons that escape me, I have rarely seen discussed: they look and sound a great deal like plays. Watching a movie from today doesn't usually feel like seeing a group of actors up on a stage; it's more like looking into a window at a real-life scene. I don't just mean that the sets look more convincing, but even more that when the actors talk, they tend to sound a lot more like real people having a conversation. To show what I'm talking about, let's examine two clips, one from the 1939 version of Of Mice and Men, the other from the same scene in the '92 version:

Did you notice what I noticed? Not only is the first movie clearly being filmed on a stage, whereas the second is filmed in actual woods (or at least provides the illusion of it), the differences in acting style are striking. In the older movie, the actors deliver their lines loudly and in an almost sing-song manner; in the later film the actors speak practically under their breath, with minimal intonation. The acting in the first clip is more stylized, in the second more naturalistic. In short, the actors in the original film seem to be acting, whereas in the remake they're behaving. The second clip therefore has a more lifelike feel (despite the fact that it's the only one of the two to feature background music, a point to which I'll return shortly).

I am not cherry-picking here; this is something that has consistently stood out for me whenever I've watched movies from the '30s and '40s and compared them with later films. It is most noticeable in dramas, but it exists to varying degrees in all genres. And of course there are exceptions. Jimmy Stewart was always more naturalistic than Nicolas Cage has ever been, but in any of their films they are surrounded by actors whose style contrasts with theirs.

None of this should be surprising. When talkies were invented, movie actors naturally adopted the conventions of what had previously been the only dramatic art form involving speech. They imitated the way stage actors spoke because that was all they knew. Over time, as the technology improved and as film came more into its own as a respectable medium, the styles of stage and screen diverged and naturalism gradually became the norm on screen; I believe the transition was complete in American cinema by the early 1970s.

What do I think of the change? It depends. The two Of Mice and Men adaptations are similar overall, but as an admirer of Steinbeck's novel I always preferred the '92 version. I had an easier time connecting emotionally with characters who sounded like real people when they spoke. This standard of judging movies is surprisingly rare, from what I've seen; people just don't want to admit that the naturalism of modern film has advantages.

It also has disadvantages. Among other things, I believe it contributed to the disappearance of musicals in the 1970s. The musical is, after all, very much a genre of the stage, and to have today's movie characters burst into song can seem odd and inappropriate in a way that it never did in previous eras. Music is still important in today's movies, but most of it is in the background: the scores (one of the few non-naturalistic aspects of movies to have increased over time) and the video interludes (a form that gradually replaced musical numbers in the '60s and '70s, and which in my opinion is one of the most annoying features of modern cinema). If a modern movie character sings, usually there's a rationale within the story, such as if the character is a professional singer.

All of this has led to a looser definition of the word "musical," which nowadays often means simply "movie with lots of songs in it," even when there are no numbers. The Golden Globes, for example, have applied the term to films like Walk the Line which are only "musicals" by virtue of concert scenes, video interludes, and the like. When modern movies do feature traditional numbers, the effect is often curiously artificial.

A lot of people like to ignore this fact and pretend nothing's changed. There's not much acknowledgment that musicals didn't just happen to fall out of fashion (the way, say, westerns did), but that the whole underlying approach to filmmaking changed in a way that made the conventions of musicals seem out of place. In the olden days, making a film as a musical was such a normal and natural choice it could even be fairly peripheral to the film itself. For example, most of the Marx Bros. and Danny Kaye films, remembered primarily as comedies, happened also to be musicals. Today's movies don't have that freedom.

The resurgence of movie musicals following the success of Chicago happened in part, I think, because the 2002 film found a unique way to reconcile the conflicting conventions. In this film, a woman played by Renee Zelwegger dreams of one day becoming a vaudeville star, and most of the song-and-dance numbers are presented as fantasy sequences where she imagines herself and other people performing on stage. As a result, the distinction in this movie between a musical number and a music video is blurred to the point of irrelevance. One IMDB commenter suggested that the film was "ashamed to be a musical," but I'm not sure it would have been as successful if it had simply ignored the problem.

Alas, many of the movie musicals since then have done just that. When I first saw Dreamgirls, I noticed it wasn't until about thirty minutes into the film that a character starts singing on the street (as opposed to on a stage or in a studio). Up to that point, the movie had seemed like a low-key, serious drama, and I have to admit I found the sudden break in realism that late in the story rather jarring. I thought to myself, "Wait a second...this is a musical?!" I've had that sort of experience with at least a couple of today's musicals, but I've never had it with the musicals of old. They don't have anything to apologize for.


My point here isn't that modern movies are intrinsically "better" than older ones, or vice versa. I just think there needs to be more recognition of the effect that the evolving conventions have on different generations of moviegoers. For sure, younger people who consider older movies boring or incomprehensible are missing out on something. But people who celebrate the old stuff as some kind of gold standard that nothing today could match up to, and imply that anyone who disagrees is simply lacking in culture or taste, aren't exactly helping matters either. Speaking personally, as I continue to enrich my knowledge of films of the past, I've had the best experiences when I've understood the movies in the context of their time and was prepared to adjust as needed. Holding them in godlike esteem doesn't do the trick for me.

Thursday, February 16, 2012

America's liberalism and GOP propaganda

Recently Sen. Marco Rubio repeated a common right-wing talking point: "The majority of Americans are conservative." In response, Politifact noted that only a plurality, not a majority, of the public answers to the term "conservative" on polls, and that slightly more Americans identify as Democrats than as Republicans, while the largest group, independents, are evenly divided in their partisan leanings.

Rachel Maddow took Politifact to task for rating Rubio's statement Mostly True in light of these facts, but I think she misses the point. She is right to attack the Politifact article, but she attacks it for the wrong reasons. If Rubio had said, "The majority of Americans think of themselves as conservative," Politifact's rating would have been appropriate. The majority/plurality distinction is often ignored in colloquial speech, and partisan identification doesn't always match ideology or even voting tendency. What makes Rubio's statement misleading is that it implies Americans tend to fit his definition of conservatism, when the evidence strongly suggests otherwise.

He explains why he thinks Americans are conservative with his ridiculous, incendiary remark, "They believe in things like the Constitution. I know that's weird to some people." Of course few Americans of any political stripe would say they don't believe in the Constitution, but Rubio isn't basing his judgment on what they claim about themselves. President Obama may be a former professor of constitutional law, but according to conservatives like Rubio, his liberal policies prove he doesn't "believe" in his area of specialty, no matter what he says in public.

In other words, Rubio's logic implies he doesn't necessarily take people's self-descriptions at face value. I doubt he would accept, for example, the claimed conservatism of Andrew Sullivan, a staunch Obama supporter. How do we know the 40% of Americans who call themselves "conservative" are the Rubio type of conservative, as opposed to the Sullivan type, or some other type entirely?

As a matter of fact, Americans tend to prefer the policies of the Democratic Party to those of the GOP. According to polls, Americans overwhelmingly favor an increase in the the minimum wage, higher taxes on the rich, and leaving Social Security and Medicare alone. The Affordable Care Act remains unpopular, but the public option that was discarded from the bill polled well.

Public opinion on social issues such as abortion is somewhat cloudier, though the public becomes increasingly accepting of gay rights with each passing year. If there's one area of liberal thought that is continually unpopular, it is civil liberties. But Democrats know that, which is why they shy away from implementing such policies. That makes Rubio's statement about the Constitution ironic, because it seems to me that Americans as a whole don't have a great deal of respect for many of the rights outlined in the Constitution, yet that's the one area in which their views are more in line with those of Rubio's party!

Given all these facts, why do twice as many Americans identify by the term "conservative" as by the term "liberal"? Partly it's because over the past several decades conservatives have successfully turned "liberal" into a dirty word, so that even people who hold liberal policy positions are reluctant to embrace the term. (That's the main reason the American left started calling itself "progressive.") It's a total flip from the past. Traditionally the word "liberal" had positive connotations, evoking someone open-minded and forward-looking, while "conservative" was often a slightly negative word, suggesting joyless, old-fashioned squareness. (That was presumably the meaning Rush Limbaugh had in mind when he complained about a reporter who allegedly described his neckties as conservative.) The rise of a vigorous conservative movement in the 1970s at a time when liberalism was flailing helped to change these perceptions.

I'm not saying the public is, in fact, "liberal." For one thing, the American system is itself to the right of most modern, developed nations, so that even the Democratic Party would look conservative in other countries. "I have no more intention of dismantling the National Health Service than I have of dismantling Britain's defenses." Who said those words? It was that bolshevik Maggie Thatcher.

Conservatives point to polls showing that Americans tend to say they favor "limited government." But when you examine their views more closely, you find they oppose just about anything that would actually lead to a smaller government. As a 2010 Washington Post poll found, "most Americans who would like to see a more limited government also call Medicare and Social Security 'very important' programs...[and] want the federal government to remain involved in education, poverty reduction and health care regulation."

The American appetite for shrinking the government in theory but not in practice is a big part of why it's so hard to combat the country's fiscal problems. Serious cuts will almost invariably cause pain to many voters. The only category of federal spending that a majority of Americans wants to see cut is foreign aid, and most Americans are unaware it constitutes a minuscule portion of the budget. It's comforting, I suppose, to pretend the source of our problems lies an ocean away.

However Americans may describe their political philosophy in the abstract, it's obvious from any serious examination of the polls that the economic policies of the Republican Party are, for the most part, highly unpopular. So why do Republicans continue to win office? For starters, elections tend to be driven by factors other than voter agreement with policy. According to political scientists, the biggest influence on national elections is the state of the economy. The 2010 Republican sweep demonstrated this to a tee: exit polls showed that 52% of the voters wanted to see the Bush tax cuts on the rich ended, and that although 48% wanted to see Obamacare repealed, 47% wanted to see it either left alone or expanded. Thus, the electorate that brought us this "Tea Party revolution" didn't clearly agree with two of the Tea Party's key policy priorities.

Of course Republicans don't have to wait till the economy goes south to gain power. They've developed a wealth of propaganda to make their views sound more palatable to the average voter. They'll point out that Democrats want to raise taxes, and once you mention that the proposed tax hikes will only fall on the richest of Americans, a mere 1% of the country, the Republicans then accuse the Democrats of "class warfare" and hostility to "job creators." All these rhetorical devices are attempts to give voters the impression that Republicans are defending the interests of the middle class, a delusion they can only maintain by not describing their policies in plain English.

It is when Republican politicians get to entitlement talk that their propaganda descends into complete incoherence. "Keep your government hands off my Medicare" was a line seen and heard at a few Tea Party rallies, and although these were isolated incidents, they reflected a message that's pervasive in a party that calls Obamacare a "government takeover of the health-care system" while at the same time attacking it for its cuts to America's actual government health-care system.

The contradiction is sometimes laughably transparent. In 2010, a Republican Congressional candidate ran an ad blasting the Democratic incumbent for the following two sins: "Government run health care. Medicare cuts." In 2011, Michelle Bachmann warned that Obama would try to turn Medicare into Obamacare (which would actually mean privatizing it, but never mind). Then there's Romney's recent statement simultaneously assailing the president for failing to make entitlement cuts and, well, making entitlement cuts: "This week, President Obama will release a budget that won't take any meaningful steps toward solving our entitlement crisis.... The president has failed to offer a single serious idea to save Social Security and is the only president in modern history to cut Medicare benefits for seniors."

Republican messaging reached its ultimate absurdity after the passage of Paul Ryan's bill (which Romney has endorsed) that not only includes the same Medicare cuts that Romney and other Republicans have attacked Democrats for implementing, but plans eventually to eliminate Medicare in all but name--whatever Politifact may tell you to the contrary. Ryan's plan isn't popular, but that hasn't stopped Republicans from claiming to be saving Medicare while accusing Democrats of trying to destroy it.

These obfuscations are necessary because Republicans truly seek reductions in America's safety net but must contend with a public deeply opposed to that project, including one of their core constituencies, the elderly. They have no choice but to lie about their positions, because describing them truthfully would prevent the GOP from getting into power. As Jonathan Chait explained in his 2007 book The Big Con, published before the rise of Obama, the Tea Party, or the mythical death panels:
There is also a natural--and, in many ways, commendable--skepticism about one-sided accusations of dishonesty. Those who confine their accusations to one side are usually partisans best taken with a grain of salt. Lying and spinning have always been a part of politics, and it is the rare elected official who prevails by offering the voters an objective and unvarnished assessment of his plans. Moreover, since we tend to think of lying as an idiosyncratic personal trait, there's no reason to think that one side has more liars than the other any more than there's reason to think one side has more drunks or adulterers.

Yet, as will become clear, the fact remains that dishonesty has become integral to the Republican economic agenda in a way that it is not to the Democratic agenda. The reason is not that Republicans are individually less honest than Democrats. Far from it. It is simply that the GOP, and the conservative movement, have embraced an economic agenda far out of step with the majority of the voting public. Republicans simply can't win office or get their plans enacted into law, without fundamentally misleading the public. Lying has become a systematic necessity. (pp. 118-9)

Thursday, February 02, 2012

The film, not the holiday

From the first time I watched it when it hit the theaters back in '93, Groundhog Day has held a special place in my mind. It isn't that it's based on a clever idea (Bill Murray stuck in a time warp forced to relive one of America's silliest holidays over and over). Lots of movies have clever ideas. It isn't even that it executes this idea flawlessly from start to finish--which is almost unheard of for a high-concept comedy. It's that it does all that, and then manages to tell a smart, perceptive tale about the human condition.

It's one of those movies where you're not even sure the filmmakers understood how good it was, because it has the trappings of a more frivolous endeavor. It belongs to a tradition of farces I call Cursed Schlub movies, which revolve around a character who becomes victim to some otherworldly fate with its own bizarre rules. Examples include The Nutty Professor, All of Me, Liar Liar, and Shallow Hal. Typically they're not much more than comic exercises, with plots that function as clotheslines for a string of elaborate gags.

While just as funny as the best of those pictures, Groundhog Day transcends the genre, fleshing out the story and supplying character development that doesn't feel the least bit contrived. It was something of a transitional role for Murray, who up to that point had been known merely as a brilliant comedian. His performance, where he uses subtle facial mannerisms to great effect, paved the road for his more serious turn in movies like Rushmore and Lost in Translation.

Even its style of humor is uncommon for this type of film. It mostly avoids slapstick in favor of witty dialogue that showcases Murray's gift for understatement, as when he laconically remarks, "My years are not advancing as fast as you might think." Not only does his deadpanning make it easier for us to believe in the character of Phil, it enhances the laughs. (Part of the hilarity of the early scenes comes from seeing the mounting panic on a man who keeps his emotions so tightly bottled up.) I have my doubts that a broader approach, like one Jim Carrey might have given, would have worked as well on any level.

The film also wisely never reveals the cause of the time loop. It's customary in Cursed Schlub movies to invent some harebrained rationale for the central plot device (the birthday wish in Liar Liar, the hypnotic suggestion in Shallow Hal, the cartoon mysticism in countless body-swap comedies). An early version of the script did just that, explaining Phil's predicament as--I kid you not--a voodoo spell cast by a former lover. Viewers tend to assume it is something more along the lines of a trial from God, and the movie acquires a certain poetic and even spiritual quality normally absent from this sort of material.

Its biggest divergence from other movies in the genre lies in its plot construction. Instead of the usual strategy of cobbling together a series of comic sketches and gluing on a formula ending, the plot develops as the step-by-step process by which Phil comes to terms with his strange condition. There's a surprisingly smooth progression to the story that never gets thrown off course by the jokes. In the entire movie only the Jeopardy scene feels like a skit that could have appeared just about anywhere in the proceedings (though its placement in the section where Phil becomes lethargic and depressed makes sense). The rest of the time the events fit together like clockwork, the end flowing naturally from the beginning, all of it focused on Phil's growth as a person. Along the way, it does a splendid job exploring a universal human trait.

Everyone on the planet, I'm convinced, sometimes imagines redoing past experiences. This can range from thinking up a snappy retort hours after an argument ended to harboring deep regrets over a life decision. Yet if you were somehow given the ability to go back in time and alter past events until you got them exactly as you wanted, you'd eventually go mad, because you'd be depriving yourself of the unpredictability that makes life worthwhile.

That's one of the movie's insights. I think of the sequence where Phil uses trial-and-error to determine Rita's likes and dislikes so that after she's forgotten divulging all that information he'll present himself to her as Mr. Right. From his perspective the project could be taking weeks or months or longer (the movie never says), but to her it's always their first date, and he is never able to seduce her on what to him is a day with no end. She always feels he is pushing too hard on their relationship, and even though she doesn't know the supernatural part, she senses he's manipulating the situation and concealing his true self from her.

In one crucial bit of dialogue after he thinks he has gotten through with her, she tells him, "It's a perfect day. You couldn't plan a day like this." He replies, "Well, you can. It just takes an awful lot of work"--once again making an honest observation secure in the knowledge she won't possibly take it literally. Yet one of their most romantic moments, where they fall on the snow together, is unplanned. We see that he is unable to repeat that moment in later iterations of Groundhog Day: his words get increasingly stilted, his movements increasingly clumsy. Even for a man given thousands of do-overs, the moment is gone forever, and only he will remember it. Our lives are filled with moments like that, but some of us lose sight of their significance when we're overwhelmed by the things we almost got right.

Thursday, January 05, 2012

Ivory tower crusaders

According to Ron Paul, "Libertarians are incapable of being racist, because racism is a collectivist idea, you see people in groups."

That remark reminds me of Pat Buchanan's response to charges of anti-Semitism: "I am as aware as any other Christian that our Savior was Jewish, His mother was Jewish, the Apostles were Jewish, the first martyrs were Jewish.... So no true Christian, in my judgment, can be an anti-Semite."

Not only do these statements both demonstrate the No True Scotsman fallacy, they raise some intriguing points about how the concept of prejudice is commonly misunderstood.

Let's start with the claim that a true Christian cannot be an anti-Semite. Somehow I doubt that assertion would much impress the Jewish victims of the Crusades, the Inquisition, and the numerous expulsions and pogroms and massacres committed in the name of Christ throughout the centuries. Presumably, Buchanan would respond that none of those attackers were "true" Christians. (I'm being charitable here, because I know there's a distinct possibility that he would defend the Crusades, as some on the right have.) It's a seductive argument because you can't possibly disprove it. Anytime a Christian assaults a Jew, you can either deny that person is a "true Christian" or deny that what that person did was anti-Semitic. It's one of those airtight defenses lawyers love.

It also shows a poor understanding of the historical roots of anti-Semitism. The simple fact is that most of the themes of modern-day anti-Semitism first emerged in a medieval Christian context. This happened not in spite of the fact that Christianity began as a rival Jewish sect, but in many ways because of it. Medieval Christians saw the continued existence of Judaism as an insult to their own faith which was supposed to have supplanted it. In theory this was a religious rather than racial prejudice, with the goal of converting Jews rather than killing them. And when it took on a racial character, as in 15th-century Spain, pointing out the Jewishness of the early Christians would probably not have swayed the persecutors.

Buchanan seems to be implicitly defining anti-Semitism as "the doctrine of hating all Jews who ever walked the face of the earth"--which is not how medieval Christians, even the Spanish, ever framed the issue--and then suggesting that this doctrine is logically incompatible with the theological claims of Christianity. And so it is--but only very mildly. The fact that his religion is founded upon worship of a long-deceased Jewish man does not automatically imply acceptance of the vast majority of Jews. History makes this all too clear. Centuries of persecution and bigotry can't be swept aside by one tiny, possible logical inconsistency.

That brings us to Ron Paul and his argument that libertarians can't be racist because racism is a form of collectivism, the opposite of libertarianism. If that's the case, then it's a funny coincidence how closely many of his policy views match those of the people he calls collectivist. As Stormfront founder Don Black said after endorsing his 2008 presidential bid, "We know that he's not a white nationalist...but on the issues, there's only one choice." What issues? Black mentions the Iraq War and immigration, but maybe there's just a few other things Paul has said that might appeal to white nationalists--say, his long-standing opposition to the Civil Rights Act of 1964. He insists he takes this position not because he harbors any animosity toward blacks (or "the blacks," as he phrases it in the earlier clip) but merely because he values freedom.
When you invade and violate the Constitution, you attack the personal liberties of the citizens of California and Maine, as well as the liberties of the people of South Carolina and Virginia. You cannot create new rights for one group by taking them away from another.

I am deeply concerned over the efforts of opposing groups to smear our effort with the false trappings of race hatred. We are interested solely in protecting the rights of states to manage their own internal affairs, which is a fundamental guarantee of the Constitution.
Actually, those aren't the words of Ron Paul. They're the words of Strom Thurmond during his 1948 segregationist campaign. (The first paragraph is from The Washington Post, Oct. 12, 1948, the second from The Baltimore Sun, Jul. 20, 1948--both obtained from my library's archive.) But if you read what Paul has actually said on the subject, you'll find that the above quote wouldn't sound at all out of place.

Of course Thurmond also once said, "there's not enough troops in the army to force the Southern people to break down segregation and admit the nigger race into our theaters, into our swimming pools, into our homes, and into our churches." Admittedly it's hard to imagine a remark like that escaping Paul's lips (though not so hard to imagine it appearing in a newsletter under his name). And Paul does talk favorably, as Thurmond would not have, about creating a "color-blind society."

But Paul's argument about collectivism is doubly flawed, first because it conflates a philosophy of government with a philosophy of human differences (a person can, with perfect consistency, believe that blacks and whites should be treated equally under the law while also believing whites will naturally come out on top), second because it's exactly the sort of rationalization that white supremacists have used for centuries to justify keeping racist institutions alive. They also talked about states' rights; they also depicted civil-rights legislation as an assault on freedom; they also claimed their preferred policies would benefit blacks; and they also repudiated certain manifestations of bigotry. (Thurmond, for example, opposed the poll tax and distanced himself from the racist, anti-Semitic preacher Gerald L. K. Smith.) Even if Paul's motives are entirely honorable, rooted only in his fealty to federalist principles and not to prejudice, it doesn't change the fact that racism has a long history of coming cloaked in such principles.

Paul and Buchanan both think they can refute charges of bigotry simply by identifying themselves with a favored belief system and defining that belief system in logical opposition to the charges. Their use of this defense reveals a cartoonish understanding of bigotry, and the philosophical basis on which they reject that bigotry is hopelessly feeble. They are men living in ivory towers, too attached to the elegant simplicity of their logic to appreciate its real-world implications.