Saturday, September 28, 2013

In (partial) defense of nonliteral literally

Gene Weingarten has written a piece decrying the Oxford English Dictionary's recent decision to include in its definitions the use of literally for nonliteral expressions (as in "I literally died of laughter"). While I share some of Weingarten's distaste for this usage and find it to be a fun topic, I cannot agree with his complaint. It has to do with what you consider a dictionary's purpose. Weingarten apparently believes it is to serve as an authority on how people ought to speak and write. This school of thought, known as prescriptivism, once dominated lexicography. But over the past century most dictionaries moved toward descriptivism, the idea that their purpose is simply to describe the language as it is currently used by its speakers. According to this view, if enough people use a word in a certain way, it deserves inclusion in a dictionary. Weingarten thinks this is simply "rewarding vapidity."

Weingarten's harangue is typical of prescriptivists, who in my experience tend to be scarcely aware they're even advocating a philosophy, let alone one widely rejected by lexicographers and linguists. They present their criticisms of the way people speak and write as nothing more than commonsense conclusions that they remember better than others because they stayed awake during third-grade English. Rarely do prescriptivists question any of the traditional rules they were taught in school, many of which do not hold up to scrutiny. They are discussed at length by the linguist John McWhorter in his 2001 book Word on the Street, which presents a wealth of evidence that many of the so-called "rules," from avoidance of split infinitives to the prohibition on using they with a singular antecedent (as in "everyone returned to their seat"), are rooted in the basically arbitrary decisions of a group of 18th- and 19th-century writers who often had a poor understanding of how English worked. But because these rules have been taught to generations of schoolchildren as ironclad truths, educated people have come to think of them as being on par with the laws of thermodynamics.

That's why no evidence from history or literature or any other field can possibly sway the fervent prescriptivist. Consider how Weingarten addresses the fact that many classic writers such as Jane Austen adopted the nonliteral literally on occasion: "That no more makes it right or acceptable than it makes it right for you to annihilate 100,000 people with a bomb just because Harry Truman once did it."

With this statement, Weingarten joins the honorable company of the critic John Simon, who wrote in 1980 that "The English language is being treated nowadays exactly as slave traders once handled the merchandise in their slave ships, or as the inmates of concentration camps were dealt with by their Nazi jailers." Most language scolds I've encountered aren't quite this colorful in their choice of analogies. A professor of mine made the point more simply when confronted by evidence that a usage he disapproved of appeared in the works of great writers: "It's still wrong."

The real problem with this argument is that it assumes a word's proper definition is some immutable law of nature, like gravity, that can never be shaped by the people who use the language, not even by the people who use it best. This view is positively blinkered. There's no reason why the English of Shakespeare is different from that of Chaucer, or from that of Weingarten, other than that human beings of every generation have spoken and written differently than their predecessors. And if there is one thing linguistic history absolutely makes clear, it is that today's error is tomorrow's rule. For example, nice once meant "foolish." It evolved to its present state because people kept using a "wrong" definition, but it's hard to see how English suffered as a result.

Of course, literally isn't just any evolving word. Its traditional definition is a useful concept to have a word for, and it would be a shame to see it go obsolete, which may happen if more and more people say things like "He literally puts his money where his mouth is." In that sense I'm with Weingarten that the looser definition should be avoided (though not excluded from dictionaries). What's striking is that he never makes this argument. His point is simply that it's the law, and we must obey. His indifference to judging word usages based on their utility is revealed in his offhand comment, "although I may cringe at 'blogosphere' and 'webinar' and, sigh, 'whatevs,' I do not protest their appearance in dictionaries." Now, why would anyone cringe at a coinage like blogosphere? (Least of all a blogger?!) Only someone who believes that language should remain literally frozen in time, and that all change is bad, would find anything wrong with that type of innovation.

Weingarten doesn't even accurately explain the loose definition of literally. He claims it is being used to denote its opposite, the word figuratively. It is not. As the OED notes, it is being used as an intensifier. It's basically a synonym for really or actually, except that those words have been blunted from overuse, so when you want to express that you really, truly mean something, literally sometimes gets the point across with more force.

Hence, "the coach literally hates my guts" is meant to convey that you aren't exaggerating the coach's hatred. In a way this is a form of traditional literally; it's just being applied selectively, to the level of the coach's hatred rather than to the metaphor used to describe it. What this example shows is that a statement can have multiple layers of presumptive nonliteralism, and literally may be intended to unpack one layer while leaving the next alone.

My point here is not that I approve of the loose definition of literally, but that it isn't necessarily based on ignorance of the traditional definition. Rather, it's a reflection of the fact that our language is littered with dead metaphors that are all but invisible to us. (The mixed metaphor I just used is further evidence of that fact.) This helps explain why the traditional definition hasn't disappeared from the language, despite centuries of being disregarded. Annoying as it is, the loose sense has come to coexist alongside the traditional one instead of replacing it outright. Weingarten misses this point when he quotes Ambrose Bierce's supposedly accurate prediction that "within a few years the word 'literally' will mean 'figuratively.'" In fact most people today use literally in exactly the way it was originally intended. We just pay closer attention to the loose sense because of the way it literally sticks in our craw--suggesting the danger it poses to our ability to communicate may be overstated.

That's actually true of most gripes about language usage. Some are completely groundless (the most famous being the split-infinitive "rule"), while others, such as this one, at best point to bothersome trends that detract from our language's vitality. In neither case is any large-scale damage on the horizon. As McWhorter explains in his book:

What we must realize...is that during these changes, because renewal always complements erosion, all languages are eternally self-sustaining, just as while our present mountains are slowly eroding, new ones are gradually being thrown up by the movement of geological plates. Thus at any given time, a language is coherent and complex, suitable for the expression of all human needs, thoughts, and emotions. Just as linguists have encountered no languages that do not change, they have also not encountered any languages whose changes compromised their basic coherency and complexity. We have encountered no society hampered by a dialect that was slowly simply wearing out like an old car. Anthropologists report no society in which communication is impossible in the dark because the local dialect has become so mush-mouthed and senseless that it can only be spoken with help from hand gestures. In other words, there is no such thing as a language 'going to the dogs'--never in the history of the world has there existed a language that has reached, or even gotten anywhere near, said dogs.

Wednesday, September 11, 2013

Keep your Obamacare off my exchanges

After reading the recent news story about a man at a Kentucky State Fair who expressed interest in Kentucky's new health-care exchange program, Kynect, by saying he hoped it beat Obamacare, apparently not realizing it was Obamacare, I decided to take a look at Kynect's website. What I found was that it seems to encourage exactly this sort of ignorance. Nowhere on the website is there a single mention of the words Obama, Obamacare, or the Affordable Care Act. The FAQ makes just one fleeting reference to federal law (regarding the requirement to purchase insurance) and then makes it sound like it was the governor, Steve Beshear (a Democrat, for what it's worth), who unilaterally chose to set up the exchanges:
Why was [Kynect] created?

Governor Steve Beshear issued an executive order to create a state-based health benefit exchange to best meet the needs of Kentuckians. kynect, like other health benefit exchanges, will provide simple, one-stop shopping for individuals and small businesses to purchase health insurance and receive payment assistance or tax credits.

In contrast, the website for the exchange program in New York (where I live) says right upfront that it's a result of the ACA:
Under the federal Affordable Care Act, an Exchange will be operating in every state starting in 2014. States have the option to either set up an Exchange themselves or to allow the federal government to set up an Exchange in their state. New York has chosen to set up its own Exchange, called the New York Health Benefit Exchange. On April 12, 2012, Governor Cuomo issued Executive Order #42 to establish it within the NYS Department of Health.
This made me curious about whether there's some relationship between a state's political composition and how candid its exchange website is about its connection with the ACA. I did a little online research about the different state exchanges that have been set up (this webpage was particularly helpful), and my discovery was a bit anti-climactic: it turns out that almost all of the states that have set up exchanges were ones that voted for Obama in 2012. Kentucky, which Obama lost by 23 percentage points, is the one exception. Maybe not so surprisingly, it also has the only exchange website where the words "Affordable Care Act" are nowhere to be found (though in a few other states such as Minnesota and New Mexico, mention of the law is buried deep within the website, and not, say, in a FAQ or "About Us" section). It will be interesting to watch how the law will be sold in other red states, where ironically the exchanges will be mostly federal-run due to the GOP's dogged unwillingness to cooperate with the law's implementation. Will the feds also adopt the principle that it's better to avoid disclosing the source of this cool new policy in the name of getting more people into the system?

Sunday, May 26, 2013

Why liberals became progressives--and why they'll stay that way

One of the most striking changes in political terminology to happen in my lifetime was the adoption of progressive as a substitute for liberal. What's weird about it is that most of the time people talk as though they've always used the word progressive this way, yet I can't remember hearing it until the 2000s. (Checking the archives for Google News and Google Books seems to confirm my suspicions.) When the topic is brought up, the commonest explanation (which even I have made) is that it was an attempt to escape the negative connotations of the word liberal, which had suffered from decades of abuse by conservative commentators. But that raises some questions. Since the negative use of liberal goes back at least to the 1970s, what took progressives so long to come up with their new name? Furthermore, why didn't they stick with liberal in a spirit of defiance against those who treat it as a dirty word? Doesn't abandoning it suggest that there really is something wrong with being a liberal, and that so-called progressives are simply doing a linguistic makeover to hide their flaws?

The answer to these questions lies partly in recent political history, partly in the difficulty in consciously making changes to the language. For several decades liberals did in fact try to wear the word liberal proudly, in spite of those who used it disparagingly. Progressive already existed in political parlance, but it had a broader, vaguer meaning than it does today and didn't necessarily imply an affinity for the left. In the 1980s, for example, the centrist Democratic Leadership Council called its think tank the Progressive Policy Institute. My guess is that the DLC aimed to evoke something along the lines of Teddy Roosevelt's bipartisan, reform-oriented "progressivism."

The degradation of the word liberal was gradual and, contrary to the oft-heard claim, not entirely due to the right's efforts. I think the process began in the late 1960s in reaction to the disillusionment and shattered dreams of the left. Around that time the term was undergoing a shift in meaning similar to what happened to a word like pious, where a formerly positive adjective comes to be used as a sneering description of those who fall short of the ideals they preach. Look, for instance, how Roger Ebert used it in his 1972 review of Sounder, a movie he defended against charges of liberalism:

It is, I suppose, a "liberal" film, and that has come to be a bad word in these times when liberalism is supposed to stand for compromise--for good intentions but no action. This movie stands for a lot more than that, and we live in such illiberal times that Sounder comes as a reminder of former dreams.
By the 1970s, liberal was starting to be treated less like a political orientation than like a character type, describing an overzealous do-gooder who may even be a hypocrite and patronizing snob--someone much like the character of Meathead from All in the Family. When the right began using the word pejoratively, they were in part seizing on that stereotype. Of course there is a difference between the trait of "good intentions but no action" and the right's more malevolent view of liberals. But the image of the excessive do-gooder--and above all the connotation of weakness--prevailed.

For a long time, Democratic politicians were unsure whether to embrace the liberal label or run away from it. In 1988 Dukakis resisted it before finally admitting, late in his campaign, that "I'm a liberal in the tradition of Franklin Roosevelt and Harry Truman and John Kennedy." This comment was practically an apology, seeming to imply that liberalism had fallen from its lofty position in the ensuing decades. It was as if he was assuring the public, "I'm a liberal, but one of the good ones."

Indeed, when it came to presidential elections in the post-Vietnam era, it often seemed that the Democrats' victories rested on how successfully their candidates escaped the liberal label. This perception was probably delusional (Mondale and Dukakis were running against a popular administration, whereas Carter and Clinton were running against unpopular ones, and so their ideological character was probably not the determining factor in the outcome of those races), but it was a lesson the Democratic establishment took to heart.

The moderate, Third Way politics of the Clinton years disappointed many liberals at the time, but this was overshadowed somewhat by their disgust at the GOP's scandal-mongering against the president. By the end of the decade, when Clinton enjoyed sky-high approval ratings while the GOP ended up defeated and humiliated in its attempts to bring him down, there was a triumphant feeling among Democrats which, I believe, made many of them willing to forget (if not forgive) his policy betrayals.

This truce ended with the U.S. invasion of Iraq, an event that drove a wedge between the Democratic establishment and the left unlike anything seen in over a generation. As the left's antiwar position, dismissed at first as radical, eventually became the consensus not just within the Democratic Party but in the country as a whole, it damaged the establishment's credibility and made the left's early criticisms of the invasion seem prescient. I personally believe (but have rarely seen it expressed) that this factor was a large part of the reason for the DLC's demise. And of course it led to the rise of Obama, whose early opposition to the war may have been singlehandedly responsible for his narrow defeat of Hillary Clinton in the primaries. Despite GOP talking points about how he was the "most liberal Senator," the L-word commanded surprisingly little attention in the 2008 election, when compared with past races. Obama did, however, eagerly identify as "progressive," the first modern Democratic nominee to do so.

This new use of progressive arose during the boom in Internet political culture that came to be called the "netroots," dominated by activists who now had the tools to make their voice heard in a way that wasn't possible in earlier times. That was the main setting from which today's progressive movement emerged. Though they rarely explained why they preferred the term progressive, I believe there were two primary reasons: they associated liberal with compromise and moderation in the hated establishment, and they wanted to free themselves from the influence of conservative frames they felt had governed mainstream political discourse for too long. Creating a new word for themselves (or, rather, refashioning an old, nearly forgotten one) was a way of achieving that goal.

Naturally, the new progressives tended to be fairly young--people in their twenties when the millennium rolled around (basically my generation). Older figures who have come to be associated with the movement have had to adapt their language to the times. When I searched Paul Krugman's columns and books for the word progressive, all I found were some references to progressive taxation--until his 2007 book The Conscience of a Liberal, where he explains the difference between liberals and progressives:

The real distinction between the terms, at least as I and many others use them, is between philosophy and action. Liberals are those who believe in institutions that limit inequality and injustice. Progressives are those who participate, explicitly or implicitly, in a political coalition that defends and tries to enlarge those institutions. You're a liberal, whether you know it or not, if you believe that the United States should have universal health care. You're a progressive if you participate in the effort to bring universal health care into being. (p. 268)
Although Krugman isn't defining the two terms as mutually exclusive, there is an echo of Ebert's association of liberalism with "good intentions but no action." Progressives, Krugman maintains, are liberals who put their beliefs into action. While that's an inspiring thought, I'm not sure it fits the way most people use these words. I assume Krugman bases his definition on the activist roots of the progressive movement, but by now (at least in my experience) there are plenty of self-identifying progressives not actively involved in the fight for liberal causes.

The linguist Geoffrey Nunberg rounds up various pundit theories on the progressive/liberal distinction before observing, "none of them has much to do with with how the labels are actually used." One problem I have with most of these theories is that they treat the categories as fixed and static. In reality, these words have had greatly varied meanings over time, and even within the same time have meant different things to different people. The fact that TR referred to himself as a Progressive while FDR considered himself a liberal doesn't shed much light on the differences between Clinton and Obama. With these caveats in mind, Nunberg offers his thoughts on what the progressive label is intended to signal today:

Far more than liberals, progressives see themselves in the line of the historical left. Not that America has much of a left to speak of anymore, at least by the standards of the leftists of the Vietnam era, who were a lot less eager than most modern-day progressives to identify themselves with the Democratic Party. But if modern progressives haven't inherited the radicalism or ferocity of the movement left of the 60's, they're doing what they can to keep its tone and attitude alive.
I tend to agree. I just wonder how long this situation will last. As the new progressives grow older and the word progressive becomes more ingrained, its anti-establishment overtones may well fade. Eventually it may come to be a simple descriptor of the average left-leaning Democrat, occupying more or less the same place that liberal used to--before it was turned into an epithet.

Perhaps sensing this possibility, some conservatives in recent years have been trying to do to progressive what they once did to liberal. Glenn Beck attempted something of the sort in his 2010 speech to CPAC, where he linked today's progressives with the alleged evils of the early-20th century Progressive Movement. I doubt this strategy will work. These conservatives have grown too insulated from the mainstream to reach beyond their narrow audience (somehow I don't think most Americans would share Beck's outrage at TR's support for universal health care or Woodrow Wilson's creation of the Federal Reserve), and in any case the word progressive just doesn't carry the negative connotations that helped the right tarnish liberal. Whether conservatives or older liberals like it or not, progressive as a self-respecting term is here to stay.

Tuesday, July 17, 2012

The challenge of old movies

Want to know how certifiable a movie fanatic I am? I actually keep an Excel spreadsheet noting every movie I see and the date on which I first see it. About a year ago, realizing I'd been keeping this list for literally half my life (since 1994, just before my 17th birthday), I attempted to identify the movies I'd seen during the first half. I got some help from Wikipedia, which has articles listing the films released every year (most of the major ones, at any rate). One thing I've determined is that I've seen well over a thousand films in my life--but perhaps three-fourths of them have been ones made within my lifetime, starting in the late 1970s.

This was a bit of a surprise to me, since I remember watching lots of old movies as a kid. But when I think about it, there are indeed an astonishing number of classics I still have not seen. And when I do get around to seeing them, the experience isn't always as satisfying as it's supposed to be. Part of the problem is a feeling of being intimidated by a movie's reputation. It's tricky trying to sit back and enjoy a Great Movie when I'm conscious of how I'm supposed to be feeling the weight of its Greatness at every moment. This is a big reason why I still have never watched Citizen Kane, and why I had the DVD to Lawrence of Arabia for a long time before I gathered up the courage to stick it in the drive.

The genre I find easiest to appreciate regardless of period is comedy. I was raised on classic comedy--the Marx Bros., Laurel & Hardy, Chaplin (who remains one of my favorite filmmakers to this day), Danny Kaye, the Three Stooges. I believe comedy is essentially timeless as long as it avoids topical humor, as these old movies generally did. Good 21st-century comedies like 40-Year-Old Virgin or Borat may be more profane than their predecessors, but the underlying principles of humor haven't changed. I can't say the same for dramas, westerns, romances, or horror films.

There are four basic challenges to becoming engaged in older movies. One I am not dealing with here is advancement in special effects and other technical matters. Everyone agrees movies have improved over time on that score. Instead, I wish to focus on those areas that pose significant and non-superficial barriers between modern viewers and even the best films of the past:

1. Changes in decency standards

While I find many of today's movies overly coarse, those made at the height of the Hays Commission had the opposite problem. They couldn't talk about sex in anything approaching a candid manner and were forced to employ ridiculous euphemisms, which can be hard for a modern viewer to adjust to. When I watched His Girl Friday, I had already seen the 1974 version of The Front Page with Jack Lemmon and Walter Matthau, based on the same play. Being a big Matthau/Lemmon fan, I loved the '74 version, and while I enjoyed the older movie as well, I was conscious of its limited ability to depict certain plot points. This actually led to a couple of good lines, as when Hildy reports that a character got shot in the "classified ads." At other times I felt the movie suffered from the constraints, as when it presented Mollie Malloy as an old maid rather than (as in the original play) a prostitute. And I'm sorry, but I just can't have as much affection for a film that omits the play's hilarious closing line, "The son of a bitch stole my watch!"

Some old movies have an innocence that looks laughable today. I saw the Oscar-winning 1938 film Boys Town when Newt Gingrich hosted a showing of it on TNT in 1994. Gingrich felt people should watch this movie to learn how to help today's troubled youth. The movie tells the story of Father Flanagan (Spencer Tracy) and his heroic efforts running an orphanage. His motto is that there's "no such thing as a bad boy." And indeed, most of the boys we see in the orphanage behave like perfect angels, except for one played by Mickey Rooney as a juvenile delinquent so terrible he actually smokes, plays cards, and acts sassy toward the grownups. (As a neighbor of mine at the time put it, "Sounds like the typical yeshiva bochur.") Despite these immeasurable crimes, Father Flanagan somehow manages to get through to him in the end and make him into a good kid, a message of great relevance for today's crack babies.

2. Changes in moral sensibilities

This category covers a lot of ground, but it's most notable with attitudes about race. Movies from the '30s and '40s are often shockingly racist, and when they are, I'm thrown right out of the picture. Duck Soup is one of my favorite comedies, but when Groucho utters the line--"My father was a little headstrong, my mother was a little armstrong. The Headstrongs married the Armstrongs, and that's why the darkies were born"--the movie for me just stops dead. (Few people today are aware that Groucho was actually referencing a hit song from the time titled "That's Why the Darkies Were Born" that was supposedly satirizing racism, but it still sounds pretty offensive to modern ears.) And that's just dialogue. The stereotyping of nonwhite characters in films from this period is so awful it leads to the sadly ironic fact that films from this period tend to be more watchable when they feature all-white casts.

3. Lack of freshness

It may not be fair, but movie ideas that were once highly original can come to seem banal if they get imitated enough. Hitchcock went to great lengths to keep the surprise ending to Psycho from getting leaked, but by today's standards it seems almost trite. (As Nicolas Cage declares in Adaptation, "The only idea more overused than serial killers is multiple personalities.") I recently saw Casablanca for the first time, but it feels like I've seen it my whole life. It was like deja vu as I watched, memories from my childhood of stuff I had seen that referenced the film coming to the surface of my mind: an episode of Moonlighting, a scene from one of the Naked Gun films, parts of When Harry Met Sally, you name it. There's also something surpassing strange about hearing lines like "Here's looking at you, kid," "I think this is the beginning of a beautiful friendship," "I'm shocked, shocked to find that gambling is going on here," uttered in earnest. While I did enjoy the movie, it was certainly not the same experience moviegoers in the '40s had. It was like viewing some grand antique.

4. Differences in filmmaking style

This section will be much longer than the previous ones because it deals with a characteristic of movies from the '30s and '40s that has always been obvious to me but which, for reasons that escape me, I have rarely seen discussed: they look and sound a great deal like plays. Watching a movie from today doesn't usually feel like seeing a group of actors up on a stage; it's more like looking into a window at a real-life scene. I don't just mean that the sets look more convincing, but even more that when the actors talk, they tend to sound a lot more like real people having a conversation. To show what I'm talking about, let's examine two clips, one from the 1939 version of Of Mice and Men, the other from the same scene in the '92 version:

Did you notice what I noticed? Not only is the first movie clearly being filmed on a stage, whereas the second is filmed in actual woods (or at least provides the illusion of it), the differences in acting style are striking. In the older movie, the actors deliver their lines loudly and in an almost sing-song manner; in the later film the actors speak practically under their breath, with minimal intonation. The acting in the first clip is more stylized, in the second more naturalistic. In short, the actors in the original film seem to be acting, whereas in the remake they're behaving. The second clip therefore has a more lifelike feel (despite the fact that it's the only one of the two to feature background music, a point to which I'll return shortly).

I am not cherry-picking here; this is something that has consistently stood out for me whenever I've watched movies from the '30s and '40s and compared them with later films. It is most noticeable in dramas, but it exists to varying degrees in all genres. And of course there are exceptions. Jimmy Stewart was always more naturalistic than Nicolas Cage has ever been, but in any of their films they are surrounded by actors whose style contrasts with theirs.

None of this should be surprising. When talkies were invented, movie actors naturally adopted the conventions of what had previously been the only dramatic art form involving speech. They imitated the way stage actors spoke because that was all they knew. Over time, as the technology improved and as film came more into its own as a respectable medium, the styles of stage and screen diverged and naturalism gradually became the norm on screen; I believe the transition was complete in American cinema by the early 1970s.

What do I think of the change? It depends. The two Of Mice and Men adaptations are similar overall, but as an admirer of Steinbeck's novel I always preferred the '92 version. I had an easier time connecting emotionally with characters who sounded like real people when they spoke. This standard of judging movies is surprisingly rare, from what I've seen; people just don't want to admit that the naturalism of modern film has advantages.

It also has disadvantages. Among other things, I believe it contributed to the disappearance of musicals in the 1970s. The musical is, after all, very much a genre of the stage, and to have today's movie characters burst into song can seem odd and inappropriate in a way that it never did in previous eras. Music is still important in today's movies, but most of it is in the background: the scores (one of the few non-naturalistic aspects of movies to have increased over time) and the video interludes (a form that gradually replaced musical numbers in the '60s and '70s, and which in my opinion is one of the most annoying features of modern cinema). If a modern movie character sings, usually there's a rationale within the story, such as if the character is a professional singer.

All of this has led to a looser definition of the word "musical," which nowadays often means simply "movie with lots of songs in it," even when there are no numbers. The Golden Globes, for example, have applied the term to films like Walk the Line which are only "musicals" by virtue of concert scenes, video interludes, and the like. When modern movies do feature traditional numbers, the effect is often curiously artificial.

A lot of people like to ignore this fact and pretend nothing's changed. There's not much acknowledgment that musicals didn't just happen to fall out of fashion (the way, say, westerns did), but that the whole underlying approach to filmmaking changed in a way that made the conventions of musicals seem out of place. In the olden days, making a film as a musical was such a normal and natural choice it could even be fairly peripheral to the film itself. For example, most of the Marx Bros. and Danny Kaye films, remembered primarily as comedies, happened also to be musicals. Today's movies don't have that freedom.

The resurgence of movie musicals following the success of Chicago happened in part, I think, because the 2002 film found a unique way to reconcile the conflicting conventions. In this film, a woman played by Renee Zelwegger dreams of one day becoming a vaudeville star, and most of the song-and-dance numbers are presented as fantasy sequences where she imagines herself and other people performing on stage. As a result, the distinction in this movie between a musical number and a music video is blurred to the point of irrelevance. One IMDB commenter suggested that the film was "ashamed to be a musical," but I'm not sure it would have been as successful if it had simply ignored the problem.

Alas, many of the movie musicals since then have done just that. When I first saw Dreamgirls, I noticed it wasn't until about thirty minutes into the film that a character starts singing on the street (as opposed to on a stage or in a studio). Up to that point, the movie had seemed like a low-key, serious drama, and I have to admit I found the sudden break in realism that late in the story rather jarring. I thought to myself, "Wait a second...this is a musical?!" I've had that sort of experience with at least a couple of today's musicals, but I've never had it with the musicals of old. They don't have anything to apologize for.

Conclusion

My point here isn't that modern movies are intrinsically "better" than older ones, or vice versa. I just think there needs to be more recognition of the effect that the evolving conventions have on different generations of moviegoers. For sure, younger people who consider older movies boring or incomprehensible are missing out on something. But people who celebrate the old stuff as some kind of gold standard that nothing today could match up to, and imply that anyone who disagrees is simply lacking in culture or taste, aren't exactly helping matters either. Speaking personally, as I continue to enrich my knowledge of films of the past, I've had the best experiences when I've understood the movies in the context of their time and was prepared to adjust as needed. Holding them in godlike esteem doesn't do the trick for me.

Thursday, February 16, 2012

America's liberalism and GOP propaganda

Recently Sen. Marco Rubio repeated a common right-wing talking point: "The majority of Americans are conservative." In response, Politifact noted that only a plurality, not a majority, of the public answers to the term "conservative" on polls, and that slightly more Americans identify as Democrats than as Republicans, while the largest group, independents, are evenly divided in their partisan leanings.

Rachel Maddow took Politifact to task for rating Rubio's statement Mostly True in light of these facts, but I think she misses the point. She is right to attack the Politifact article, but she attacks it for the wrong reasons. If Rubio had said, "The majority of Americans think of themselves as conservative," Politifact's rating would have been appropriate. The majority/plurality distinction is often ignored in colloquial speech, and partisan identification doesn't always match ideology or even voting tendency. What makes Rubio's statement misleading is that it implies Americans tend to fit his definition of conservatism, when the evidence strongly suggests otherwise.

He explains why he thinks Americans are conservative with his ridiculous, incendiary remark, "They believe in things like the Constitution. I know that's weird to some people." Of course few Americans of any political stripe would say they don't believe in the Constitution, but Rubio isn't basing his judgment on what they claim about themselves. President Obama may be a former professor of constitutional law, but according to conservatives like Rubio, his liberal policies prove he doesn't "believe" in his area of specialty, no matter what he says in public.

In other words, Rubio's logic implies he doesn't necessarily take people's self-descriptions at face value. I doubt he would accept, for example, the claimed conservatism of Andrew Sullivan, a staunch Obama supporter. How do we know the 40% of Americans who call themselves "conservative" are the Rubio type of conservative, as opposed to the Sullivan type, or some other type entirely?

As a matter of fact, Americans tend to prefer the policies of the Democratic Party to those of the GOP. According to polls, Americans overwhelmingly favor an increase in the the minimum wage, higher taxes on the rich, and leaving Social Security and Medicare alone. The Affordable Care Act remains unpopular, but the public option that was discarded from the bill polled well.

Public opinion on social issues such as abortion is somewhat cloudier, though the public becomes increasingly accepting of gay rights with each passing year. If there's one area of liberal thought that is continually unpopular, it is civil liberties. But Democrats know that, which is why they shy away from implementing such policies. That makes Rubio's statement about the Constitution ironic, because it seems to me that Americans as a whole don't have a great deal of respect for many of the rights outlined in the Constitution, yet that's the one area in which their views are more in line with those of Rubio's party!

Given all these facts, why do twice as many Americans identify by the term "conservative" as by the term "liberal"? Partly it's because over the past several decades conservatives have successfully turned "liberal" into a dirty word, so that even people who hold liberal policy positions are reluctant to embrace the term. (That's the main reason the American left started calling itself "progressive.") It's a total flip from the past. Traditionally the word "liberal" had positive connotations, evoking someone open-minded and forward-looking, while "conservative" was often a slightly negative word, suggesting joyless, old-fashioned squareness. (That was presumably the meaning Rush Limbaugh had in mind when he complained about a reporter who allegedly described his neckties as conservative.) The rise of a vigorous conservative movement in the 1970s at a time when liberalism was flailing helped to change these perceptions.

I'm not saying the public is, in fact, "liberal." For one thing, the American system is itself to the right of most modern, developed nations, so that even the Democratic Party would look conservative in other countries. "I have no more intention of dismantling the National Health Service than I have of dismantling Britain's defenses." Who said those words? It was that bolshevik Maggie Thatcher.

Conservatives point to polls showing that Americans tend to say they favor "limited government." But when you examine their views more closely, you find they oppose just about anything that would actually lead to a smaller government. As a 2010 Washington Post poll found, "most Americans who would like to see a more limited government also call Medicare and Social Security 'very important' programs...[and] want the federal government to remain involved in education, poverty reduction and health care regulation."

The American appetite for shrinking the government in theory but not in practice is a big part of why it's so hard to combat the country's fiscal problems. Serious cuts will almost invariably cause pain to many voters. The only category of federal spending that a majority of Americans wants to see cut is foreign aid, and most Americans are unaware it constitutes a minuscule portion of the budget. It's comforting, I suppose, to pretend the source of our problems lies an ocean away.

However Americans may describe their political philosophy in the abstract, it's obvious from any serious examination of the polls that the economic policies of the Republican Party are, for the most part, highly unpopular. So why do Republicans continue to win office? For starters, elections tend to be driven by factors other than voter agreement with policy. According to political scientists, the biggest influence on national elections is the state of the economy. The 2010 Republican sweep demonstrated this to a tee: exit polls showed that 52% of the voters wanted to see the Bush tax cuts on the rich ended, and that although 48% wanted to see Obamacare repealed, 47% wanted to see it either left alone or expanded. Thus, the electorate that brought us this "Tea Party revolution" didn't clearly agree with two of the Tea Party's key policy priorities.

Of course Republicans don't have to wait till the economy goes south to gain power. They've developed a wealth of propaganda to make their views sound more palatable to the average voter. They'll point out that Democrats want to raise taxes, and once you mention that the proposed tax hikes will only fall on the richest of Americans, a mere 1% of the country, the Republicans then accuse the Democrats of "class warfare" and hostility to "job creators." All these rhetorical devices are attempts to give voters the impression that Republicans are defending the interests of the middle class, a delusion they can only maintain by not describing their policies in plain English.

It is when Republican politicians get to entitlement talk that their propaganda descends into complete incoherence. "Keep your government hands off my Medicare" was a line seen and heard at a few Tea Party rallies, and although these were isolated incidents, they reflected a message that's pervasive in a party that calls Obamacare a "government takeover of the health-care system" while at the same time attacking it for its cuts to America's actual government health-care system.

The contradiction is sometimes laughably transparent. In 2010, a Republican Congressional candidate ran an ad blasting the Democratic incumbent for the following two sins: "Government run health care. Medicare cuts." In 2011, Michelle Bachmann warned that Obama would try to turn Medicare into Obamacare (which would actually mean privatizing it, but never mind). Then there's Romney's recent statement simultaneously assailing the president for failing to make entitlement cuts and, well, making entitlement cuts: "This week, President Obama will release a budget that won't take any meaningful steps toward solving our entitlement crisis.... The president has failed to offer a single serious idea to save Social Security and is the only president in modern history to cut Medicare benefits for seniors."

Republican messaging reached its ultimate absurdity after the passage of Paul Ryan's bill (which Romney has endorsed) that not only includes the same Medicare cuts that Romney and other Republicans have attacked Democrats for implementing, but plans eventually to eliminate Medicare in all but name--whatever Politifact may tell you to the contrary. Ryan's plan isn't popular, but that hasn't stopped Republicans from claiming to be saving Medicare while accusing Democrats of trying to destroy it.

These obfuscations are necessary because Republicans truly seek reductions in America's safety net but must contend with a public deeply opposed to that project, including one of their core constituencies, the elderly. They have no choice but to lie about their positions, because describing them truthfully would prevent the GOP from getting into power. As Jonathan Chait explained in his 2007 book The Big Con, published before the rise of Obama, the Tea Party, or the mythical death panels:
There is also a natural--and, in many ways, commendable--skepticism about one-sided accusations of dishonesty. Those who confine their accusations to one side are usually partisans best taken with a grain of salt. Lying and spinning have always been a part of politics, and it is the rare elected official who prevails by offering the voters an objective and unvarnished assessment of his plans. Moreover, since we tend to think of lying as an idiosyncratic personal trait, there's no reason to think that one side has more liars than the other any more than there's reason to think one side has more drunks or adulterers.

Yet, as will become clear, the fact remains that dishonesty has become integral to the Republican economic agenda in a way that it is not to the Democratic agenda. The reason is not that Republicans are individually less honest than Democrats. Far from it. It is simply that the GOP, and the conservative movement, have embraced an economic agenda far out of step with the majority of the voting public. Republicans simply can't win office or get their plans enacted into law, without fundamentally misleading the public. Lying has become a systematic necessity. (pp. 118-9)

Thursday, February 02, 2012

The film, not the holiday

From the first time I watched it when it hit the theaters back in '93, Groundhog Day has held a special place in my mind. It isn't that it's based on a clever idea (Bill Murray stuck in a time warp forced to relive one of America's silliest holidays over and over). Lots of movies have clever ideas. It isn't even that it executes this idea flawlessly from start to finish--which is almost unheard of for a high-concept comedy. It's that it does all that, and then manages to tell a smart, perceptive tale about the human condition.

It's one of those movies where you're not even sure the filmmakers understood how good it was, because it has the trappings of a more frivolous endeavor. It belongs to a tradition of farces I call Cursed Schlub movies, which revolve around a character who becomes victim to some otherworldly fate with its own bizarre rules. Examples include The Nutty Professor, All of Me, Liar Liar, and Shallow Hal. Typically they're not much more than comic exercises, with plots that function as clotheslines for a string of elaborate gags.

While just as funny as the best of those pictures, Groundhog Day transcends the genre, fleshing out the story and supplying character development that doesn't feel the least bit contrived. It was something of a transitional role for Murray, who up to that point had been known merely as a brilliant comedian. His performance, where he uses subtle facial mannerisms to great effect, paved the road for his more serious turn in movies like Rushmore and Lost in Translation.

Even its style of humor is uncommon for this type of film. It mostly avoids slapstick in favor of witty dialogue that showcases Murray's gift for understatement, as when he laconically remarks, "My years are not advancing as fast as you might think." Not only does his deadpanning make it easier for us to believe in the character of Phil, it enhances the laughs. (Part of the hilarity of the early scenes comes from seeing the mounting panic on a man who keeps his emotions so tightly bottled up.) I have my doubts that a broader approach, like one Jim Carrey might have given, would have worked as well on any level.

The film also wisely never reveals the cause of the time loop. It's customary in Cursed Schlub movies to invent some harebrained rationale for the central plot device (the birthday wish in Liar Liar, the hypnotic suggestion in Shallow Hal, the cartoon mysticism in countless body-swap comedies). An early version of the script did just that, explaining Phil's predicament as--I kid you not--a voodoo spell cast by a former lover. Viewers tend to assume it is something more along the lines of a trial from God, and the movie acquires a certain poetic and even spiritual quality normally absent from this sort of material.

Its biggest divergence from other movies in the genre lies in its plot construction. Instead of the usual strategy of cobbling together a series of comic sketches and gluing on a formula ending, the plot develops as the step-by-step process by which Phil comes to terms with his strange condition. There's a surprisingly smooth progression to the story that never gets thrown off course by the jokes. In the entire movie only the Jeopardy scene feels like a skit that could have appeared just about anywhere in the proceedings (though its placement in the section where Phil becomes lethargic and depressed makes sense). The rest of the time the events fit together like clockwork, the end flowing naturally from the beginning, all of it focused on Phil's growth as a person. Along the way, it does a splendid job exploring a universal human trait.

Everyone on the planet, I'm convinced, sometimes imagines redoing past experiences. This can range from thinking up a snappy retort hours after an argument ended to harboring deep regrets over a life decision. Yet if you were somehow given the ability to go back in time and alter past events until you got them exactly as you wanted, you'd eventually go mad, because you'd be depriving yourself of the unpredictability that makes life worthwhile.

That's one of the movie's insights. I think of the sequence where Phil uses trial-and-error to determine Rita's likes and dislikes so that after she's forgotten divulging all that information he'll present himself to her as Mr. Right. From his perspective the project could be taking weeks or months or longer (the movie never says), but to her it's always their first date, and he is never able to seduce her on what to him is a day with no end. She always feels he is pushing too hard on their relationship, and even though she doesn't know the supernatural part, she senses he's manipulating the situation and concealing his true self from her.

In one crucial bit of dialogue after he thinks he has gotten through with her, she tells him, "It's a perfect day. You couldn't plan a day like this." He replies, "Well, you can. It just takes an awful lot of work"--once again making an honest observation secure in the knowledge she won't possibly take it literally. Yet one of their most romantic moments, where they fall on the snow together, is unplanned. We see that he is unable to repeat that moment in later iterations of Groundhog Day: his words get increasingly stilted, his movements increasingly clumsy. Even for a man given thousands of do-overs, the moment is gone forever, and only he will remember it. Our lives are filled with moments like that, but some of us lose sight of their significance when we're overwhelmed by the things we almost got right.

Thursday, January 05, 2012

Ivory tower crusaders

According to Ron Paul, "Libertarians are incapable of being racist, because racism is a collectivist idea, you see people in groups."

That remark reminds me of Pat Buchanan's response to charges of anti-Semitism: "I am as aware as any other Christian that our Savior was Jewish, His mother was Jewish, the Apostles were Jewish, the first martyrs were Jewish.... So no true Christian, in my judgment, can be an anti-Semite."

Not only do these statements both demonstrate the No True Scotsman fallacy, they raise some intriguing points about how the concept of prejudice is commonly misunderstood.

Let's start with the claim that a true Christian cannot be an anti-Semite. Somehow I doubt that assertion would much impress the Jewish victims of the Crusades, the Inquisition, and the numerous expulsions and pogroms and massacres committed in the name of Christ throughout the centuries. Presumably, Buchanan would respond that none of those attackers were "true" Christians. (I'm being charitable here, because I know there's a distinct possibility that he would defend the Crusades, as some on the right have.) It's a seductive argument because you can't possibly disprove it. Anytime a Christian assaults a Jew, you can either deny that person is a "true Christian" or deny that what that person did was anti-Semitic. It's one of those airtight defenses lawyers love.

It also shows a poor understanding of the historical roots of anti-Semitism. The simple fact is that most of the themes of modern-day anti-Semitism first emerged in a medieval Christian context. This happened not in spite of the fact that Christianity began as a rival Jewish sect, but in many ways because of it. Medieval Christians saw the continued existence of Judaism as an insult to their own faith which was supposed to have supplanted it. In theory this was a religious rather than racial prejudice, with the goal of converting Jews rather than killing them. And when it took on a racial character, as in 15th-century Spain, pointing out the Jewishness of the early Christians would probably not have swayed the persecutors.

Buchanan seems to be implicitly defining anti-Semitism as "the doctrine of hating all Jews who ever walked the face of the earth"--which is not how medieval Christians, even the Spanish, ever framed the issue--and then suggesting that this doctrine is logically incompatible with the theological claims of Christianity. And so it is--but only very mildly. The fact that his religion is founded upon worship of a long-deceased Jewish man does not automatically imply acceptance of the vast majority of Jews. History makes this all too clear. Centuries of persecution and bigotry can't be swept aside by one tiny, possible logical inconsistency.

That brings us to Ron Paul and his argument that libertarians can't be racist because racism is a form of collectivism, the opposite of libertarianism. If that's the case, then it's a funny coincidence how closely many of his policy views match those of the people he calls collectivist. As Stormfront founder Don Black said after endorsing his 2008 presidential bid, "We know that he's not a white nationalist...but on the issues, there's only one choice." What issues? Black mentions the Iraq War and immigration, but maybe there's just a few other things Paul has said that might appeal to white nationalists--say, his long-standing opposition to the Civil Rights Act of 1964. He insists he takes this position not because he harbors any animosity toward blacks (or "the blacks," as he phrases it in the earlier clip) but merely because he values freedom.
When you invade and violate the Constitution, you attack the personal liberties of the citizens of California and Maine, as well as the liberties of the people of South Carolina and Virginia. You cannot create new rights for one group by taking them away from another.

I am deeply concerned over the efforts of opposing groups to smear our effort with the false trappings of race hatred. We are interested solely in protecting the rights of states to manage their own internal affairs, which is a fundamental guarantee of the Constitution.
Actually, those aren't the words of Ron Paul. They're the words of Strom Thurmond during his 1948 segregationist campaign. (The first paragraph is from The Washington Post, Oct. 12, 1948, the second from The Baltimore Sun, Jul. 20, 1948--both obtained from my library's archive.) But if you read what Paul has actually said on the subject, you'll find that the above quote wouldn't sound at all out of place.

Of course Thurmond also once said, "there's not enough troops in the army to force the Southern people to break down segregation and admit the nigger race into our theaters, into our swimming pools, into our homes, and into our churches." Admittedly it's hard to imagine a remark like that escaping Paul's lips (though not so hard to imagine it appearing in a newsletter under his name). And Paul does talk favorably, as Thurmond would not have, about creating a "color-blind society."

But Paul's argument about collectivism is doubly flawed, first because it conflates a philosophy of government with a philosophy of human differences (a person can, with perfect consistency, believe that blacks and whites should be treated equally under the law while also believing whites will naturally come out on top), second because it's exactly the sort of rationalization that white supremacists have used for centuries to justify keeping racist institutions alive. They also talked about states' rights; they also depicted civil-rights legislation as an assault on freedom; they also claimed their preferred policies would benefit blacks; and they also repudiated certain manifestations of bigotry. (Thurmond, for example, opposed the poll tax and distanced himself from the racist, anti-Semitic preacher Gerald L. K. Smith.) Even if Paul's motives are entirely honorable, rooted only in his fealty to federalist principles and not to prejudice, it doesn't change the fact that racism has a long history of coming cloaked in such principles.

Paul and Buchanan both think they can refute charges of bigotry simply by identifying themselves with a favored belief system and defining that belief system in logical opposition to the charges. Their use of this defense reveals a cartoonish understanding of bigotry, and the philosophical basis on which they reject that bigotry is hopelessly feeble. They are men living in ivory towers, too attached to the elegant simplicity of their logic to appreciate its real-world implications.