Sunday, August 09, 2015

Political correctness and sincerity

One thing I found interesting about the recent showdown between Donald Trump and Megyn Kelly is how he brought "political correctness" into the discussion:
I think the big problem this country has is being politically correct. I've been challenged by so many people, and I don't frankly have time for total political correctness.
What I noticed was that he didn't actually bother to defend the behavior which Kelly complained about, namely his disparaging remarks about women. He wasn't so much making a bad argument as making no argument at all. He simply observed that his behavior is "not politically correct" and claimed that political correctness is a big problem in this country--as if to suggest that his behavior's taboo status was itself proof of its worthiness.

You could use this reasoning to defend any position at all. Hey, I think a man should beat his wife with a frying pan every night! You offended? Sorry, I'm not PC. I think black people are feeble-minded, Jews are cheap, and only the rich should be allowed to vote. Don't like what I'm saying? That's just because you're too PC.

It would be a mistake to dismiss this type of thing as simply another Donald Trump absurdity. On the contrary, it lies at the heart of most arguments attacking political correctness, and it's been a feature of these arguments for the last thirty years. Whenever you say something that offends someone, you say you're "not being PC" as if pointing that out automatically absolves you of responsibility for your remarks.

I mentioned the following anecdote a couple of years ago on this blog, but it bears repeating. I was once reading a blog discussion on a subject that had nothing to do with politics. One commenter referred to the author of some book as an idiot. The blogger said he agreed with the criticism but added that there was no need to engage in ad hominem attacks. The commenter retorted, "Oh, don't be so PC."

One of the assumptions underlying attacks on PC is that you're being more authentic, more truthful, than the other person. As a result, the anti-PC trend in our society has fostered an idea that civility and common courtesy are nothing more than strategies for hiding what people are really thinking.

This idea is reflected in the repeated claims I keep hearing--and not just from Trump admirers--that Trump is "speaking his mind" or engaging in "straight talk." This is a patent misunderstanding of Trump's whole public profile. It's obvious to anyone who bothers to pay attention that Trump's antics are pure theater. I literally have no idea what he really thinks about Mexicans or PMS or Obama's birthplace. It doesn't matter. He understands something which shock jocks began capitalizing on more than a generation ago, which is that outrage sells.

That's part of the whole allure of attacks on PC: they equate sincerity with a willingness to offend. The assumption is based on a fundamental fallacy. It's certainly true that professional politicians typically behave in a canned and artificial manner by avoiding saying anything that will offend their constituents. But it doesn't follow that going to the opposite extreme, acting rude and boorish in an erratic and unpredictable way, automatically implies authenticity.

Nobody argues that when Andy Kaufman did his Tony Clifton act, he was showing a truer version of himself. Yet that's just the sort of assumption people make whenever celebrities or politicians stray outside the boundaries of what is generally considered decent behavior. Trump may be a walking caricature, but like a lot of caricatures he throws some things about the real world into sharp relief.

Friday, July 17, 2015

The joys of being persecuted

In a piece on separating myth from reality in the history of anti-Irish bigotry, Megan McArdle writes:
As I read about these notices, I wondered: Why was I so glad to read that my ancestors had, in fact, faced nasty discrimination? It's a reaction that needs scrutiny.
This is something I was thinking about recently in light of all those stories on Rachel Dolezal, the white woman who posed as black and became an NAACP leader. It reminded me of the case of "Binjamin Wilkomirski," a man who published a memoir in the 1990s detailing his experiences as an Auschwitz survivor before it was discovered that the events he described never took place, he wasn't even Jewish, and he had a totally different name.

What fascinates me as a Jew and the grandson of Holocaust survivors is why people would want to engage in this kind of deception. Cases like these are as bizarre as they are uncommon, and surely mental illness is involved. But I also think they stem in part from the same tendency McArdle is alluding to, of taking pride in being the member of a historically persecuted group. Modern society has romanticized persecution to the point that everyone wants part of it, as when a billionaire last year claimed the wealthy in America today were being treated like the Jews during Kristallnacht. The statement was both silly and offensive, and it's hard to imagine anyone who actually lived in the 1930s making such a comment. Back then, being the victim of anti-Semitism or racism wasn't regarded as cool.

It brings to mind a Mark Twain quip: "A classic is something everybody wants to have read, but nobody wants to read." Nobody wants to be in a concentration camp or be lynched by the Klan or beaten by cops or harassed or discriminated against--but a lot of people, whether they admit it or not, wouldn't mind having those things on their résumé.

Wednesday, May 27, 2015

Conservative muggers

The recent story about the South Carolina man who was an avid Tea Partier before discovering the virtues of Obamacare has gotten me thinking about the old aphorism, "A conservative is a liberal who got mugged." What do we say in cases like this? Maybe "A liberal is a conservative who lost his health insurance."

That brings me to a general point: the "liberal who got mugged" expression is highly misleading and ought to be retired. It perpetuates the false notion that liberals live in ivory towers and that if they were better attuned to the real-world consequences of their policy preferences, they'd be conservatives. I believe that in many ways the reverse is true.

Where did this expression come from? A while back I searched newspaper archives for the answer, and with the help of The Atlantic's Yoni Appelbaum (previously known as Cynic, a commenter on Ta-Nehisi Coates' blog where we hung out), I found it. The line was apparently coined in 1972 by Philadelphia mayor Frank Rizzo when talking to a Newsweek reporter. "You know what a conservative is?" he asked. "That's a liberal who got mugged the night before." According to Appelbaum:

The article itself 'Living with Crime, USA' is a famous exploration of the intersection of crime and our fear of crime, written by David Alpern after he himself was mugged. Rizzo, a Democrat who ascended to the mayoralty through the police force, was making a point about the paramount importance of law and order. His point was that stopping muggings was a basic prerequisite for any other initiative. And most conservatives, at the time, were justifiably offended. The quip, after all, implies that liberals win in reasoned, principled debate, but that conservatives are fueled by fear.
As the expression gained popularity, its original meaning was forgotten, and--perhaps owing partly to Rizzo's own reputation as a law-and-order Democrat who eventually switched parties--it soon evolved into a condescending indictment of liberals. Its new meaning was encapsulated by Irving Kristol in the late '70s when he declared that a neoconservative was a liberal who got "mugged by reality."

Liberals over the years have struggled to come up with alternative expressions. One I've often heard is "A liberal is a conservative who got arrested." Actually, though, that to me almost sounds like just another diss of liberals, implying they're "soft" on crime if not criminals themselves. (Maybe it would work better if it went "wrongly arrested.") One thing liberals rarely do is question the original expression, probably because they believe it has kernels of truth to it. And so it does. The problem is that it stacks the deck against liberals in a way that just isn't accurate--and that's something we ought to point out more often, instead of searching in vain for our own quips.

For one thing, it comes with an assumption that the liberal who gets mugged has made a wise choice by turning conservative, rather than having simply given in to base fears that may have little grounding in reality. Let's say a liberal's house gets burglarized, and he goes out and buys a gun to protect himself from future home invasions. He may have become more conservative, but has he become objectively safer? The evidence wouldn't seem to support that conclusion.

The expression stops making sense altogether once you move beyond the subject of crime. (Even on crime it is questionable and betrays a white perspective since it doesn't account for a fear of police, a big factor in the lives of African Americans.) On issue after issue, from health care to Social Security to unemployment and beyond, it is conservative elites who don't have to deal with the real-world consequences of their policy-making. Kristol's statement in particular looks ironic now, since the Iraq War was essentially a case of neocons getting mugged by reality. And as we can see from the story about the South Carolina man, the GOP's war against Obamacare has opened up new avenues for conservatives to be mugged by reality.

But let's not kid ourselves that this will create scores of new Democratic voters. What will prevent that from occurring is something that has been a great friend to the GOP over the past several generations: namely, good old-fashioned cognitive dissonance.

It's what gets people to go to Tea Party rallies declaring, "Keep your government hands off my Medicare." It's what gets the actor Craig T. Nelson to say, "I've been on food stamps and welfare, did anybody help me out? No. No." It's what gets Kentuckyans to like their health-care exchange but hate Obamacare. It's what gets some of the reddest states in the nation to have the highest rates of food stamp use.

Getting mugged isn't going to change you when you've been brainwashed into thinking the mugger is your savior.

Saturday, September 28, 2013

In (partial) defense of nonliteral literally

Gene Weingarten has written a piece decrying the Oxford English Dictionary's recent decision to include in its definitions the use of literally for nonliteral expressions (as in "I literally died of laughter"). While I share some of Weingarten's distaste for this usage and find it to be a fun topic, I cannot agree with his complaint. It has to do with what you consider a dictionary's purpose. Weingarten apparently believes it is to serve as an authority on how people ought to speak and write. This school of thought, known as prescriptivism, once dominated lexicography. But over the past century most dictionaries moved toward descriptivism, the idea that their purpose is simply to describe the language as it is currently used by its speakers. According to this view, if enough people use a word in a certain way, it deserves inclusion in a dictionary. Weingarten thinks this is simply "rewarding vapidity."

Weingarten's harangue is typical of prescriptivists, who in my experience tend to be scarcely aware they're even advocating a philosophy, let alone one widely rejected by lexicographers and linguists. They present their criticisms of the way people speak and write as nothing more than commonsense conclusions that they remember better than others because they stayed awake during third-grade English. Rarely do prescriptivists question any of the traditional rules they were taught in school, many of which do not hold up to scrutiny. They are discussed at length by the linguist John McWhorter in his 2001 book Word on the Street, which presents a wealth of evidence that many of the so-called "rules," from avoidance of split infinitives to the prohibition on using they with a singular antecedent (as in "everyone returned to their seat"), are rooted in the basically arbitrary decisions of a group of 18th- and 19th-century writers who often had a poor understanding of how English worked. But because these rules have been taught to generations of schoolchildren as ironclad truths, educated people have come to think of them as being on par with the laws of thermodynamics.

That's why no evidence from history or literature or any other field can possibly sway the fervent prescriptivist. Consider how Weingarten addresses the fact that many classic writers such as Jane Austen adopted the nonliteral literally on occasion: "That no more makes it right or acceptable than it makes it right for you to annihilate 100,000 people with a bomb just because Harry Truman once did it."

With this statement, Weingarten joins the honorable company of the critic John Simon, who wrote in 1980 that "The English language is being treated nowadays exactly as slave traders once handled the merchandise in their slave ships, or as the inmates of concentration camps were dealt with by their Nazi jailers." Most language scolds I've encountered aren't quite this colorful in their choice of analogies. A professor of mine made the point more simply when confronted by evidence that a usage he disapproved of appeared in the works of great writers: "It's still wrong."

The real problem with this argument is that it assumes a word's proper definition is some immutable law of nature, like gravity, that can never be shaped by the people who use the language, not even by the people who use it best. This view is positively blinkered. There's no reason why the English of Shakespeare is different from that of Chaucer, or from that of Weingarten, other than that human beings of every generation have spoken and written differently than their predecessors. And if there is one thing linguistic history absolutely makes clear, it is that today's error is tomorrow's rule. For example, nice once meant "foolish." It evolved to its present state because people kept using a "wrong" definition, but it's hard to see how English suffered as a result.

Of course, literally isn't just any evolving word. Its traditional definition is a useful concept to have a word for, and it would be a shame to see it go obsolete, which may happen if more and more people say things like "He literally puts his money where his mouth is." In that sense I'm with Weingarten that the looser definition should be avoided (though not excluded from dictionaries). What's striking is that he never makes this argument. His point is simply that it's the law, and we must obey. His indifference to judging word usages based on their utility is revealed in his offhand comment, "although I may cringe at 'blogosphere' and 'webinar' and, sigh, 'whatevs,' I do not protest their appearance in dictionaries." Now, why would anyone cringe at a coinage like blogosphere? (Least of all a blogger?!) Only someone who believes that language should remain literally frozen in time, and that all change is bad, would find anything wrong with that type of innovation.

Weingarten doesn't even accurately explain the loose definition of literally. He claims it is being used to denote its opposite, the word figuratively. It is not. As the OED notes, it is being used as an intensifier. It's basically a synonym for really or actually, except that those words have been blunted from overuse, so when you want to express that you really, truly mean something, literally sometimes gets the point across with more force.

Hence, "the coach literally hates my guts" is meant to convey that you aren't exaggerating the coach's hatred. In a way this is a form of traditional literally; it's just being applied selectively, to the level of the coach's hatred rather than to the metaphor used to describe it. What this example shows is that a statement can have multiple layers of presumptive nonliteralism, and literally may be intended to unpack one layer while leaving the next alone.

My point here is not that I approve of the loose definition of literally, but that it isn't necessarily based on ignorance of the traditional definition. Rather, it's a reflection of the fact that our language is littered with dead metaphors that are all but invisible to us. (The mixed metaphor I just used is further evidence of that fact.) This helps explain why the traditional definition hasn't disappeared from the language, despite centuries of being disregarded. Annoying as it is, the loose sense has come to coexist alongside the traditional one instead of replacing it outright. Weingarten misses this point when he quotes Ambrose Bierce's supposedly accurate prediction that "within a few years the word 'literally' will mean 'figuratively.'" In fact most people today use literally in exactly the way it was originally intended. We just pay closer attention to the loose sense because of the way it literally sticks in our craw--suggesting the danger it poses to our ability to communicate may be overstated.

That's actually true of most gripes about language usage. Some are completely groundless (the most famous being the split-infinitive "rule"), while others, such as this one, at best point to bothersome trends that detract from our language's vitality. In neither case is any large-scale damage on the horizon. As McWhorter explains in his book:

What we must realize...is that during these changes, because renewal always complements erosion, all languages are eternally self-sustaining, just as while our present mountains are slowly eroding, new ones are gradually being thrown up by the movement of geological plates. Thus at any given time, a language is coherent and complex, suitable for the expression of all human needs, thoughts, and emotions. Just as linguists have encountered no languages that do not change, they have also not encountered any languages whose changes compromised their basic coherency and complexity. We have encountered no society hampered by a dialect that was slowly simply wearing out like an old car. Anthropologists report no society in which communication is impossible in the dark because the local dialect has become so mush-mouthed and senseless that it can only be spoken with help from hand gestures. In other words, there is no such thing as a language 'going to the dogs'--never in the history of the world has there existed a language that has reached, or even gotten anywhere near, said dogs.

Wednesday, September 11, 2013

Keep your Obamacare off my exchanges

After reading the recent news story about a man at a Kentucky State Fair who expressed interest in Kentucky's new health-care exchange program, Kynect, by saying he hoped it beat Obamacare, apparently not realizing it was Obamacare, I decided to take a look at Kynect's website. What I found was that it seems to encourage exactly this sort of ignorance. Nowhere on the website is there a single mention of the words Obama, Obamacare, or the Affordable Care Act. The FAQ makes just one fleeting reference to federal law (regarding the requirement to purchase insurance) and then makes it sound like it was the governor, Steve Beshear (a Democrat, for what it's worth), who unilaterally chose to set up the exchanges:
Why was [Kynect] created?

Governor Steve Beshear issued an executive order to create a state-based health benefit exchange to best meet the needs of Kentuckians. kynect, like other health benefit exchanges, will provide simple, one-stop shopping for individuals and small businesses to purchase health insurance and receive payment assistance or tax credits.

In contrast, the website for the exchange program in New York (where I live) says right upfront that it's a result of the ACA:
Under the federal Affordable Care Act, an Exchange will be operating in every state starting in 2014. States have the option to either set up an Exchange themselves or to allow the federal government to set up an Exchange in their state. New York has chosen to set up its own Exchange, called the New York Health Benefit Exchange. On April 12, 2012, Governor Cuomo issued Executive Order #42 to establish it within the NYS Department of Health.
This made me curious about whether there's some relationship between a state's political composition and how candid its exchange website is about its connection with the ACA. I did a little online research about the different state exchanges that have been set up (this webpage was particularly helpful), and my discovery was a bit anti-climactic: it turns out that almost all of the states that have set up exchanges were ones that voted for Obama in 2012. Kentucky, which Obama lost by 23 percentage points, is the one exception. Maybe not so surprisingly, it also has the only exchange website where the words "Affordable Care Act" are nowhere to be found (though in a few other states such as Minnesota and New Mexico, mention of the law is buried deep within the website, and not, say, in a FAQ or "About Us" section). It will be interesting to watch how the law will be sold in other red states, where ironically the exchanges will be mostly federal-run due to the GOP's dogged unwillingness to cooperate with the law's implementation. Will the feds also adopt the principle that it's better to avoid disclosing the source of this cool new policy in the name of getting more people into the system?

Sunday, May 26, 2013

Why liberals became progressives--and why they'll stay that way

One of the most striking changes in political terminology to happen in my lifetime was the adoption of progressive as a substitute for liberal. What's weird about it is that most of the time people talk as though they've always used the word progressive this way, yet I can't remember hearing it until the 2000s. (Checking the archives for Google News and Google Books seems to confirm my suspicions.) When the topic is brought up, the commonest explanation (which even I have made) is that it was an attempt to escape the negative connotations of the word liberal, which had suffered from decades of abuse by conservative commentators. But that raises some questions. Since the negative use of liberal goes back at least to the 1970s, what took progressives so long to come up with their new name? Furthermore, why didn't they stick with liberal in a spirit of defiance against those who treat it as a dirty word? Doesn't abandoning it suggest that there really is something wrong with being a liberal, and that so-called progressives are simply doing a linguistic makeover to hide their flaws?

The answer to these questions lies partly in recent political history, partly in the difficulty in consciously making changes to the language. For several decades liberals did in fact try to wear the word liberal proudly, in spite of those who used it disparagingly. Progressive already existed in political parlance, but it had a broader, vaguer meaning than it does today and didn't necessarily imply an affinity for the left. In the 1980s, for example, the centrist Democratic Leadership Council called its think tank the Progressive Policy Institute. My guess is that the DLC aimed to evoke something along the lines of Teddy Roosevelt's bipartisan, reform-oriented "progressivism."

The degradation of the word liberal was gradual and, contrary to the oft-heard claim, not entirely due to the right's efforts. I think the process began in the late 1960s in reaction to the disillusionment and shattered dreams of the left. Around that time the term was undergoing a shift in meaning similar to what happened to a word like pious, where a formerly positive adjective comes to be used as a sneering description of those who fall short of the ideals they preach. Look, for instance, how Roger Ebert used it in his 1972 review of Sounder, a movie he defended against charges of liberalism:

It is, I suppose, a "liberal" film, and that has come to be a bad word in these times when liberalism is supposed to stand for compromise--for good intentions but no action. This movie stands for a lot more than that, and we live in such illiberal times that Sounder comes as a reminder of former dreams.
By the 1970s, liberal was starting to be treated less like a political orientation than like a character type, describing an overzealous do-gooder who may even be a hypocrite and patronizing snob--someone much like the character of Meathead from All in the Family. When the right began using the word pejoratively, they were in part seizing on that stereotype. Of course there is a difference between the trait of "good intentions but no action" and the right's more malevolent view of liberals. But the image of the excessive do-gooder--and above all the connotation of weakness--prevailed.

For a long time, Democratic politicians were unsure whether to embrace the liberal label or run away from it. In 1988 Dukakis resisted it before finally admitting, late in his campaign, that "I'm a liberal in the tradition of Franklin Roosevelt and Harry Truman and John Kennedy." This comment was practically an apology, seeming to imply that liberalism had fallen from its lofty position in the ensuing decades. It was as if he was assuring the public, "I'm a liberal, but one of the good ones."

Indeed, when it came to presidential elections in the post-Vietnam era, it often seemed that the Democrats' victories rested on how successfully their candidates escaped the liberal label. This perception was probably delusional (Mondale and Dukakis were running against a popular administration, whereas Carter and Clinton were running against unpopular ones, and so their ideological character was probably not the determining factor in the outcome of those races), but it was a lesson the Democratic establishment took to heart.

The moderate, Third Way politics of the Clinton years disappointed many liberals at the time, but this was overshadowed somewhat by their disgust at the GOP's scandal-mongering against the president. By the end of the decade, when Clinton enjoyed sky-high approval ratings while the GOP ended up defeated and humiliated in its attempts to bring him down, there was a triumphant feeling among Democrats which, I believe, made many of them willing to forget (if not forgive) his policy betrayals.

This truce ended with the U.S. invasion of Iraq, an event that drove a wedge between the Democratic establishment and the left unlike anything seen in over a generation. As the left's antiwar position, dismissed at first as radical, eventually became the consensus not just within the Democratic Party but in the country as a whole, it damaged the establishment's credibility and made the left's early criticisms of the invasion seem prescient. I personally believe (but have rarely seen it expressed) that this factor was a large part of the reason for the DLC's demise. And of course it led to the rise of Obama, whose early opposition to the war may have been singlehandedly responsible for his narrow defeat of Hillary Clinton in the primaries. Despite GOP talking points about how he was the "most liberal Senator," the L-word commanded surprisingly little attention in the 2008 election, when compared with past races. Obama did, however, eagerly identify as "progressive," the first modern Democratic nominee to do so.

This new use of progressive arose during the boom in Internet political culture that came to be called the "netroots," dominated by activists who now had the tools to make their voice heard in a way that wasn't possible in earlier times. That was the main setting from which today's progressive movement emerged. Though they rarely explained why they preferred the term progressive, I believe there were two primary reasons: they associated liberal with compromise and moderation in the hated establishment, and they wanted to free themselves from the influence of conservative frames they felt had governed mainstream political discourse for too long. Creating a new word for themselves (or, rather, refashioning an old, nearly forgotten one) was a way of achieving that goal.

Naturally, the new progressives tended to be fairly young--people in their twenties when the millennium rolled around (basically my generation). Older figures who have come to be associated with the movement have had to adapt their language to the times. When I searched Paul Krugman's columns and books for the word progressive, all I found were some references to progressive taxation--until his 2007 book The Conscience of a Liberal, where he explains the difference between liberals and progressives:

The real distinction between the terms, at least as I and many others use them, is between philosophy and action. Liberals are those who believe in institutions that limit inequality and injustice. Progressives are those who participate, explicitly or implicitly, in a political coalition that defends and tries to enlarge those institutions. You're a liberal, whether you know it or not, if you believe that the United States should have universal health care. You're a progressive if you participate in the effort to bring universal health care into being. (p. 268)
Although Krugman isn't defining the two terms as mutually exclusive, there is an echo of Ebert's association of liberalism with "good intentions but no action." Progressives, Krugman maintains, are liberals who put their beliefs into action. While that's an inspiring thought, I'm not sure it fits the way most people use these words. I assume Krugman bases his definition on the activist roots of the progressive movement, but by now (at least in my experience) there are plenty of self-identifying progressives not actively involved in the fight for liberal causes.

The linguist Geoffrey Nunberg rounds up various pundit theories on the progressive/liberal distinction before observing, "none of them has much to do with with how the labels are actually used." One problem I have with most of these theories is that they treat the categories as fixed and static. In reality, these words have had greatly varied meanings over time, and even within the same time have meant different things to different people. The fact that TR referred to himself as a Progressive while FDR considered himself a liberal doesn't shed much light on the differences between Clinton and Obama. With these caveats in mind, Nunberg offers his thoughts on what the progressive label is intended to signal today:

Far more than liberals, progressives see themselves in the line of the historical left. Not that America has much of a left to speak of anymore, at least by the standards of the leftists of the Vietnam era, who were a lot less eager than most modern-day progressives to identify themselves with the Democratic Party. But if modern progressives haven't inherited the radicalism or ferocity of the movement left of the 60's, they're doing what they can to keep its tone and attitude alive.
I tend to agree. I just wonder how long this situation will last. As the new progressives grow older and the word progressive becomes more ingrained, its anti-establishment overtones may well fade. Eventually it may come to be a simple descriptor of the average left-leaning Democrat, occupying more or less the same place that liberal used to--before it was turned into an epithet.

Perhaps sensing this possibility, some conservatives in recent years have been trying to do to progressive what they once did to liberal. Glenn Beck attempted something of the sort in his 2010 speech to CPAC, where he linked today's progressives with the alleged evils of the early-20th century Progressive Movement. I doubt this strategy will work. These conservatives have grown too insulated from the mainstream to reach beyond their narrow audience (somehow I don't think most Americans would share Beck's outrage at TR's support for universal health care or Woodrow Wilson's creation of the Federal Reserve), and in any case the word progressive just doesn't carry the negative connotations that helped the right tarnish liberal. Whether conservatives or older liberals like it or not, progressive as a self-respecting term is here to stay.

Tuesday, July 17, 2012

The challenge of old movies

Want to know how certifiable a movie fanatic I am? I actually keep an Excel spreadsheet noting every movie I see and the date on which I first see it. About a year ago, realizing I'd been keeping this list for literally half my life (since 1994, just before my 17th birthday), I attempted to identify the movies I'd seen during the first half. I got some help from Wikipedia, which has articles listing the films released every year (most of the major ones, at any rate). One thing I've determined is that I've seen well over a thousand films in my life--but perhaps three-fourths of them have been ones made within my lifetime, starting in the late 1970s.

This was a bit of a surprise to me, since I remember watching lots of old movies as a kid. But when I think about it, there are indeed an astonishing number of classics I still have not seen. And when I do get around to seeing them, the experience isn't always as satisfying as it's supposed to be. Part of the problem is a feeling of being intimidated by a movie's reputation. It's tricky trying to sit back and enjoy a Great Movie when I'm conscious of how I'm supposed to be feeling the weight of its Greatness at every moment. This is a big reason why I still have never watched Citizen Kane, and why I had the DVD to Lawrence of Arabia for a long time before I gathered up the courage to stick it in the drive.

The genre I find easiest to appreciate regardless of period is comedy. I was raised on classic comedy--the Marx Bros., Laurel & Hardy, Chaplin (who remains one of my favorite filmmakers to this day), Danny Kaye, the Three Stooges. I believe comedy is essentially timeless as long as it avoids topical humor, as these old movies generally did. Good 21st-century comedies like 40-Year-Old Virgin or Borat may be more profane than their predecessors, but the underlying principles of humor haven't changed. I can't say the same for dramas, westerns, romances, or horror films.

There are four basic challenges to becoming engaged in older movies. One I am not dealing with here is advancement in special effects and other technical matters. Everyone agrees movies have improved over time on that score. Instead, I wish to focus on those areas that pose significant and non-superficial barriers between modern viewers and even the best films of the past:

1. Changes in decency standards

While I find many of today's movies overly coarse, those made at the height of the Hays Commission had the opposite problem. They couldn't talk about sex in anything approaching a candid manner and were forced to employ ridiculous euphemisms, which can be hard for a modern viewer to adjust to. When I watched His Girl Friday, I had already seen the 1974 version of The Front Page with Jack Lemmon and Walter Matthau, based on the same play. Being a big Matthau/Lemmon fan, I loved the '74 version, and while I enjoyed the older movie as well, I was conscious of its limited ability to depict certain plot points. This actually led to a couple of good lines, as when Hildy reports that a character got shot in the "classified ads." At other times I felt the movie suffered from the constraints, as when it presented Mollie Malloy as an old maid rather than (as in the original play) a prostitute. And I'm sorry, but I just can't have as much affection for a film that omits the play's hilarious closing line, "The son of a bitch stole my watch!"

Some old movies have an innocence that looks laughable today. I saw the Oscar-winning 1938 film Boys Town when Newt Gingrich hosted a showing of it on TNT in 1994. Gingrich felt people should watch this movie to learn how to help today's troubled youth. The movie tells the story of Father Flanagan (Spencer Tracy) and his heroic efforts running an orphanage. His motto is that there's "no such thing as a bad boy." And indeed, most of the boys we see in the orphanage behave like perfect angels, except for one played by Mickey Rooney as a juvenile delinquent so terrible he actually smokes, plays cards, and acts sassy toward the grownups. (As a neighbor of mine at the time put it, "Sounds like the typical yeshiva bochur.") Despite these immeasurable crimes, Father Flanagan somehow manages to get through to him in the end and make him into a good kid, a message of great relevance for today's crack babies.

2. Changes in moral sensibilities

This category covers a lot of ground, but it's most notable with attitudes about race. Movies from the '30s and '40s are often shockingly racist, and when they are, I'm thrown right out of the picture. Duck Soup is one of my favorite comedies, but when Groucho utters the line--"My father was a little headstrong, my mother was a little armstrong. The Headstrongs married the Armstrongs, and that's why the darkies were born"--the movie for me just stops dead. (Few people today are aware that Groucho was actually referencing a hit song from the time titled "That's Why the Darkies Were Born" that was supposedly satirizing racism, but it still sounds pretty offensive to modern ears.) And that's just dialogue. The stereotyping of nonwhite characters in films from this period is so awful it leads to the sadly ironic fact that films from this period tend to be more watchable when they feature all-white casts.

3. Lack of freshness

It may not be fair, but movie ideas that were once highly original can come to seem banal if they get imitated enough. Hitchcock went to great lengths to keep the surprise ending to Psycho from getting leaked, but by today's standards it seems almost trite. (As Nicolas Cage declares in Adaptation, "The only idea more overused than serial killers is multiple personalities.") I recently saw Casablanca for the first time, but it feels like I've seen it my whole life. It was like deja vu as I watched, memories from my childhood of stuff I had seen that referenced the film coming to the surface of my mind: an episode of Moonlighting, a scene from one of the Naked Gun films, parts of When Harry Met Sally, you name it. There's also something surpassing strange about hearing lines like "Here's looking at you, kid," "I think this is the beginning of a beautiful friendship," "I'm shocked, shocked to find that gambling is going on here," uttered in earnest. While I did enjoy the movie, it was certainly not the same experience moviegoers in the '40s had. It was like viewing some grand antique.

4. Differences in filmmaking style

This section will be much longer than the previous ones because it deals with a characteristic of movies from the '30s and '40s that has always been obvious to me but which, for reasons that escape me, I have rarely seen discussed: they look and sound a great deal like plays. Watching a movie from today doesn't usually feel like seeing a group of actors up on a stage; it's more like looking into a window at a real-life scene. I don't just mean that the sets look more convincing, but even more that when the actors talk, they tend to sound a lot more like real people having a conversation. To show what I'm talking about, let's examine two clips, one from the 1939 version of Of Mice and Men, the other from the same scene in the '92 version:

Did you notice what I noticed? Not only is the first movie clearly being filmed on a stage, whereas the second is filmed in actual woods (or at least provides the illusion of it), the differences in acting style are striking. In the older movie, the actors deliver their lines loudly and in an almost sing-song manner; in the later film the actors speak practically under their breath, with minimal intonation. The acting in the first clip is more stylized, in the second more naturalistic. In short, the actors in the original film seem to be acting, whereas in the remake they're behaving. The second clip therefore has a more lifelike feel (despite the fact that it's the only one of the two to feature background music, a point to which I'll return shortly).

I am not cherry-picking here; this is something that has consistently stood out for me whenever I've watched movies from the '30s and '40s and compared them with later films. It is most noticeable in dramas, but it exists to varying degrees in all genres. And of course there are exceptions. Jimmy Stewart was always more naturalistic than Nicolas Cage has ever been, but in any of their films they are surrounded by actors whose style contrasts with theirs.

None of this should be surprising. When talkies were invented, movie actors naturally adopted the conventions of what had previously been the only dramatic art form involving speech. They imitated the way stage actors spoke because that was all they knew. Over time, as the technology improved and as film came more into its own as a respectable medium, the styles of stage and screen diverged and naturalism gradually became the norm on screen; I believe the transition was complete in American cinema by the early 1970s.

What do I think of the change? It depends. The two Of Mice and Men adaptations are similar overall, but as an admirer of Steinbeck's novel I always preferred the '92 version. I had an easier time connecting emotionally with characters who sounded like real people when they spoke. This standard of judging movies is surprisingly rare, from what I've seen; people just don't want to admit that the naturalism of modern film has advantages.

It also has disadvantages. Among other things, I believe it contributed to the disappearance of musicals in the 1970s. The musical is, after all, very much a genre of the stage, and to have today's movie characters burst into song can seem odd and inappropriate in a way that it never did in previous eras. Music is still important in today's movies, but most of it is in the background: the scores (one of the few non-naturalistic aspects of movies to have increased over time) and the video interludes (a form that gradually replaced musical numbers in the '60s and '70s, and which in my opinion is one of the most annoying features of modern cinema). If a modern movie character sings, usually there's a rationale within the story, such as if the character is a professional singer.

All of this has led to a looser definition of the word "musical," which nowadays often means simply "movie with lots of songs in it," even when there are no numbers. The Golden Globes, for example, have applied the term to films like Walk the Line which are only "musicals" by virtue of concert scenes, video interludes, and the like. When modern movies do feature traditional numbers, the effect is often curiously artificial.

A lot of people like to ignore this fact and pretend nothing's changed. There's not much acknowledgment that musicals didn't just happen to fall out of fashion (the way, say, westerns did), but that the whole underlying approach to filmmaking changed in a way that made the conventions of musicals seem out of place. In the olden days, making a film as a musical was such a normal and natural choice it could even be fairly peripheral to the film itself. For example, most of the Marx Bros. and Danny Kaye films, remembered primarily as comedies, happened also to be musicals. Today's movies don't have that freedom.

The resurgence of movie musicals following the success of Chicago happened in part, I think, because the 2002 film found a unique way to reconcile the conflicting conventions. In this film, a woman played by Renee Zelwegger dreams of one day becoming a vaudeville star, and most of the song-and-dance numbers are presented as fantasy sequences where she imagines herself and other people performing on stage. As a result, the distinction in this movie between a musical number and a music video is blurred to the point of irrelevance. One IMDB commenter suggested that the film was "ashamed to be a musical," but I'm not sure it would have been as successful if it had simply ignored the problem.

Alas, many of the movie musicals since then have done just that. When I first saw Dreamgirls, I noticed it wasn't until about thirty minutes into the film that a character starts singing on the street (as opposed to on a stage or in a studio). Up to that point, the movie had seemed like a low-key, serious drama, and I have to admit I found the sudden break in realism that late in the story rather jarring. I thought to myself, "Wait a second...this is a musical?!" I've had that sort of experience with at least a couple of today's musicals, but I've never had it with the musicals of old. They don't have anything to apologize for.

Conclusion

My point here isn't that modern movies are intrinsically "better" than older ones, or vice versa. I just think there needs to be more recognition of the effect that the evolving conventions have on different generations of moviegoers. For sure, younger people who consider older movies boring or incomprehensible are missing out on something. But people who celebrate the old stuff as some kind of gold standard that nothing today could match up to, and imply that anyone who disagrees is simply lacking in culture or taste, aren't exactly helping matters either. Speaking personally, as I continue to enrich my knowledge of films of the past, I've had the best experiences when I've understood the movies in the context of their time and was prepared to adjust as needed. Holding them in godlike esteem doesn't do the trick for me.