Skip to content

Corollary to Rule Number Three

November 15, 2013

Elmore Leonard died back in August, as you have probably heard.  His passing has been characterized as the loss of a national treasure, and based on the five or six books of his I’ve read over the years, that seems like a pretty fair assessment.

Leonard began his career as a writer of pulp westerns, but achieved popular success in crime fiction, where he managed to locate a fertile patch of turf midway between wisecracking romantics in the Raymond Chandler mold and their more spartan hardboiled rivals, and then to cultivate that patch with awe-inspiring consistency.  Leonard was funny, but not zany; he was witty, but resisted being clever.  He didn’t let his plots push his characters around, and he also had no apparent use for the shadowy psychopathology that fuels the works of his most critically-lauded midcentury peers: nobody in an Elmore Leonard novel is a case study, or a symptom of anything.

Leonard’s great achievement, it seems to me, is his evocation of the idiosyncratic moral universe in which his books always take place; everything else for which he is justly praised—his off-like-a-shot narration, his fleshed-out characters, his ear for dialogue—proceeds from that.  The inhabitants of Leonard’s world can never be sure of the consequences of their actions, and they’re too foggy on their own motives to ever settle on a solid code of conduct.  What they fall back on instead—if they know what’s good for them, which most of them don’t—is an attitude, or maybe more accurately a stance.  Leonard’s heroes understand what they can and can’t get away with, and they know what they look like through other people’s eyes: what they seem to be.  Most importantly of all, they know when it’s time to shut up and pay attention.  (Leonard’s vaunted dialogue is as much or more a tribute to the value of listening than to the value of talking.)  The categorical imperative of this moral universe can be summed up with Leonardesque conciseness in two words borrowed from one of his titles: Be Cool.

Although Leonard’s dialogue is richly excerptable, his adroit treatment of violent action is less so: it’s hard to convey a sense of the slow build to these scenes, and harder still to convey what he cannily omits from them.  Nevertheless, I’d like to take a quick look at a few paragraphs from Pronto, the 1993 novel that introduces Deputy U.S. Marshal Raylan Givens, who went on to feature in two more Leonard books and is now the protagonist of the giddily acclaimed FX series Justified.  If you’re planning to read Pronto (and I think you should) and you don’t want me to spoil anything for you (as if that were really possible) then please skip on ahead to the Cayucas video below.

The scene from which I’ll quote is set in Liguria, on Italy’s northwest coast; Marshal Givens is there to retrieve an on-the-lam Miami bookie, and has, of course, run afoul of the mob:

Now the fat guy waved his pistol at Nicky, saying, “Come on,” and started toward Raylan again, getting a sincere look on his face as he said, “We want to talk to you, man.  Get a little closer, that’s all, so I don’t have to shout.”

“I can hear you,” Raylan said.

The fat guy said, “Listen, it’s okay.  I don’t mean real close.  Just a little closer, uh?  It’s okay?”

Getting within his range, Raylan thought.  If he knows what it is.  The guy was confident, you could say that for him.  Raylan raised his left hand, this time toward the fat guy.

Then lowered it, saying, “I wouldn’t come any closer’n right there.  You want to talk, go ahead and talk.”

The fat guy kept coming anyway, saying, “It’s okay, don’t worry about it.”

“You take one more step,” Raylan said, “I’ll shoot you.  That’s all I’m gonna say.”

This time the fat guy stopped and grinned, shaking his head, about sixty feet away now.  He said, “Listen, I want to tell you something, okay?  That you should know.”  He took a step.  He started to take another one.

And Raylan shot him.  Put the 357 Mag on him, fired once, and hit him high in the gut.  Raylan glanced at Nicky standing way over on his left, Nicky with his pistol about waist high Raylan put the Mag on the fat guy again, the guy with his hand on his gut now, looking down like he couldn’t believe there was a hole in him before looking at Raylan again, saying something in Italian that had a surprised sound to it.  When the guy raised his pistol and had it out in front of him, Raylan shot him again, higher this time, in the chest, and this one put him down.

The sound echoed and faded.

Raylan turned his head.

For characters who are slow to pick up on when to speak, when to listen, and when to make a move, things tend to end poorly in Leonard’s world.  Note how Leonard—the master of whip-smart dialogue—sets up the above-excerpted killing with a conversation that’s calculatedly lame.  The “confident” fat guy thinks he can smooth-talk his way into an advantage, but he’s not a very good talker: everything he says is vacuous, obviously serving no purpose but to cover his approach.  He also thinks this straight-arrow American lawman won’t shoot unless he’s shot at, won’t actually kill over the mere crossing of a line arbitrarily drawn.  Maybe we, the readers, think this too.

What’s effective in this scene is its faintly nightmarish quality, both surreal and super-real, as characters who have been playing a game with each other discover too late that they’re using different rules.  Like all crime novels, Leonard’s books necessarily depend on violence or the prospect of it for their appeal, and as such they’re open to the charge of being exploitative.  Still, for whatever it’s worth, Leonard’s scenes of bloodshed feel true in their circumstances to the way such altercations seem to happen in real life: somebody stubborn encounters somebody stupid, and they both have guns.  The situation with Raylan and the fat guy just seems too dumb to actually play out the way it does; we’re not quite ready for what happens.

It’s worth noticing how carefully Leonard lines us up for the slight shock the scene delivers when Raylan shoots; it’s also worth noticing how hard he works to make that care seem like carelessness.  Take a look at the description of Raylan raising and lowering his left hand: Leonard separates the two halves of this gesture with not only an ungrammatical period but a full paragraph break between the dependent and independent clauses, producing a queasy slow-motion effect at the level of syntax.  (This adventurousness with grammar gets even more pronounced after that first shot is fired, with subordinating elements receding to reflect Raylan’s broadcast attention: good luck diagramming the sentence that includes “. . . Nicky with his pistol about waist high Raylan put the Mag on the fat guy again . . .”)

Significantly, that paragraph break as Raylan moves his left hand also suggests a cinematic shot-reverse-shot.  As the hand goes up we’re in Raylan’s head, looking at the fat guy through his eyes; then—period, paragraph—the hand comes down, and we’re looking at Raylan now, from a distance, no longer privy to what he’s thinking.  This is critical to achieving the surprise that Leonard is aiming for, a surprise sprung not by revealing but by withholding information: although we’ve just been in Raylan’s head, hearing him assess the fat guy’s confidence and tactics, we weren’t told that he’d made any decisions about whether and when to shoot.  Most writers—even, or especially, thriller writers—would prolong this moment like a Sergio Leone shootout, milking it for maximum tension; Leonard knows this, and knows that we know this, and uses our expectations to catch us off balance.  (He tips the odds even further in his favor with the phrase “about sixty feet away now”—that adverbial now strongly suggesting that we’ll be receiving additional updates on the fat guy’s diminishing distance from Raylan, when in fact we won’t.)

The thrills of reading most mass-market fiction are pretty much those of watching competitive figure skating: the question is not whether the triple Salchow is coming, but only how adeptly it will be executed.  The selection from Pronto provides evidence, as if it were needed, that Leonard is playing a bigger game.  What’s most impressive to me is that he makes no attempt to do what virtually all other ambitious crime writers—even really good ones—do to qualify their work as “literary,” i.e. to tack on ornaments like social commentary, mythic allusion, and/or poetic language.  (After all, as Elisa Gabbert has demonstrated, even contemporary open-form poetry has its own set of stock jumps.)  Leonard’s art, by contrast, is subtractive, paring away received techniques and—as we saw above—even basic coordinating elements when it suits his narrative purposes.  The combined effect of all his small omissions, misdirections, and discombobulations is to induce us to grab hold of the figurative armrests, as if we’ve come to suspect that our driver might be a little drunker than we thought: we’re now wondering whether this guy really knows what he’s doing.  Leonard’s seemingly cavalier departure from the standard slow build of pulp-fiction syntax demonstrates that this book’s narration will not be leading us by the hand through the remainder of the tale.  And that realization, naturally, will increase our suspense as we close in on the ending, since our efforts to guess what’s going to happen are now complicated by our efforts to figure out what kind of a book we’re reading, and whether it gives a particular damn about delivering the conventional pleasures of a crime thriller.  When those conventional pleasures are indeed served up a few dozen pages later, it feels less like the clicking of formulaic genre gears than like a fortuitous accident, like things might not have worked so cleanly out after all.  That’s pretty much how Leonard does his thing: staying a step ahead of our expectations, managing our attention from moment to moment, and doing so with enough humility to disguise his best and smartest narrative moves as sloppiness or whimsy.

I’d been thinking about Leonard a lot over the summer, even before he died.  To be more specific, I’d been thinking about the famous and widely-quoted “Writers on Writing” feature that he contributed to the New York Times back in July of 2001, a piece that has come to be known universally among people inclined to compile, argue about, and obsess over putative “rules for writing”—of whom I am one—as Elmore Leonard’s Rules for Writing:

These are rules I’ve picked up along the way to help me remain invisible when I’m writing a book, to help me show rather than tell what’s taking place in the story.  If you have a facility for language and imagery and the sound of your voice pleases you, invisibility is not what you are after, and you can skip the rules.  Still, you might look them over.

The most famous of the rules is probably Number Ten: Try to leave out the part that readers tend to skip—and understandably so, since many rightly detect in this rule a shift of responsibility from reader to writer that vindicates certain resentments lingering from high school English class—but personally I find it a little too glib to be useful.  The Leonard rule that I hear quoted most often by actual writer-type-people is the pleasingly cut-and-dried Number Three: Never use a verb other than “said” to carry dialogue.  It’s pretty great advice: once you start paying attention to dialogue tags, I promise that you will find violations of this rule to be among the surest indications that Amateur Hour is underway.

But I would also argue that the near-gospel status achieved by this rule comes with a small downside, that being an overemphasis on the lexicon of dialogue tags at the expense of consideration of their placement, which is where competent writers of narrative can really show off their ninja skills.  Obviously the main purpose of dialogue tags is to inform us who’s speaking, and (when absolutely necessary) to help us situate the speaker in the scene, but that’s not all they’ll do for us.  A dialogue tag can also function as a non-grammatical pause—a beat—that captures the rhythm of speech in a way that punctuation simply can’t.  In the example from Pronto above, when Raylan says, “You take one more step, I’ll shoot you,” something happens at that comma that the comma by itself can’t convey: a squaring-off, a slight intake of breath that indicates that shit just got serious.  Although we don’t know it yet, this comma marks the determinate moment in the scene, the one that seals the fat guy’s fate, and maybe Raylan’s too.  The subtlety of the moment can’t really be rendered through the diffuseness of an ellipsis, or by the dramatic wait-for-it fermata of an em-dash.  Something else is needed.  Thus:

“You take one more step,” Raylan said, “I’ll shoot you.”

This placement-of-dialogue-tag trick does not, of course, qualify as a major discovery on my part, and is hardly unique to Leonard: any writer who knows what she or he is doing uses it from time to time.  It’s the sort of technique that’s particularly indispensable to comic writers who need to slow down their punchlines; I first became conscious of the practice during my otherwise entirely clueless adolescence, when I noticed Douglas Adams doing it in The Hitchhiker’s Guide to the Galaxy . . . although in Adams’ case it can get a little out of hand, verging on shrug-shouldered self-parody:

Ford looked at him severely.

“And no sneaky knocking down Mr. Dent’s house while he’s away, all right?” he said.

“The mere thought,” growled Mr. Prosser, “hadn’t even begun to speculate,” he continued, settling himself back, “about the merest possibility of crossing my mind.”

(Re “severely,” “growled,” and “continued,” q.v. Leonard’s Rules Three and Four.)

And yet dialogue tag placement is still not exactly what I was thinking about over the summer.  What I was thinking about, broadly speaking, is another basic but underappreciated function of dialogue tags: to reveal not only WHO is speaking but THAT someone is speaking at all.  In conventional narrative writing this isn’t such a big deal, since quotation marks signal that we’re in dialogue, but in certain unusual narrative situations—in unconventional writing that omits quotation marks, for instance, or in narratives that are encountered aloud rather than on the page—the writer has to adopt other strategies to make it clear when we have entered reported speech.

Unless, of course, the writer doesn’t WANT it to be clear.  What I was thinking about this summer, narrowly speaking, was the song “High School Lover” by the band Cayucas (which in a recording-studio context pretty much just means singer-songwriter Zach Yudin).  The song appears on their debut album Bigfoot, as well as, I gather, in, like, a Verizon commercial or whatever.

Speaking yet more narrowly still, I’ve been thinking about the song’s first thirty-odd seconds.  The reverbed hey (or is it more of an eh?) that cancels the crowd noise and kicks things off registers mostly as a canonical rock ’n’ roll hype move, but there’s also something ambiguous in its tone—a little glum, a little peevish—that positions Yudin’s opening salvo somewhere between let’s-get-this-party-started! and somebody-stole-my-bike!  The music it ushers in doesn’t exactly resolve the ambiguity: the boing-boing bassline sets out jauntily enough, but then retreats at its midpoint to an anxious whole tone (I think) from where it began before scrambling to complete its loop.  Still, the slap-happy percussion that pushes everything along seems to indicate that this will be a fun track for the kids to mash-potato to . . . maybe when Yudin drops some lyrics we’ll know for sure which direction he’s headed.  Ah, here’s we go:

Are you going to the party on Saturday?

Hell yeah!  Mystery solved!  It’s beach blanket bingo time!  Oh, wait:

she asked.  I said I didn’t know.
See ever since I saw you on the back of some guy’s bicycle
well I’ve been feeling kind of so-so.

See what I mean about the dialogue tag placement?  Rather than making us feel the loneliness and resentment of the narrator as he avoids the party and walks around it and thinks about it all night long, the music and the first line contrive to put us AT the party that he’s skipping before we even realize he’s skipped it, which locates us at a peculiar critical distance from him: maybe a little guilty at his ostensible exclusion from our pop pleasure, and also a little unmoved by and skeptical of his apparent sullenness—if only because, c’mon, man, that is clearly not what this song is about.  This critical disconnect is key to the somewhat subtle game that Yudin is playing.

You think I’m overthinking this?  My friends, I have not even begun to overthink this.  Because, check it out, after a first verse and a chorus that spell out the wronged narrator’s frustrations with the object of his unrequited affection, Yudin does it to us again:

Did you get the letters that I sent last summer?
you asked again and again.
Well they’ve piling up on the top shelf in my closet
and I read them every now and then.

Wait, what?  When we reach the beginning of this second verse, everything we’ve heard so far has led us to believe that this is still our lovelorn narrator speaking, having spent his sorry summer trudging back and forth to an empty mailbox . . . but nope: we’re being quoted to again.  It is in fact our feckless hero who has been receiving these letters—letters he still hasn’t acknowledged (“again and again,” she asked!)—and now he’s the one who’s sulking because this girl caught a lift on some other dude’s bike?!  You’re not exactly proving your case here, cowpoke.

But let us consider the milieu of “High School Lover,” which is, of course, suburban adolescence (the narrator’s rival’s default means of transport being our tipoff).  And this, of course, is exactly what suburban adolescence was like: a maddening sequence of shared attractions that somehow never amounted to anything, a chaotic and dispiriting jumble of hoped-for rendezvous thwarted by awkwardness, cowardice, confusion, and a persistent revulsion at the plain prospect of growing up.  (Okay, that’s what suburban adolescence was like for ME.  If you are the sort of person who spends a typical evening celebrating your latest eight-figure real estate deal by speeding down empty freeways in your Audi coupe with the stroboscopic pulse of high-mast streetlights lashing your coke-blown pupils instead of hanging out in your jammies reading cultural criticism on the internet, then your experience may have been different than mine, and I honor that.)  What Yudin is attempting in this seemingly breezy summertime top-down pop song is an oblique investigation of the inexhaustible mystery of early adulthood, a mystery best summarized with the timeless question what was I thinking?

Toward this end, the song’s dialogue-tag confusion has one more trick to play, and this time it’s an ambiguity that isn’t—and can’t be—completely resolved:

It’s got me feeling kind of stuck, like what the fuck is going on,
someone tell me what is happening.
Yeah you’ve been acting like you’re too cool for far too long.
It’s okay, it’s just kind of embarrassing.

So . . . is this reported speech, or not?  Is this the narrator taking the girl (Elizabeth) to task for her alleged fickleness?  Or maybe imagining himself doing so, hours or days or years after her seemingly innocent query about his Saturday plans?  Or is this Elizabeth talking, trying to figure out what specific as-yet-undiagnosed psychological malady might lead the narrator to ignore an entire summer’s worth of envelopes inventively collaged with clippings from Sassy?  The text (ha! I just called some indie rock lyrics “the text”!) will support any and all of these interpretations.  We have now passed the song’s point of laminar-turbulent transition, beyond which complete understanding—not only between the two mixed-up teens, but also between the narrator and us, the real audience for his complaint—has become impossible.

(If we seek clarification on this point from an external authority, we encounter yet more evidence of subterfuge.  I had a brainstorm that checking the printed lyrics on the CD case—y’all remember CD players, right?  Laser Victrolas, we used to call them?—might clear things up, and lo and behold and sure enough, the telltale punctuation is indeed present:

Are you going to the party on Saturday? / She asked, I said “I didn’t know . . .”

Got that?  Those quotation marks are the only ones that appear in the printed lyrics to “High School Lover,” and they—just to be clear—CANNOT BE CORRECT.  Although, as I have just argued, there are a couple of ways that quotes might plausibly be deployed in these lyrics, this ain’t one of ’em.  Try it:  Are you going to the party on Saturday?  “I didn’t know.”  Um . . . you didn’t know WHAT?  And how is that an answer to my question?)

What is being dramatized in “High School Lover” is a progressive breakdown of language.  Yet this is not some gnarly poststructuralist aporia that Yudin has sprung on us to expose hitherto unsuspected glitches in the linguistic system: language itself is not the problem.  Rather, this collapse proceeds directly from a shortcoming in the narrator’s character, specifically his inability or unwillingness to communicate in words, to enter the semantic realm in any kind of committed way.  This deficiency manifests most clearly in his failure to respond to Elizabeth’s letters, but also in the evasive lameness of the few rote utterances he does seem to manage, from his overloaded response to her opening question (I didn’t know not being equal to I hadn’t decided), to his characterization of his jealousy and wounded bafflement as “feeling kind of so-so,” to his insistence—really an admission—that he’s been “saying the things [he] thought [he] should.”  The song’s various narrative muddles—its lack of clarity about who’s speaking to whom when, about whether actions and utterances are real or fantasized, etc.—are obviously also symptomatic of the problem.

But the song does evoke one moment when the narrator manages to speak up, to put aside his paralyzing concerns about coolness (and lacks and surfeits of it), to put himself at risk, and to say, for better or worse, what is on his mind: a moment when his “words came out one after another.”  This is the moment—hard to situate precisely on the song’s timeline, but let’s figure it happens early, prior to the summer of piled-up letters—when he opens a door to the sight of Elizabeth undressing.  Given that the incident shocks the narrator, like some reverse Actaeon, into unguarded speech, I think we can assume that it comes about by accident.  We should note too that this scene is described in the chorus, i.e. in the part that repeats.  (Okay, just once—it’s a short song—but still.)  Even as words fail, the song’s structure tells us something the narrator can’t articulate and may not even understand: that this oops-forgot-to-knock episode is the key moment in the story, a memory that our mixed-up hero can’t get
rid of or get past.  What he really needs, whether he knows it or not, is to somehow regain access to the shared vulnerability of this unplanned encounter, this instant of grace and peril, this missed chance to connect and be transformed.

That classical reference in the previous paragraph was not entirely gratuitous, I’m afraid.  I have asserted elsewhere that myth, being the opposite of history, readily lends itself to depicting the suburban experience; it is also an easy medium for the dramatic and disempowered confusions of youth, for approximately the same reasons.  In Ovid’s version of the Actaeon tale—not the original, but it might as well be, cf. Whitney Houston’s “I Will Always Love You”—the hunter spies the goddess Diana nude at her bath, a turn of events with which she is SO not cool; she turns him into a stag, and he gets killed by his own hounds.  What’s interesting is that Ovid’s telling places less emphasis on Actaeon’s physical transformation than on the detail that Diana also renders him mute.  (Not the most intuitive choice of punishment for a voyeur: when Tiresias, for instance and by contrast, stumbles upon the nude Athena, she blinds him.)  “Go tell it, if your tongue can tell the tale,” Diana mocks (in the dogged blank verse of Brookes More’s translation), “your bold eyes saw me stripped of all my robes”—and sure enough, when the now-quadrupedal-and-antlered Actaeon tries to convince his ravening pooches to chill out for a second, he cannot speak to identify himself.  The hunter . . . has become the hunted.  Cue Twilight Zone theme.

Now, to be sure, Elizabeth is no goddess—she’s just another awkward teen—and at the sight of her nakedness our narrator is rendered a blabbermouth rather than struck dumb.  (Notably, he never fesses up to what he says on this occasion.)  But despite these differences, there remains a key correspondence between “High School Lover” and Ovid’s Actaeon story, in that both are built on the same conceptual armature: a right-angled axis of speech and sight.  Our narrator may be uncomfortable using language, but he is very damn comfortable being a spectator, and he makes persistent efforts to keep his interactions with Elizabeth as optical as possible.  His explanation of his noncommittal response to her opening question, for instance—if it’s actually articulated at all, which seems doubtful—employs the same verb twice in quick succession (“See ever since I saw you”); he then goes on to impugn the “look” in Elizabeth’s eyes, and to characterize the narrative circumstances as a “movie” that he’s been (passively) watching.

But the ultimate indictment of his retreat away from telling and toward looking arrives at the end of the second verse, in a cryptic final scene that isn’t much more than an image:

See I’ve been sneakin’ I’ve been sneakin’ I’ve been sneakin’
wondering just what I’ll see.
You turned around and stared, you squinted then you glared,
and I was leaning back in the passenger seat.

What to make of this?  Well, as an interpretive palate-cleanser, let’s figure that the switch from the bicycle of the first verse to the automobile of the second suggests a passage of time, the approach of adulthood, and the waning of what once might have passed for innocence.  Furthermore, the narrator’s location in the passenger seat indicates a persisting lack of agency and responsibility.  Cool?  Now let’s cut to the freakin’ chase: check out that semi-amazing sequence of looking verbs—the visual equivalent of dialogue tags—that describe what Elizabeth is doing in this scene.  Stared!  Squinted!  Glared!  Sounds like she’s none too happy with our guy.  But if the principal aim of his passive-aggressive efforts has been to keep things, y’know, purely ocular between him and this chick, then it seems he has succeeded abundantly.

Because what’s actually happening here?  Could it be that the narrator has talked one of his car-equipped sleazeball buddies into idling in front of Elizabeth’s house while he ogles her through the blinds, hoping for a deliberate repeat of his earlier fortuitous peepshow, only this time without any attendant obligation to account for himself: a purely retinal one-way encounter in which he risks nothing and holds all the cards?  And could it be that bright-eyed Elizabeth has just gotten wise to their creepy stakeout, rushing indignantly to the window, leading him to hunker out of sight?  You think?

Even if you don’t totally buy that scenario, I think it’s safe to say based on available evidence that this a song about a guy who won’t talk to a girl—even though he digs her, even though she clearly digs him—because he’d rather just look at her.  Dyed-in-the-wool putz though this kid may be, the song leaves us enough room to grant him a small measure of sympathy: although his position is clearly more privileged than Elizabeth’s, he and she are both snared in the same set of social codes, codes that regulate his role as a spectator as assiduously as hers as an object on display, thereby making it just about impossible for them to encounter each other on equal footing.  In a sense this social system—invisible, internalized, all-pervasive—gets the first and last word, evoked by the crowd noise that we hear under the beginning and the end of the track.

Anyway, that is some—some—of what I think is going on in “High School Lover.”  And it’s not even my favorite song on the album.

During his brief time in the pop sphere—first with his solo project Oregon Bike Trails, and now with Cayucas—it’s safe to say that Zach Yudin has not exactly emerged as a critics’ darling.  In reviewing Bigfoot for Pitchfork (and let’s face it, Pitchfork remains the indie rock equivalent of Standard & Poor’s) Ian Cohen awarded it the eyebrow-elevating score of 4.9 out of a possible 10, the sort of grade one might receive on an undergrad survey-course pop quiz for simply bothering to show up.  Cohen’s major knock against Bigfoot seems to be that it’s derivative of the first Vampire Weekend album, an assessment that would probably be more damning if Bigfoot were obviously the inferior product.  I don’t think that’s obvious at all.

I mean, sure, Cohen’s not wrong: Yudin’s work IS derivative—of Vampire Weekend, of Beck, of the Beach Boys . . . hell, I think I’m hearing some Cure in there too, and maybe some Belle & Sebastian, and who knows who else.  But I’m afraid that if I make the pretty-much-self-evident point that Vampire Weekend is no less derivative—they seem to be doing their level best to approximate Matador-era Spoon playing Congolese soukous on a Wes Anderson soundtrack, and I say that with total admiration—then I will only succeed in reinforcing Cohen’s suggestion that being derivative is a bad thing, res ipsa loquitur, when in fact it’s one of the core strategies by which contemporary pop music connects with its audience, and has been so for quite some time, at least since the dawn of the hip-hop era.  It seems like we ought to be past arguing about whether or not this pop intertextuality is legitimate: when musicians are able to engage the full depth and breadth of their iTunes-account-holding audiences’ musical lexicons, our default assumption should be that they will avail themselves of this option, not that they won’t.  Back in the early 90s, a few rock and pop artists were buoyed to cultural prominence by their hip-hop-inspired facility with a vast range of styles and genres; these days such working methods are so run-of-the-mill as to barely merit comment.  Observing that Yudin’s music is heavily referential to other music is about one degree of perceptiveness beyond just recognizing that the stereo is on; I’m not awarding any points for that.  Am I out of line in expecting professional music critics to consider what a work is doing with its appropriations, instead of just consigning it to a twig on some imagined indie-pop version of the phylogenetic Tree of Life?

Cohen’s closing verdict on Bigfoot (or maybe on Yudin; the grammar seems a little deadline-damaged) is as follows: “observant, but lacks insight, descriptive without offering any commentary, nostalgic without feeling the pain from an old wound.  [Pitchfork’s link—and seriously, Ian, I know you’re not writing this for, like, your dissertation committee or whatever, but is that really the best you could do?]”  A few months after Pitchfork issued its thumbs-down of record, Paste ran a cover story on Cayucas that reads suspiciously like an awkward sidelong rebuttal of Cohen’s review; the author, Ryan Bort, pretty much argues that Bigfoot ought to be valued as a collection of catchy, simple, sentimental songs, and not faulted for its failure to blaze new trails, because, good lord, does everything have to be Metal Machine Music?  It does not!  According to Bort, Yudin . . .

. . . doesn’t approach his topics obliquely, instead simply remembering the nostalgic event he’s addressing and listing his associations, those things from that past that stick in your mind as mental totems of more innocent and blissful times.  A certain kind of car, a girl on the back of a bicycle, that true-to-scale Michael Jordan poster on a wall—it all transports listeners to another time and place, and because of how friendly Yudin’s voice is, it’s hard not to feel good about all that’s past—even if regret is involved.

Bort’s take on Cayucas, I’m sorry to say, misses the mark even worse than Cohen’s does.  The most cursory listen to “High School Lover” makes it clear that the girl on the back of the bike is no “mental totem” triggering bittersweet bliss, but instead a spur to contemplation and reassessment: very precisely the “pain of an old wound” that Cohen (through the expedient and imprecise filter of Don Draper) bemoans the supposed lack of.  Oh, and that Michael Jordan poster?  Damn glad you brought that up, Ryan, because it, on the other hand, is a perfect example of what Yudin is up to—just not for the reasons you’re arguing.

The lyric in question, which appears in the song “Will ‘The Thrill,’” is actually “Look at the posters that are on the wall / Michael Jordan standing six feet tall.”  Live and in person, of course, Michael Jordan is six foot six; in other words, the poster is big, but it’s not “true-to-scale.”  This is a detail that we’re meant to catch, and to think about.  The song is probably dead-on accurate about the size of the poster; the poster is not accurate about the size of Michael Jordan.  This is something that, say, a seven-year-old boy is unlikely to notice when that poster first goes up, but something that, say, a seventeen-year-old boy will notice when he’s wondering whether it’s finally time to take it down.  The image evokes nostalgia, sure, but also a twinge of resentment at the countless small frauds perpetrated on us before we’re old enough to notice; it prompts a consideration of how our perceptions are embodied, and how our perceiving bodies change over time.  I don’t think it’s too much of a stretch to say that it also provides an opportunity to think about the ways that gender in general and masculinity in particular are learned, rehearsed, and performed.  (Similar gender-tweaking case in point: the California beach town that provided the band’s name is Cayucos, not Cayucas.)

But the best thing about the Michael Jordan poster line—the thing that makes it effective, and makes it representative of Yudin’s whole project—is that its subtle and complex resonances seem to have been evoked entirely by accident.  In fact, these errors-that-aren’t-errors are all over Yudin’s songs, and their very ubiquity is the best clue that there is method in his apparent carelessness.  Take, for instance, the late-placement-of-dialogue-tags trick that I described above: the resultant confusion about who’s speaking comes off as merely sloppy until Yudin pulls it a second time.  There’s another, similar fakeout sixteen bars into the first verse of “High School Lover,” at the exact point where a half-century of pop-song convention leads us to expect the chorus to drop; Yudin knows this, and he knows that we know this, and he reinforces our expectation by placing the lyric “there was gonna be an ending”—which, thanks to the rhyme scheme, we can totally see coming—just prior to where we think there’s, y’know, gonna be an ending . . . and then he keeps the verse going for eight more bars.

For us listeners, this creates a tension akin to the sense that our cabbie has just driven past our exit . . . but tension, needless to say, is exactly what a songwriter wants leading up to the big release in the chorus, and delay and deferral are the classic means of achieving it.  (Clive “Kool Herc” Campbell’s famous realization that a DJ can extend a track’s instrumental break by switching back and forth between two turntables spinning two copies of the same record, thereby driving a dance floor insane—a discovery that pretty much gave birth to hip-hop, and by extension much of the past 35 years of popular culture—is only the most obvious example of this method.)  But here again Yudin’s trick isn’t to make the technique seem new, only to make it seem unintentional, which it most certainly is not.  The lyric that immediately follows “gonna be an ending”—“to the story to the story to the story”—is a shift into neutral, a sputtering pause that lets us realize what’s just happened: the approximate equivalent of a cartoon coyote running in place in midair, its plunge into the chasm unaccountably interrupted.  At first the repetition seems lazy, like a dummy lyric left in place, which reinforces Yudin’s feigned ineptitude; only when he does it again at the same spot in the second verse (“I’ve been sneakin’ I’ve been sneakin’ I’ve been sneakin’) does the finer grain of the song’s structure become apparent.

Yudin seems to have a real fondness for repetitions like this: lyrics that give the initial impression of being redundant, or slack, or gauche.  Elsewhere on Bigfoot we find instances of shining sunshine, piled-up piles, and hidden-in hiding places (reiteration-with-grammatical-shift evidently being a Cayucas stock-in-trade) and these serve to advance Yudin’s purposes in a couple of ways.  First, they seem stupid but sound great, particularly when braided into the already dense weave of assonance, consonance, and internal rhyme in the typical Cayucas lyric.  Second, they signal something important about the scope of the project: about what it is and isn’t trying to accomplish, and how it does and doesn’t hope to engage its audience.  Stating repeatedly that things are equal to themselves is a way of indicating that Cayucas has no particular thesis to advance, no case to prove, no real desire to impress us with cleverness or insight.  The songs are playful, and thoughtful, but they’re not written in code; they contain depths and subtleties, but they don’t withhold them and don’t depend on them.  The listener doesn’t have to resort to analysis to intuit that they’re there.

In other words, Yudin’s songs are pop in the most basic sense: they welcome all comers while privileging none, evincing an aw-shucks modesty that I am inclined to ascribe—perhaps counterintuitively, but think about it for a second—to absolute confidence and clarity of purpose.  They take pains to lay no claim on any occult subcultural authority, but instead do business in an egalitarian zone where music is primarily a manifestation of musicians’ own fandom, and the barrier between band and audience is permeable to the point of impalpability: a precarious midrange sweet-spot that is, as Yudin’s own lyrics put it, “beautiful / somewhere in between dumb and kind of cool.”

Part of the deal with this egalitarian approach is that Yudin has to be really overt in citing his influences, lest they become Easter eggs for sophisticates: checkpoints instead of welcome signs, occasions for demonstrating that some listeners “get it” while others don’t.  With (for example) Yudin singing from the low end of his pitch range in the persona of a puerile and delusional would-be lothario, “High School Lover” sounds a hell of a lot like a Beck song . . . which is why Yudin pretty much lifts the chorus melody from “The New Pollution” for his own song’s coda, a not-admitting-but-insisting gesture which makes the lineage impossible to miss and snatches canny comparisons from would-be critics’ throats.  The opening song on Bigfoot (“Cayucos”) evokes the janglier end of the 1980s new wave pool . . . which I’m guessing is why it contains both the lyric “c-c-c-c-chameleon” AND some late-in-the-game guitar that very strongly recalls the main riff from “Just Like Heaven.”  And so forth.

As Cohen’s piece indicates, Yudin’s just-own-it strategy for hipster-proofing his influences—which depends on his reviewers to grasp the obvious and then refrain from pointing it out—has not been 100% successful.  Nevertheless, on this point I’m inclined to allocate Cayucas more sympathy than blame, especially since Yudin seems motivated by generosity toward critics and regular listeners alike . . . to the extent that there’s a difference these days.  Okay, I promise I’ll lay off this soon, but the layers of complex half-assedness in this Pitchfork review are still blowing my mind: Cohen complains that “Ayawa ’Kya,” Bigfoot’s penultimate track, “sports the kind of onomatopoetic fake patois that people who hate Vampire Weekend assume they use in all of their songs.”  But, um, dude, if Cayucas is—as you’ve just argued—worshipfully imitative of Vampire Weekend, shouldn’t we figure that Yudin knows they don’t use fake patois in all of their songs?  And that “Ayawa ’Kya” might therefore be better understood as an affectionate parody of, let’s say, “Cape Cod Kwassa-Kwassa”—a parody pointedly written from outside the moneyed East Coast milieu that Ezra Koenig and his bandmates position themselves within, as indicated by the last intelligible line of Yudin’s song, “feel like I feel like I feel like I feel like I’m so poor”?  Maybe?  (Also, that’s not what onomatopoetic means.  Glossolalic?  Whatever.)

Anyway, my point here, basically, is this: there are smarter songwriters than Zach Yudin in the world, and there are less pretentious songwriters than Zach Yudin, too, but I think you would have a hard time filling up a city bus with songwriters who are both smarter and less pretentious.  I think this is something that’s worthy of praise, and I’d like to see Yudin get more of it . . . or, at minimum, a closer examination than he seems to have received to date.

I’ve been giving the paid critics at Pitchfork and Paste a hard time—which I think they’ve earned—but I confess I’m not unsympathetic to the difficulty of their task: Bigfoot is a tough album to write a solid 700-(as opposed to 7,000-)word review of, if only because Yudin throws so few pitches into critics’ strike zones.  In fact, Yudin’s not throwing much of anything anywhere.  Toward the end of his review, Cohen—as if suddenly worried that he’s missed a stitch—accuses Yudin of “failing to recognize the difference between leaving something to the imagination and making the listener do all the hard work,” a charge which fails to acknowledge that these are not a songwriter’s only two options for engaging an audience.  Yudin doesn’t neglect to “offer commentary” on the milieus and circumstances evoked in his songs; he purposefully avoids it.  Rather than doing what virtually every other post-Johnny-Rotten indie-rock frontman does—i.e. asserting his status as a subcultural authority, a personality with something to say—Yudin has taken on a trickier mission, at once more modest and more ambitious: creating conditions for his listeners under which they can see the world through his eyes, consider the things he considers, and feel as if they got there entirely on their own.  To work in one final sports reference that I have no real qualifications to employ, Yudin operates like the proverbial no-stats all-star: it’s sometimes hard to say exactly what he’s doing, but remarkable things seem to happen in his presence with surprising consistency.

And this, at long last, brings me back to Elmore Leonard, whom I obviously understand to be linked to Zach Yudin by more than the adroit use of dialogue tags.  Leonard characterizes his famous tips for writing as “rules I’ve picked up along the way to help me remain invisible;” by way of explanation, he adds: “If I write in scenes and always from the point of view of a particular character—the one whose view best brings the scene to life—I’m able to concentrate on the voices of the characters telling you who they are and how they feel about what they see and what’s going on, and I’m nowhere in sight.”  That’s pretty standard show-don’t-tell writing-workshop fare up until that last clause . . . but that final step of killing your darlings, brushing over your own footprints, and vanishing behind the text is a doozy, and one that’s rare to see anybody even try to pull off successfully.  (I used to think I’d spend a lot more time here analyzing variations on this approach to making art, which is why this blog is called what it’s called.)

A great many young writers pound out tens of thousands of words in the hope of cultivating an ephemeral quality often called “voice”—which near as I can figure refers to an engaging combination of style, substance, and sensibility—with the understanding that this quality is what will win them access to the iron-clad gates of the cultural apparatus, where this “voice” might one day, with the proper promotional nourishment, be cultivated into a brand.  Leonard’s advice points the way down a vastly longer path, one that eschews pyrotechnic stagecraft in favor of a quieter and more measured approach that exerts a plausible claim on real magic—real because it occurs not on the page but in the minds of the audience, one at a time.  (Leonard’s eleventh rule that sums up the previous ten: “If it sounds like writing, I rewrite it.”)  It’s always been rare to see artists who have the poise and humility to work this way, and it seems rarer all the time—maybe because the cultural apparatus just doesn’t grant them admittance anymore.

I guess we’ll see.  Props to Zach Yudin for giving it a shot.

Romney’s douchebag problem

November 3, 2012

One of the major obstacles that Mitt Romney faces in his campaign for the presidency is the fact that a great number of Americans regard him as a total douchebag.

Why is this so? When we call somebody a douchebag, what do we mean?

illustration of fountain syringe from The People's Common Sense Medical Advisor in Plain English, or, Medicine Simplified, by R.V. Pierce, M.D., 1895

Let’s begin with an informal connotative survey of the term. The dutiful aggregators over at Dictionary.com cite the 2009 Collins English Dictionary and the 2012 Random House Dictionary as respectively providing “a contemptible person” and “a contemptible or despicable person” as definitions. For our purposes, these clearly will not do. They make “douchebag” out to be synonymous with virtually every other insult, which simply cannot be the case: a pejorative like “douchebag” derives half its force from specificity, from the sense that its punch has landed right where it hurts. (The other half, needless to say, is from the standard shock value that all profanity manages through the impassioned disruption of civil discourse.)

It’s safe to say that “douchebag” has been long unmoored from its literal referent—i.e. the reservoir of a semi-intrusive and dubiously effective personal hygiene device. A little monkeying around with date ranges in Google Book Search indicates that the word has been employed metaphorically as a slur since at least the 1930s, with appearances in print no doubt lagging far behind conversational usage. “Douchebag” seems to have been a beneficiary of the great American linguistic flowering that followed World War II: one among innumerable expressions extracted from their regional and socioeconomic origins, cross-pollinated in various barracks, and redistributed through the demobilized nation like dandelion seed.

Its meaning has retained a tendency to drift. Dictionary.com also gives us an entry from the Dictionary of American Slang and Colloquial Expressions—fourth edition, copyright 2007, and yes, this is indeed an instance of a website referring us to a five-year-old printed reference book on the subject of contemporary slang, which strikes me as about as desperate and pathetic as a cat trying to get a Post-It off the top of its head—which defines “douche bag” as 1) “a wretched and disgusting person,” and 2) “an ugly girl or woman,” definitions that I suspect went untouched in that 2007 revision. Early uses of the insult do indeed refer to ugly and/or undesirable females—I’ve found suggestions that the phrase “old bag” may be a derivation—and this is kind of interesting, since nowadays we hear the pejorative applied pretty much exclusively to dudes. (Perhaps not surprisingly, it seems to have been borne across the gender divide on the backs of gay men; Henry Miller’s Rosy Crucifixion novels, for instance, published between 1949 and 1959, contain passing references to a drag performer called Minnie Douchebag.)

In the early 1980s “douchebag” really blew up, tipping into broad use among middle-class male teens. Although its application seems to have been fairly indiscriminate, by this point it had pretty much settled on male targets. In keeping with my earlier assertion—i.e. that innovation in profanity, pre-cable and pre-internet, was mostly driven by armed conflict—I’m going to suggest that the groundwork for the sudden rise of “douchebag” was really laid by the U.S. military, which found itself obliged to conscript a bunch of relatively educated, relatively affluent young men and send them off to a highly suspect shooting war in Southeast Asia, and to do so in the midst of a sweeping and convulsive transformation of American culture. Dealt this exceedingly crappy hand, a generation of frustrated drill sergeants had to come up with a rhetoric and a lexicon that would define the harsh, hierarchical, dangerous-as-hell masculine space inhabited by the combat serviceman against the polymorphously perverse rock-’n-roll carnival then going on in the civilian world to the advantage of the former. Most people, it seems, would rather have sex and go to rock festivals than get shot at; unable to appeal plausibly to patriotism, or to a devil-may-care sense of adventure, military rhetoric set out instead to denigrate the youth culture—and the recruits plucked from it—as inane, ignominious, abject, feminized, and feckless. Thus: douchebag.

With these preconditions established, the accomplishment of what I’ll call the First Great Douchebag Breakthrough was a matter of relative ease, somewhat akin to a successful volleyball return: the “set,” in this instance, almost certainly occurred on May 24, 1980, with the final episode of the fifth season of Saturday Night Live—the last one performed by the original cast—and a skit entitled “Lord Douchebag.” Although the skit relied more on the literal meaning of the word than on the connotations it had accrued during the Vietnam era, it made “douchebag” widely available to the late-night-TV-watching public: to an unprecedented extent, it put it into play. The fact that many young male viewers had no idea what the word meant or why it got such a big laugh made SNL’s use of it even more culturally potent, a seeming paradox we witness again two years later, when the “spike” came along.

If, hypothetically, you are an adolescent or preadolescent boy, and you see a movie in which a kid calls another kid a name, and that name is a word that’s unfamiliar to you, and the movie kid’s use of it draws an immediate reaction (shocked) and reproach (mild) from a movie adult, then you are damn sure going to remember that word. And once you figure out that although the word is clearly impolite, it’s not really a bad word—you can say it on TV—then you are going to start using that word a lot. We can therefore date the real watershed moment for “douchebag” to June 11, 1982, and an exchange between actors Henry Thomas, C. Thomas Howell, and Dee Wallace in a little arthouse flick called E.T. the Extra-Terrestrial:

Elliott: But Mom, it was real, I swear!
Tyler: Douchebag, Elliott.
Mary: [swatting Tyler upside the head] No douchebag talk in my house!

BOOM! Box office records indicate that every living man, woman, and child in the United States saw E.T. at least fifteen times—I’m rounding a little—and many of the most enthusiastic and attentive of those repeat viewers were preteen boys. While, true, “douchebag” is not the most memorable insult uttered in the movie, it’s culturally viable in ways that the first-place finisher, “penis-breath,” just ain’t: it’s delivered in a throwaway fashion by a cool older kid (C. Thomas Howell was a child stuntman for the love of god; is that job even legal anymore?) and it also requires a little research to make sense of, and therefore it’s a badge of sophistication. “Penis-breath,” by contrast, is a false friend to its prospective adopter: mannered, idiosyncratic, broadly-delivered by the awkward, earnest, deeply uncool young Elliott, it’s far too easily-sourced to earn its user any locker-room cred.

If you’re willing to entertain my conjecture about the military repurposing of “douchebag” as an anti-hippie slur, then it’s worth putting its cameo in E.T. into some sociopolitical context. If we dust off a useful Slavoj Žižek axiom that I’ve cited in the past—“the first key to horror films is to say: let’s imagine the same story, but without the horror element” (and, okay, sure, E.T.’s not a horror film . . . but it almost was)—then what we have here is a story about a lonely kid and his traumatized family as they try to get over the breakup of a marriage: Dad has skipped town in the company of a new girlfriend, and Mom, suffice to say, is not powering through like a champ. As A. O. Scott (among others) has pointed out, the suburban milieu of E.T. is a grim, lonely, anxious place, due at least in part to the failure of the movie’s baby-boomer parents to assume their proper authority and responsibility and act like grownups for once in their freaking lives. The father—less Peter Pan, we suspect, than Dorian Gray—is totally absent from the film, down in Mexico partying like it’s 1968; meanwhile Elliott’s stunned and frazzled mother treats her kids more like college roommates than legal dependents. It’s pretty clear that teenage boys like to hang around her house because they’ve got the run of the place: nobody’s laying down any law. Their casual use of “douchebag” signals their disdain for the entitled and ineffectual flower-power generation that spawned them but can’t quite manage to rear them, that can’t even keep its own affairs in order.

E.T. can easily—all too easily—be read as a paean to the restorative powers of imaginative fantasy, but it’s important to note that by implication it also advocates for a deux ex machina assertion of cryptofascist control. I mean, c’mon: it is, after all, a shadowy band of all-but-faceless bunny-suited government scientists that swoops in in the third act to legitimate the family’s close encounter and reestablish (or maybe just establish) social order by imposing proto-X-Files martial/exobiological law. The sympathetic researcher played by Peter Coyote is depicted as Elliott all grown up, his sense of enchantment preserved—but he also and more obviously represents the occult knowledge and limitless power of a reinvigorated nation-state. The film’s virtuous abjuration of individual civil liberties and multicultural messiness—paired with its mystificatory and frankly kind of creepy simultaneous embrace of childlike wonderment and skunkworks technocracy—make it a dead solid perfect fable for the Reagan era.

So that, my friends, is the story of First Great Douchebag Breakthrough. By the turn of the next decade, for all kinds of reasons, the word had gathered dust again: incautious overuse had blurred and blunted its impact, new media technologies had made R-rated alternatives more widely available and accepted, and kids had just flat outgrown it. Meanwhile, throughout the land, the cultural circumstances that had made it operative in the first place had shifted, with rising yuppies definitively shunting aging hippies toward irrelevance. “Douchebag” found itself buried deep in the pop-lexical humus where, not surprisingly, it began to mutate once again.

At this point we need not continue to ramble forth without a guide; we can refer to Robert Moor’s essay “On Douchebags,” which appeared in Wag’s Revue in 2009 and was later revised, abbreviated, and reprinted in the n+1 anthology What Was the Hipster?, which is where I first came across it. Moor charts the abrupt rebirth of “douchebag” around the turn of the present century, when it woke from its slumber in answer to a need to name “a certain kind of man—gelled hair, fitted baseball cap, multiple pastel polo shirts with popped collars layered one atop another—who is stereotypically thought to have originated in or around New Jersey, but who, sometime around 2002, suddenly began popping up everywhere (perhaps not coincidentally) just as the nation became familiar with the notion of ‘metrosexuality.’”

This new referent, however, didn’t stick: as Moor points out, “few slang-savvy people today would describe a douchebag as a greasy, Italianate, overtanned, testosterone-rich gym rat.” The early-Aughts connotation of the term seemed to suffer an affliction opposite that of its late-’80s forebear: its meaning was too targeted, because the word was just too good—too potent, too much fun to say—to shackle to such infrequent use. Plus, of course, its rehabilitators no doubt recalled with fondness those middle-school-cafeteria days of yore when no verbal exchange went unadorned by the two-note leitmotif of douchebag; clearly greater semantic ambition was warranted.

Which brings us to the present. What does “douchebag” mean today? This, I suspect, is one of those rare but increasingly common situations when we’re just not going to do any better than Wikipedia:

The term usually refers to a person, usually male, with a variety of negative qualities, specifically arrogance and engaging in obnoxious and/or irritating actions, most often without malicious intent.

Typically wrongfooted too-many-cooks Wikipedia syntax aside, I think this is actually pretty good. The entire post-WWII history of the term fits under the umbrella, from the self-absorbed sanctimony of the hippies, to a broad and irregular litany of 1980s gaucherie (recall that Elliott gets called a douchebag not for claiming to have seen an alien but for ratting out the older kids), to the noxious narcissism of preening millennial meatheads, and more besides. Certain fundamental elements unify all these targets: excessive self-regard, paired with a cluelessness that manifests as incapacity to properly account for the subjectivity of others. The key word in the previous sentence is properly; it’s not that douchebags don’t care what other people think of them—they care a lot—it’s that they overestimate their ability to charm, with confidence based not on sympathetic intuition, nor even on perceptive analysis, but on received technique. A middle-class kid who asserts his participation in the discourse community of the urban lumpenproletariat based on his attentive listening to Chief Keef raps is a douchebag. A dude who professes understanding of ostensibly peculiarly female psychology based on his attentive reading of Men Are from Mars, Women Are from Venus is a douchebag. And so forth.

I’m sure there’s no shortage of candidates—from Sinclair Lewis’s Babbitt to, um, Chief Keef, actually—but based on its sustained focus, the pungency of its indictment, and the historical circumstances in which it emerged, my avant-la-lettre pick for the Greatest American Pop-Cultural Douchebag Case Study of All Time is “Ballad of a Thin Man” from Bob Dylan’s 1965 album Highway 61 Revisited. The song represents an early attempt to pin down a phenomenon in order to better resist it: to point out that a particular number of bothersome individuals can be defined as a type, and that doing so can allow complaints against them to register not (or not just) as petulant sneers but (also) as assertions of competing values.

You’ve been with the professors
And they’ve all liked your looks
With great lawyers you have
Discussed lepers and crooks
You’ve been through all of
F. Scott Fitzgerald’s books
You’re very well read
It’s well known

But something is happening here
And you don’t know what it is
Do you, Mister Jones?

The type of person that Dylan calls out in “Thin Man” was hardly a new feature on the cultural landscape back in ’65. These folks had been around for years, unnamed, rendered invisible by their ubiquity, their social positions fortified by that invisibility. What was new was the type of person that Dylan was: the harbinger and chief prophet of the coming late-’60s counterculture, and of every counterculture that has followed.

This is significant, because the lightning-bolt insight that Robert Moor uses to crack the douchebag code—an insight that doesn’t resist but instead incorporates the term’s propensity to shapeshift—is basically this: a douchebag is the opposite of a hipster. Slick, eh? In exactly the same way that the hipster seeks to stand apart, the douchebag seeks to fit in; in exactly the same way that the hipster seeks to resist hegemonically-imposed common culture, the douchebag seeks to internalize and master it. As Moor puts it (this is from the Wag’s Revue version of the essay):

The douchebag, above all else, seeks a kind of internal legibility, or in simpler terms, normalcy. [. . .] If you listen to his judgments of others, the douchebag reveals that, above all else, he strives just to be normal, to not be “weird”; in fact, to not be labeled at all. [. . .] He yearns more than anything for a stable, non-shifting center, where he can comfortably reside without receiving derision or ridicule. When he succeeds in this task, he is free of stigma, not invisible so much as omnipresent. For that moment he is structurally centralized, an ever-widening nucleus, invisible to himself but projected everywhere he looks.

Let’s look a little closer at the “stable, non-shifting center” that Moor references. To be clear, the douchebag does not achieve this stability and centering the same way that your yoga instructor does: i.e. through reflection and self-assessment, sorting priorities from distractions, articulating and asserting core values, etc. Instead, the douchebag surveys the social landscape, calculates the exact middle of it, and moves toward those coordinates as expeditiously as possible. The idea of assessing his own values and proclivities not only doesn’t enter into this process, but actually produces bafflement in the douchebag, who defines “character” as the extent to which one’s personal habits revert to the norm. The douchebag maintains a suspicion of interiority—his own, and that of others—that borders on revulsion.

One of the cool things about Moor’s analysis is that it posits an interdependent structural relationship between the douchebag and the hipster that’s basically unaffected by the historical specifics of style and fashion; he does a good job depicting the eternal circuit that these two opposed camps are forced to run, as signifiers of hipster quirkiness filter to the mainstream to become signifiers of douchebaggery. Thus neither the douchebag’s stable center nor the hipster’s frontier outpost can ever really exist; both remain perpetually moving targets.

The hipster’s attempts to avoid recuperation by the mainstream—which are motivated by her resistance to having her tastes dictated, and which therefore constitute an assertion of her selfhood—are generally caricatured as flustered and frenetic, accompanied by rolling eyes and furrowing brows. Skinny jeans now fill the racks at Target! Every ex-sorority sales rep now sports a facial piercing and a tattoo! Grrrr, says the stereotypical hipster.

The douchebag, by contrast, is blissful and confident in his conviction that his selfhood can always be assumed, and therefore need never be asserted or examined. Above all else he believes himself to be good at smoothing over potential sites of friction by “saying the right thing,” which amounts to telling people what they want to hear. He regards this not as a symptom of moral weakness but as a skill to take pride in. If this behavior causes him to contradict himself, he’s only fleetingly aware of the contradictions, and anyway doesn’t see what the big deal is. Because of the instinctive rejection of interiority that I mentioned above, accusations of insincerity just don’t mean anything to him: he thinks of himself as a nice guy—respectful and respected, good at his job, whatever it is—who knows how to get along with people. Well, with normal people, anyway. Some weirdoes, y’know, you just can’t do anything with.

Here’s Moor again, this time from the n+1 version of the essay:

The famous douchebag arrogance comes with the false assumption that normalcy has been achieved and that it’s a true triumph. The douchebag who considers himself “relatively normal” thinks he is speaking from a centralized location, a place of authority. To the outside observer, however, he simply looks mediocre and smug. And indeed, why should the douchebag be humble? He is at the center and apex of all things. The average American douchebag is a model citizen of our society: masculine, unaffected, well-rounded, concerned with his physical health, moral (but not puritanical or prude), virile without being sleazy, funny without being clever or snide; he is at all times a faithful consumer, an eager participant, and a contributor to society.

Is this, like, ringing any bells with anybody?

To a great extent, every presidential election since at least 1960 has been a coolness contest, with the cooler major-party candidate consistently prevailing. (I know what you’re thinking, but I am not wrong about this.) This year’s contest, however, seems to break along the hipster/douchebag divide with particular clarity. Sure, you’d have to stretch the definition of “hipster” to fit it around Barack Obama himself—David Brooks’ coinage “bobo” is probably a little closer to the mark (bobos : hipsters : : yuppies : hippies, I suppose)—but I think it’s interesting how the rhetoric of the two candidates and their supporters has recapitulated Moor’s eternal circuit of shifting signifiers: Romney is now freely using Obama’s 2008 conceptual frames, while Obama’s 2012 slogans evoke (as surely they must, given the circumstances) the hipsters’ restless movement toward the next new thing.


At this point, our analysis having reduced American politics to a vacuous fashion system, we might smirk sarcastically and conclude (not entirely without justification) that all the noise associated with the 2012 presidential election has at no point achieved the status of a substantive debate about the nation’s troubled circumstances, and has instead amounted to a pointless and protracted assault on our attention and our dignity: two entrenched constellations of lifestyle enclaves singing traditional fight songs and hurling customary insults at one another. We could conclude that the whole of American civic life adds up to a compulsive reiteration of the social dynamics of a high-school cafeteria, with every clique defined exclusively by its relationship to other cliques in a fixed hierarchy and otherwise devoid of significance or content.

All of that may indeed be the case. Even so, it’s an approximately opposite conclusion that I’d like try to advance here.

As tawdry, stupid, petty, and degrading as our public discourse undoubtedly is—on both sides, and in every direction—there are real stakes on the table: we’re going to be living in a significantly different country four years from now depending on how Tuesday plays out. (And you are hearing this from a guy who unapologetically voted for Ralph Nader in 2000. Calm down; I was in Texas.) While I do believe that most of us decide our affiliations (political and otherwise) at a visceral, emotional, preconscious level and then rationalize them with evidence and analysis after the fact—and I also think it’s clear that some big dollars get spent in efforts to appeal directly to those preconscious attitudes in order to influence our behavior in all kinds of benign and sinister ways—I do NOT think it naturally follows that we should mistrust or reject our visceral reactions. Instead, we should examine those reactions and try to figure out how we came by them, what emotions animate them, and what principles they endorse. My point, in a nutshell, is this: calling Mitt Romney a douchebag is not—or, okay, is not just—a coarse ad hominem attack. It’s also a legitimate articulation of opposing values.

Calling Mitt Romney a douchebag is, for instance, a way of saying that he displays a cavalier disregard for facts. This is not the same thing as calling him a liar; to my way of thinking, in the context of a presidential election, it’s actually worse: liars respect facts enough to know when they’ve parted company with them. Pretty much all candidates for national office lie, in an elbow-throwing, hard-checking, if-you-ain’t-cheatin’-you-ain’t-tryin’ sort of way: Obama and his team certainly cherry-pick statements to paint him in a better and Romney in a worse light, but as lame as this practice is, it really just amounts to spin. Romney’s inconsistencies and factual detours are of a very different order—the kinds of things that make you think surely dude knows people are going to check him on this before you realize that he doesn’t care if anybody checks him.

The highlight reel so far—painfully familiar, but still worthy of review—includes the following: Romney wouldn’t cut taxes for the wealthy; his tax plan wouldn’t add to the deficit; he would maintain full funding for Pell Grants and FEMA, allow no restrictions on insurance coverage for contraception, and keep in place the most popular benefits conveyed by the Affordable Care Act; aside from forcing Chrysler into the Little-Red-Book-fondling hands of the Italians, the federal bailout of the auto industry was implemented exactly as he recommended; he doesn’t favor the strict enforcement of current immigration laws; the stimulus didn’t work; Obama made an “apology tour” of the Middle East after he took office; these are not the droids we’re looking for; and so forth. Every bit of this is either demonstrably factually untrue or has been directly contradicted by the candidate himself.

Romney’s tendency to just make stuff up has a number of troubling implications, but the most obvious one is this: despite the famous contrary assertion by Republican strategists (which I seem hell-bent on referencing in every post: check!), the President of the United States does not get to pick the reality that he or she governs in. Candidates like Romney, for whom candidacy is a full-time job, incur little risk by treating facts with contempt: they occupy no elected office and therefore have none to lose, and if they win they can defend against attempts to hold them to statements they made during the campaign by citing occasions when they stated the exact opposite. But presidents need facts to do their jobs; indeed, the job largely consists of weighing the quality and value of available information to determine a course of action. The terrorist is either in the compound or he’s not; the rogue state will either negotiate or it won’t; the bill either has the votes to pass or it doesn’t. The American electorate has a reasonable expectation of being told how a particular candidate will respond within the bounds of possibility to real conditions once he or she has taken office; Romney’s refusal to do so—or to acknowledge why people might care—is a total douchebag move.


But that ain’t the half of it, kids. Calling Mitt Romney a douchebag is also, critically, a way of saying that there’s just no there there with this guy.

And this is a really big problem. For a long time now (I can refer you to Mark Fisher for some interesting assertions as to just how long) political rhetoric in Western democracies has been unable to support a serious dialogue regarding just about anything of real consequence: it can acknowledge, to a limited extent, the many impending disasters that the world now faces, but realistic suggestions as to how these problems might be solved cannot be posited without being instantly dismissed as “politically impossible.” Examples of these unspeakable subjects include aggressive regulation to limit carbon emissions, the implementation of a national single-payer healthcare plan, across-the-board tax increases at every income level to shore up entitlement programs and pay down the national debt, honestly confronting the ugly legacy of the United States’ conduct in developing nations during the Cold War, creating disincentives to the speculative trade in financial instruments, setting constitutional limitations on campaign financing and the rights of corporate entities . . . you get the picture. All of these topics are “unrealistic,” which is unfortunate because they’re also absolutely necessary: very likely the only approaches that stand a chance in hell of effectively forestalling various looming catastrophes. Anybody with any genuine desire to see these problems addressed ought to be working to enable a public discourse in which these options can be put on the table and evaluated on their merits. There are worse places to begin the project of rehabilitating public discourse than making sure Mitt Romney is not only defeated but discredited on Tuesday, called out for his irresponsible rhetoric on the national stage.

Cuz here’s the thing: with disproportionate thanks to Governor Romney, the 2012 presidential election has actually moved us farther from a capacity to talk about serious stuff. Instead of adopting a bunch of transparently insane but laser-beam-consistent positions and refusing to budge from them—i.e. the Tea Party approach—Romney has declined to take a steady position on much of anything. This is an even bigger deal than the contempt for facts mentioned above: it engenders the absurd spectacle of an enormously expensive and closely-fought presidential election that’s almost completely devoid of politics.

I’m going to say that again: since the conventions, there have been almost no politics discernible in Romney’s campaign. This is not a good thing. Although there are obvious ideological and policy differences between the two major parties, and there’s no question that Romney and Obama would govern very differently, we’ve increasingly had to infer those differences from behind Romney’s ecumenical general-election smokescreen. The public statements that Romney has made in the very recent past (this was particularly apparent in the final debate) have conveyed a consistent message, which is that Romney plans to pursue the Obama administration’s aims using the Obama administration’s methods, but to do so more effectively than Obama has. If we limit ourselves to the public declarations issuing from the candidate’s own mouth—which is not possible since, as mentioned previously, they are almost always at variance with the facts, or with his own earlier statements, or both—then we are forced to conclude that Romney will differ from Obama in his extension of Bush-era tax cuts for the very wealthy, his readiness to privatize Medicare, and his belief that certain federal initiatives should be implemented by the states but remain otherwise unchanged. That’s pretty much it. Romney’s refusal to own and inhabit any single policy position for much longer than it takes him to draw his next breath is not only frustrating but also damaging, in that it further erodes the rhetoric available to all of us to argue productively about much of anything.

This is also pure, classic, peerless douchebag behavior. I really don’t believe that Romney intends to be deceptive; he just doesn’t see any value in wandering in the ideological weeds when that isn’t what—at least to him—this election is about. “I can get this country on track again,” he said during his closing argument in the second debate. “We don’t have to settle for what we’re going through.” He decorated these two sentences with a catalogue of not-too-specific promises of what he’d accomplish as president, none of which was escorted by an explanation of how he’d achieve it, all of which are also identifiable as goals of the Obama administration.

There’s a telling quote from one of the aides on Romney’s 2002 Massachusetts gubernatorial campaign in a recent New Yorker piece by Nicholas Lehmann: “Mitt Romney believes in his competence as a manager,” the aide says. “If he’s elected, he’ll do an adequate job of dealing with the issues of the day. He’s not a vision guy. He’s not policy-driven. He thinks he’ll do a good job.” Romney’s sales pitch to the electorate is simply that he’ll be a better president than Barack Obama has been. Dyed-in-the-wool douchebag that he is, he believes this to be utterly self-evident, almost not worth articulating. He honestly doesn’t understand why anybody would need to know exactly what his values are, given that he’s already spelled out his qualifications. He’s absolutely confident in his ability to turn the country around, and he expects people to share his confidence.

Well, he expects 53% of people to share it, anyway. Some weirdoes you just can’t do anything with.

The defining words of Romney’s 2012 campaign—the infamous remarks surreptitiously recorded at a Boca Raton fundraiser—provide abundant evidence of his willingness to tell different stories to different audiences. This incident was initially assessed as damning because it was thought to show Romney’s true colors on display while safely surrounded by other members of his class; I disagree with that interpretation. I think what we hear in the Boca Raton video is a candidate whose comfort level is conspicuously inversely proportional to the number of people he thinks can hear him talking.

In his initial attempt at overdubbing these comments, Romney described them as “not eloquently stated.” The trouble with that characterization is that he’s more eloquent in the Boca Raton video than he’s been at any other point in the campaign: cogent, candid, forceful, decisive, at ease, in his element. By “in his element” I don’t just mean that he’s hanging out with other super-rich folks, but rather that he’s among people with whom he feels he has a business relationship—his investors, more or less—and for whom he’s in the midst of conducting an analysis. This is how I’m going to get the deal done, he’s saying. This is why it’s worth fifty grand to have dinner with me. He sounds kind of great, frankly. I listen to these remarks and I think: damn, this guy should be in charge of something—something other than the executive branch of the federal government.

The problem with the popular caricature of Mitt Romney as Mr. Moneybags Businessman is not that it’s inaccurate, but that it’s imprecise. There’s capitalists and there’s capitalists, and Romney has been a capitalist of a very particular sort: not an entrepreneur, or a CEO, or even a hedge fund manager, but a private-equity guy. Lehmann’s smart and evenhanded New Yorker piece is really strong on Romney’s experience at Bain Capital, and on what that experience suggests about the way the candidate understands leadership and governance:

Within private equity, people don’t talk about the questions that are on the mind of the public. One professor at a leading business school whose subject is private equity put it simply: “Can I change the free cash-flow equation of the company? If I do, I win. If I don’t, I lose. It’s not the job of private equity to create jobs. The job is to create value. That sometimes creates jobs, and sometimes not.” A comprehensive study of private equity published last year found that the industry has a negligible effect on employment. Private equity is business on steroids: seek efficiency and economic return, not large social goals (unless you think those are large social goals).

If we can get a gigantic and one-hopes-obvious cavil out of the way right off the bat—i.e. that private equity operates largely by dumping unprofitable assets, while government at every level exists to provide services that will never be attractive to the private sector, and therefore any suggestion of equivalency between the two spheres is kind of insane—then I think certain operational aspects of private equity do make for compelling analogy with the office of the Presidency. A new president, for instance, always takes over the management of an established organization; thanks to term limits (rather than the need to repay debt) he or she has limited time to get stuff done; replacing upper management is always the first order of business. The most illuminating similarity may involve the fraught and contentious circumstances under which the president takes office, and the disproportional importance of this process: national election as leveraged buyout.

“Romney likes to say that he was a consultant or a venture capitalist, not that he was in private equity,” Lehmann writes. “Consultants think that people in private equity make most of their money from the way a deal is structured (Bain Capital aggressively pursued that aspect of its business), not from how well they analyze a company and its problems.” This seems consistent with Romney’s orientation as a candidate: he’ll say whatever he needs to say and promise whatever he needs to promise to get American voters to sign on the dotted line—provided, of course, those promises aren’t binding and can be unwound later—with the expectation that he’ll have no trouble sorting out the operational specifics after he’s been sworn in. He’s done it before! A bunch of times!

And on the outside chance that Romney does get himself elected next week—unlikely, but still, all hands on deck!—then rest assured those specifics will be sorted out in short order. Romney may not have been shy about using his veto pen during his tenure as Massachusetts governor, but he was also notably content to let the Democratic legislature set the broad agenda for the Commonwealth; given the determined lack of vision that Romney continues to display, there’s every reason to think that his presidency would run along the same lines. The major difference, of course, is that Romney’s own party would hold a filibuster-enabled minority in the Senate and an outright majority in the House—so it’s Congressional Republicans who’d be doing the agenda-setting for the next four years. While we can certainly hope that a President Romney would shake the Etch-A-Sketch again and redeploy his Beacon Hill moderate mode, it seems rather more likely that he’d devote himself to activities such as reengineering the operations of various regulatory apparatus to improve their “efficiency,” while the largely unimpeded House Republicans move to advance their lunatic disassembly of the commons.

The biggest piece of evidence that Romney is prepared to sit in the grandstands with a stopwatch while the Tea Party Caucus speeds around the track is, obviously, his selection of a running mate. After Paul Ryan joined the ticket, there was a flurry of punditry that attempted to read Romney’s own ideals—pliable, and therefore hard to pin down—through his veep pick. Did it mean that Romney was on board with the aggressive policy recommendations of Republican Party’s primary elected wonk? Or that he perceived a need to borrow somebody else’s intellectual credentials to cover his own disinterest in policymaking? Or that Romney’s position relative to his own party was weak, and he was badgered into making the pick? At this point I think it’s become clear that these questions are irrelevant, as they credit Romney with a range of concerns that he simply doesn’t possess.

The conventional wisdom is that vice-presidential picks are used to shore up perceptions of weakness at the top of the ticket: Reagan picked Bush père for his foreign-policy cred, Bush père picked Quayle for his glamor and good looks (whoops), Clinton picked Gore for his technocratic gravitas, Obama picked Biden for his earthiness. (Cheney picked himself, of course.) I think we can be reasonably sure that Romney picked Ryan not although but rather because he’s a strident ideologue and an energetic leader who makes a lot of people nervous. He was picked, in other words, to mitigate Romney’s douchebag problem.

The Republican bench is not deep; I don’t have too many suggestions for better picks. Still, while Ryan seems to have helped Romney invigorate the conservative base—which is certainly worth something—if the main goal was to attract folks in the skittish middle, I’m not sure this plan worked. For while a great number of Americans do not seem to regard Paul Ryan as a douchebag, they do, unfortunately, regard him as an asshole.

Why is this so? When we call somebody an asshole, what do we mean?

This is an easier question to answer: for the sake of expediency, we can situate the asshole in relation to the douchebag. If, as Wikipedia tells us, a douchebag possesses “a variety of negative qualities, specifically arrogance and engaging in obnoxious and/or irritating actions, most often without malicious intent,” then you can get to asshole simply by deleting that final clause. The asshole relishes making enemies, running in packs, drawing lines in the sand. The asshole’s belligerence comes from the fact that while he shares the douchebag’s disinclination to introspect, he does NOT share the douchebag’s absolute confidence. The asshole knows that a lot of people—MOST people—don’t like him, don’t share his interests or values, would like to see him taken down a peg. He knows himself to be a member of a privileged minority, and he knows that a functional democracy will tend to erode his privileges, and he’s prepared to fight like hell to defend what he’s got. Therefore he has no interest in coalition-building, he avoids the concept of fairness, and he not only accepts but insists that his own team gets to play by a different set of rules. You can see this clearly in the faces of smart conservative assholes, of which Paul Ryan is certainly one: an awareness of how the deck is actually stacked, an understanding of what it will take to keep it that way, and a barely-submerged terror of how easily their regime could be swept aside.

As nationwide demographic trends assemble themselves into a wrecking-ball aimed squarely at the Republican Party, you can bet you’ll hear the voice of the asshole shouting ever more shrilly on the national stage. Leaders like Paul Ryan are good at moderating their tones to appear more reasonable than they are, and also at camouflaging their most thuggish initiatives in a blizzard of data. To discern this voice in its purest form it’s best to listen at the margins, to the angry faces that fill out the crowds at campaign rallies. Here is where you’ll hear asshole ideology most clearly expressed—in the unalloyed valorization of entrepreneurs, in the oft-uttered assertion that no one who hasn’t run their own business has a right to criticize or impose upon anyone who has, in the declaration that the majority of Americans are parasites living off government aid or off the largess of their employers. As if this is not only an accurate but an obvious characterization of the state of the union. As if a nation of entrepreneurs were possible or in any way desirable: an ecosystem consisting entirely of sharks.

As President Obama and his various surrogates have been at pains to point out, the 2012 presidential election is about values, about choosing who we as a nation want to be. There are many ways to pose this question. Here’s mine: do we really want to live in a country that’s governed by—and for the benefit of—a bunch of douchebags and assholes?

I’ll see y’all on Tuesday.

A brief survey of equestrian idioms in late-capitalist popular dance

September 17, 2012

Rodeo by Aaron Copland, performed by the Colorado Ballet, choreography by Agnes De Mille (1942)

“Don’t Tell Me” by Joe Henry, Mirwais Ahmadzaï, and Madonna, performed by Madonna, choreography by Alex Magno (2000)

“Gangnam Style” by Psy and Yoo Gun-Hyung, performed by Psy, choreography by Psy (2012)

Is he in heaven? Is he in hell? Where has he gone? No one can tell! (Part the Third)

March 16, 2012

In the last couple of posts, I’ve been trying to figure out just what the holy hell is going on in Terrence Malick’s recent The Tree of Life.  In this third and final post, I’d like to consider a possible set of explanations as to why the film works the way it works and looks the way it looks.

I’m gonna try to land this Spruce Goose with one last bit of Terrence Malick biographical trivia, the discovery of which felt for me like one of those moments when you’re playing expert-level Windows Minesweeper and you click on a square that’s adjacent to no mines and suddenly a huge swath of empty space opens up in front of you and you’re like: dude, I got this.

Am I the only person in the world who didn’t know (at least prior to starting this post) that Malick was a philosopher before he became a filmmaker?  I’m talking here about a rather different degree of scholarship than, say, Mick Jagger’s early studies at the London School of Economics, or even David Duchovny’s unfinished Ph.D. in English Lit: Malick studied with Stanley Cavell at Harvard, with Gilbert Ryle at Oxford as a Rhodes Scholar, and he taught philosophy at MIT for a while.  To borrow a line from an academic in a different field, I believe this has some significance for our Tree of Life problem.

Based on available evidence, the philosopher with whom Malick seems to have the strongest affiliation is Martin Heidegger.  Malick’s undergrad honors thesis with Cavell centered on Heidegger, at one point he evidently traveled to Germany and met Heidegger, and Heidegger was the major focus of his MIT course; additionally, in 1969, Malick published a fairly distinguished translation of Heidegger’s Vom Wesen des Grundes (as The Essence of Reasons).  I think it’s safe to say that these days Heidegger is understood to be the 800-pound gorilla of what’s commonly called “continental philosophy”: he was a major influence on Sartre’s formulation of existentialism (though he thought Sartre misread him), the term “deconstruction” (now common parlance, albeit with degraded specificity of meaning) arose as an attempt to capture his use of the German word Destruktion, and I suspect but cannot confirm that he’s at least an indirect inspiration (or supplier of unattributed talking points) for contemporary cultural trends like the Slow Food movement, “simple living,” and Portlandia-style hyperconscious consumerism.  Back in ’69, however, Heidegger was still pretty esoteric stuff, at least in the analytic and Anglophone circles where Malick made his academic rounds.  The point here is that as a young philosopher, Malick was 1) serious, 2) reasonably distinguished, and 3) on or slightly ahead of one of the cutting edges of his profession.

For several reasons, I’m not going to attempt any kind of serious examination of Heideggerian themes in Malick’s filmography here.  First, this whole triparate thing is closing in on 8000 words, a length I normally like to reserve for assessing the works of mononymic pop stars.  Second, my knowledge of Heidegger is limited to a couple of his late essays—“The Question Concerning Technology” and “Building Dwelling Thinking”—that I read as an undergrad like twenty years ago.  Third, Heidegger is not so much a can as a forty-gallon drum of worms: any mention of him probably ought to arrive trailing asterisks the size of Macy’s Thanksgiving Day Parade balloons, with the biggest one representing his committed Nazi Party membership in the 1930s.  Finally, there is the risk of making too much of this comparison: calling Malick a Heideggerian filmmaker stands to interpose Heidegger between us and our experience of the films, such that we stop seeing them in their own right.  That said, I haven’t observed a ton of evidence—outside of academic publishing, maybe—that Malick’s critics have herded him into some Heideggerian corral; it seems to me that making too little of his background in philosophy, rather than too much, is the more pressing danger.  So at the risk of RADICALLY oversimplifying—hell, I’m not even gonna risk it; I’m just gonna do it—let’s take a quick swing at this.

Heidegger’s basic deal is an attempt to recover an apprehension of the world that’s firmly rooted in our experience of it.  Sounds pretty straightforward, right?  Well, it ain’t: according to Heidegger, Western philosophy since Plato—which pretty much means Western philosophy in its entirety—has basically amounted to a turn away from the direct experience of the world, in favor of imaginary perspectives, separated from time and space and specific circumstance, from which the world can be viewed, described, ordered, classified, and so forth.  (Cogito-ergo-sum-style thought experiments are rooted in such imaginary perspectives; so are the methods of the experimental sciences.)  Much of Heidegger’s project consists of an attempt to dismantle everything that preceded him, or at least to make the fundamental assumptions on which it’s all based visible and open to question.

When it comes to understanding Heidegger, it’s tempting to think that those of us who don’t really have any solid grasp on the history of Western philosophy might actually be better off—because, hey, we’re junking all that stuff anyway, right?—but again, no dice: over the past 2500 or so years, the errors of perspective and conception that he seeks to defuse have been encoded so deeply in our everyday language and habits of thought as to become entirely transparent to us.  Thus our experience is “always already” enmeshed in preconceptions and circumstances that we have no real means of extricating ourselves from, since we can’t reliably be aware of them at all.  Heidegger recommends that we instead try approaching key questions about existence by means of literary or poetic language that preserves (or restores) some of the mystery inherent to being—a prescription that lands him in the same neck of the woods as our friends the Russian formalists, something I am hardly the first to note.

Although it casts an impressively wide and deep net, Heidegger’s thought is pretty much billeted in the haunted vessel of German Romanticism: when the chips are down and he’s stuck in a tough rhetorical corner, he tends to reach for his Hölderlin or Rilke.  Much like the fictional O’Brien family—but, we probably ought to note, unlike the real-life Malicks—Heidegger’s people were observant Catholics: Heidegger began his schooling with a concentration in theology and an intent to enter the priesthood.  Although this plan didn’t pan out—he renounced the faith in 1919—he maintained a career-long antipathy toward humanism, rooted in his suspicion of anthropocentric conceptions of the universe.  He ends up with a philosophy that places being itself at the center of the world (think being in the sense of the state or quality of having existence and you’re on the right track) and he suggests that our job as thinking and perceiving subjects should be to remain open to being in all its sublime authenticity.  This gentle and attentive openness to the world has resonance with the Taoist concept of wu wei (at some point Heidegger evidently attempted to translate the Tao Te Ching into German), and it also bears a passing (but significant for our purposes) resemblance to the concept of grace that Malick distilled from Thomas à Kempis.

When the internet told me that Malick did time as a Heidegger scholar before he started slumming as an award-winning motion-picture auteur, the first thing that popped into my head—not, y’know, word-for-word, obviously—was a section of “Building Dwelling Thinking,” a briefish and kind of trippy Heidegger essay from 1951 (the translation is by Albert Hofstadter):

Mortals dwell in that they save the earth [. . .].  Saving does not only snatch something from a danger.  To save really means to set something free into its own presencing.  To save the earth is more than to exploit it or even wear it out.  Saving the earth does not master the earth and does not subjugate it, which is merely one step from spoliation.

Mortals dwell in that they receive the sky as sky.  They leave to the sun and the moon their journey, to the stars their courses, to the seasons their blessing and their inclemency; they do not turn night into day nor day into a harassed unrest.

Mortals dwell in that they await the divinities as divinities.  In hope they hold up to the divinities what is unhoped for.  They wait for intimations of their coming and do not mistake the signs of their absence.  They do not make their gods for themselves and do not worship idols.  In the very depth of misfortune they wait for the weal that has been withdrawn.

Mortals dwell in that they initiate their own nature—their being capable of death as death—into the use and practice of this capacity, so that there may be a good death.  To initiate mortals into the nature of death in no way means to make death, as empty Nothing, the goal.  Nor does it mean to darken dwelling by blindly staring toward the end.

Out of context, this passage is pretty perplexing; in context, it’s . . . still pretty perplexing.  If you can kind of roll along with Heidegger’s idiosyncratic use of certain terms—words like “mortals,” “dwell,” “earth,” “sky,” “divinities,” “save,” and “danger,” among others, all accrue very particular implications in his writing—this excerpt gives you a pretty good sense of the attitude toward existence that he says we should cultivate, i.e. one of engaged and mindful humility.  The passage reminds me of several things in Malick, most notably the developing outlooks of several of his films’ central characters (all of whom also serve as offscreen narrators): Sissy Spacek’s Holly in Badlands, Linda Manz’s Linda in Days of Heaven, and especially Caviezel’s Private Witt in The Thin Red Line.  (John Baskin’s essay in The Point, which I linked to in my last post, does a great job of analyzing all these characters.)  I think of Witt particularly with reference to the last paragraph in the block quote above: both of his ultimate fate, and of his statement early in the film about an appropriate attitude toward one’s own death.

I remember my mother, when she was dyin’.  Looked all shrunk up and gray.  I asked her if she was afraid.  She just shook her head.  I was afraid to touch the death I seen in her.  I couldn’t find nothin’ beautiful or uplifting about her goin’ back to God.  I heard people talk about immortality, but I ain’t seen it.

I wondered how it’d be when I died, what it’d be like to know this breath now was the last one you was ever gonna draw.  I just hope I can meet it the same way she did, with the same calm.  ’Cause that’s where it’s hidden: the immortality I hadn’t seen.

Maintaining a particular attitude toward death is a consideration in The Tree of Life, as well—the movie does, after all, conclude with the peculiar extratemporal-O’Brien-family-reunion-at-the-Bonneville-Salt-Flats “afterlife” sequence—but it’s really the third paragraph of the Heidegger excerpt above that seems applicable to Tree.  The attitude that Heidegger recommends dwelling mortals adopt toward the divinities seems to rhyme with that prescribed by the Book of Job, a quotation from which (Verses 38:4 & 38:7) serves as the film’s epigraph: “Where were you when I laid the foundation of the earth? When the morning stars sang together, and all the sons of God shouted for joy?”  If the big question posed by the adult Jack in The Tree of Life is: How do I make sense of a universe in which my gentle and decent brother dies at nineteen and I, jerk that I am, go on pointlessly living? then the film’s answer, like God’s to Job, is: You don’t, dude.  It’s not the place of the human subject to make such inquiries of the universe—not, according to Heidegger, because the universe will put you in your place with a bolt of lightning, nor even because such questions can’t really be answered, but because any answers you find will actually blind you to other truths and lead you further into error.  Although the universe may appear stable, it’s actually a constantly-shifting balance of opposing forces—a concept that Heidegger derives from the same fragment of Heraclitus that Malick more or less quotes in The Thin Red Line—and any truth you pin down concerning particular circumstances or beings is apt not to remain true relative to other circumstances or beings.

This is a major concern for Heidegger: the better able we are to master the world and to get it to disclose its secrets according to our will, the more screwed-up we become, alienated from the mysterious and self-disclosing aspects of existence, and therefore alienated from ourselves.  Heidegger isn’t a Luddite, opposed to technological advancement, but he does worry that our increasing dominance of our surroundings changes us: it inclines us to organize and classify the world solely as a collection of resources at our disposal, and then to perceive the world only in those terms, rather than as its unmediated, unclassified self.  We are so utterly surrounded by and entangled in systems and processes designed to exploit available assets that it requires a kind of breakthrough—and/or a breakdown of systems and technologies—to encounter the world as the world.

This is a major concern in The Tree of Life, as well, though that may not be immediately apparent.  It is, after all, exactly this kind of breakthrough that the predatory Ornithomimus experiences on the Cretaceous riverbank: rather than viewing the things of the world in purely functional terms—categorized as food and not-food—for a moment it seems to stop and consider the trapped parasaur as a being, both like it and unlike it.  The predator seems to do this without any circumstantial prompting.  Malick’s implication seems to be that as organisms have become more complex and more sophisticated—better able to dominate and organize their environments—they’ve become more and more estranged from the capacity to slow down and look, to see the world’s phenomena in anything other than functional terms, and therefore less given to this kind of fleeting openness.

Not for nothing, then, does Malick depict Jack O’Brien’s father as an aeronautical engineer, pursuing a career in America’s post-WWII golden era of aerospace.  (Malick’s own father, in case you were wondering, was a petroleum geologist.)  Mr. O’Brien—restless, ambitious, covetous, never satisfied (“It takes fierce will to get ahead in this world,” he tells his sons)—evokes the tunnel-visioned, results-oriented, technological will-to-power that Heidegger warns against, and that Malick connects with what Thomas à Kempis calls “nature.”  Viewed in this light, the young Jack’s cruel “experiment” of launching a frog on a bottle-rocket represents a grotesque parody of his father’s technical aspirations.  It’s also a more-or-less intentional blasphemy: “That’s where God lives!” Mrs. O’Brien tells her sons at one point, gesturing skyward.  We should note too that the heavenly trajectory of young Jack’s frog-bearing firecracker is recapitulated by the upright BB gun with which he later shoots R.L.’s fingertip.  The injury to R.L.’s finger resonates in turn with a brief scene in which R.L. places his small hand over a flashlight’s beam to see the shadows of veins and arteries there—a branching pattern that we might call, with a little metaphorical license, a tree of life.

And I don’t think it’s too much of a stretch to associate the artificial light that passes through R.L.’s hand with the light that passes through a camera lens to record images on film, the light that passes through a projector to strike a movie-theater screen: a light that reveals.  It’s a little easier to buy the notion that filmmaking might in some ways qualify as a continuation of Malick’s philosophical project when we consider that this is pretty much where Heidegger’s own project concluded: i.e. in approximate prescriptive accord with Viktor Shklovsky, with the idea that art represents the best way to reconnect ourselves with the unmediated textures of the world in which we make our home.

If this is indeed a legit way to characterize Malick’s career, it’s also worth mentioning that the approach he has chosen risks some pitfalls.  If, for instance, technology is the major force that alienates us from the world, it seems significant that film is an extremely technology-intensive artistic medium, certainly far more so than the poetry that Heidegger treasures.  And in fact examples abound of films that were wrecked by their makers’ access to technology, wrecked because making the movie became secondary to using the technology.  (The criminally odious Star Wars prequels are the test case here; I’d argue that Peter Jackson’s Lord of the Rings films are already aging poorly because of their tech-heaviness; even something as ostensibly high-minded and serious as Saving Private Ryan—which suffers badly in a head-to-head with The Thin Red Line from the same year—is hastened toward inauthenticity by a bottomless visual-effects budget.  Jaws, by contrast, remains a great film precisely because its technology consistently failed during production.)

I get the strong impression that Malick knows this, and has consciously adopted strategies to steer clear of this peril: while his dinosaurs are crafted from state-of-the-art CGI, he hired the legendary Douglas Trumbull to do the outer-space bits in his creation-of-the-world sequence; Trumbull is probably best known for doing effects for Stanley Kubrick on 2001, and returned to those working methods (which involve such high-art, low-tech approaches as filming clouds of milk and dye in a side-lit tank of water) for Tree.  It’s also worth noting that Tree’s image of maximum mystery—the auroralike curtains of light that open and close the film—wasn’t made or commissioned by Malick at all: it’s simply footage of Opus 161 by Thomas Wilfred, a work of light-art from the mid-1960s.  If art is our best bet for refreshing our encounters with the world, then I suppose it makes sense that Malick’s most overt evocation of the unmediated infinite should arrive in the form of a fairly modest work of studio art by an eccentric Danish immigrant, produced roughly contemporaneously with most of the the events depicted in his film.

Just for the sake of tying up a loose end, it’s also worth mentioning that film tends to rely heavily on another technology that Malick has adopted strategies for keeping in check—a technology that’s very old, and hardly exclusive to film, and so fundamental that it’s typically only visible to us through its absence.  I’m talking, of course, about narrative itself.  I’ve already gone on at length about how Malick deliberately breaks narrative momentum in his films, so I won’t belabor that here, except to note the interesting tension in Malick’s work between film as a narrative medium for invention (let me tell you a story) and film as a documentary medium for recording (I was here, and these things were here, and I saw these things).  These opposed-but-overlaid conceptions of the function of film as a medium run parallel to Heidegger’s famous conception of the two realms of being: what he called the earth (the nonhuman world of objects and forces) and what he called the world (the world of human activity, with every tangible object and force classified according to its purpose or significance).  The intersection of these two ways of encountering the world is—for Heidegger, and perhaps for Malick too—where we encounter our own authentic humanity.

So I think Malick’s got a pretty good handle on the intrinsically technological nature of film.  My quibble—I MUST have a quibble!—is that I’m less convinced that he has a handle on the intrinsically collaborative nature of film.  Jameson and Davies make a good point that Days of Heaven is just about everybody’s favorite Malick film, and that this probably has something to do with the input of Malick’s collaborators and the fertile milieu in which the film took shape; the fact that it’s a more compromised product than Malick’s more recent efforts actually works more to the film’s benefit than its detriment, at least in terms of connecting with and engaging viewers.  (It’s not that Days is a more crowd-pleasing movie, necessarily, but rather that it admits a certain degree of rhetorical sophistication that Malick keeps out of his later films—intentionally, I’m guessing.)

My aim in bringing this up is not to criticize Malick for hitting what he aims at just because I’d prefer he picked a different target; I think it’s worth considering the implications of.  At the very beginning of this series of posts, I described The Tree of Life as “personal, even private,” and now that I’ve dragged you through some quick-and-dirty Russian formalism and my plagiarize-this-at-your-own-risk survey of Heidegger, I’d like to clarify what I meant by that.  I think this film would still feel personal and private if it lacked a single autobiographical reference: I think this feeling comes not (just) from its presentation of character but from its conception of character, specifically a suspicion of or antipathy toward the midrange psychic distance I talked about in the last post.  Virtually all—maybe just plain ALL—of The Tree of Life is presented from inside the adult Jack’s head, from a psychic distance of zero.  (There’s a reason for all that low-angled handheld camerawork.)  As such, we not only never feel like we have a complete understanding of Jack, we also never feel as if we know or understand any of the other characters (i.e. his parents, his brothers, his own younger self).  We never feel as if Jack has an understanding of these characters, either; nor do any of those characters seem to understand each other.  The film does, I believe, succeed in inducing us, the audience, to think about the things (i.e. our own births, childhoods, families, hopes, loves, aspirations, and eventual deaths in a vast and ancient universe) that the adult Jack thinks about, and to do so in the same way that he thinks about them, which is pretty impressive.  But I also believe the film wants me to believe—or at least assumes—that trying to achieve a sympathetic understanding of how other people might confront these same questions is either impossible or ought to be avoided.  And I don’t have a super-good feeling about that.

Let’s go back to Heidegger one last time.  One of the big knocks on Heidegger—and given the dude’s repellent careerist adventures in National Socialism, the contest is really for second place here—is that while he presents a sweeping and kind of mind-blowing account of the relation of the human subject to being, he is conspicuously unwilling or unable to say much of anything about the relation of the human subject to other human subjects.  Maybe I should say that he’s unwilling to say anything positive: his early writings are full of references to society as a force that clouds the consciousness of the human subject with received notions that encourage conformity and mediocrity and alienate it from its own authentic experience of being.  The gear that Heidegger idled in—as I mentioned above and will now repeat—was German Romanticism; its fetishism of intuitive individual genius is pretty much always playing in the background when he’s holding forth.  (It sounds a lot like a late Beethoven quartet, of course.)  As a result of this orientation, the atmosphere that Heidegger evokes is always kind of cool and awestruck and mist-shrouded and mysterious, and it is for damn sure a fun place to take an intellectual vacation.  But it doesn’t take very long before the environs start to feel a little like a theme park, before the mist starts to look like production design (is that a fog machine behind that boulder?), and before at least some of the mystery starts to feel like it might be mystification instead.

I feel like Malick may also be susceptible to this.  I’m intrigued by reports that Sean Penn has expressed some dissatisfaction with Tree, saying that the finished product didn’t entirely do the screenplay justice, and that the film might have been better served by a “more conventional narrative;” I’m tempted to think that by “more conventional narrative” he might mean “characters having actual conversations with one another.”  (This could be taken as another instance of Adrien-Brody-style um-where-did-my-performance-go? sour grapes, but I get the impression that Penn really gets Malick; evidently he helped him edit The Thin Red Line.)  This is, of course, pretty much the same conclusion that Mitchell arrives at (rather more efficiently than I have) at The Discreet Bourgeois with his comparison of Tree to She Wore a Yellow Ribbon: if Malick’s assumption is that you can’t really present a vivid and complete evocation of an inhabited world if you complicate it by depicting the complexities of human society, then Yellow Ribbon would sure seem to be a strong argument to the contrary.

An effective comedy constitutes a positive rejoinder to a movie presented from a Heideggerian perspective; an effective film noir—like Memento, mentioned parenthetically in the last post—presents a more cautionary critique.  Heidegger’s whole project, ostensibly, is a deconstruction of all the initial assumptions of Western philosophy, but it seems to me that he gives the idea of individual consciousness a free pass.  Yeah, sure, he provides an extensive account of how the individual human subject gets all manner of screwed up by listening to the prattle of its friends and neighbors—but the fact that his account of the operations of mass culture is so strongly negative indicates that he’s imagining something pure at the heart of subjectivity, something that could be brought to the surface if the complications of living amidst others could only be stripped away.  Heidegger doesn’t seem to consider that those complications might actually constitute the human subject.  He’s also very weird about emotions: the “divinities” that he mentions in the passage quoted above are best understood not as invisible quasi-animist forces, but rather as overriding moods or atmospheres that arise from somewhere outside the perceiving subject to determine the character of an experience.  That’s kind of nuts; one suspects that Heidegger has to locate these moods outside the self, because his idea of human consciousness can’t admit the unconscious or the irrational without inventing a mysterious inhuman external power to naturalize and legitimate it.

I should end by noting that many, many people have made attempts to draw lines between Heidegger’s philosophical project and its attendant view of the world—i.e. his extremely rich account of lived experience, his complete disinterest in ethics and politics, his extremely accommodating account of individual selfhood—and his deplorable conduct during the Nazi era.  Some of those lines look pretty straight and pretty short to me.

Am I here to tell you that any work of art that’s coming from his general direction is ethically suspect, or that The Tree of Life is a piece of fascist propaganda?  Hell no.  I’m only suggesting that it’s worth spending a little time (and, apparently, a few thousand words) considering what The Tree of Life is—and is not—saying to us, and what it is and is not able to say, given its initial assumptions.  There are evidently keys in which Terrence Malick cannot or chooses not to sing; I think part of enjoying his performance probably ought to consist of being mindful of certain pitches that we never hear.

Is he in heaven? Is he in hell? Where has he gone? No one can tell! (Part the Second)

March 15, 2012

In yesterday’s episode, we took a gander at the much-discussed “dinosaur scene” from Terrence Malick’s The Tree of Life, which I believe is the riskiest, the most important, and the most memorable scene in the movie, as well as the most confounding and the most frankly ridiculous.  I started to argue that the scene is less important for its content—which is not difficult to interpret—than for its function, which is harder to figure out, and not by accident.  That’s where I’d like to pick up today.

The crucial thing to catch here, I think, is that the dinosaur scene is risky because it represents a major intentional rupture in the narrative.  In order to talk about how this scene works (as opposed to what it means, a topic that we kicked around yesterday) I’d like to reach into the Russian-formalist toolbox for a second.  In a 1925 essay, the critic Boris Tomashevsky famously describes how any story can be broken down into a bunch of component parts—bits of information that convey to the reader the events of the tale, and explain when, where, why, and how they transpire—and he calls the narrative function of each of these bits a motif.  He goes on to divide motifs into two types, which he calls bound and free: bound motifs are crucial to the plot—if one gets left out, the story will stop making sense—while free motifs aren’t.  Bound motifs carry us through the narrative, and make it intelligible AS a narrative, while free motifs require us to do some interpretation to figure out why they made the final cut.  The most common rationale for including a free motif is what might problematically be called “realism”: a bunch of the party descriptions and dialogue in The Great Gatsby, for example, do little to advance the story, but they sure give us a clear sense of the glib and venal milieu in which the book’s action takes place; not dissimilarly, rice paddies and water buffalo don’t figure into the plot of Rambo, but if we don’t see a few of them, then we’re not going to buy that Stallone is really grunting his way through Southeast Asia.  And so forth.

I am reasonably sure that my understanding of the O’Brien family history was not clarified by the experience of watching a dinosaur get its head stomped on; thus I think we can safely call the dinosaur scene—along with the whole creation-of-the-world sequence in which it appears—free rather than bound.  However, neither would I say that this sequence does anything to convince me of the reality or plausibility of what I’m watching; on the contrary, it completely derails my reception of the family drama that’s ostensibly what the film is about, just as I’m starting to get a grip on it.  Within the creation sequence, the dinosaur scene represents the moment of maximum narrative dislocation, the moment at which Malick’s dude-how-did-we-wind-up-in-the-planetarium detour really turns a corner: suddenly we’re in a free motif that contains bound motifs—i.e. 1) the parasaurs are feeding, 2) one among them is sick and immobile, 3) the healthy ones sense danger and flee in terror, 4) the predator appears, 5) the predator recognizes the sick parasaur as easy prey, 6) the predator pounces on it, 7) etc.  The creation sequence that preceded this scene was intelligible as a really, really digressive depiction of the adult Jack’s attempt to understand his own life’s smallness in the cosmos and to grasp the expanse of time that preceded his existence, yadda yadda yadda, but the dinosaurs are something else: the film has suddenly gone from asking me to understand and care about a kid growing up in Central Texas to asking me to watch an end-of-2001-style spacescape to now asking me to understand and care about a couple of large toothy lizards that died out 65 million years ago.  W—as the kids say—TF?

It’s all good, though: our boy Tomashevsky’s got our back.  His essay posits yet another function of free motifs, one which he broadly defines as “artistic.”  (Go ahead, roll your eyes.)  Artistic motifs perform all kinds of awesome, non-plot-related functions; very often these have to do with anticipating and getting in front of the audience’s familiarity with other, similar stories.  (And I do suspect that there’s a certain amount of this afoot in Tree; more to the point, I’m convinced that the references to Tarkovsky and Kubrick that A D Jameson and Jeremy M. Davies think they detect in the film really are there—q.v. their insightful, entertaining, fairly exasperated dialogue on the subject at Big Other, which is rewarding enough that I’m willing to excuse their location of Waco in the Texas panhandle.*)  But Tomashevsky makes particular mention of a different strategy, one he calls ostranenie, a term generally translated as “defamiliarization,” or “estrangement,” or “making-strange.”  The concept doesn’t originate with him; it pops up here and there pretty much throughout the history of art criticism, from Aristotle on, but the sense in which he’s using it was first and most emphatically formulated by Viktor Shklovsky in his 1917 essay “Art as Technique” (I’ll quote from the Lemon & Reis translation):

The purpose of art is to impart the sensation of things as they are perceived and not as they are known. The technique of art is to make objects “unfamiliar,” to make forms difficult, to increase the difficulty and length of perception because the process of perception is an aesthetic end in itself and must be prolonged.

Got that?  From a formalist perspective, Malick’s nutty space-time-dinosaur detour is one of the features that qualifies Tree as an honest-to-god (so to speak) Work Of Art—not although but precisely because it disrupts the orderly progression of the film’s narrative.  Shklovsky’s implication is that although art (in his sense) and story coexist more or less peacefully in just about every narrative work you can think of, they’re actually at cross-purposes: narrative is about motion, while art is about stasis; narrative wants our attention directed to what has happened and is going to happen, while art wants us focused on what’s happening (or not happening).  Every work of narrative art seeks its own particular balance between the headlong rush of plot and the obstructing drag of ostranenie—a balance that has important resonance with Horace’s classic prescription that literature should fuse the instructive with the agreeable.

But what exactly might Malick be aiming to slow down our perceptions of?  The narrative operations of the film itself, maybe—but surely not just that.  (A film that wants to make us aware of its own storytelling machinery tends to look more like Rashōmon or Breathless or The Long Goodbye or Memento or Mulholland Drive—and less like NOVA: The Fabric of the Cosmos.)  If the formalists maintain that the task of art is to induce us to look at our familiar surroundings with renewed alertness and attentiveness—“[A]rt exists that one may recover the sensation of life,” Shklovsky writes in another famous passage; “it exists to make one feel things, to make the stone stony”—and if this is indeed more or less what Malick’s up to with his digressions in Tree, then is his ultimate goal in the creation-of-the-world sequence to make the universe seem, um, universey?

To an extent, yeah, sure, I think it kind of is.  I mean, try imagining a different director—John Ford, Peter Weir, Zhang Yimou, Bertolucci, Spielberg, just about anybody—making a movie about Jack O’Brien’s recollections of his childhood.  When the adult Jack considers his place in the universe, virtually any other director would show us Jack thinking.  It’d probably be done via montage: a solitary Jack looking pensive and glum, maybe flipping through a family Bible and/or an old physics textbook while an orchestra mopes extradiegetically in the audio.  These directors would do this primarily in service to the film’s plot: for the story to work—to properly set up the climax and the dénouement—the audience needs to understand Jack’s frame of mind so his motives are clear and his behavior is intelligible.  Malick totally inverts these priorities: instead of showing us that Jack is thinking, he shows us what Jack thinks; he presents Jack’s thoughts to us not as ideas nor as motives but as an experience.

It’s significant too that the film’s elaborate depiction of what’s going on in the adult Jack’s head does NOT encourage us to understand or to sympathize with his dramatic circumstances: since the film almost always has us looking through his mind’s eye, we’re never really permitted to put any distance between ourselves and his point of view.  One of the weird, paradoxical-at-first-blush aspects of narrative is the fact that our emotional investment is diminished if a storyteller places us too close to a central character’s subjectivity: if we’re always seeing through that person’s eyes and feeling through his or her body, we’ll probably find ourselves totally immersed in each scene, but we won’t have enough macro-level perspective to keep tabs on who’s who and what’s at stake.  If, on the other hand, we’re given an occasional glimpse of the character’s situation from an objective and impersonal distance, or through the eyes of others, we’ll be better able to orient ourselves, and we’ll begin to feel as if we have adequate vantage to form opinions; this in turn will lead us to grant our sympathy to the central character and our investment to the story.  Makes sense, right?

(Okay, since I already mentioned Memento, I’m just gonna go ahead and cite it as an example of this phenomenon: Leonard, the protagonist, suffers from anterograde amnesia—he “can’t make new memories”—and the film imposes a roughly analogous condition on its audience by presenting most of its scenes in reverse chronological order, so we’re never sure what preceded a particular event; consequently we spend the whole movie pretty much trapped in Leonard’s point of view.  Although we’re with him for just about every frame, we never feel entirely at ease in his company—and indeed the film’s major themes and final resolution absolutely depend on maintaining this distance.  Complaints—of which there are many—that Memento is cold or overly fastidious strike me as somewhat akin to complaints that Saw is gory, or that Murder She Wrote is repetitive: um, you think?  Leonard’s anterograde amnesia isn’t just a storytelling gimmick meant to conceal information from the audience; it’s also a fissure that the film uses to reveal limits to the basic human capacity to handle information, along with our apparent readiness to ignore and reinterpret facts that conflict with our favored personal narratives.  Against a dominant tendency in the storytelling arts to depict characters with coherent identities who move steadily toward epiphany and self-actualization, Memento—in keeping with film noir tradition—argues pretty forcefully that our most cherished notions of individual selfhood may not be significantly less contrived than the most fatuous output of Tinseltown: not by chance, then, does the movie come off as a little frosty.)

The classic nuts-and-bolts treatment of how this kind of narrative distancing functions may be the one that appears in The Art of Fiction by John Gardner; he calls it “psychic distance.”  (You’ll find a pretty good summary here.)  Gardner’s talking about fictional prose, obviously, but the grammar of film is strongly analogous to what he describes; in fact he even resorts to cinematic metaphors—close-ups, establishing shots, etc.—to make his points.  If we look at The Tree of Life with this kind of calculation in mind, I think evidence once again suggests that Malick hasn’t blocked our sympathy for the adult Jack out of carelessness or perversity.  Although there are, for example, a ton of close-up shots all over Tree, I don’t recall many of them being of Sean Penn’s face; we typically view him from a distance (and often from above and/or behind), or he’s offscreen entirely.  In other words, Malick seems to intentionally avoid showing us the adult Jack from the midrange perspective that Gardner identifies as most apt to put us at ease and draw us in.  Malick’s aim seems to be to induce us to experience the contents of the film directly, for ourselves, rather than filtering them through the perspective of the main character.  Simply put, he wants us to feel like the film is happening to us, not to Jack.

Tree’s abundant voiceovers—probably the second-most-ridiculed aspect of the film—work in a similar way, in that their rhetoric exhibits even more pronounced constraint: they’re resolutely non-narrative, even anti-literary, leaning heavily on one-syllable words and consisting either of aggrieved questions (“Lord, why?  Where were you?  Did you know what happened?  Do you care?”) or imperative-mood prescriptions and bald assertions devoid of any argument.  (“Help each other.  Love everyone.  Every leaf.  Every ray of light.  Forgive.”)  The voiceovers take pains to offer the audience very little to analyze or interpret.  They’re so simple—so untextured and atomized—that it’s almost difficult to imagine them being written out at all: they have the character of phrases and fragments that might drift through our heads while we’re going about our daily business, preoccupied by some lingering trouble that we don’t have the time or the inclination to really sit down and work through.  (Jameson’s and Davies’ conversation includes a useful comparison of the voiceovers from The Tree of Life and Malick’s highly-regarded Days of Heaven from 1978 that casts the difference in almost excruciating relief.)

At some level, all or most serious films aspire to give their audiences something to think about; Tree, I believe, emphatically does not.  Instead, it seems to want to show us thought, to make thought visible to us, to provide us with a critical vantage on it by impeding our capacity to engage in it, to make us aware of it by taking us outside it.  But Terry, why?  What are you getting at?  Do you know how irritating this is?  Do you care?

At this point I suspect it may be helpful to take a very quick backward glance at some of Malick’s earlier films, which are largely free from the complicating specters of overt autobiography and orthodox religion.  (If you can’t get enough of this stuff and are looking for a more rigorous backward glance at the Malick oeuvre, I highly recommend this piece, by Jon Baskin at The Point.)  Here I say “I suspect” because I can’t claim a ton of authority to present this survey: I have seen just four of Malick’s five feature-length films: Ialong with just about everybody else—skipped The New World from 2005, and I haven’t seen his undisputed masterpiece Days of Heaven since I was a kid.  (I believe I was home sick from school at the time, suffering an acute case of whatever the opposite of ADHD is.)

Based on what I HAVE seen, though, I feel confident in asserting that Malick has always been—and seems increasingly to be—really, really comfortable monkeywrenching the narrative progression of his films with what the audience might regard as, like, scenery: his camera will linger on amber waves of grain or dust billowing behind a distant farm truck for longer than seems necessary, or appropriate, or functional; he’ll cut away from dialogue that seems as if it could be, y’know, important in order to follow a flock of birds as it takes wing.  There seems to be a tension in these films between foreground (i.e. the story and the characters) and background (i.e. the setting and the various free motifs that emerge from it), and a constant tendency for the latter to supersede the former.  The most overt example of this is in Days of Heaven, when a quirky shot of a couple of grasshoppers hanging out on a head of cabbage quickly escalates into what becomes the film’s tragic climax.

After Days, Malick somewhat notoriously took a twenty-year vacation from directing.  Since dude is pretty much a straight-up recluse, we don’t know why this is.  The standard Wonder Boys version of events is that he got bogged down in an unfilmable project about the origins of life in the universe (ring any bells?), had a few bad meetings in post-Star-Wars Hollywood, and decamped to Paris, where he set about becoming really mysterious and interesting as his film-world legend grew.  The awesome thing about reclusive artists is that people like me have carte blanche to come up with theories about their rationales for doing and not doing things, and that is exactly what I’m going to do now.

As great a movie as Days of Heaven is, if Malick’s goal in it is indeed to depict the world of human concerns and entanglements being overwhelmed by the inhuman, natural world, then it fails.  To be more precise, it succeeds within the fictional world of the film—nature does indeed wreck the aspirations of Sam Shepard’s farmer, and dooms the scheme of Richard Gere’s conniving fugitive—but outside that fictional world, from the perspective of the audience, invented human systems of ordering still reign supreme, particularly the system we know as narrative: the clean mechanics of Malick’s pared-down storyline would probably earn him an approving fist-bump from Sophocles.  As gorgeous and enigmatic as Days is, it’s still a spectacle, engrossing but not quite immersive: something we watch happen, but not something that happens to us.  I have a strong suspicion—and what are you gonna do, call Terry up and prove me wrong?—that Malick came away from Days of Heaven with a pretty specific next-time-I’ll list, and I’ll bet that eliminating narrative armature was riding pretty high on that list.

Twenty years and a whole lot of sitting in cafés later, Malick was back in the saddle with The Thin Red Line, an adaption of the James Jones World War II novel costarring something like thirty percent of the bankable male actors in the Anglophone world, with the performances of an additional twenty percent left on the cutting-room floor.  (Viggo Mortensen, Gary Oldman, Mickey Rourke, Martin Sheen, and a ton of other dudes were all apparently in this thing at some point; Adrien Brody—who has expressed something akin to fury at having what was essentially a lead role edited down to a few minutes of screen time—can probably be forgiven for being upset at losing his star turn to a bunch of tropical birds and a marine crocodile.)  Line was the first Malick film I saw in a theater, and also the first one that left me wondering just what the hell the deal is with this guy: the cast list certainly led the unsuspecting filmgoer to anticipate something like A Bridge Too Far, or at any rate something less reliant on lingering shots of wind-purled island grasses.

Yet for all the actorly firepower in Line, fourteen years later I hardly recall any of the performances.  What I remember, of course, are the not-infrequent moments when the film breaks away from the war and the soldiers entirely, into what looks like an IMAX documentary about flora and fauna in the Solomon Islands.  (This tendency is paralleled by the habit of the film’s central character—played by Jesus-to-be Jim Caviezel—of going AWOL to hang out with natives and wander beatifically in the woods.)  In other words, I remember the film’s narrative less than the disruption of the narrative, the foreground less than the background.  I don’t recall enjoying these natural-world reveries—I recall being fairly irritated by them, and by Caviezel’s semi-stoned performance and goofy voiceover—but they’re what has stayed with me.

And this, I believe, is mostly because Malick never really permits a narrative arc to take shape in the film.  The emergence of such an arc would almost certainly supersede Line’s evocations of the natural world in the audience’s memories, because that’s how our brains are trained to work.  The full sensory rush of lived experience—while no doubt way cool—isn’t particularly useful to us: it’s too much data to store and to process, and therefore our tendency is always to look for “the takeaway,” to identify the braided strands of cause and effect that explain why things happen the way they happen, so we can internalize the rules and forget the specifics.  The human capacity to do this is probably one of the big factors that accounts for our near-total dominance of the planet we inhabit; it’s also precisely the tendency to which Viktor Shklovsky says art should make opposition its first order of business.  The swarming locusts in Days of Heaven may evoke the terror of the sublime in viewers while they’re sitting in the theater (or at least make them take a hard look in the popcorn bucket to make sure it’s, y’know, just popcorn in there) but by the time they’re wandering the parking lot trying to find their cars, their brains have classified the swarm as a plot point, an event that brings about the film’s resolution—and maybe as a literary device, a symbol and/or a Biblical allusion—but not as a whirring, squirming, wheat-stalk-munching thing. 

In The Thin Red Line, Malick takes pains not to trap himself like that again.  Images of the natural world are prominent, but the film never justifies their presence by giving them any plot function.  Furthermore, the events of the plot aren’t linked to each other by clear causal chains: much of what happens just kind of happens, analogously to the steady series of trees and vines and birds and reptiles and Guadalcanal topography that Malick places before us, all of it ultimately declining to mean anything aside from itself.  Thus the film ends up with the texture of a mosaic, or maybe a cubist painting: something lacking any illusory depth, a patchwork in which all the component elements—plot and setting, human and nature, fiction and documentary—are arranged without reference to an obvious hierarchy.  This equanimous perspective is reinforced by the first words we hear in the film, in Caviezel’s moony voiceover, words which serve as a thesis statement for Line rather more reliably than Mrs. O’Brien’s nature/grace passage does for Tree:

What is this war in the heart of nature?  Why does nature vie with itself, the land contend with the sea?

This is very likely an unattributed and grammar-shifted paraphrase of Heraclitus’s Fragment 80, written around the end of the 6th Century BCE: war is all around us, strife is justice, and all things come into being and pass away through strife.  The baldly-stated point here, obviously, is that viewers are NOT encouraged to interpret The Thin Red Line as a depiction of unspoiled nature being violated by human war, since humans and their wars are part of nature, too.

Many viewers interpret Line that way anyway, of course; many more don’t really bother to interpret it at all.  While any number of naysayers have complained that the film is slow-paced and dull—at least by war-movie standards—or that its structure is too loose, or that its themes aren’t clearly articulated, just about everybody offers grudging or enraptured praise for its visual richness, or lushness, or gorgeousness.  These words all indicate surplus: the implication being that the film contains more images than the narrative requires or can justify.  Not too many folks, however, seem to get around to asking what the significance of these surplus images is, or why Malick shot them in the first place, or why they made his final cut when Viggo Mortensen didn’t.  Although everybody remarks on them, pretty much everybody also seems comfortable regarding them as merely ornamental, or maybe just as instances of lazy synecdoche illustrating the aforementioned ostensible man-versus-nature conflict.  Although I would bet that Malick is more satisfied with the structure of Line and its balance of elements than he is with those of Days, I would also guess that the propensity of his audience to regard Line’s narrative-derailing images of the nonhuman natural world as nothing more than B-roll footage that Malick lacked the self-control to omit might persist as a source of frustration to him, one he has finally had an opportunity to address in The Tree of Life.  “If I cut away to the Big Bang and a CGI plesiosaur,” I like to imagine Malick thinking, “ain’t nobody gonna say I did it on a freaking whim.”

None of this quite gets at the why question, however—at least not in a way that’s totally convincing to me.  At this point we’ve talked about how Tree works, and what the artistic aim of deploying such filmic techniques might be, but I still feel as though we’re skating across the surface of what’s really going on.  Tomorrow, in Part Three (of three, thank god!) I’d like to try to get past Malick’s methods to talk about the values that may have given rise to them, and also to see just how much I retained from the Intro to Western Philosophy course I took from Dr. Chuck Salman back in like 1992.  Stay tuned, true believers!

*RETRACTION: As attentive link-clickers have no doubt already discerned, I owe Adam Jameson and Jeremy Davies an apology . . . I carelessly read The Tree of Life for Days of Heaven in their Big Other dialogue and mistook a reference to the latter as a reference to the former.  (Not sure how I managed that, as there are certainly plenty of trees in . . . what’s it called?  Oh yeah: The Tree [ahem] of Life.)  Their handle on Lone Star State geography should remain unbesmirched.

This of course raises grave doubts about my own capacity to carefully read and gloss complex material BUT I’M GOING TO DO IT ANYWAY!  See you in [checks watch] twelve hours!!!

Is he in heaven? Is he in hell? Where has he gone? No one can tell! (Part the First)

March 14, 2012

For a while now, the estimable Mitchell Brown has had a great post up at The Discrete Bourgeois that contrasts Terence Malick’s depiction of time and of place in the recent and much-argued-about The Tree of Life with—dig this—John Ford’s depiction of approximately same in his She Wore a Yellow Ribbon from 1949.  Both Mitchell’s post and She Wore a Yellow Ribbon—which I hadn’t seen until he screened it for us back in November—are very much worth your time.

I’m pretty sure The Tree of Life is, too—and that hedged “pretty sure” is basically what Mitchell’s post is about: Malick’s film is easy to admire (visually stunning, etc.) but not so easy to love, or to feel satisfied by.  Although it’s remarkable for its inventiveness, as well as for both the vastness and the specificity of its ambitions, the film ultimately feels very personal, even private, in its perspective and its rhetoric (whether it actually is or not) in a way that’s distancing for its audience.  Its successes come at the expense of engaging us on certain levels.

In his post, Mitchell does a great job of describing how John Ford goes about telling a story of similarly sweeping scope to Malick’s in such an adroit and hospitable way that his audience is barely aware of his ambitions until the theater lights have come back up.  I don’t really have much to add to Mitchell’s reading of Ford; what I’d like to do here is come at this comparison from the other direction: to talk about how The Tree of Life does and doesn’t work, and about what Malick’s filmmaking choices earn him and cost him.

In the course of arranging my thoughts on this subject, it’s become painfully clear to me that there’s just no good way to hammer this stuff out in a single post.  Thus I hereby present what I project to be Part One of a short series.  My aim—in order to avoid marooning poor Terrence Malick in New Strategies limbo with Cairo the Anti-Terror Dog and the eleven-year-old children from The Birds—is to present these posts on successive days.  We’ll see how that plays out.  Place your bets!

Okay.  What The Tree of Life sets out to do, to my way of thinking, is to depict—with maximal mimetic precision and minimal concession to narrative clarity—the response of an individual consciousness to a range of existential questions.  More specifically it’s about the efforts of the film’s protagonist, in light of his brother’s untimely death, to make sense of his life, and of the universe, and to figure out what the former and the latter have to do with each other, if anything.  The film works in approximately the opposite way that most serious dramatic films work: these films generally depict a few significant events and the reactions of a group of characters to them, and then leave it up to the audience to infer what’s going on in the characters’ heads and hearts.  Tree, on the other hand, doesn’t much care about telling us a story, but DOES want to show us exactly what its protagonist is thinking and feeling.  (One almost imagines Malick starting with a truism about film—that it can show images, but has to induce its audience to infer ideas and emotions—and then setting out to disprove it.)

As Mitchell correctly notes, many of The Tree of Life’s admirers tend to come across as dismissive of efforts to “figure out” the film, or to pin down its meaning.  The consensus among these folks seems to be that there’s really not that much to figure: Malick’s movie may be unconventional, but it’s also basically straightforward and sincere.  These fans are apt to reassure us that any puzzlement we may feel isn’t due to philistinism on our part; it merely comes from the fact that most of the standard interpretive stuff we’re accustomed to doing with “serious” movies has pretty much been done for us in this case.  What the film has to tell us, it clearly states; it doesn’t claim to have any better answers to big questions than its befuddled characters do.  Even the nuts-and-bolts content that ISN’T made clear in Tree—content that seems to have been deliberately elided or withheld, such as the precise chronology of depicted events, or some of the characters’ biographical particulars, or the precise ratio of fantasy to reality in what we’re shown—isn’t necessarily rewarding for us to puzzle through: knowing the circumstances of the brother’s death, for instance, wouldn’t exactly make us more receptive to our encounters with dinosaurs and nebulae and stuff.  If the film’s earnestness and directness leave it open to charges of self-indulgent sentimentality, well, then it’s redeemed by its sheer beauty and its evocative strangeness.  And that’s pretty much all you needed to know to fill out your Oscar ballot.

Or so go the usual pro-Tree arguments, at any rate.  Until I began working on this—initially it was just supposed to be a comment on Mitchell’s post—I was pretty much coming from the same place: I saw Tree, I basically liked it, and I felt like I, y’know, got it or whatever.  I was aware of a bunch of divergent opinions on the film—at Cannes it was famously both jeered and awarded the Palme d’Or—but I figured that all the back-and-forth basically boiled down to viewers’ varying appetites for metaphysical earnestness: if you’re cool with it, then you thought Tree was one of the best movies of the year; if you’re not, then you didn’t.  The case against the film that corresponds to the pro argument set forth above was crystallized for me by an urbane middle-aged couple who sat in front of us at the theater in Evanston where K and I saw Tree: sometime round about the eighth reel, during one of Sean Penn’s plaintive voiceovers, the gentleman leaned over to his companion and audibly muttered the words “Jesus freak.”  Shortly thereafter the couple walked out.  (I should mention for the benefit of those unfamiliar with the folkways of Chicagoland that Evanston is the kind of town where comments like “Jesus freak” are indeed intelligible as heckling.)

Although—just to be clear—the not-infrequent protestations made by certain persons of faith about their perceived oppression by the forces of secularism kind of make me want to shoot something with a gun, I still think it’s possible—possible—that we arugula-eating, pew-eschewing, art-film-watching liberal elites have gotten a little bit lazy in our viewing habits.  It could be that we’ve grown so accustomed to seeing “serious” film directors use religiosity as a quick signifier—of rooted steadiness at best, of cruel bigotry at worst, of a disinclination to doubt in any case—that when a film makes a clear sympathetic effort to convey the complex and conflicted worldviews of religious characters, our assumption tends to be that the filmmaker must share that worldview.  The protagonist of The Tree of Life grows up in an observant Catholic family in small-town Texas; as an adult he works as an architect in the city, and although he doesn’t seem overtly religious, his voiceover—which is addressed to a supreme otherworldly power—makes it apparent that he still tries to make sense of the world through the lens of faith: even if his belief in God has been shaken, his faith is his only framework for asking the questions that trouble him.  Although Terrence Malick is famously reticent regarding his private life, most watchers of Tree will know that he too was raised in Central Texas in the 1950s, and many will also know that he too lost a younger brother at an early age; therefore we can hardly help but view Tree as near-autobiography, and to conclude that the perspective from which the film’s protagonist views the world is very close to Malick’s own.

I am not sure that that’s the case.  (At minimum, I strongly suspect that Malick is not a Jesus freak.)  With the benefit of Google and a few months’ hindsight, I have become convinced that I—along with many others—was a little too quick to make up my mind about The Tree of Life.  Whether they lionize it as a heart-on-its-sleeve address to the infinite, or they write it off as self-involved reverie, I think the majority of opinions I’ve heard or read about the film don’t credit it with the complexity it actually possesses, and don’t really take into account the full measure of its weirdness.  Since I’ve only seen the movie once—and since I am frankly not in a huge hurry to sit through it again—I can’t claim to be able to accomplish that full measure-taking here.  But I AM going to take a crack at arguing that, sincere though it may indeed be, it possesses more moving parts than might initially be apparent.

We might as well start with the dinosaurs.  In Tree’s most-talked-about (and certainly most-ridiculed) scene, we see what Wikipedia informs me is a young Parasaurolophus being set upon by a predatory Ornithomimus on a riverbank.  (You can watch the scene here, courtesy of the New York Times.)  The young parasaur is injured or sick; it huddles helplessly on the ground while its fellows flee the premises.  The predator scampers over in a very Jurassic-Park-velociraptory way, stomps on the parasaur’s crested head, and is clearly ready to start noshing.  Then it stops.  It lifts its foot, as if to get a better look at the parasaur’s face; the parasaur raises its head, and the predator smooshes it down again—more gently this time, as if only concerned with maintaining its control over the situation.  The predator lifts its foot again; the parasaur remains still, and the predator’s foot comes down a third time: just a tap, a touch that seems curious, exploratory, and almost—not quite—affectionate.  Unless I’m mistaken, the predator brushes a clawed toe along the parasaur’s distinctive crest, as if suddenly wondering: Just what the hell ARE these things I’ve been eating?  Then it departs across the river, leaving the young parasaur unharmed.

Am I a total dweeb for being moved by this scene?  Maybe it’s partly that NPR has been awash lately in stories of animals that exhibit capacities for cooperation and caring that seem to match (or exceed) those of humans—rats will rescue each other! Tom Brokaw concludes a long interview with a portentous story about elk!—but thinking about the scene now, I find I’m MORE affected than I was when I actually watched it.

For those who haven’t seen Tree, it may be helpful to describe—and for those who have, it may be helpful to recall—the context in which Malick presents the dinosaur scene: it appears in the midst of a condensed history of, um, everything, starting with the Big Bang and passing (briskly by cosmological standards, unhurriedly by cinematic ones) through the formation of the solar system and the earth, the appearance of increasingly complex organisms, and their migration from the oceans onto the land.  (This sequence is introduced by the voiced-over interrogatory of the film’s protagonist, Jack O’Brien, played as an adult by Sean Penn and as a child by Hunter McCracken.  As grown-up Jack’s ruminations on his brother’s death lead him to imagine the vastness of time and space, Malick shows us time and space—or shows us Jack’s imaginings of them, at any rate.)  Even during the dramatic events of the dinosaur scene, there are strong reminders that this episode is only a flicker in a sequence—let’s not call it a story—with a beginning and an end that vanish into infinity: the dinosaurs’ encounter, we notice, is underlain by the constant sound of the river beside them, and the earth beneath them is covered by stones that have been worn conspicuously smooth by that river.  After the dinosaur scene ends, the next thing we’re shown is an asteroid striking the earth, presumably dropping the curtain of extinction on the two players we just finished watching, along with the rest of their kind.

To my way of thinking, the peculiar encounter between the Parasaurolophus and the Ornithomimus is—and kind of HAS to be—the most important scene in The Tree of Life: the key (well, certainly a key) to everything else that Malick shows us.  But what are we, as viewers, supposed to do with it, exactly?

We should note that what’s confounding about the scene isn’t that it’s all that difficult to interpret.  Most viewers will pick up pretty quickly on the fact that what we’ve just witnessed contradicts—or at least complicates—a certain declaration that has been quoted by many if not most reviewers of Tree, and that I recall as the first major assertion we hear made in the film:

The nuns taught us there are two ways through life: the way of nature and the way of grace.  You have to choose which one you’ll follow.  Grace doesn’t try to please itself.  Accepts being slighted, forgotten, disliked.  Accepts insults and injuries.  Nature only wants to please itself.  Get others to please it too.  Likes to lord it over them.  To have its own way.  It finds reasons to be unhappy when all the world is shining around it.  And love is smiling through all things.  The nuns taught us that no one who loves the way of grace ever comes to a bad end.

We hear this spoken in the voice of Jack’s mother; much of it, come to find out, is an unattributed paraphrase of Book 3, Chapter 54 of The Imitation of Life by Thomas à Kempis—which makes it a quotation (Mrs. O’Brien) of a quotation (“the nuns”) of a quotation (Thomas) filtered in turn through the adult Jack’s recollections.  Thus the film is interposing something like five reportative layers between us and the content of the statement, a fact that many incautious commenters have tended not to pick up on.

To be sure, the nature/grace dichotomy is handy for charting how Jack understands his parents’ personalities—i.e. Brad Pitt’s stern Mr. O’Brien = nature while Jessica Chastain’s gentle Mrs. O’Brien = grace—as well as the internal tensions that make Jack who he is.  (The validity of this interpretive schema seems to be at least semi-confirmed by the adult Jack’s late-in-the-film voiceover: “Mother.  Father.  Always you wrestle inside me.  Always you will.”)  But the dinosaur scene serves up a pretty clear signal that the nuns’ assertion is at least somewhat out of whack: you’d have to work pretty hard to find a critter that’s a purer product of nature, redder in tooth and claw, and less, y’know, christlike than a predatory bipedal dinosaur, and yet the film presents us with the spectacle of just such a beast acting against what we have to assume are its best interests when it mercifully passes up an easy meal.  (Unless of course the helpless parasaur is totally infected with like listeria or something—in which case, clever girl!—but I don’t think that’s the most fruitful reading of the scene.)

The dinosaurs’ encounter indicates that manifestations of what the nuns call “grace” are present in nature, pretty much right from the starting whistle.  In fact, Tree seems to suggest that the kind of jerk-ass behavior that the nuns—in imitation of Thomas’s Imitation—ascribe to “nature” may be uniquely human, or at least arise from particularly human existential circumstances.  Another interpretive connection that most viewers will make pretty quickly: the predatory dinosaur’s apparently motiveless sparing of its prey is mirrored (and inverted) by the scene late in the film in which the young Jack convinces his younger brother R.L.—the guitarist brother whose death at age nineteen is presented as the film’s central problem—to place his fingertip over the barrel of a skyward-aimed BB gun; young Jack then pulls the trigger.  As instances of wanton cruelty go, this is a pretty good one; the scene also reinforces the family schema that’s developing along the nature/grace axis: mild, artistic R.L. is clearly his mother’s child, while Jack, to his own chagrin, takes after his dad.  (“I’m as bad as you are,” the young Jack says to his father at one point; “I’m more like you than her.”  See also his blunt, effective paraphrase of Romans 7:15 slightly earlier in the film: “What I want to do, I can’t do.  I do what I hate.”)  And of course the association of the doomed R.L. with his mother and with what the nuns call “grace” serves to further erode the validity of the passage quoted above, particularly its closing statement that “no one who loves the way of grace ever comes to a bad end.”  The failure of this statement to be true is, in a nutshell, what the film is about; the context in which the assertion is made—the problematic opposition of nature and grace—is the key to how the film works.

So the dinosaurs, like I said, aren’t difficult to interpret.  What really provokes all the strong reactions to the scene—the eye-rolling, the snickering, the irritation, the bafflement—is that they’re difficult to justify.  Years and years of narrative works that genuflect to Aristotelian unities have trained us to expect that stories will limit themselves to depicting only as many times and places as are absolutely necessary; The Tree of Life doesn’t so much throw these unities out the window as shoot them from a cannon.  (During the creation-of-the-world sequence, I couldn’t help thinking of that old Bloom County strip where Bill the Cat has come back from the dead and all the tearful celebrations are captured in a soaring and widening crane shot that ends up showing the entire earth from orbit: “TOO WIDE!  And too damned silly!”)

By broad unscientific consensus, the dinosaur scene is the most memorable one in the film—it’s the one everybody wants to talk about around the water-cooler—and this is surely not an accident.  (I mean, I doubt very much that Malick has been complaining to his therapist about how he made a beautiful movie about faith and family but all anybody wants to talk about is the bit set in the Late Cretaceous; I’m pretty sure dude knew what he was doing.)  The dinosaurs are memorable precisely because they’re so flummoxing: they represent the riskiest moment in the film, the moment at which Malick lays out his cards and more or less demands to know whether the audience is with him or not.

That’s probably enough for today’s installment; we’ll return to the terrible lizards—and what they’re doing in this movie—tomorrow.  Don’t touch that dial!

A huge translation of hypocrisy, / vilely compiled, profound simplicity.

November 18, 2011

So—have you seen Anonymous yet?  Do you plan to?

Yeah, me neither.  I did, however, read Stephen Marche’s “Riff” on the movie in the New York Times Magazine a few weeks ago, and I recommend that you do the same.  It’s pretty entertaining, and more importantly it does a good job of articulating what’s objectionable about Anonymous—which is not just that it promotes nonsense about the plays of Shakespeare, but that it doesn’t seem to give much of a damn whether the case it argues against the historical record is accurate or not.

One minor quibble with Marche: at one point he refers to the Oxfordian quasi-scholars (who hold that Edward de Vere was the “real” Shakespeare) as “the prophets of truthiness”—and though he’s absolutely right to evoke truthiness in the context of Anonymous, I think his aim is a little off.  Truthiness isn’t being perpetrated, exactly, by the committedly snobbish Oxfordians, whose attempts to braid the stems of a few cherry-picked facts seems quaint and almost respectable by contrast to the film their research has inspired: wrongheaded though they may be, the Oxfordians hold their ground when they’re called out.

Anonymous, by contrast, DOES traffic in something like truthiness—it may be more accurate to call it bullshit—in that its makers are perfectly content, and indeed prefer, to lob irresponsible assertions and then fall back with a shrug, like drunk revelers who shoot pistols in the air at a carnival and melt into the crowd when all the shouting and running starts.  Pressed on his film’s fast-and-looseness by NPR’s Renée Montagne, screenwriter John Orloff tap-dances a little about how, y’know, the Shakespeare plays themselves stretch the truth for the sake of a good story—as if depicting Richard III as a hunchback is comparable to suggesting that his reign was entirely the invention of Polydore Vergil—and then says the following:

At the end of the day, what we’re really doing is having a question about art and politics, and the process of creativity, and where does it come from—and THAT’S what the movie’s about.  It’s not about who wrote these plays.  It’s about, how does art survive and exist in our society?

Montagne only has a few seconds left, but she doesn’t let this pass: um, of COURSE the movie is about who wrote the freaking plays; to suggest otherwise is absurd.  Among the jawdropping qualities of Orloff’s vacuous statement—and they are many: I mean, are we seriously supposed to accept that a film that assumes that a regular guy from Stratford couldn’t have had the skillset or the résumé to produce great literature is really about “the process of creativity?” or that the best way to show “how art survives and exists in our society” is by means of a byzantine conspiracy yarn set in late-Elizabethan England?—surely the worst is the maddening implication that Orloff himself may not be totally convinced of the veracity of his own film’s premise, and indeed may not have thought about it all that much.  It’s just material to this dude: it’s a pitch, a tagline, designed to stir things up; it possesses and aspires to no more substance than the hook of a pop hit.  What if I told you that Shakespeare never wrote a single word!  Eh?  You see what I did there?  Do I have your attention?

But where, one might legitimately ask, is the harm?  Does Anonymous really cheapen the culture when the culture is already, y’know, pretty cheap?  Does it really make us dumber than we already are?

Yeah, actually, I think it does.  Sure, one can (and many will) argue that Anonymous is a net win for Shakespeare (or whomever) and also for the literate culture at large because it will prompt new and closer readership of the work, in much the same way that Dan Brown sent tens of thousands of fresh-minted armchair art historians into museum gift shops and onto the internet in search of Leonardo’s Last Supper(This seems to be the position that Sony Pictures is taking, promoting the film with a classroom study guide ostensibly intended “to encourage critical thinking by challenging students to examine the theories about the authorship of Shakespeare’s works and to formulate their own opinions,” which sounds just excruciatingly fair-’n’-balanced to me.)  Sorry, but I just don’t buy this argument.  Yeah, maybe there’s value to motivating attention toward works that have become inert from neglect or (and?) over-familiarity, but I see approximately zero evidence that the works of Shakespeare have been neglected of late, nor any sign that their reception has grown inert.  To the extent that Anonymous introduces the works of the Bard to a new audience, it introduces those works not as artifacts of a superlatively imaginative human consciousness using language to engage an audience, to curry political favor, to struggle with major questions of existence, to earn cash and prestige, and to tell an ascendant nation complex stories about itself, but rather as an already-cracked code: an enormous crossword puzzle with all the letters already filled in and all the clues missing.  Now, don’t get me wrong, I can be as postmodern as the next guy when it comes to issues of interpretation . . . but I also try to be pragmatic on such questions, and the ultimate rubric in a case like this one is probably whether the works being interpreted become more interesting or less interesting when viewed through the lens of the theory under consideration.  The Oxfordian hypothesis does not perform well on this particular racetrack, and Anonymous can barely roll itself out of the pit.

Okay, then, lit snob (one might also legitimately ask): since you brought up Dan Brown, how is Anonymous any worse than The Da Vinci Code?  Or, for that matter, worse than JFK?  Now that the guy who directed Independence Day is crapping on your precious Shakespeare you’re urging the troops to the battlements, but where were you when those other conspiracies were getting mongered, huh?  You’re saying Anonymous is different somehow?

Yup, pretty much: different and worse, for a number of reasons.  There is, first of all, the matter of Anonymous’s target selection.  While Dan Brown and Oliver Stone seek to encourage suspicion of powerful and entrenched institutions that have earned close scrutiny and that frankly can afford to take the hit—Roman Christianity and the American military-industrial complex, respectively—I’m not sure what corrupt institution Anonymous aims to disinfect with daylight.  No matter how you felt while trying to memorize Romeo’s But, soft! what light through yonder window breaks speech back in middle school, Shakespeare isn’t really oppressing anybody.  Opulent and hidebound though it may be, the British monarchy isn’t exactly being propped up by the plays of Shakespeare.  And if the villain here is supposed to be the academic establishment—I’m imagining a scrapped preview trailer set at the MLA Conference, featuring cloaked adjunct professors darkly muttering stuff like our secrets must be preserved! in the incense-befogged corridors of a Midwestern convention center—well, to me that seems a little like bullying the skinny, bespectacled dork on the playground.

I tend to roll my eyes at tales like Brown’s and Stone’s, but they don’t make me nervous.  When conspiracy stories start accusing groups that are relatively powerless in practical terms of hiding the truth, or perpetrating hoaxes, or exerting occult and improper influence over the unsuspecting rabble, then I feel obliged to start clearing my throat.  If the implicated groups are made up of harmless managerial-class dissenters like Shakespeare scholars (or for that matter environmental activists, the favored late-90s-post-Communist-pre-terrorist bogeymen of  contemptible hacks like Tom Clancy and Michael Crichton: a particularly hilarious premise for those of us who actually know environmental activists, and understand that they’re unlikely to accomplish a successful mass mailing, never mind world domination), then I think it’s sufficient to simply mock the offending conspiracy yarn, the way I’m mocking Anonymous now.  When the alleged conspirators are groups that are broadly disempowered, and are defined by ethnicity, religion, gender identity, national origin, etc. . . . well, then it’s not really funny anymore, is it?  I’m not remotely suggesting that Anonymous is guilty of that—but the irresponsibility of its approach is the same kind of irresponsibility.

There is also, with Anonymous, the problem of narrative mode: i.e. the manner in which screenwriter Orloff and director Roland Emmerich choose to present their tale.  More specifically, Anonymous dispenses with a frame narrative—which is to say the story of de Vere’s conspiracy is the only story it has to tell.  Terrible though it may be, The Da Vinci Code doesn’t suffer this shortcoming: Dan Brown has the good sense to keep his pseudohistorical esoterica strictly confined to his book’s backstory, while all the action in the present involves the unmistakably fictional adventures of his Indiana-Holmesian protagonist.  (In other words, while Mary Magdalene—understood by Christians to have been an actual person—is a key figure in the book’s plot, she is not a character in the book.)  Although JFK is played in a far more urgent and sincere key, we should note that it too uses a frame narrative: it’s presented (at least initially) not as the story of a conspiracy to murder President Kennedy, but of the efforts of Orleans Parish District Attorney Jim Garrison to uncover and prove that conspiracy.  The closest thing to a frame narrative Anonymous has, however, is the what-if-I-told-you teaser prologue spoken by Derek Jacobi.  (Who also played the non-diegetic narrating Chorus in Kenneth Branagh’s Henry V!  And who’s also an outspoken Oxfordian!  See?  Evidence accumulates!)

This absence of a frame narrative might not seem like a big deal, but it is.  In a conspiracy yarn, the use of a frame accomplishes a couple of important things: first, it creates a point of entry for the audience, a detective character who’s (almost) as ignorant of the conspiracy as we are, and who’ll walk us through it as she or he figures it all out.  While this character’s constant exclamations of eureka! and/or this goes all the way to the top! may grow tiresome, they perform the useful function of signposting the audience’s own reception of the conspiracy as it unravels.  Second, a frame narrative allows some critical daylight to creep between the story and the peculiar theories that make up its content.  This fictional ambiguity—an ambiguity that’s present in the audience’s experience of the narrative, not the kind that materializes outside it, when some jackass screenwriter backpedals in an interview—has the effect of critically engaging us, and making us work to get out in front of the story.  Is this conspiracy just something that the main characters believe in, or is it literally true in the world of the narrative?  Is this story pulling our legs, or do its makers really expect us to buy into this stuff?

Thanks to its frame, The Da Vinci Code can function effectively even for readers who aren’t prepared to get on board with its swipe at orthodox Christianity; I think we can safely assume that the 80 million people who own a copy haven’t wholeheartedly embraced the gnostic gospels.  In purely functional terms, Dan Brown’s conspiracies are MacGuffins; the linchpin of his plot could be the Golden Fleece or the toothbrush of Odin as easily as the Holy Grail.  JFK—which is both more overheated and more serious about what it’s doing—handles its subject conspiracy in roughly the opposite way: as it becomes clearer and clearer that not only the fictionalized Jim Garrison but also the film itself both really believe and really want us to believe that the historical JFK assassination was in reality a vast antidemocratic plot, the Garrison narrative begins to recede and collapse (along with Costner’s accent—zing!), and the illusionistic continuity of the fictional world is repeatedly ruptured.

Oliver Stone famously characterized JFK as a “counter-myth” to the official account, which it really isn’t: it’s a fictional depiction of the Clay Shaw trial that gradually transforms into a polemical documentary heavy on speculative reenactments.  The term “counter-myth” could be better applied to the all-but-frameless Anonymous, which doesn’t bother to depict the uncovering of the Oxfordian conspiracy, but only the conspiracy itself.  Anonymous doesn’t argue for de Vere’s authorship of the plays, nor does it display any understanding that that’s the sort of thing that might need to be argued for; instead it just presents de Vere’s authorship as part of a series of plot events, which may or may not correspond to some extra-filmic historical record.  Were I convinced of Anonymous’s sincerity, I’d be inclined to regard it in sort of the same way I do contemporary Christian music—i.e. I’m not especially inclined to entertain its initial assumptions, I’m irritated that it seems to assume that I am, and I’m therefore not able to set those considerations aside and just enjoy the craftsmanship of the product, such as it is—but Anonymous is not sincere.  And were I convinced of its insincerity—if I thought its aim was just to make a little mischief with history and literature, in a manner akin to that of, say, Shakespeare in Love (which fills in gaps in the Bard’s scant biography without coloring outside the lines), or monumental goofs like Abraham Lincoln, Vampire Hunter, or even an honest-to-god masterpiece like Brad Neely’s (NSFW!) “George Washington”—well, then I could muster some respect for it as harmless entertainment, something clearly not intended to mislead or confuse anybody.

But with Anonymous, of course, sincerity and insincerity aren’t even on the table.  It doesn’t make sense to ask whether the film really believes the story it’s telling, because the film itself doesn’t know and doesn’t care.  In keeping with its extraordinary lack of narrative and rhetorical ambition, its only goals are functional: it wants the extra jolt of adrenal seriousness that comes from rooting its story in supposed real-world events, but it’s unwilling to surrender the freedom to make stuff up . . . and its bookkeeping regarding what among its contents is factual versus speculative versus full-on fictional appears to be slapdash and/or nonexistent.

That being said, I might still be willing to let Anonymous pass in silence if irresponsibility were not so central to its project.  I thought, for instance, that Braveheart was pretty stupid on the whole, but at least its tagline was “Every man dies, not every man really lives,” rather than “What if William Wallace was the illegitimate father of Edward III?”  I guess I’m also bugged by the fact that unlike other conspiracy potboilers, which generally take historical events as their raw material, Anonymous is a dramatic work that wants to function as a gloss on yet another body of dramatic work, essentially laying its cuckoo’s egg in the well-feathered Shakespearean nest; this amounts not to boldness but to laziness.  It also seems to position Anonymous as a successor to, and possibly a substitute for, the plays of Shakespeare: all that gnarly iambic pentameter sure is tough to parse, but thanks to Anonymous we now know that it’s just a bunch of coded propaganda intended to sway contemporary court intrigues, and we can comfortably interpret it—and dismiss it—as such.

Parting shot, disguised as a clarification: in the foregoing paragraphs I have made comparisons between Anonymous and conspiracy stories like The Da Vinci Code and JFK to the detriment of the former; I hope I have been clear that my intention has NOT been to endorse the latter.  Conspiracies are like the potato chips of the narrative food pyramid: very appealing, fleetingly satisfying, and nutritionally void.  Unless a particular conspiracy story is robustly, voluptuously fictional—willfully impossible to take seriously, and therefore more concerned with the nature of truth and knowledge than with its chosen posited cabal or plot (I’m thinking here of a family of works that includes The Crying of Lot 49 and Foucault’s Pendulum, Twin Peaks and The X Files)—then its contribution to the culture is probably a net negative.  Even if their intentions are honorable, such stories always encourage us to think of history as something to which we’re spectators: a few of us can rattle off all the players’ stats from memory, while most of us spend the whole game trying to flag down a beer vendor, but all of us are stuck in the stands while the real action happens on the field.

This is not an accurate or a productive way to understand the world.  The course of history is not generally set by small groups of scheming individuals, but rather by enormous impersonal institutions; we are not passive subjects, but active and implicated (if individually powerless, and generally unthinking) participants.  Power is invisible, sure enough, but it doesn’t maintain its invisibility by hiding; it doesn’t have to.  We willfully avert our eyes from it, or we fail to see it when we’re looking right at it.

In a piece that appeared in Z in 2004, Michael Albert does an admirable job of explaining the appeal and the limitations of conspiracy theory; he also presents an instructive contrast between it and what’s often called institutional analysis.  I suspect—and hope—that his piece has been widely circulated among the campers in Zuccotti Park, McPherson Square, Frank Ogawa Plaza, and Seattle Central Community College as they collectively plan their next move.  The Occupy protestors’ habit of identifying a particular human sin (i.e. greed) and/or a small group of individuals (i.e. the One Percent) as the perpetrators of our present international crisis has been rhetorically effective, but it’s kind of a philosophical dead end: sure, there are indeed scoundrels out there with a lot to answer for, but rather than heating up the pine tar and gathering the feathers, now seems like a good time to focus on the complex unmonitored systems that empowered and encouraged those scoundrels, and maybe even to try fostering some kind of broad and serious national conversation about the way we assign value to things.  To pick up that dropped baseball metaphor, rather than dissecting the weaknesses of the visiting team, or speculating about who’s been doping, it may be time to consider reassessing the rulebook and redesigning the ballpark.

If we’re to stand any kind of chance of doing that successfully—of doing much of anything successfully—we’ll have to cultivate and safeguard our capacity to sort through facts.  And by facts I mean, y’know, facts: independently verifiable data about conditions and circumstances, causes and effects.  You can’t make policy without facts; not honestly, anyway.  Twenty-odd years of a pretty-much-constantly ballooning economy proved to be a golden era for postmodern ideologues both left and right (although the pomo left mostly seems to have used its rhetorical chops for seducing impressionable undergrads, while the pomo right mostly used theirs to, like, invade Iraq and stuff), and made this notion easy to deny or forget.  The assumption always seemed to be that the value of ideas, just like everything else, is best proved in the consumer marketplace: what’s true is what polls best, what goes viral, what pulls in the best ratings, and to suggest otherwise was to reveal oneself as a member of the pathetically outmoded “reality-based community.”  (That’s a shot at Dubya, of course, but Clinton governed more or less the same way.)

And this cavalier disinterest in facts, of course, brings me back to Anonymous.  What’s most troublesome about the movie isn’t that it’s a 130-minute-long lie, it’s that it doesn’t bother to lie: unlike the Oxfordians, who make their feeble case by means of citation, quotation, and coincidence, Anonymous aims to convince via bald-faced assertion amped up with fancy production design and ample CGI.  Anonymous seems to suggest—even to declare, by means of its de-Vere-as-Shakespeare-as-propagandist premise—that this is how ALL history is made: not by contest or argument, but by spin and obfuscation and special effect.  This thesis might not always be mistaken, but it should never be regarded as acceptable.

Wait wait wait wait, some of you are now saying.  You seriously just blew 3500 words trash-talking a movie you admit you HAVEN’T SEEN?  If you haven’t seen it, how do you know it’s bad?  To which I’ll respond—just as I have said in the past—that whether the movie is any good has nothing to do with the point I’m making.  I’m not saying Anonymous is BAD.  I’m saying it’s EVIL.  Dig?

In other, me-related news, the latest victim of my short-fiction campaign is Joyland, the awesome web-based literary magazine published by Emily Schultz and Brian Joseph Davis.  If you aren’t familiar with it, Joyland is to my knowledge unique among litmags in that it publishes work from throughout North America, but is curated regionally by several geographically-dispersed editors.  As a current Chicagoan, my story falls in the domain of Midwest editor Charles MacLeod, and it seems appropriate to thank him for encouraging me to get off my ass and send him something.  (K and I know Charles from our extended honeymoon in Provincetown.)

The story of mine that’s up at Joyland is called “Seven Names for Missing Cats;” it came about while I was in grad school in early 2004.  I was studying at the time with the novelist Jane Alison, who is a genius of the highest order and who is very good at fostering an atmosphere that’s conducive—at least it was for me—to reassessing the core principles underlying whatever it is you think you’ve been doing, writer-wise.  With “Missing Cats,” the object of the game was to write a story that omitted as many standard narrative operations as possible, or anyway left them up to the reader to create, or infer; it’s probably the thing I’ve written that I am most happy with, and I’m grateful to Joyland for giving it a home.

Follow

Get every new post delivered to your Inbox.

Join 34 other followers