How Has the Erosion of Tenure Affected Literary Scholarship?

Last fall the AAUP published data revealing that, as of 2016, 73% of instructional positions in higher education were off the tenure track. Even at R1 institutions, only 22% of faculty were tenured, with an additional 8% on the tenure track. Furthermore, the percentage of tenured and tenure-track faculty has dwindled over time as universities have turned increasingly to part-time labor.

Most literary scholars are well aware of the adjunctification of the professoriate. But how does it shape the literary scholarship and criticism that we write?

The most obvious point regards quantity. According to a “publish-or-perish” market ideology, the best strategy for untenured scholars seeking tenured positions is to secure as many reputable publications as possible. Anyone with an interest in “keeping up” with their field knows that there’s no shortage of new scholarship!

But this leads to a more interesting question: What happens when the great majority of published scholarship is written by untenured scholars?

This has long been a well-known problem for academic journal editors. Once scholars get tenure—and especially if they attain the status of full professor—they have little professional obligation or even incentive to submit their work through the often-unpleasant process of anonymous reader reports. Untenured scholars, meanwhile, and especially those on the job market, aggressively submit to journals because the blind peer-review process is especially valued by hiring committees. The scholarly “article,” therefore, is a genre dominated by writers without the kind of comprehensive field knowledge that comes with decades of research experience.

Due to the ever-increasing numbers of untenured scholars striving to publish journal articles, the competition to find a home for one’s work in a top-tier venue is severe. This means that a “modest” contribution to knowledge in one’s field—e.g. a clever close reading of a lesser-known poem or short story—is unlikely to appeal to editors. The “stakes” of one’s argument are expected to be high. Compelling articles nowadays offer revolutionary challenges to established fields and introduce new conceptual terms applicable to scholars across fields and disciplines.

The proliferation of articles by junior scholars claiming to revolutionize or reconceptualize the profession isn’t really a problem. Far more worrisome is the suggestion that decades of study—the digestion of literally thousands of books—bears little “market value” in the arena of scholarly publication. To what extent should the production of good scholarship require dozens of years of devoted reading?

Without rigorously connecting all the dots, I might simply point to two scholarly trends that have risen alongside the growth of the untenured professoriate.

The first is New Historicism, particularly the aspect of it that celebrates the revelations afforded from a very specifically cultivated cultural-archival context (e.g. what farm equipment manuals from the 1880s can show us about regionalist fiction)—bold and arresting conclusions, in other words, that can be drawn from temporally manageable research inquiries. This approach is of course very different from the less manageable “Old Historicist” approach that privileged an authorial context: If you want to write an article on Nathaniel Hawthorne, you begin by reading all of Hawthorne’s writings and all extant scholarship on Hawthorne.

The second (and more recent) trend is Computational Literary Studies. As Nan Da’s recent takedown makes clear, the promotion of a computer program that can swiftly “analyze” a corpus of a thousand novels should be balanced against the fact that an English professor reading two novels per week could get through a thousand novels in a decade—more slowly than a computer, certainly, but the point is that a tenured professor can peruse a remarkable quantity of literature during his or her career.

So in other words, the percentage of scholarship by tenured professors doesn’t just determine the level of “academic freedom” that can be applied to critical arguments. It also determines the kinds of research that seem reasonable—or even possible—for scholarly work.

The Black Hole of Humor

At an anniversary screening of his 2006 movie Idiocracy, director Mike Judge was asked—as he often is—about the prophetic nature of the film. At what point did he realize that he had made a documentary rather than a satire?

Judge responded with a story about how he realized this while he was filming—long before the movie was actually released. One of the film’s quick jokes is Ass, a movie released in twenty-sixth-century America that sweeps the Oscars. Ass is a 90-minute closeup of a man’s naked buttocks intermittently farting. For Idiocracy, Judge had to shoot part of this fake ass movie and then screen it in a theater full of extras. He had earlier done some location scouting at a reform school, and he was impressed by the fact that the students there looked (in his words) “kinda stupid.” So he went back and hired them as extras for a day.

When it was time to shoot, Judge became nervous. He found himself nearly alone in a room with 250 “juvenile delinquents,” and he needed them to take direction. Basically, he needed them to laugh—to act as if they thought a film of a naked ass farting was hilarious. But his worries soon shifted. The kids, it turned out, really did find the ass hilarious. They laughed continuously, on and on, without encouragement.

Judge wondered whether his entire film project was a waste of time and money. He could just release Ass instead and have a hit on his hands. These doubts would only be amplified by the fact that Idiocracy performed poorly with test audiences.

I love this story less for the punchline (they loved the farting ass!) than for Judge’s confusion. It’s a real problem—how do you successfully mock humor? Aziz Ansari had a similar struggle. In the 2009 movie Funny People, he played an obnoxious hack comedian named Randy, who turned out to be the funniest part of the film. Randy, in fact, was arguably a funnier stand-up comedian than Ansari himself. As a 2010 New Yorker profile noted, “In Funny People, it’s supposed to be obvious that Ansari’s character is a hack, but it’s just as obvious that the character is funny—and, if it’s possible for a comedian to be bad and funny at the same time, then who cares about badness? And if a hack comedian is a guy who will do anything for a laugh—well, what, exactly, is wrong with that?”

In “The Road, Part 2,” the very last episode of the TV series Louie, Louie C. K. finds himself bombing at a comedy club in Oklahoma. His opening act, a man named Kenny, kills by lighting his farts on fire, while Louie’s observations of the New York City subway system go nowhere. To make things worse, Louie is stuck sharing a condo with Kenny, and they find each other insufferable. Eventually they confront each other, Kenny calling Louie a snob and Louie calling Kenny a hack (an especially offensive insult for comics). But when Kenny tells Louie to “look me in the eye and tell me farts aren’t funny,” Louie breaks down and starts crying. He can’t do it. “Every fart is funny,” Louie admits.

This is the conclusion of the entire five-season run for Louie. At the end of the day, farts are funny. This is the terrible truth at the center of all comedy—the black hole of humor. Witty bon mots, clever ripostes, trenchant observations, complex situational fiascos—all of these are functionally unnecessary. “Pull my finger” will do just as well.

In his conversations with other comics in Comedians in Cars Getting Coffee, Jerry Seinfeld comes across as a big believer in the idea that stand-up comedy is radically meritocratic—either you get laughs or you don’t, and those who can reliably get them deserve fame and fortune. “It’s the hardest job in the world,” he says to Tracy Morgan. Anyone who’s ever stood on stage alone with a silent—or distracted—audience might be likely to agree. But what if it’s not that hard to make people laugh? What if it’s depressingly easy? What if success in humor—and, by extension, in arts and entertainment broadly—has little to do with talent and hard work?

These sort of doubts form a fascinating backside to the business. What if the vast majority of your labor and energy is simply misplaced? What if the work you’re most proud of is only distantly related to any success you enjoy?

Dickinson’s Dashes

Emily Dickinson’s dashes are the most famous punctuation marks in all of American literature. And they are famously ambiguous – “among the most widely contested diacriticals in the modern literary canon,” notes Ena Jung.

Scholars don’t simply debate what the dashes mean; they debate what the dashes are. How do you reprint Dickinson’s idiosyncratic handwriting using a standard typeface? Do you go with the short hyphen (-), the medium en-dash (–), or the long em-dash (—)? Adam O’Fallon Price observes that this particular style of punctuation is so frequently associated with Dickinson that the “em” in “em-dash” might stand for “Emily”!

When Mabel Loomis Todd and Thomas Wentworth Higginson first collected and published some of Dickinson’s poems in 1890, they omitted most of the dashes, replacing some with commas. For the few they retained, the publisher chose the longer em-dash, a choice that was naturally repeated as Dickinson’s poems were increasingly anthologized in the first half of the twentieth century.

The em-dash is what is most generally thought of as a “dash.” But “are the marks we refer to, for convenience, as Dickinson’s dashes truly conventional dashes?” wonders Mary Loeffelholz. Scholars have generally preferred shorter marks. Thomas Johnson’s monumental 1955 edition of The Poems of Emily Dickinson went with the medium-length en-dash (my own personal preference). Ralph W. Franklin’s authoritative 1998 edition of The Poems of Emily Dickinson shrunk them further by using mere hyphens. (And now the online Emily Dickinson Archive makes it easy to view the poet’s original manuscripts, potentially eliminating the need for standard typography.)

Such a shortening is not without its effects. “Has our experience of Dickinson’s writing altered, if subliminally, with these changes?” asks Loeffelholz. Are we reading a different Dickinson?

The Dickinson dash might be read merely as a cue for a pause – a breath – a hesitation. But many understand it to signify more powerfully. “The dash sensitizes the reader’s reactions, activates the responsive reservoirs of the reader,” wrote David Porter back in 1966. It is “a graphic representation in the poem of the presence of the creative impulse, of the spontaneity of the emotional force that went into the composition.”

For Paul Crumbley, Dickinson’s many different dashes – penned with different lengths and different angles of slant – likewise reach out to readers and suggest multiple possibilities of discourse. “The dash suggests that disjunction, to Dickinson, is one of the defining characteristics of the self in language.” Deirdre Fagan, however, counters that that dash – which appears more often than any single word in Dickinson’s poetry – represents “the unutterable” itself. “The dash is silent,” she writes, but “the potency of the dash remains, nonetheless, and becomes, cataclysmically and without words, emotion both expressed and unexpressed.”

I like both of these views, Crumbley’s and Fagan’s, and I’ve seen them expressed beautifully in two recent texts. Ben Lerner takes Fagan’s side, referring to Dickinson’s dashes as “markers of the limits of the actual, vectors of implication where no words will do.” For what it’s worth, Gavin Jones, echoing Crumbley, offers my favorite articulation, insisting that the dash is “a prostrate ‘I,’ the punctuation mark of a fallen and discontinuous self.” The little tippler – lying prone in the gutter!

 

Sources:

Crumbley, Paul. Inflections of the Pen: Dash and Voice in Emily Dickinson. UP of Kentucky, 1997.

Fagan, Deirdre. “Emily Dickinson’s Unutterable Word.” The Emily Dickinson Journal 14.2 (2005): 70–75.

Jones, Gavin. Failure and the American Writer: A Literary History. Cambridge UP, 2014.

Jung, Ena. “The Breath of Emily Dickinson’s Dashes.” Emily Dickinson Journal 24.2 (2015): 1–23.

Lerner, Ben. The Hatred of Poetry. Farrar, Straus and Giroux, 2016.

Loeffelholz, Mary. The Value of Emily Dickinson. Cambridge UP, 2016.

Porter, David T. The Art of Emily Dickinson’s Early Poetry. Harvard UP, 1966.

Price, Adam O’Fallon. “Regarding the Em Dash.” The Millions (4 January 2018).

Tuition vs. Information

The word tuition always seems to signify a fee rather than the act of instruction. For the latter, we tend to use the word teaching. But I think that tuition is a very useful word, especially as the neoliberal digitization of teaching (especially teaching in the humanities) suggests that what professors primarily offer to students is information.

Tuition comes from the Latin tueri, which means to watch, to guardian, to preserve. It refers to the action of a guardian—someone who’s looking out for the best interests of students. An emphasis on such tuition can of course detrimentally suggest a parental role for professors that might infantilize students. But a greater focus on observation and preservation seems to me necessary for upholding the value of a liberal arts education. The responsibility and the vocation of teaching are threatened by the Silicon Valley discourse of “content delivery”—the suggestion that a good teacher is someone who transfers packets of information to students efficiently and satisfyingly.

A good teacher is a tutor, not a lecturer or an informer. To reduce teaching to content delivery is to take the human element out of the humanities—to suggest, in other words, that students need data rather than guidance. The very performance of interpersonal conduct—especially in the acts of listening and responding to questions—seems, to me, essential to my job as a college professor.

Tuition requires sympathy. The best chapter of James Banner and Harold Cannon’s book The Elements of Teaching is titled “Compassion.” “Anyone contemplating teaching as a profession should consider compassion as a measure of suitability,” argue Banner and Cannon. To fail to bear the ethical obligations of the profession “means in effect to be not a teacher but someone who merely offers information.”

I often spend too much time preparing lecture notes, placing too much faith in value of facts uttered into the atmosphere. The most valuable education is not one that bestows knowledge upon a student but rather one that leads a student forth into knowledge. (“Give a man a fish….”)

Information is cheap, and students living in an Information Age would do well to keep this in mind. If you’re going to spend thousands of dollars on tuition, don’t accept content delivery in return.

The Real Thing

Realist fiction is an oxymoron. Realist fiction is basically anti-fiction. It is fiction that has internalized doubts and anxieties about fictionality. It is the closest to the truth that one can get while remaining fundamentally untrue. Realism is neither real nor unreal. It’s realish. Like Stephen Colbert’s truthiness, realism appeals not to actual reality but to felt reality—realishness.

Realism is part genre and part mode. It is a mode that becomes a genre whenever it takes over an entire text. Because of this relationship between its genre and its mode, realism is a genre that insists on its own metrics of evaluation. A realist text is bad if it is unrealistic; it is good if it is true to life—a life understood to be fairly predictable. Its enemy is pernicious illusion, which it aims to dispel (if indeed it aims to do anything at all).

Realism is often therefore defined negatively—whatever it is, it is most certainly not romantic, melodramatic, fantastic, sentimental, sensational, or modernist. It is often considered “conventional,” but it resists the categories of “genre fiction”—mystery, western, science fiction, etc. Realism is neither didactic nor inspirational; it’s all mirror and no lamp.

Realism can be “political” in the sense that it can feature “political” material. (Henry Adams’s 1880 novel, Democracy, might be a good example here.) It can certainly make readers upset about real problems in the real world. But it generally does so by attempting to narrate the status quo. And because it is fundamentally fiction and not journalism, it always leaves itself open to charges of levity: if it were “serious” about its politics, it wouldn’t have to make up anything.

Uncle Tom’s Cabin is prime example of this problem. Harriet Beecher Stowe claimed to be using the “allurements of fiction” to achieve the correct “sympathetic influence.” Her descriptions of slavery, as she insisted in A Key to Uncle Tom’s Cabin, were all “true”—that is, based on factual occurrences. These facts were merely robed in fiction in order to elicit a more powerful sympathetic response than factual material could elicit on its own.

But of course Uncle Tom’s Cabin, because of this very appeal to sympathy—because of its melodramatic pulling of the heartstrings—has also been criticized as other than realist. There seems to be an implied understanding here that realism is or should be essentially unexciting. At its most provocative, it merely provokes contemplation (not action) and is therefore an intensely private genre. Donal Harris can thus label mainstream literary realism as “quietist realism.” (One could argue that this is the default mode of the writer of demographic privilege.) Its acolytes tend to promote the sublimity of boredom—the profundity of a not-too-meaningful life. (Roland Barthes thus thought that the “reality effect” in literature was achieved by including a series of meaningless details that in no way contribute to a text’s plot—because real life itself generally either lacks meaning or means in superabundance.)

We might thus identify two poles of literary realism: individual realism and social realism. Because it seems to be philosophically opposed to idealism, realist fiction often portrays a solid social reality that resists the solipsism of individual rumination. (It owes more, perhaps, to Hobbes than to Montaigne.) Rather than record the idiosyncrasies of personal experience, realistic novels, notes Amy Kaplan, “attempt to imagine and contain social change.” Along these same lines, Joe Cleary has questioned the preference in postcolonial studies for modernist literary aesthetics (e.g. “magical” realism, another oxymoron), arguing instead in favor of “realist-associated conceptual categories such as historical transition, class consciousness, and totality.”

When realism takes a tack toward individual rather than social reality (as it often does in postwar American literature), it tends to require an unexceptional protagonist. In needs, in other words, an Everyman (like Harry “Rabbit” Angstrom) who might represent a very large number of real people. Furthermore, this Everyman should have, in lieu of an exciting adventure, an “everyday” life. (Contra Aristotle, character evokes real life better than plot.) The very normalcy of the events narrated challenges the singularity of narration itself, which tends to insist that its events occurred “once” upon a time. For Fredric Jameson, realism thus lies at the crossroads of destiny and the “eternal present.” This dialectic is perhaps best represented by what Gerard Genette called iterative narration—the difference between “it once happened that” and “it usually happened that.” This difference between the singular occurrence and the everyday phenomenon privileges the mundane, the ordinary, and the quotidian over the exotic, the extraordinary, and the unusual.

Teaching Jones Very

In 1838, Harvard graduate Jones Very (pronounced JO-nǝs VEE-ry) went around Salem telling people that he was the Second Coming of Christ. This faux pas secured him a two-month stay at McLean’s Asylum for the Insane. (Bronson Alcott remarked that Very was “diswitted in the contemplation of the holiness of Divinity.”) Upon his release, he began displaying hundreds of sonnets he was writing—or rather, that God was writing through him. It was a tremendous burst of literary productivity; he was, writes Helen Deese, a “dynamo of poetic energy.” Elizabeth Palmer Peabody recommended Very’s work to Ralph Waldo Emerson, who served as the literary agent and anonymous editor of his Essays and Poems (Boston: Little and Brown, 1839).

Very’s sonnets can be interesting texts for the classroom. Many of them, such as “The New Birth,” feature the “born again” rhetoric that structures so much early American literature. Very’s conviction that one becomes a new, different person upon spiritual rebirth led him into some grammatical vagaries. How to distinguish between the you you are and the you you will be? “’Tis to yourself I speak; you cannot know / Him whom I call in speaking such an one, / For thou beneath the earth lie buried low, / Which he alone as living walks upon,” he claims in the aptly titled sonnet “Yourself.” Perhaps this is what Margaret Fuller had in mind when she generously referred to his “elasticity of spirit.”

In the 1830s and 1840s, Very’s poems gained some notoriety because they were potentially blasphemous. A sonnet such as “I Am the Way” suggested that Very still thought he was Christ re-incarnate. (And because Very believed, at the least, that his poems were divinely inspired, he insisted that even slight revisions to his manuscripts would constitute an affront to God—to which a bemused Emerson supposedly replied that the Holy Spirit should have a better command of spelling and grammar!) In his best sonnets, such as “The Created,” Very’s syntax troubles the distinction between the poet, the reader, and God: “Behold! as day by day the spirit grows, / Thou see’st by inward light things hid before; / Till what God is, thyself, his image, shows.” That last line artfully effects the transcendentalist dictum that divinity might be found within the self. As Lawrence Buell notes, “The alternation between divine, prophetic, and human voices” in Very’s poems has “a provocatively disorienting effect on the reader.”

In the twentieth century, some scholars found Very’s voice to be in harmony with Modernist sensibilities. Yvor Winters was largely responsible for bringing Very’s name back into consideration. Winters championed “The Created” especially and maintained that Very’s transcendentalism was more authentically experienced than Emerson’s. F. O. Matthiessen promoted some of Very’s sonnets such as “The Dead,” which bears a noticeable thematic similarity to T. S. Eliot’s “The Hollow Men.”

For today’s audience, Very’s emphasis on self-annihilation (as opposed to Emerson’s self-reliance) is probably his most interesting trait. The opening lines of his poem “They Better Self”—“I am thy other self, what thou wilt be, / When thou art I, the one see’st now”—anticipate Rimbaud’s far-more-famous “Je est un autre.” Very took the “death of the author” as seriously as any other nineteenth-century poet. He firmly believed that he had died to the world. Subjectivity is strangely both overconfident and terrifically unstable in his sonnets.

If nothing else, Very is worth a moment for his character alone, which was a mass of contradictions. According to biographer Edward Gittleman, “He was an irritating mixture of brilliance and absurdity, profundity and simplicity, piety and blasphemy, equanimity and anxiety, excitement and dullness, innocence and guilt, humility and messianic delusion.” David Dowling more succinctly diagnoses Very with a “messianic persecution complex.” His work is a fascinating facet of American transcendentalism—and an extreme example of American poetry writ large.

 

Select Scholarship:

Yvor Winters, “Jones Very and R. W. Emerson: Aspects of New England Mysticism,” Maule’s Curse: Seven Studies in the History of American Obscurantism (New Directions, 1938), 123–146

William Irving Bartlett, Jones Very: Emerson’s “Brave Saint” (Duke UP, 1942)

Edward Gittleman, Jones Very: The Effective Years, 1833–1840 (Columbia UP, 1967)

Lawrence Buell, “Transcendental Egoism in Very and Whitman,” Literary Transcendentalism: Style and Vision in the American Renaissance (Cornell UP, 1973), 312–330

David Robinson, “The Exemplary Self and the Transcendent Self in the Poetry of Jones Very,” ESQ 24 (1978): 206–214

Helen R. Deese, ed., Jones Very: The Complete Poems (U of Georgia P, 1993)

Sarah Turner Clayton, The Angelic Sins of Jones Very (Peter Lang, 1999)

Benjamin Reiss, “Emerson’s Close Encounters with Madness,” Theaters of Madness: Insane Asylums and Nineteenth-Century American Culture (U of Chicago P, 2008), 103–141

David Dowling, “Jones Very: A Poet’s Zeal,” Emerson’s Protégés: Mentoring and Marketing Transcendentalism’s Future (Yale UP, 2014), 206–237

Cui Bono? American Beneficiaries

“In order that England may live in comparative comfort, a hundred million Indians must live on the verge of starvation,” wrote George Orwell in 1937. This simple observation forms the basis for Bruce Robbins’s impressive new book, The Beneficiary (Duke UP, 2017). Robbins is interested in what he calls “the discourse of the beneficiary,” in which those who benefit from global inequalities explain to other privileged consumers that their comforts and commodities indirectly cause great harm to people in other countries—that they are in some way responsible for foreign poverty.

Since this realization comes about with the rise of global capitalism, its expression only begins to be discernible in the works of eighteenth-century writers such as Adam Smith. And it is only with twentieth-century figures such as Orwell that this zero-sum game between haves and have-nots in separate nations becomes a somewhat common notion.

I found this work to be particularly valuable for thinking about antebellum American literature. In “Man the Reformer,” a lecture offered to the Mechanics’ Apprentices’ Library Association in 1841, Ralph Waldo Emerson reflects on the “derelictions and abuses” rampant in commercial trade. But he doesn’t think that anyone is innocent. “We are all implicated of course in this charge,” he maintains. “It is only necessary to ask a few questions as to the progress of the articles of commerce from the fields where they grew, to our houses, to become aware that we eat and drink and wear perjury and fraud in a hundred commodities.” This is a systemic problem, not merely the result of a few unscrupulous actors. “Every body partakes,” he adds, “yet none feels himself accountable.”

Robbins calls this “commodity recognition”—the becoming aware of the complex histories of our commodities’ productions. (What really goes into your cup of coffee?) This recognition was clearly a terrible problem for Emerson, forcing him to consider the ethics of a complete withdrawal from civil society. And while he doesn’t go quite this far (though his acolyte Thoreau perhaps did), he advocated for “a certain rigor and privation” in one’s habits in order to discourage “the taste for luxury.” There may be a moral impetus to adopt a bread-and-water diet.

Ultimately, Emerson preached continual consideration: “We must clear ourselves each one by the interrogation, whether we have earned our bread to-day by the hearty contribution of our energies to the common benefit; and we must not cease to tend to the correction of flagrant wrongs, by laying one stone aright every day.” In other words, work hard to make sure that everyone is a beneficiary.

The Hegemony of Close Reading

Jonathan Kramnick’s recent essay “The Interdisciplinary Fallacy” makes a strong case for disciplinarity—specifically, that English Departments have a unique perspective to offer, different from the perspectives offered by other academic departments. Worrying that the radical edge of interdisciplinarity is a kind of scientific reductionism (a single scientific worldview, fundamentally physical, unifying the university—i.e. Edward O. Wilson’s “consilience”), Kramnick sees long trends in English Departments to promote Cultural Studies and New Historicism as threats to disciplinary coherence.

Kramnick fights back against these historicist tendencies by championing “close reading,” which is “the skill that establishes the baseline of competence for work in literary studies.” Close reading, he says, is “our grounding practice,” the analytical form that structures the discipline. And it must be defended against those (he cites Mary Poovey and Clifford Siskin) who are bent on “getting rid of close reading.” (One could also imagine Franco Moretti’s “distant reading” and Stephen Best and Sharon Marcus’s “surface reading” as paradigmatic threats.)

Kramnick isn’t alone in his insistence that close reading is what makes English English. Jane Gallop has also suggested that the precarious “fate of close reading” is also the “fate of literary studies.” “We became a discipline,” Gallop notes (gesturing to the replacement of the Old Historicism by the New Criticism), “when we stopped being amateur historians and became instead painstaking close readers.”

I too like close reading, but I resist its fetishization. I worry that privileging close reading as the ur-skill to be acquired by every ideal English major risks the kind of reductionism that worries Kramnick in the first place. Why should English Departments (literary critics and scholars) be committed to one single professional technique? I prefer to think of English Departments as fostering a literary criticism that features three tools working in tandem (including close reading).

I tell my own students that when I grade their papers I look for the 3 C’s: Curating, Connecting, and Close Reading.

Curating chiefly regards quotations and citations. What are the most significant moments in a long text? An essay implicitly answers this question with the specific passages to which it gestures. The best literary critics tend to have a keen curatorial eye—able to pick out the choicest phrases, the deepest sentences, the strangest scenes. They seem to find gems wherever they scan.

Connecting involves linking separate passages, either in a single or in multiple texts. (“Only connect!” cries Forster.) Erudite critics see parallels: sometimes cause-and-effect relationships due to one author’s influence over another, sometimes shared themes among authors responding to similar cultural cues. Attentive readers can write not just of unique moments but of common ones—repeated phrases, stylistic tics, formulaic tendencies. And such readers can track developments in character, plot, and theme across various places in a text.

Close Reading, finally, is a third tool for criticism. In its “classic” form, dating back to the midcentury New Critics such as Cleanth Brooks, close reading takes an apparently simple expression and reveals it to be irreducibly complex—an intricately ambiguous or ironic superposition that resists hermeneutic collapse (not unlike the wave function of quantum mechanics). This is a valuable task (so the thinking goes) because real life is irreducibly complex; as Lionel Trilling reflected in 1949 (writing in the aftermath of the Second World War), “The world is a complex and unexpected and terrible place which is not always to be understood by the mind as we use it in our everyday tasks.” Close reading allows us to appreciate how the most talented authors can pack multi-layered meanings into the briefest spaces. The most valuable passages cannot be simplified or summarized or paraphrased; rather, they mean in abundance.

I’m happy to celebrate close reading as an excellent technique, but I’m wary of claiming that the entire discipline of literary study rests on it. Close reading in a vacuum merely insists that a thing is never what it at first seems, a curious tenet by which to live. Curating and connecting strike me as equally important—and necessarily related—skills. They can be grasped by novices, but they can also be honed to a professional expertise. Curating key passages makes possible connecting and close reading; connecting lends coherence to literary texts (otherwise liable to dissolve into scattered fragments); close reading allows one to discern value in literary style. They work best when they work together.

Bill Murray’s Liberal Arts

February 2nd has become the Feast Day of Saint Bill. Every year, this date promises an airing of Bill Murray’s now-classic 1993 film Groundhog Day. For aficionados of the liberal arts, it’s a day for rejoicing. More than any other popular movie, Groundhog Day offers a powerful defense of the humanities.

One specific scene is the key to this strange film, in which Murray’s character—Pittsburgh weatherman Phil Connors—inexplicably and inescapably relives the same day over and over again. While having a beer in a bowling alley with two Punxsutawney locals, Phil turns to one and says, “What would you do if you were stuck in one place, and every day was exactly the same, and nothing that you did mattered?” The local, a man named Ralph, replies, “That about sums it up for me.” (The scene is especially central to the Broadway musical adaptation.)

Here the film tips its hand, exposing the allegory structuring the supernatural fantasy. Rather than a bizarre paranormal prank outside the realm of actual possibility, Groundhog Day is about the mundane nature of all our lives. Everyone knows this feeling of being trapped in an endless recursion—stuck in one place, every day the same, nothing you do matters. So what do you do about it?

Suicide is Phil’s initial solution, but when that repeatedly fails, he drifts toward a liberal education. He begins with the study of French poetry in a botched attempt to seduce his producer, Rita (played by Andie MacDowell). But while Baudelaire’s verses fail to deliver the desired goods, they succeed as a hook. Phil starts reading for himself—British Romanticism, Russian Realism, whatever moves him. He takes up piano and practices Rachmaninoff. He learns to sculpt. It’s been estimated that Phil spends between eight and thirty-four years reliving the same day over and over. And he uses those years to work on himself, undertaking an education that yields impressive results.

One could claim that Groundhog Day is simply a reboot of Scrooged (1988), in which Murray’s Frank Cross has a change of heart after being visited by three Christmas ghosts. But Phil Connors is no simple Scrooge. By the end of the film, Phil hasn’t just transformed from grouchy to generous; he’s become more poetic and eloquent. His spheres of perception and influence expand tremendously. To the admiration of bystanders, he records the following conclusion to his news footage of the holiday ceremony:

“When Chekhov saw the long winter, he saw a winter bleak and dark and bereft of hope. Yet we know that winter is just another step in the cycle of life. But standing here among the people of Punxsutawney and basking in the warmth of their hearths and hearts, I couldn’t imagine a better fate than a long and lustrous winter.”

Instead of a change of heart overnight (as is all too often the case in ninety-minute films), Phil experiences a growth of mind over years. He’s not just better, he’s more, displaying the fruits of a larger and more comprehensive soul.

In the evening of what, unbeknownst to him, will be his final Groundhog Day, Phil claims that it’s been the best day of his life. The claim is doubly ironic: first, because at this point Groundhog Day is his life; and second, because he initially saw the day as a disaster. Nothing in the world has changed; the social and natural environment remain exactly the same throughout the film. But Phil has changed, and that has made all the difference. His emergence into a new morning at the end of the movie is less an escape than a graduation.

In a fascinating reevaluation of a film about reevaluations, Roger Ebert reappraised Groundhog Day a dozen years after its release. Having initially written a lukewarm review in 1993, in 2005 Ebert hailed it as one of the all-time greats in film history. In his reconsideration, he highlighted the heightened aesthetic capacities of the transformed Phil Connors: “There is a moment when Phil tells Rita, ‘When you stand in the snow, you look like an angel.’ The point is not that he has come to love Rita. It is that he has learned to see the angel.” In other words, the appreciation of arts and culture is just as important as the passion of a human relationship. The film is ultimately not a romantic but a Romantic comedy—a sweeping affirmation of the powerful human spark within all of us that can swell in might and magnitude if properly fostered.

Bill Murray has thus become quite a spokesman for the liberal arts. And why not? This is an actor who parlayed the success of Ghostbusters into an unprofitable adaptation of Somerset Maugham’s The Razor’s Edge—then put his career on hold to study philosophy at the Sorbonne. Phil Connors similarly becomes a Murrayesque dilettante while caught in his repetitive limbo. Instead of honing his on-camera skills in hopes of a more lucrative meteorological job in the Big Apple, Phil develops his humanity in pursuit of a more fulfilling life. Perhaps it should come as no surprise that, in 2009, eminent humanities professor Stanley Fish claimed that Groundhog Day was the best American movie to appear in the last thirty years. As Fish notes, the film depicts “a self-help project that takes forever.” If there’s a better motto for a liberal education, I can’t think of one.