Poe and Wolfe

Building off of my post about modern Frogpondians yesterday, at the same time that I started studying Poe’s life more deeply—especially his letters and criticism—I also read more of the late great Tom Wolfe’s journalistic monographs. These include From Bauhaus to Our House, a takedown of modern architecture; The Painted Word, a similar treatment of modern art; and Radical Chic and Mau-Mauing the Flak Catchers, a brilliant pair of essays about high-minded leftwing activism and its distance from grungy reality.

With all of these assorted things floating around in my mind I realized one day that Poe and Wolfe took a similar glee in identifying and attacking cliques. Both objected to the self-identified, self-satisfied, and self-righteous cognoscenti who have found a way to dominate a particular field and enforce an orthodoxy, all while feathering their nests and basking in a success lauded primarily by its members, whom they treat as the only people that count. Poe had Longfellow and Emerson, Wolfe had Mailer, Updike, and Irving. And Leonard Bernstein. And Le Corbusier. And…

And once I noticed this similarity, I noticed others. I’ve kicked this idea around with a few of y’all in conversation, but wanted to get some of this down in writing. Consider the following notes toward a comparative study of Poe and Wolfe:

  • Both were Southerners

  • Both were Virginians specifically—Wolfe by birth, Poe by rearing and explicit self-identification

  • Both worked primarily in big northeastern cities

  • Both were accounted personally charming and gentlemanly despite their acid literary criticism

  • Both worked in journalism and fiction—Poe considering himself a poet who worked for magazines to (barely) make ends meet while Wolfe was a successful journalist who moved into fiction mid-career

  • Both, in rejecting and attacking the dominant literary cliques, made themselves political outsiders, though neither was particularly interested in politics except as an epiphenomenon of something more important

  • Both had an intense concern for authenticity in fiction

  • Both developed immediately identifiable styles intended to convey something more truthful than the dominant style at the time

  • Both were mocked for their style

I’ve returned to this and thought about it a lot, especially since realizing that the similarities are not just biographical but thematic.

The regional dimension, especially in Poe’s case, is too easily overlooked, but I think it’s fundamental to understanding both men. Back in the spring I watched Radical Wolfe, an excellent recent documentary on Wolfe’s life and career that I meant to review here but never found the time to. I recommend it. It doesn’t cover Wolfe’s youth and education in detail, but the sense of Wolfe as a Southerner amused by the unquestioned pretensions of the Yankees in the society he was forced to keep from Yale onwards comes through clearly. It certainly resonated with me.

And now, after mentally connecting Wolfe with Poe, I have to wonder whether the man in black, whom we are so used to imagining with a far-off gaze and a tired frown, used to wander the streets of New York and Philadelphia with a small, wry smile on his face the way the man in white did.

Speaking of Wolfe, Joel Miller recently posted about the delicate art of book cover design, beginning with the recent news that Picador is reprinting thirteen of Wolfe’s books with new matching covers. I’m not crazy about the cover art, personally, but my Wolfe shelf is a jumble of different trim sizes and if I can someday tidy that up and Wolfe can experience a much-deserved posthumous resurgence, all the better.

More writing advice from Lewis

Years and years ago I collected lists of writing advice from three authors—CS Lewis, George Orwell, and Elmore Leonard—and shared them here, both for my own reference and for anyone else who might benefit from them. The Lewis advice came from two separate sources, a letter from the 1950s and his final interview in 1963, and came to eight interrelated points about clarity and precision.

This morning I came across the following, from a 1959 letter to an American schoolgirl collected in Letters of CS Lewis. I own this book, so I don’t know how I’ve missed this set of writing advice before, especially since it may be the best and most systematic that I’ve seen from Lewis. I reproduce it here in full:

It is very hard to give any general advice about writing. Here’s my attempt.

(1) Turn off the Radio.

(2) Read all the good books you can, and avoid nearly all magazines.

(3) Always write (and read) with the ear, not the eye. You shd. hear every sentence you write as if it was being read aloud or spoken. If it does not sound nice, try again.

(4) Write about what really interests you, whether it is real things or imaginary things, and nothing else. (Notice this means that if you are interested only in writing you will never be a writer, because you will have nothing to write about. . . .)

(5) Take great pains to be clear. Remember that though you start by knowing what you mean, the reader doesn’t, and a single ill-chosen word may lead him to a total misunderstanding. In a story it is terribly easy just to forget that you have not told the reader something that he wants to know—the whole picture is so clear in your own mind that you forget that it isn’t the same in his.

(6) When you give up a bit of work don’t (unless it is hopelessly bad) throw it away. Put it in a drawer. It may come in useful later. Much of my best work, or what I think my best, is the re-writing of things begun and abandoned years earlier.

(7) Don’t use a typewriter. The noise will destroy your sense of rhythm, which still needs years of training.

(8) Be sure you know the meaning (or meanings) of every word you use.

Excellent, generally applicable advice. I’d say his attempt succeeds. A few of my own glosses:

#1 is a good caution against technological or media distraction. Most of the advice from the last few years will have something about staying off Twitter or putting your phone in another room while writing. Same principle.

#2 is evergreen but perhaps even more important now thanks to the exponential proliferation of trash reading material on the internet. AI-generated textual “content” will only aggravate the problem. Read old books of a wide variety.

Speaking of Elmore Leonard, he’s a good illustration of #3. His dialogue always sounds natural and his third-person narration is so effortlessly conversational that one is not conscious, after a while, of reading it. Great writers can achieve this effect in a variety of ways, not necessarily Leonard’s.

My worst experience with #5 is simply leaving a detail out. Attentive readers of Griswoldville might note the word musketoon in the glossary at the back, though the word appears nowhere in the novel. Well, it was supposed to. One character, a cavalryman who encounters the narrator just before the climactic battle, rests a musketoon on his thigh in my head, but that detail either never made it onto paper or was trimmed and never reinserted in a better place. Fortunately this omission affects nothing in the scene negatively, but it has always bothered me—and cautioned me to make sure I know which details I’ve actually written.

This is where revision and having other people read your manuscript proves most helpful. When writing The Snipers, I had a clear, concrete picture of all of its locations in my head, but I didn’t effectively describe all of them on paper. JP Burten (whose second novel has just come out, by the way) pointed out that the geography of one early scene was totally unclear. I worked hard to fix that, and it strengthened that scene.

#8 has been on my mind a lot recently thanks to YouTube. Listening to—rather than watching—a lot of aspiring YouTube documentarians (I have specifically American YouTubers in mind) has made me wonder whether they know how English works or what words mean. Malapropisms abound. Most often they misuse words as they strain to sound more serious and intellectual than necessary. Basic attention to meaning is sacrificed for a pretentious (or portentous) tone. Which becomes self-defeating, in the manner of Michael Scott trying to use big words.

The mercenary aspect of seeking views by producing videos on the same handful of sensational stories—how many Dyatlov Pass documentaries does a man need?—also plays a role. Per #4, someone who isn’t interested in material for its own sake will not take the care over it that Lewis’s advice requires.

Dramatic irony and plot contrivance

Bates and Anna in Downton Abbey and Abby in Blood Simple

Last night RedLetterMedia posted their review of the first season of “The Acolyte,” the latest Star Wars show. I have no interest whatsoever in watching “The Acolyte” but in the course of Mike and Jay’s discussion Jay specifically critiques it for an overused storytelling technique:

One of my least favorite plot contrivances that’s used for, like, lazy screenwriting is the misunderstanding and the not explaining to a character what is going on because the plot demands it. . . . the lazy contrivance of not knowing all the information and not being told the information because if you were then there would be no story.

I should say “misused” rather than “overused.” What Jay is describing is dramatic irony, a form of literary irony in which the audience knows more than the characters do. This can create tension and pathos as characters ignorant of the full significance of their own actions carry on, ignorant not only of what they’re doing but of the consequences they will face later. Shakespearian and especially Greek tragedy are rich in dramatic irony, as are modern horror and suspense movies—as exemplified by Hitchcock’s famous example of the bomb under the table.

But dramatic irony, as Jay suggests, becomes a plot contrivance when the ignorance of the characters is maintained unnaturally. The best example I can think of is “Downton Abbey.” Earlier seasons of the show produce dramatic irony much more organically, but as the show goes on and becomes more obviously a high-toned soap opera, characters get into increasingly melodramatic situations and increasingly refuse to talk to each other about them. Most of the show’s problems could be resolved with one conversation, a conversation the characters will not have.*

This is particularly the case with any plot involving Mr Bates, whose aloof taciturnity is taken to a ridiculous extreme when he is accused of murder—among other things. He has numerous opportunities simply to explain to someone else what is going on and why he is acting in the way that he is, and he doesn’t.** Over and over, “Downton Abbey” prolongs the drama artificially in exactly the same way.

For dramatic irony done not just well but brilliantly, watch Blood Simple, the Coen brothers’ first film. The film has four primary characters: Abby, the young wife of a shady nightclub owner; Marty, the husband; Ray, a friend with whom Abby begins an affair; and Loren Visser, a private detective. Briefly, Marty hires Visser to look into what Abby and Ray are up to and, when he finds out, pays Visser to kill them. Visser double-crosses and shoots Marty, and Ray discovers the body.

Without giving too much away, as the rest of the movie unfolds:

  • Ray thinks that Abby killed Marty (she didn’t) and decides that he has to cover for her

  • Abby thinks Marty is still alive and out to get her (he isn’t) and decides to fight back

  • Visser thinks he has gotten away with his crime (he hasn’t) until he realizes he has left evidence in Marty’s nightclub, which he thinks Ray and Abby have (they don’t), and decides to eliminate them to cover his tracks

All of the characters operate in ignorance of the whole picture—with the possible exception of Visser, who makes mistakes despite knowing more than Ray and Abby—and make their decisions based on what they think they know, which is often wrong. This ignorance continues right up until the final lines of the film, following a climactic confrontation in which the two surviving characters can’t see each other. And it is unbearably suspenseful rather than, like “Downton Abbey” or “The Acolyte,” merely frustrating.

Dramatic irony is a powerful device, and it’s a shame it isn’t better used. Writers hoping to create tension in their stories through the ignorance or misperceptions of their characters would do well to revisit a movie like Blood Simple, some of Elmore Leonard’s crime fiction, or, even better, go back to Aeschylus and Sophocles.

* This is, I think, part of what makes Maggie Smith’s Dowager Countess such a breath of fresh air whenever she appears, as she actually says what she means.

** My wife and I refer to these as “Shan’t” moments, as in “I could resolve this with a simple explanation, but—” turning one’s head away, “shan’t.

On artistic innovations that don’t make art better

For years now I’ve wanted to write a blog post about the Coke machines on my college’s campus. They’re sleek, modern, and high tech, with WiFi-integrated chip card readers and LED lights and a system of robotic conveyor belts that whisk your drink out of the rack to dispense it in a rotating receptacle with its own recessed lighting.

They also don’t work very well. All that innovation has resulted in more points of failure than the rudimentary, purely mechanical Coke machines I grew up with. One of those might occasionally have eaten your change. These can break in at least twenty ways. I’ve counted. Technological sophistication has actually made the machines worse for their original purpose.

Here’s a quotation I’ve been meaning to share for a while, a passage from poet Dana Gioia’s essay “Notes on the New Formalism,” which was published in 1999 but that I first ran across last year:

These young poets have grown up in a literary culture so removed from the predominantly oral traditions of metrical verse that they can no longer hear it accurately. Their training in reading and writing has been overwhelmingly visual not aural, and they have never learned to hear the musical design a poem executes. For them poems exist as words on a page rather than sounds in the mouth and ear. While they have often analyzed poems, they have rarely memorized and recited them. Nor have they studied and learned poems by heart in foreign languages where sound patterns are more obvious to nonnative speakers. Their often extensive critical training in textual analysis never included scansion, and their knowledge of even the fundamentals of prosody is haphazard (though theory is less important in practice than mastering the craft of versification). Consequently, they have neither much practical nor theoretical training in the way sounds are organized as poetry. Ironically this very lack of training makes them deaf to their own ineptitude. Full of confidence, they rely on instincts they have never developed. Magisterially they take liberties with forms whose rudimentary principles they misconstrue. Every poem reveals some basic confusion about its own medium. Some misconceptions ultimately prove profitable for art. Not this one.

The failures of both modern poetry and modern Coke machines stem from a fundamental misapprehension of their purpose—what they’re for, how they’re supposed to work. The basics are neglected. Not for nothing does the phrase mechanical failure apply in both instances. What you end up with is pointless sophistication (cf. Jacques Barzun’s definition of “decadence”) that often doesn’t even work.

A silly comparison, probably, but one that is broadly applicable.

Scruton on what children can teach us about art

From the late Sir Roger Scruton’s documentary “Why Beauty Matters”:

 
Art needs creativity, and creativity is about sharing. It is a call to others to see the world as the artist sees it. That is why we find beauty in the naïve art of children. Children are not giving us ideas in the place of creative images, nor are they wallowing in ugliness. They are trying to affirm the world as they see it and to share what they feel. Something of the child’s pure delight in creation survives in every true work of art.
— Sir Roger Scruton
 

Scruton makes this aside as a point of contrast with modern art—which is intentionally insular, confrontational, transgressive, and over-intellectual if not ideological—but in doing so he makes a broader point about what art is and what it’s for. This description of children’s art is also honestly and accurately observed.

I’ve thought of this passage many times over the last few weeks, ever since my eldest son eagerly presented me with a picture he had drawn. It was a pencil and highlighter drawing that showed me holding my youngest son at the dinner table—a picture of his dad and one of his little brothers. It was drawn from life without my noticing, and joy he took both in drawing and giving it to me, the joy in and care taken over the details, including the stubble of my beard, and the simple, straightforward, honest love in the picture itself have stuck with me. My kids have drawn many things for me, but this one in particular struck me as a clear example of Scruton’s “pure delight” in “sharing.”

Last week I tacked it to the wall of my office at school. May any art I create be motivated as purely as my son’s.

“Why Beauty Matters” is worth your while, as I wrote here almost four years ago following Scruton’s death. You can watch the whole thing on Vimeo here.

Scruton on style

Last week I revisited the late Sir Roger Scruton’s Beauty: A Very Short Introduction via audiobook on my commute. It’s an excellent precis of much that is fundamental to his thinking and, true to the subtitle, a wide-ranging introduction to many topics that bear further thought. Here’s one.

From a discussion of the role proportion plays in the creation of vernacular architectures by launching the builder on “a path of discovery” to what “fits” and is “suitable” for each detail in relation to the others in Chapter 4, “Everyday Beauty”:

One result of this process of matching is a visual vocabulary: by using identical mouldings in door and window, for example, the visual match becomes easier to recognize and to accept. Another result is what is loosely described as style—the repeated use of shapes, contours, materials and so on, their adaptation to special uses, and the search for a repertoire of visual gestures.

I like the idea of a style as mastery of a discipline’s “repertoire,” the selective, purposeful use of a shared vocabulary. Scruton’s example is architectural, but he also refers throughout the book to painting, sculpture, cinema, and most especially music. My mind naturally suggested literary style, with its literal shared vocabulary and the many effects and fine shades of meaning that a firm control of English can yield.

Scruton himself raises the idea of control as a component of style in the next chapter, “Artistic Beauty”:

True artists control their subject-matter, in order that our response to it should be their doing, not ours.

True artists control their subject-matter, in order that our response to it should be their doing, not ours. One way of exerting this control is through style . . . Style is not exhibited only by art: indeed, as I argued in the last chapter, it is natural to us, part of the aesthetics of everyday life, through which we arrange our environment and place it in significant relation to ourselves. Flair in dressing, for example, which is not the same as an insistent originality, consists rather in the ability to turn a shared repertoire in a personal direction, so that a single character is revealed in each of them. That is what we mean by style, and by the ‘stylishness’ that comes about when style over-reaches itself and becomes the dominant factor in a person’s dress.

The tension between originality and a common vocabulary and the need for balance is an important topic and one Scruton returns to later in the book, but he continues by introducing another consideration:

Styles can resemble each other, and contain large overlapping idioms—like the styles of Haydn and Mozart or Coleridge and Wordsworth. Or they might be unique, like the style of Van Gogh, so that anyone who shares the repertoire is seen as a mere copier or pasticheur, and not as an artist with a style of his own. Our tendency to think in this way has something to do with our sense of human integrity: the unique style is one that has identified a unique human being, whose personality is entirely objectified in his work.

This passage in particular offers a lot for the writer to think about. Every writer has heroes and idols and role models, other writers whose control over their work has influenced our own technique, consciously or not. This starts young. It’s been more than twenty years since I read Stephen King’s On Writing, but I still remember and think often about this passage:

You may find yourself adopting a style you find particularly exciting, and there’s nothing wrong with that. When I read Ray Bradbury as a kid, I wrote like Ray Bradbury—everything green and wondrous and seen through a lens smeared with the grease of nostalgia. When I read James M Cain, everything I wrote came out clipped and stripped and hard-boiled. When I read Lovecraft, my prose became luxurious and Byzantine.

All of which is, for King, a crucial developmental stage in the writer’s life, one that should be refined through constant reading and writing, so that eventually one is no longer writing in imitation but in “one’s own style.”

But if you’re aware of what you’re doing and working hard at it, particularly in order to achieve a certain specific effect—so that, per Scruton, the readers’ response will be my doing, not theirs—it’s hard not to become anxious that one is working merely in pastiche or even accidental parody. Have I sacrificed my integrity to sound like someone else? Inconsistency doesn’t help. I’ve worried more about this on some projects than others. Why am I confident that I can use tricks learned from Charles Portis but not those from Cormac McCarthy? Food for thought.

I think, naturally, of John Gardner and his description of “mannered” prose, a term he’d certainly have applied to McCarthy. “Mannered” suggests artificiality or phoniness, the lack of integrity Scruton suggests above, which is how every good writer hopes not to come across. But I also think of Elmore Leonard, another author whom I’ve quoted here many times, and who worked hard to make his style the absence of style. Scruton contends that that is impossible:

Style must be perceivable: there is no such thing as hidden style. It shows itself, even if it does so in artful ways that conceal the effort and sophistication . . . At the same time, it becomes perceivable by virtue of our comparative perceptions: it involves a standing out from norms that must also be subliminally present in our perception if the stylistic idioms and departures are to be noticed. Style enables artists to allude to things that they do not state, to summon comparisons that they do not explicitly make, to place their work and its subject-matter in a context which makes every gesture significant, and so achieve the kind of concentration of meaning that we witness in Britten’s Cello Symphony or Eliot's Four Quartets.

This is exactly right, and Leonard would agree. Leonard’s style, which was precisely designed to “conceal the effort and sophistication” of his writing and make it seem effortless, was immediately recognizable because it was distinct from the “norms” described above in particular ways—something Leonard himself noted. Those “norms” or context are the broader shared vocabulary we began with—which gives shape to one’s work through contrast.

And that final sentence on what a firm, controlled, purposeful, precise style can do, using the power of allusion, implicit comparison, the subtle significance of every detail to “achieve . . . concentration of meaning”—is there a writer who wouldn’t die happy having that said of his work?

Melancholy in the outfield

A few weeks ago I revisited a childhood favorite with my own kids. Angels in the Outfield came out when I was ten years old and an enthusiastic baseball fan. I must have watched it fifty or sixty times over the next few years, before I aged out of it and the real-life drama of the mid-90s Braves gently edged it out of my imagination.

What I remembered most about Angels in the Outfield was the comedy, the slapstick baseball action, the standard sports movie joys of becoming a team and winning the big game, and the music. (I noticed, though very young, that composer Randy Edelman’s score had a lot of cues suspiciously similar to his work on the previous year’s Gettysburg, one of my favorite soundtracks.) What I was not prepared for upon rewatching it as an adult just how firmly the plot’s foundation was built upon pain, sorrow, and longing.

Roger, the main character, lives in foster care because his mom has died and his dad is a negligent, uncommunicative deadbeat. When the film starts his father has already signed over his rights to his son and has shown up just long enough to tell Roger, a job he performs badly. Is that guilt we see in his eyes, or just awkwardness in performing the unwanted duty of talking to his child? When an oblivious Roger asks when they can “be a family again,” his dad replies with a “when pigs fly” scenario that Roger takes literally. And Roger’s younger friend JP seems bright and happy all the time but collapses into grief when another boy is moved out of the foster home, an emotional response the movie suggests is always ready just below the surface. This is clearly a child struggling with abandonment.

But the vein of sadness runs through the adults, too. California Angels manager George Knox seethes with grievance, not only having had his career cut short when a dirty player slid into him cleats-first, but also becoming a manager only to be saddled with the worst team in the league. The man who injured him, Ranch Wilder, is now the Angels’ radio announcer and loathes the team as well as Knox. His entire demeanor suggests he resents being kept down when he is meant for greater things. And Mel Clark, a former star pitcher who developed a pain pill addiction under Knox’s managership at Cincinnati and who has the film’s clearest redemption arc, is revealed at the end to be only six months away from death. He has lung cancer and doesn’t even know it yet. And so even the longed-for victory in the playoffs is tinged with loss.

I’m not going to pretend that Angels in the Outfield is a great movie or serious drama; it’s simply well and honestly crafted and it treats all of these scenarios seriously. None of it feels forced, none of it is used merely to jerk tears, and none of it is tidily and painlessly resolved. In fact, most of the characters don’t actually get the specific thing they want at the beginning of the film.

This brought to mind two things I had reflected on long ago. The first is an essay from Film School Rejects called “The Melancholy of Don Bluth,” an excellent read on animated films like The Land Before Time, All Dogs Go to Heaven, or An American Tail—all three of which were in constant rotation in the Poss household when I was growing up. Bluth’s movies have a reputation for going to dark places Disney typically balks at, to the point that they’re sometimes the subject of internet memes about “trauma.” Please.

The artistic upshot of Bluth’s willingness to include death and—perhaps more importantly—mourning in his films is a truth and richness often missing from comparable animated films:

Thematically, there is an ever-present air of death about Bluth’s work that is profoundly sad. Bones litter certain set-pieces; illness and age are veritable threats (shout out to Nicodemus’ gnarly skeleton hands); and characters can and do bleed. Critically, Bluth films don’t gloss over grief, they sit with it. From Littlefoot’s straight up depression following the on-screen death of his mom, to Mrs. Brisby’s soft sorrow at finding out the details of her husband’s death. There is a space for mourning in Bluth’s stories that feels extra-narrative, and unpretentious. Critically, this is distinct from, say, wallowing. Bluth’s films have a ridiculously productive attitude towards mourning, most lucidly articulated through Land Before Time’s moral mouthpiece Rooter: “you’ll always miss her, but she’ll always be with you as long as you remember the things she taught you.” Disney meanwhile, tends to treat death as a narrative flourish, or worse, a footnote. And in comparison, even notable exceptions like Bambi and The Lion King seem immaturely timid to let palpable grief linger for longer than a scene, let alone throughout a film’s runtime.

The other thing that came to mind was a podcast conversation on The Sectarian Review concerning Hallmark Christmas movies. At some point during the conversation I drew a comparison between Hallmark romantic comedies and older romcoms by pointing out that films like You’ve Got Mail, as fun and bubbly and appealing as they are, also have vein of genuine pain running through them. Kathleen Kelly takes her mom’s little bookshop up against the big chain store and loses, an event the film doesn’t gloss over and doesn’t paint as some kind of moral victory. Who doesn’t feel the pang of her loss as she closes up shop for the final time and walks away into the night, her mom’s shop doorbell jingling in her hand?

Only Pixar, in older movies like Up and Toy Story 2 and Inside Out, has recently attempted to include such real pain in their stories. By comparison, most of the recent crowd-pleasing PG-13 action fare or animated kids’ movies in theatres or the mass-produced dramas of the Hallmark Channel are pure saccharine—thin, fake, and probably carcinogenic.

I have no firm conclusions to draw on this topic except to note that, for whatever reason, even in our simplest and cheapest stories we’ve lost something important. And if you feel some of this and hope for catharsis, one of the oldest reasons for watching a drama that there is, you’ll have to go to older films for it.

The Mysteries

 
‘In our world,’ said Eustace, ‘a star is a huge ball of flaming gas.’
‘Even in your world, my son, that is not what a star is but only what it is made of.’
— CS Lewis, The Voyage of the Dawn Treader
 

I feel like the publication of a new book by Bill Watterson, whose “Calvin and Hobbes” ended its run twenty-nine years ago and who has remained almost entirely quiet since, should be more of an event than the release of The Mysteries has proven. But then, given the book’s title and most especially its subject matter, maybe that’s appropriate. Call it a mystery, but not one of the Mysteries.

The story is simple enough. This blog post will probably end up several times longer than the entire book. The Mysteries introduces the reader to a medieval-ish world of castles and half-timber towns in which the people and their king are bounded by dark forest. The forest is the domain of the Mysteries, whom no one has ever seen but everyone knows have terrible powers. At first the people strive not to understand but to protect themselves from the Mysteries, putting huge efforts into building walls and chronicling the long history of their fears in tales and art.

Then one day the king decides to strike back against the Mysteries, dispatching knights into the forest on a quest to capture and bring back a Mystery. After a long stretch of futile searching, one knight succeeds, returning with an iron box chained to a cart.

At last, a Mystery is revealed—and the people discover that there’s not, apparently, very much to them. Their fearful powers turn out to be “mundane.” And capturing one Mystery opens the way to capturing others, to the point that the people not only lose their fear of the Mysteries but come to find them boring. One clever illustration shows a medieval newspaper stall full of headlines like “YAWN.”

Then, the Mysteries understood and no longer feared or the object of much attention at all, the people demolish their walls, cut down the forest, and overspread the land. They mock the old paintings inspired by the Mysteries. They now live in a world of jet aircraft and skyscrapers and the king no longer appears on the balcony of his castle but on TV or behind the wheel of a car on a busy freeway, drinking a Big Gulp. At last, the narrator tells us, they control everything.

Or do they? The sky turns strange colors and, ominously, “things” start “disappearing.” The king assures them that this is normal, wizards study the phenomena, and life continues apace. Then, “too late,” the people realize that they’re in trouble. An indifferent universe wheels on.

In the final pages the viewpoint of the illustrations pulls back farther and farther from the people and their conquered land, into space, beyond the solar system and the Milky Way. “The Mysteries,” the story concludes, “lived happily ever after.”

One notable aspect of The Mysteries is that although Watterson wrote the story, it is illustrated by caricaturist John Kascht. Watterson and Kascht worked on the pictures in close collaboration for several years, experimenting with and abandoning many styles before arriving at an atmospheric, unsettlingly dreamlike aesthetic combining clay figures, cardboard scenery, and painted backdrops. The effect is powerfully eerie, especially as the pace of the story accelerates and the fairytale world at the beginning of the book gives way to one that resembles, disconcertingly, our own.

If the pictures are murky, moody, and ambiguous, often more allusive than concrete, so is the story. This, according to Watterson, is by design. I’m not typically one for deliberate ambiguity, but it works brilliantly here. This “fable for grownups,” as the publisher describes it, achieves a timelessness through its strangely specific soft-focus art and a broad applicability through its theme.

And what is that? The most obvious and easy referent to the consequences the people face in the book’s closing pages is climate change, whether anthropogenic or not. But The Mysteries is not an allegory but a fable. To narrow its message, if it has one, to a policy issue is to cheapen and limit it.

The core theme of The Mysteries is disenchantment. Since the Scientific Revolution uncovered the wheels and levers of the universe and the Enlightenment insisted that the wheels and levers were all there is, was, or ever will be, the mysteries of our own world have retreated further and further from our imaginations and the place we once gave them in our daily lives. The powers that once kept people within their walled towns have been banished—or rather seized and repurposed, put to work for the people’s desires. Fear or, to put it more positively, awe of the world has given place to self-assured technical mastery. We control everything.

Or do we?

The Mysteries is probably not what anyone anticipating the return of Bill Watterson would have expected. I was certainly surprised, but pleasantly. As befits the creator of “Calvin and Hobbes,” a work that prized imagination above all else, The Mysteries treads lightly but surefootedly across deep ideas, and powerfully suggests that whatever Mysteries once lived in the forest, we have not sufficiently understood them to warrant our boredom, apathy, and self-indulgence, and we certainly are not free of them. We are, in fact, in graver danger through our indifference to the Mysteries than we ever were when we feared them.

John Gardner on art and democracy

Yesterday during my commute I revisited a short radio interview with John Gardner, one of the writers and writing teachers I most admire. The entire interview is worth listening to for Gardner’s trenchant comments on, well, everything, but I found the following exchange most striking.

Considering the way “the rise of middle class literature”—a “bad thing” in Gardner’s view—was satirized by Henry Fielding and Daniel Defoe, interviewer Stephen Banker goes back to Gardner’s preference for premodern work like Beowulf or Dante or Chaucer and his belief that literature has decreased in quality since then:

Banker: There’s so much in what you said. First of all, are you seriously suggesting that the literature of the aristocracy is the right kind of literature?

Gardner: Yeah, sure, sure. And I think that, as a matter of fact, I don’t think that’s snobbism, I think that every kid in a democracy would like that literature better if he knew it. But of course the thing that happens in a democracy is that the teachers lose touch with what’s good—they don’t know, you know? How many art teachers, you know, in ordinary public schools, have been to an art museum? Just that. How many teachers of creative writing in high schools and colleges for that matter really know what the Iliad is about? I’ve talked with an awful lot of professors. I think there are a handful of people in America who understand the poem Beowulf. And I don’t think there’s even a handful in England. It’s just lost knowledge.

Banker: Well, what—

Gardner: I don’t know anybody who knows about Dante! I don’t know a single person who understands what Dante is doing. I don’t mean that as arrogance, it’s just a fact. They read little sections of it, they talk about the dolce stil nuovo, that’s all.

The reading of great literature in context-free excerpt with a primary focus on formal or—increasingly—political qualities still rings true, as does the well-expressed observation that kids even in democracies will prefer to the adventure of aristocratic literature to middle-class realism. The problem comes in the line “if he knew it.” Many kids today are deprived, often for ideological rather than artistic reasons, and I can see their thirst for this kind of storytelling anytime I describe, in detail and for its own sake, a work of ancient or medieval literature to a class of students. They respond.

I do think there is more cause for hope than Gardner suggests—consider the wave of relative popularity greeting Emily Wilson’s recent translations of Homer—but the situation is dire.

Banker next moves the discussion on to whether old literature is still relevant in a more technologically sophisticated world and Gardner comes out swinging, while also rounding out some of his statements above:

I don’t think that’s snobbism, I think that every kid in a democracy would like that literature better if he knew it.
— John Gardner

Banker: I think one could make a case—

Gardner: Mm-hm.

Banker: —that things that happened five, six, seven hundred years ago are not really relevant to the way we live now, that those people didn’t live with machinery, they didn’t live in the age of anxiety, they didn’t live with the kind of tensions, the kind of communications we have today.

Gardner: I think that’s probably not true. I think, in fact, that—pick your age, pick the age, for instance, of Alexandrian Greece, with Apollonius Rhodius writing in an overpopulated, effete, decadent society, he writes a book which is a bitter, ironic, very Donald Barthelme-like book in imitation of the epic form but actually making fun of the epic form and expressing, you know, his ultra-modern kind of disgust and despair and all this kind of business.

Banker: And what period are you talking about now?

Gardner: Oh, I don’t know about dates. Third century BC. One can find at the end of every great period decadent literature very much like ours. The difference is that we have for the first time—and it’s a great thing—real democracy, in which everybody can be educated. And as everybody begins to be educated and as everybody begins to say what education ought to be, then education changes, and so that the kind of values which make first-rate philosophy or art or anything else disappear—or become rare, at least. There are obviously lots of writers in America who are still concerned about great art and are trying to create it but, mostly, that’s not true.

Food for thought.

The interview ranges widely and it’s hard not to transcribe large parts of the rest, particularly, in considering the value of fiction, Gardner’s comparison of the way Nietzsche and Dostoevsky attacked the same philosophical problems, the first in abstract aphorism and the second in concretely realized fiction, and why Dostoevsky’s fictional interrogation of the Übermensch was more successful—and truthful.

Listen to the whole thing.

For more from Gardner on what’s great about Beowulf and what’s wrong with modern “realism,” check out this Paris Review interview from 1979, a year after the radio interview above. It’s paywalled but a generous, tantalizing chunk is available to read before it cuts off. I’ve written about Gardner here several times before, most importantly on his concept of fiction as the painstaking creation of a “vivid and continuous fictive dream.” This is a crucial idea to me, one I often reflect on. I also considered the role of sensory detail in Gardner’s “fictive dream” using the example of the novel Butcher’s Crossing here.

Literary cameos

Yesterday Alan Jacobs posted a longish recommendation of Francis Spufford’s latest novel, an alternate history detective noir titled Cahokia Jazz. I’m intrigued. But I especially enjoyed this minor note from the end of Jacobs’s post:

At one point, late in the story, our hero is at Cahokia’s railway station and happens to see a family, “pale, shabby-grand, and relocating with their life’s possessions”—including, curiously enough, butterfly nets: “white Russians on their way to Kodiak, by the look of it.” One of them, “a lanky twenty-something in flannels and tennis shoes,” is called by his family Vovka, and he briefly assists our hero. Then off they go, leaving our story as abruptly as they had arrived in it. Assuming that they made their way to Kodiak—or, more formally, as our map tells us, NOVAYA SIBIRSKAYA TERRITORII—it is unlikely that their world ever knew Lolita or Pale Fire.

This is “one of several delightful cameos” in the novel, and Jacobs’s recommendation and praise got me thinking about such cameos in fiction.

I haven’t read Cahokia Jazz yet, though I intend to, but I’m willing to take Jacobs at his word that Spufford does this well. The example he cites certainly sounds subtle enough to work. But done poorly, such cameos awkwardly shoehorn a well-known figure into the story and call unnecessary attention to themselves. Think Forrest Gump in novel form. They can also, if used to denigrate the characters in the story, turn into the kind of wink-wink presentist authorial irony that I deplore.

I think the best version of the literary cameo functions much like a good film cameo—if you spot the cameo and know who it is, it’s a nice bonus, but if you don’t it doesn’t intrude enough to distract. And, ideally, it will work with and add to the story and characterization of the main characters.

A good and especially subtle example comes from Declare, which I’m almost finished reading. Early in the novel we read of protagonist Andrew Hale’s background, specifically where he was in the early stages of World War II before embarking on his first espionage assignments in occupied France:

In November he successfully sat for an exhibition scholarship to Magdalen College, Oxford, and in the spring of 1941 he went up to that college to read English literature.

His allowance from Drummond’s Bank in Admiralty Arch was not big enough for him to do any of the high living for which Oxford was legendary, but wartime rationing appeared to have cut down on that kind of thing in any case—even cigarettes and beer were too costly for most of the students in Hale’s college, and it was fortunate that the one-way lanes of Oxford were too narrow for comfortable driving and parking, since bicycles were the only vehicles most students could afford to maintain. His time was spent mostly in the Bodleian Library researching Spenser and Malory, and defending his resultant essays in weekly sessions with his merciless tutor.

A Magdalen College tutor ruthlessly grilling a student over Spenser and Malory? That can only be CS Lewis.

They’re not precisely cameos, but I have worked a few real-life figures into my novels in greater or lesser supporting roles: David Howarth in Dark Full of Enemies, Gustavus W Smith and Pleasant Philips in Griswoldville. I’ve aimed a little lower in the name of realism, I suppose. But the precise dividing line between a cameo of the kind described here and a real person playing a serious role in a story is something I’ll have to figure out.

At any rate, a well-executed literary cameo is a joy. Curious to see who else might surprise us in the pages of Cahokia Jazz.

Further notes on Indy and Oppie

July was a big movie month here on the blog, with three reviews of movies ranging from “adequate compared to Kingdom of the Crystal Skull” to “great.” Two of them I’ve reflected on continually since seeing them and reviewing them here, especially as I’ve read, watched, and listened to more about them.

Here are a few extra thoughts on my summer’s movie highlights cobbled together over the last couple of weeks:

Indiana Jones and the Curse of Woke

When I reviewed Indiana Jones and the Dial of Destiny a month and a half ago, I didn’t dwell on the malign influence of woke ideology in its storytelling, only mentioning that I had justifiable suspicions of any Indiana Jones film produced by Disney. I wanted to acknowledge those doubts without going into detail, because after actually watching and, mostly, enjoying the movie, I found that the problems I had with Dial of Destiny weren’t political at all, but artistic. It isn’t woke, it’s just mediocre.

That didn’t stop a certain kind of critic from finding the spectral evidence of wokeness in the film and trumpeting their contempt for it. I’m thinking particularly of a caustic YouTube reviewer I usually enjoy, as well as this review for Law & Liberty, which comes out guns blazing and attacks Dial of Destiny explicitly and at length along political lines.

The problem with these reviews is that in their hypersensitivity and their mission to expose ideological propaganda they do violence to the object of their criticism, not just misinterpreting things but getting some thing completely wrong. Here’s a representative paragraph from that Law & Liberty review:

Next, we cut to 1969, the Moon Landing. Indy is an old tired man, sad, alone, miserable. The camera insists on his ugly, flabby naked body. His young neighbors wake him up with their rock music and despise him. His students don’t care about his anthropological course. His colleagues give him a retirement party and soon enough they’re murdered, by Nazis working secretly in the government, with the complicity of the CIA or some other deep state agency. We see the wife is divorcing him; we later learn, it’s because his son died in war, presumably Vietnam—Indy told the boy not to sign up.

What was remarkable about this paragraph to me is how much it simply gets wrong. Indy’s hippie neighbors wake him up by blasting the Beatles, yes, but they also treat him perfectly amiably. (In fact, it’s Indy who knocks on their door armed with a baseball bat.) It is never clear that Voller’s men have help from the CIA or any other “deep state agency;” I kept waiting for that connection but it never came. And Indy did not try to stop his son from joining the army, a point made so clear in the film—Indy’s one stated wish, were time travel possible, would be to tell him not to join—that it’s staggering to think a critic went to print with this.*

From later in the same review: “But turning from obvious metaphors to ideology, Indy is replaced by a young woman, Helen [sic—her name is Helena], daughter of his old archaeological friend Basil, but the film suggests you should think of her as a goddess to worship.” One of my chief complaints about Dial of Destiny was its failure to deal with Helena’s criminality, giving her a half-baked or even accidental redemptive arc that spares her a face-melting, as befitted all similar characters in Indy’s inscrutable but always moral universe. That bad writing again. But how one could watch her character in action and conclude that the audience is meant to “worship” her is beyond me. This is anti-woke Bulverism.

What these hostile reviewers describe is often the opposite of what is actually happening in the film. I’ve seen multiple critics assert that Helena has “replaced” Indy and “controls” and “belittles” him. The Law & Liberty reviewer describes Indy as just “along for the ride.” Helena certainly intends to use him—she’s a scam artist and he’s a mark. This is all made explicit in the film. But it is also made explicit that Indy does, in fact, keep taking charge and leading them from clue to clue and that he is much a tougher mark than Helena was counting on.

Dial of Destiny’s actual problems are all classic artistic failures—poor pacing, overlong action sequences, plodding exposition, weak or cliched characters,** slipshod writing, and a misapprehension of what matters in an Indiana Jones movie that becomes clearest in the ending, when Indy is reunited (for the third time) with Marion. Here the filmmakers make the same mistake as the team behind No Time to Die by giving Indy, like Bond, romantic continuity and attempting to trade on sentimentality when that is not what the character is about.

Again—these are artistic problems. Helena Shaw isn’t a girlboss or avenging avatar of wokeness; she’s a poorly written villain who doesn’t get her comeuppance. But I saw little such criticism among the fountains of indignation from the reviewers who pursued the “woke Disney” line of criticism.

Perhaps this is the greatest curse of wokeness: that it distorts even its critics’ minds. Once they’ve determined that a movie is woke, they’ll see what they want to see.

Call it woke derangement syndrome and add it to all the other derangement syndromes out there. Woke ideology is real, even if the ordinary person can’t define it with the precision demanded by a Studies professor or Twitter expert, and it is pernicious, and it produces—even demands—bad art. It is a kind of self-imposed blindness, as are all ideologies. But zeroing in on wokeness as the explanation for bad art can blind us to real artistic flaws, and if any good and beautiful art is to survive our age we need a keen, clear, unclouded vision of what makes art work. We need not just a sensitivity to the bad, but an understanding of the good.

Douthat on Oppenheimer

On to better criticism of a better movie. Ross Douthat, a New York Times op-ed columnist who writes film criticism for National Review, has been one of my favorite critics for the last decade. Douthat begins his review of Oppenheimer with an abashed confession that he feels guilty saying “anything especially negative about” it, but that as brilliantly executed as it is, he is “not so sure” that it is “actually a great film.”

Fair enough. What gives Douthat pause, then? For him, the problem is Oppenheimer’s final third, which he sees not as a satisfying denouement but simply a long decline from the height of the Trinity test, a decline complicated by thematic missteps:

There are two problems with this act in the movie. The first is that for much of its running time, Oppenheimer does a good job with the ambiguities of its protagonist’s relationship to the commonplace communism of his intellectual milieu—showing that he was absolutely the right man for the Manhattan Project job but also that he was deeply naïve about the implications of his various friendships and relationships and dismissive about what turned out to be entirely real Soviet infiltration of his project.

On this point I agree. As I wrote in my own review, I thought this was one of the film’s strengths. Douthat continues:

But the ending trades away some of this ambiguity for a more conventional anti-McCarthyite narrative, in which Oppenheimer was simply martyred by know-nothings rather than bringing his political troubles on himself. You can rescue a more ambiguous reading from the scenes of Oppenheimer’s security-clearance hearings alone, but the portions showing Strauss’s Senate-hearing comeuppance have the feeling of a dutiful liberal movie about the 1950s—all obvious heroes and right-wing villains, no political complexity allowed.

The second problem, as Douthat sees it, is that the drama surrounding Oppenheimer’s political destruction and Strauss’s comeuppance is unworthy of the high stakes and technical drama of the middle half of the movie concerning the Manhattan Project: “I care about the bomb and the atomic age; I don’t really care about Lewis Strauss’s confirmation, and ending a movie about the former with a dramatic reenactment of the latter seems like a pointless detour from what made Oppenheimer worth making in the first place.”

There is merit here, but I think Douthat is wrong.

I, too, got the “dutiful liberal” vibe from the final scenes, but strictly from the Alden Ehrenreich character. Ehrenreich is a fine actor unjustly burdened with the guilt of Solo, but his congressional aide character’s smug hostility to Strauss as Strauss is defeated in his confirmation hearing feels too pat, too easy. It’s Robert Downey Jr’s sympathetic and complicated portrayal of Strauss, not to mention the fact that the film demonstrates that, however Strauss acted upon them, his concerns about espionage and Oppenheimer’s naivete were justified, that saves the film from simply being standard anti-McCarthy grandstanding.***

Regarding the seemingly diminished stakes of the final act, I too wondered as I first watched Oppenheimer whether Nolan might have done better to begin in medias res, to limit himself strictly to the story of the bomb. But that story has already been told several times and Oppenheimer is very much a character study; this specific man’s rise and fall are the two necessary parts of a story that invokes Prometheus before it even begins.

The key, I think, is in the post-war scene with Oppenheimer and Einstein talking by the pond at Princeton. Nolan brings us back to this moment repeatedly—it’s therefore worth paying attention to. The final scene reveals Oppenheimer and Einstein’s conversation to us:

Oppenheimer: When I came to you with those calculations, we thought we might start a chain reaction that would destroy the entire world.

Einstein: I remember it well. What of it?

Oppenheimer: I believe we did.

Cue a vision of the earth engulfed in flames.

A technology that can destroy the entire world is not just the literal danger of Oppenheimer’s project, but a metaphorical one. The Trinity test proves fear of the literal destruction of the world unfounded, but the final act of the film—in which former colleagues tear each other apart over espionage and personal slights and former allies spy and steal and array their weapons against each other and the United States goes questing for yet more powerful bombs, a “chain reaction” all beginning with Oppenheimer’s “gadget”—shows us an unforeseen metaphorical destruction as it’s happening. The bomb doesn’t have to be dropped on anyone to annihilate.

This is a powerful and disturbing dimension of the film that you don’t get without that final act.

Finally, for a wholly positive appraisal of Oppenheimer as visual storytelling—that is, as a film—read this piece by SA Dance at First Things. Dance notes, in passing, the same importance of the film’s final act that I did: “The two threads are necessary to account for the political paradox of not just the a-bomb but of all technology.” A worthwhile read.

Addenda: About half an hour after I posted this, Sebastian Milbank’s review for The Critic went online. It’s insightful well-stated, especially with regard to Oppenheimer’s “refusal to be bound” by anyone or anything, a theme with intense religious significance.

And a couple hours after that, I ran across this excellent Substack review by Bethel McGrew, which includes this line, a better, more incisive critique of the framing narrative than Douthat’s: “This is a weakness of the film, which provides all the reasons why Oppenheimer should never have had security clearance, then demands we root against all the men who want to take it away.”

Tom Cruise does the impossible

The most purely enjoyable filmgoing experience I had this summer was Mission: Impossible—Dead Reckoning, Part I. To be sure, Oppenheimer was great art, the best film qua film of the summer, but this was great entertainment. I enjoyed it so much that, after reviewing it, I haven’t found anything else to say about it except that I liked it and can’t wait for Part II.

Leaving me with one short, clearly expressed opinion—a truly impossible mission, accomplished.

Endnotes

* In fairness, the review has one really interesting observation: in reference to the film’s titular Dial being Greek in origin, unlike the Ark of the Covenant or the Holy Grail, “Jews are replaced by Greeks in the Indiana Jones mythology, since our elites are no longer Christian.” The insight here is only partially diminished by the fact that the elites who created Indiana Jones were not Christian, either. Steven Spielberg, Philip Kaufman, and Lawrence Kasdan—key parts of Raiders—are all Jewish.

** Here is where Dial of Destiny drifts closest to woke characterization. The agents working for Voller in the first half include a white guy in shirt and tie with a crew cut and a thick Southern accent and a black female with an afro and the flyest late 1960s fashion. Which do you think turns out to be a devious bad guy and which a principled good guy? But even here, I don’t think this is woke messaging so much as the laziness of cliché. Secondary characters with Southern accents have been doltish rubes or sweaty brutes for decades.

*** A useful point of comparison, also involving a black-and-white Robert Downey Jr, is George Clooney’s engaging but self-important Good Night, and Good Luck. Watch both films and tell me which is “all obvious heroes and right-wing villains.”

A thesis

The following started as only semi-serious off-the-cuff pontification in my Instagram “stories.” I’ve expanded on it and fixed a lot of autocorrect “help” along the way.

A favorite web cartoonist, Owen Cyclops, shared the following on Instagram this morning:

If you’re unfamiliar with semiotics, which I discovered via Umberto Eco late in high school, here’s the first bit of Wikipedia’s intro:

Semiotics (also called semiotic studies) is the systematic study of sign processes (semiosis) and meaning making. Semiosis is any activity, conduct, or process that involves signs, where a sign is defined as anything that communicates something, usually called a meaning, to the sign's interpreter. The meaning can be intentional, such as a word uttered with a specific meaning; or unintentional, such as a symptom being a sign of a particular medical condition.

The phrase “usually called a meaning” should give you some sense of how arcane, abstract, and high-falutin’ this can get. Emphasis on abstract. But semiotics is not really my point, here. Owen’s cartoon brought Dr Johnson’s refutation of Berkeley to mind. Per Boswell:

After we came out of the church, we stood talking for some time together of Bishop Berkeley’s ingenious sophistry to prove the non-existence of matter, and that every thing in the universe is merely ideal. I observed, that though we are satisfied his doctrine is not true, it is impossible to refute it. I never shall forget the alacrity with which Johnson answered, striking his foot with mighty force against a large stone, till he rebounded from it, “I refute it thus.”

This is the “appeal to the stone.” Wikipedia classifies it as “an informal logical fallacy.” I don’t care. When confronted with academic disciplines that have descended to this level of abstraction, I join Dr Johnson’s stone-kicking camp.

At some point, something has to be real. Argument divorced from concrete reality simply turns into sophisticated dorm room bickering.* That’s what Owen’s cartoon captures so well—argue about the “meanings” of “signs” like carrot tops and foxholes all you want, the real carrot and the real fox are going to present an inarguable ultimate meaning to those rabbits. I refute it thus.

I was struck that Wikipedia’s article on Johnson’s stone-kicking compares this appeal to the reductio ad absurdum, which it also treats as a fallacy. Its full article on the reductio is more circumspect, classifying it as a legitimate line of argument, though I’ve always regarded the reductio more as a useful rhetorical device, a way of comically** setting the boundaries to an argument or of twisting the knife once the logic has worked itself out as impossible. But, tellingly, the article’s “see also” points us toward slippery slope. This is, of course, described not just as an informal fallacy but “a fallacious argument.” I contend that slippery slope is not a fallacy but, at this point, an ironclad empirical law of Western behavior.

And that’s what brought the late Kenneth Minogue to mind. In my Western Civ courses I use a line from his Politics: A Very Short Introduction, to impart to students that the Greeks and Romans were different from each other in a lot of fundamental ways. Chief among these differences was the Greek and Roman approach to ideas:

The Greek cities were a dazzling episode in Western history, but Rome had the solidity of a single city which grew until it became an empire, and which out of its own decline created a church that sought to encompass nothing less than the globe itself. Whereas the Greeks were brilliant and innovative theorists, the Romans were sober and cautious farmer-warriors, less likely than their predecessors to be carried away by an idea. We inherit our ideas from the Greeks, but our practices from the Romans.

Succinct, somewhat oversimplified, sure, but helpful to students who mostly assume the Greeks and Romans were the same, just with redundant sets of names for the same gods. It’s also correct. Minogue goes on to note that this mixed heritage manifests differently culture to culture, state to state, but that “Both the architecture*** and the terminology of American politics . . . are notably Roman.”

Were, I’d say.

So, a thesis I’ve kicked around in conversation:

Given Minogue’s two categories of classical influence, as the United States was founded along (partially but significantly) Roman lines by men who revered the Romans, a large part of our cultural upheaval has arisen as the country has drifted more Greek—becoming progressively more “likely . . . to be carried away by an idea.”

The emphasis has shifted from the Founders’ “Roman” belief in institutions governed by people striving for personal virtue to a “Greek” pattern of all-dissolving ideologies pursuing unachievable ends. This reflects both political and social changes. Like Athens, the US became more aggressive and more inclined to foreign intervention the more it embraced democracy not just as a system but as an end. And note the way that, when an ideal butts up against an institution in our culture, it’s the institution that’s got to go—as does anything that stands in the way of the fullest possible fulfilment of the implicit endpoint of the ideal. How dare you impede my slide down this slope, bigot.

And this is not a new problem. A whole history of the US could be written along these lines.

* During my senior year of college I once listened to two roommates argue over whether the Trix Rabbit was a “freak of nature.” This lasted at least an hour. Take away the humor and you’d have enough material for several volumes of an academic journal.

** Comically, because what’s the point of arguing if you can’t laugh the whole time? That’s not an argument, but a quarrel. See note above.

** Not always for the best, as I’ve argued before.