On tunnels

Nada and Frank discover the alien tunnels under Los Angeles in They Live (1988)

Over the weekend I finally got a chance to watch They Live, John Carpenter’s 1988 action-comedy-thriller about a working man unmasking the alien domination of the world. It was a delight. Carpenter presented his vision of the concealment of the true nature of the world by a powerful malevolence exploiting the ignorant masses brilliantly, and made it funny, creepy, and exciting in equal measure. It was also deeply paranoid.

That’s the point, of course. Rowdy Roddy Piper’s famous bank heist—a heist in which he steals no money—and the film’s climactic TV station shooting spree wouldn’t be nearly so enjoyable had the film not made the aliens’ domination so palpably real in the first half. But two things in particular struck me about They Live’s paranoid view of the world.

First, its vision of manipulative elites and passive, cattle-like masses is broadly applicable. They Live provides a template for just about any critique of the way society is run. The obvious target, and the one Carpenter intended, is the consumerism and haves-and-have-nots dynamic of 1980s America. But one could apply it to just about any menace you care to pick. In fact, the image of a hidden, rich minority of foreigners using the media to control the masses for profit suggested itself strongly enough to certain groups that Carpenter himself spoke up against the misuse of his story.

For myself, the aliens of They Live reminded me of nothing so much as latter-day tech CEOs: manipulating people, selling garbage, flogging unrealistic standards of luxury and beauty, clouding minds with useless information and busywork, justifying their existence through convenience, and—just occasionally—suppressing people they don’t want talking too much.

Second, and even more striking to me, were the tunnels. Following our hero Nada’s epiphany and initial, impulsive shooting spree, he falls in with a more organized resistance which is almost immediately destroyed by the foot soldiers of the alien overlords. Nada and his only friend, Frank, manage to escape using one of the aliens’ own wristwatches, which allow them to disappear in emergencies. Nada and Frank find themselves in a maze of tunnels under Los Angeles, the secret infrastructure supporting the aliens’ domination.

The tunnels are an interesting feature of the plot because they pop up in so many other paranoid visions of the world. Pizzagate, QAnon, the Satanic panic—all feature tunnel systems as prominent parts of their narratives. Even the rescue of twelve soccer players from a cave in Thailand has been spun in conspiratorial directions.

And this isn’t limited to recent theories: the anti-Catholic paranoia of the 1830s included fraudulent stories like that of Maria Monk, who claimed that tunnels permitted priests access to nunneries at night and convenient burial places for the children born of these unions, who were strangled at birth. Like its more recent counterparts, this hoax prompted investigations. Like those more recent investigations, it found no evidence that the stories were true.

So I’ve wondered more than once: what is it with tunnels?

If I were a Jungian—and I’m not, for reasons I intend to unfold here at some point—I might suggest that tunnels have some subconscious archetypal power that forces them to recur in our fears and anxieties and, inevitably, our stories. A little closer to reality, I find it interesting that tunnels make common conspiratorial metaphors literal. The image of the underground, the underworld, the subterranean, the hidden is always ready to hand in conspiracist rhetoric.

More to the point, I think tunnels keep popping up in paranoid narratives for two practical reasons.

First, tunnel systems really exist, and they’re not hard to find. Major cities, theme parks, malls, factories, and public works often have elaborate underground infrastructure, and that’s not even taking account of things like mining and military use. Even my undergrad college campus had a legendary tunnel network that was the subject of much rumor in the early 2000s. (One wonders how the rumors have morphed since.) These often vast systems are real, but they’re there for maintenance or logistics.

Not that the mundane has stopped paranoid speculation in the past. Look at any “abandoned places” video on YouTube and you will see two sets of people in the comments: people who have worked in maintenance tunnels and know what they’re for and try to explain it, and people who think all underground spaces are used solely for human trafficking and won’t change their minds.

Second, and perhaps more important psychologically, if something happens out of sight it is not falsifiable in the way something is that happens out in the open, potentially under observation. Conspiracy theories need tunnels because tunnels allow the conspiracy to unfold both here and somewhere else at the same time. And a good paranoid vision needs that, not just for atmosphere but so that the theory can perpetuate, unproven and impossible to disprove. Just look at all these tunnels!

John Carpenter used those trappings brilliantly in They Live. But in real life, living like Nada and looking for their tunnels will only lead you further away from reality.

Song of Songs and particularity

I’m finally finishing Peter Kreeft’s Three Philosophies of Life, the final section of which is a 26-point meditation on love as described in Song of Songs. Here, Kreeft considers the particularity or specificity of love:

 
The object of love is a person, and every person is an individual. No person is a class, a species, or a collection. There is no such thing as the love of humanity because there is no such thing as humanity. If your preachers or teachers have told you that the Bible teaches you to love humanity, they have told you a lie. Not once does the Bible say that; not once does it even mention the word humanity. Jesus always commands us to love God and our neighbor instead.
 

If, as I’ve often argued here, particularity is the key to good literature, it is fundamental in love. Sine qua non. If it’s not particular, it’s not love.

Particularity is also important philosophically, as Kreeft makes clear in the next paragraph:

 
How comfortable ‘humanity’ is! ‘Humanity’ never shows up at your door at the most inconvenient time. ‘Humanity’ is not quarrelsome, alcoholic, or fanatical. ‘Humanity’ never has the wrong political, religious, and sexual opinions. ‘Humanity’ is never slimy, swarmy, smarmy, smelly, or smutty. ‘Humanity’ is so ideal that one could easily die for it. But to die for your neighbor, to die for Sam Slug or Mehetibel Crotchit—unthinkable. Except for love.
 

To paraphrase Edmund Burke: Abstract humanity is not to be found; “humanity” inheres in specific people. Each of whom is more important than the abstract category, I would add.

Compare the people who talk a lot about “the planet” or “our species” but not about our families, friends, neighbors, communities, hometowns, countries, and nations. The people who love those things love them very specifically for themselves, flaws and all, and not because they are part of a whole too big for any honest person to grasp. Abstractions, “thinking in categories” as Malcolm Muggeridge put it, are dodges—or, increasingly, solvent.

The one spot where I’ll disagree with Kreeft—provided he isn’t being ironic, and I’m more inclined to think so the more I reread this passage—is when he calls humanity an abstraction “so ideal that one could easily die for it.” I’m not sure that actually happens. Soldiers, famously, die for each other far more readily than for democracy or freedom or big-picture geopolitical objectives. (See the Chesterton quotations here.) Likewise martyrs, whether religious or political. Stories of far-seeing men approaching the gallows with the class struggle on their lips smack of Soviet propaganda, not reality.

No, your neighbor—whether the person singing off-key in church or the dipstick you have to share a foxhole with—are easier to die for than “humanity.” Because they’re easier to love. After all, we’ve been commanded to. ‘humanity’

Here’s some of what Kreeft had to say about Job in the middle section of this book. I’ve written about particularity in storytelling several times: with regard to John Gardner here, in a short note on what novels are for here, in much more detail with regard to James Bond and Honeychile Rider in Dr No here, and in memory of Cormac McCarthy here.

The mores of Zorro

Yesterday during a quick day-trip to see my parents with my older kids we listened to a great favorite: The Mark of Zorro, a radio drama starring Val Kilmer. I reviewed it here a few years ago. It’s great. Give it a listen.

Something that struck me upon this third or fourth listen was the character of Don Diego de la Vega’s public disguise. Like his most famous imitator, Bruce Wayne, Don Diego adopts a foppish, ineffective persona to prevent his alter ego’s detection. But his playacting goes well beyond providing cover.

Almost all of the other characters have flaws, most of which are characteristic of their class. The old aristocrats of the caballeros fuss over pedigree, protocol, and inheritance. The young caballeros are idlers eager for any ruckus so long as it’s diverting. The merchants and traders care only about money, whether honest businessmen like the tavernkeeper, who is sincerely anxious about being paid by the drunken soldiers who frequent his bar, or swindlers like the hide dealer who tries to defraud a monastery. Low-class soldiers like Sergeant Gonzalez are characterized by pride, braggadocio, and pointless cruelty, while officers like Captain Ramón are pragmatically ruthless and ambitious. And the actual rulers of Alta California are either openly corrupt or easily misled by lying subordinates.

These are recognizable types—all too familiar, I’d say—and understandable. They have all given into the besetting sins of their social station.

But Don Diego’s public weaknesses go much further. Not only is he a weakling and a dandy, he is indifferent to the customs and community that usually incentivize men like him to stand up for others. Nothing has a claim on him. He “abhors violence” of any kind, views marriage as a mutually beneficial economic arrangement, pooh-poohs honor for making men “thin-skinned” and quarrelsome, and is not interested in “being a man” as he prefers simply to be “a human being.” He is a parody of modern culture.

All of which, tellingly, places him beneath contempt. Even the rapacious Captain Ramón despises him. Justifiably.

These themes are present in Johnston McCulley’s original Zorro novel, but the radio adaptation plays them up to great effect. It’s well worth your time to listen to, and think about.

Ready to spew

Trigger warning: This post contains untranslated French words and phrases. Appropriately, as you may be able to infer.

After some internationally public tableaux generated predictable—and, I think, entirely intentional—online outrage, I saw some equally predictable condemnations of the outraged for doing the thing all the kindhearted internet bien pensants love to condemn: “spewing hate.”

If a cliché is a “dead metaphor,” spewing hate must be the deadest of them all. But where most clichés are merely overused word pictures or verbal shortcuts, this one is also dangerous. J’accuse!

Spew is a very old word, almost unchanged in pronunciation from Old English spíwan and having retained both literal and figurative senses for its entire history. But what’s striking to me about spew is that as vomit or throw up or even puke have become far more commonly used for its literal meaning, its metaphorical use has been whittled down to almost the single expression spew hate. It’s rare now to see spew without hate tagging along behind it.

This is a relatively recent development. Here’s Google’s Ngram viewer for various versions of the phrase:

This particular combination of words originated in the 20th century but has taken off since 2000, especially in its most common form, spewing hate.

This jibes with my observations. I first noticed this phrase during college, when it became the de rigeur description of Mel Gibson’s drunken rant following his 2006 DUI arrest. (The unbroken climb in frequency for spewing hate in the chart above begins in 2005.) Given Gibson’s state of intoxication and what he had to say during his arrest, this was an almost accurate description.

But then I noticed that the phrase wouldn’t go away. To my increasing annoyance, within a few years the advocate of every bad opinion and every person caught saying something mildly rude on camera would inevitably be described as “spewing hate”—regardless of whether they could be described as “spewing” or whether what they had said was hateful. As Orwell and CS Lewis observed, words that get stuck within easy reach of popular use soon become yet more synonyms for something one either does or doesn’t like. They become clichés.

And this cliché isn’t just lazy, unimaginative, or gauche. Given the political and cultural valence it usually has, spewing hate also functions as a thought killer. This is where the metaphorical image does its nastiest work. Someone spewing hate is not communicating, they’re just vomiting, and what they have to say is vomit. It needs no consideration or engagement, just a mop and a man to hustle the sick person out the door.

This makes spewing hate a handy phrase for shutting down debate and preventing argument. And a cliché being a cliché, it is, of course, overused.

Its overuse makes it especially dangerous, for two reasons. First, it prevents legitimate argument. With regard to the events that prompted this post, lots of people have legitimate concerns and complaints, and describing them simply as “spewing hate” is an imperious culture war dismissal. Leave us, hateful paysan. Second—and more insidious—any openminded person who sees through this cliché, who investigates someone accused of “spewing hate” and finds them a reasonable person offering measured argument over legitimate concerns, will be more open to people who actually are in the hate business. It’s not only annoying and thought-killing, it’s self-defeating.

As always with clichés, avoid this one. Don’t use it. Don’t share material that does. Make yourself think about your words. And, in this case, just maybe, you’ll be able to consider someone else’s opinion, too.

Introducing historiography at Miller’s Book Review

Earlier this month I was humbled to be asked to contribute to Miller’s Book Review, an outstanding and wide-ranging Substack run by Joel Miller. Joel asked that I put together an essay on the nuts and bolts aspects of historiography, one of my favorite subjects and a regular topic on this blog. After a few abortive attempts to summarize everything (“It is a great mistake to include everything,” the late John Lukacs once said, accurately) I turned in an essay organized around a few of the books I like to recommend to students who are curious about how history works as a discipline.

I’m pleased to say that the essay is now available! Read the whole thing here. Expect some Herodotus, some basic research questions, some philosophy of history, some theory, some deadball era baseball, a warning or two, one salvaged reputation, a little dunking on Ridley Scott, a whole lot of Hitler, and several books I heartily recommend.

And be sure to subscribe to Joel’s reviews. I’ve added many more titles to my to-read list thanks to him. I’m grateful to him for the invitation to write—and to learn a little about Substack at last—and hope that y’all will enjoy the finished product! Thanks for reading.

Maturity and evolution in military history

A friend with a deep interest in Celtic and specifically Welsh history recently shared this passage from a popular book on ancient Celtic warfare, in which the author tries to see through legendary material relating to Irish warbands:

If the Fianna of the Irish epics are actually celebrated in epic verse as a heroic archetype, an in-depth and disillusioned examination can recognize their historical characters as unruly elements and promoters of endemic political unrest, taking part in conflict only for the sake of conflict and, due to the absence of alternative adversaries, maintaining an obsolete, un-evolving developmental phase of warfare.

Elsewhere in the same book the author describes Celtic warfare in the British Isles as not “mature” compared to the warfare of their Continental cousins. My friend was puzzled by this passage (and wryly noted that it “sounds like it was written by a Roman colonial governor”) and its suggestion that geographic isolation left British Celtic warfare moribund and pointless.

That language of maturity and evolution and development—even the simple noun phase—is a giveaway. There is a whiggish approach to military history that views warfare as progressing linearly, from the primitive, ritualized fighting of the tribe to the pragmatic modern professional army in the employ of a nation-state pursuing rational material objectives. As Jeremy Black puts it in his introduction to The Age of Total War: 1860-1945, which I serendipitously picked up just after seeing my friend’s posts on this topic, this “teleological” approach describes history as “mov[ing] in a clear direction, with developments from one period to another, and particular characteristics in each. This approach is an aspect of modernization theory.”

I’ve written on this topic before, and with reference to another book by Black, coincidentally, but what I didn’t get into as much in that post was the dangers of this view of linear historical progress.

There are two big problems with this approach. The first is that it encourages an assessment of historical subjects as good or bad, better or worse, primitive or modern, depending on how closely they approximate what a modern person recognizes as warfare. A culture’s warfare, in this view, is “mature” insofar as it resembles us, the implicitly assumed endpoint. Judgments according to modern standards are sure to follow.* The condemnation of “endemic political unrest” gives away the author’s assumption that “rest,” so to speak, is the norm. Ancient people didn’t see it that way.

The second, related problem is that, with this viewpoint in place, you need not actually understand a given culture and why it would fight the way it did on its own terms. You can simply slot it into place in a linear scheme of technical and/or tactical evolution and ignore their own viewpoint on the subject.

The result, which has been pointed out as far back as Herbert Butterfield’s Whig Interpretation of History, is that you train yourself either to dismiss or simply not to see anything falling outside the thread of development you’ve chosen to follow and you blind yourself to what’s actually going on with that culture. The search for through lines and resemblances warps the overall view. This is, at base, a form of presentism.

There’s quite a lot of this in the older historiography of Anglo-Saxon warfare. Like the ancient Britons and Irish, the Anglo-Saxons were geographically isolated from related cultures like the Franks for centuries following the Migration Period and continued to fight in recognizably older ways than their cousins. So a common whiggish approach to the story of the Conquest was that the outdated (notice the use of obsolete in the quotation we started with) infantry levy of Harold Godwinson was quite naturally defeated by the combined arms of the Normans, who deployed infantry, cavalry, and dedicated archers at Hastings. It’s a step in evolution, you see, the end of a “phase.” It’s easy to detect a faint tone of contempt for the Anglo-Saxons in a lot of those old books.

This is, of course, to ignore the entire history of this culture, its past enemies and conflicts,** and the good reasons they had to develop and use the military institutions and methods that they did. And so a historian can blithely describe a culture’s unique response to the situations it had found itself in as simply stuck in a rut—until the inevitable triumph of something more modern. No further investigation needed.

Not only is this approach presentist, it fosters an incuriosity that is the bane of good history.

* And the modern always gets the benefit of the doubt, which is morally questionable. Tribal warriors fighting for prestige on behalf of their king is “primitive” and bad but a state nuking civilians in the name of democracy is “modern” and therefore good.

** As well as the fact that William the Conqueror’s victory was down more to luck than to battlefield performance.

Dramatic irony and plot contrivance

Bates and Anna in Downton Abbey and Abby in Blood Simple

Last night RedLetterMedia posted their review of the first season of “The Acolyte,” the latest Star Wars show. I have no interest whatsoever in watching “The Acolyte” but in the course of Mike and Jay’s discussion Jay specifically critiques it for an overused storytelling technique:

One of my least favorite plot contrivances that’s used for, like, lazy screenwriting is the misunderstanding and the not explaining to a character what is going on because the plot demands it. . . . the lazy contrivance of not knowing all the information and not being told the information because if you were then there would be no story.

I should say “misused” rather than “overused.” What Jay is describing is dramatic irony, a form of literary irony in which the audience knows more than the characters do. This can create tension and pathos as characters ignorant of the full significance of their own actions carry on, ignorant not only of what they’re doing but of the consequences they will face later. Shakespearian and especially Greek tragedy are rich in dramatic irony, as are modern horror and suspense movies—as exemplified by Hitchcock’s famous example of the bomb under the table.

But dramatic irony, as Jay suggests, becomes a plot contrivance when the ignorance of the characters is maintained unnaturally. The best example I can think of is “Downton Abbey.” Earlier seasons of the show produce dramatic irony much more organically, but as the show goes on and becomes more obviously a high-toned soap opera, characters get into increasingly melodramatic situations and increasingly refuse to talk to each other about them. Most of the show’s problems could be resolved with one conversation, a conversation the characters will not have.*

This is particularly the case with any plot involving Mr Bates, whose aloof taciturnity is taken to a ridiculous extreme when he is accused of murder—among other things. He has numerous opportunities simply to explain to someone else what is going on and why he is acting in the way that he is, and he doesn’t.** Over and over, “Downton Abbey” prolongs the drama artificially in exactly the same way.

For dramatic irony done not just well but brilliantly, watch Blood Simple, the Coen brothers’ first film. The film has four primary characters: Abby, the young wife of a shady nightclub owner; Marty, the husband; Ray, a friend with whom Abby begins an affair; and Loren Visser, a private detective. Briefly, Marty hires Visser to look into what Abby and Ray are up to and, when he finds out, pays Visser to kill them. Visser double-crosses and shoots Marty, and Ray discovers the body.

Without giving too much away, as the rest of the movie unfolds:

  • Ray thinks that Abby killed Marty (she didn’t) and decides that he has to cover for her

  • Abby thinks Marty is still alive and out to get her (he isn’t) and decides to fight back

  • Visser thinks he has gotten away with his crime (he hasn’t) until he realizes he has left evidence in Marty’s nightclub, which he thinks Ray and Abby have (they don’t), and decides to eliminate them to cover his tracks

All of the characters operate in ignorance of the whole picture—with the possible exception of Visser, who makes mistakes despite knowing more than Ray and Abby—and make their decisions based on what they think they know, which is often wrong. This ignorance continues right up until the final lines of the film, following a climactic confrontation in which the two surviving characters can’t see each other. And it is unbearably suspenseful rather than, like “Downton Abbey” or “The Acolyte,” merely frustrating.

Dramatic irony is a powerful device, and it’s a shame it isn’t better used. Writers hoping to create tension in their stories through the ignorance or misperceptions of their characters would do well to revisit a movie like Blood Simple, some of Elmore Leonard’s crime fiction, or, even better, go back to Aeschylus and Sophocles.

* This is, I think, part of what makes Maggie Smith’s Dowager Countess such a breath of fresh air whenever she appears, as she actually says what she means.

** My wife and I refer to these as “Shan’t” moments, as in “I could resolve this with a simple explanation, but—” turning one’s head away, “shan’t.

Tim Powers on chronocentrism and conformism

For the last week I’ve been reading Tim Powers’s 1987 pirate fantasy On Stranger Tides, a book that everyone seems to agree Pirates of the Caribbean couldn’t have come into existence without—even before Disney optioned the title for the fourth one—and that got me watching Powers interviews on YouTube again.

In this interview with a channel called Through a Glass Darkly, host Sean Patrick Hazlett asks, as a wrap-up, “What advice would you give to new writers?” Powers responds with a list of “the old, traditional advice, which is solid-rock true,” and that I have to add is still good advice for people who’ve been writing for years or decades. Here’s the first part of his answer in bullet-list form:

  • “Read very widely, read outside of your field, read outside of your time, don’t restrict yourself simply to stuff published since 2000 or 1980 or whatever. You don’t want to be chronocentric.

  • “Have as wide a base as you can, chronologically and [in] subject matter. Read mysteries, read plays, read poetry, non-fiction, et cetera.

  • “Write a lot. Set yourself a schedule and keep to it. Even if it’s only a thousand words a month, stick to it. Use guilt and fear as motivators. Tell yourself you’re worth nothing if you don’t get the writing done.

  • “Get it in front of editors, send it out. Don’t get trapped in a revision whirlpool. A story doesn’t exist until an editor has looked at it. It’s like Schrödinger’s cat.”

He follows this up with an elaboration on his first point of advice:

Don’t be a conformist. Don’t bend your writing to fit what you see as trends, even if they seem to be mandatory trends. They’re not.
— Tim Powers

Okay, all that’s true. Then I would say—goes back to chronocentrism—don’t be a conformist. Don’t try to clock what’s selling now, because even if you could correctly gauge that and then write a story, it’s very likely not to be what’s selling now by the time your story comes out. Don’t be a conformist. Don’t bend your writing to fit what you see as trends, even if they seem to be mandatory trends. They’re not. If you say, “Oh this is what they’re buying now. This is what you have to do now in order to get published. There’s some boxes you have to check.” No. Be different. Be a nonconformist. Because if you go along that conformist road, even if it gets published your work is just going to be one more of that generic type, and what’s the value in that? So I would say, ignore trends.

Hear hear.

Powers has said versions of this before—here’s a blog post I wrote last October based on a similar interview conversation—but it’s stated more firmly and in more detail here.

I especially like Powers’s framing of the problem in terms of “chronocentrism.” As I recently told one of my classes, the most neglected form of diversity in our diversity-obsessed age is chronological diversity. Powers is steeped in CS Lewis and loves his non-fiction, so he’s probably got Lewis’s concept of “chronological snobbery” and passages like this from “On the Reading of Old Books” at the back of his mind:

Every age has its own outlook. It is especially good at seeing certain truths and specially liable to make certain mistakes. We all, therefore, need the books that will correct the characteristic mistakes of our own period. And that means the old books. . . . None of us can fully escape this blindness, but we shall certainly increase it, and weaken our guard against it, if we read only modern books. Where they are true they will give us truths which we half knew already. Where they are false they will aggravate the error with which we are already dangerously ill. The only palliative is to keep the clean sea breeze of the centuries blowing through our minds, and this can be done only by reading old books.

For a similar concept, see Alan Jacobs’s “temporal bandwith.”

Kreeft on Job

detail from Job Rebuked by his Friends, by William Blake

I’ve been reading Peter Kreeft’s Three Philosophies of Life, a short examination of Ecclesiastes, Job, and Song of Solomon as visions of competing worldviews, bit by bit over the last month or so. Appropriately, over the weekend I read the section on Job—theme: Life as Suffering—while getting over an illness. Here are two passages that struck me:

First, from Kreeft’s introductory remarks:

Though bottomlessly mysterious, [the Book of Job] is also simple and obvious in its main “lesson”, which lies right on the surface in the words of God to Job at the end. Unless you are Rabbi Kushner, who incredibly manages to miss the unmissable, you cannot miss the message. If Job is about the problem of evil, then Job’s answer to that problem is that we do not know the answer. We do not know what philosophers from Plato to Rabbi Kushner so helpfully but hopelessly try to teach us: why “bad things happen to good people”. Job does not understand this fact of life, and neither do we. We “identify” with Job not in his knowledge but in his ignorance.

Identifying with Job in his ignorance is an elegant way to put it. Job doesn’t know at the beginning of the book—the Accuser executes his plan to have Job curse God without any forewarning—and Job still doesn’t know at the end, and yet he is satisfied. Cf. Chesterton’s comments on Job, which I quoted here two years ago in connection with another great ancient confrontation with Not Knowing: The Epic of Gilgamesh.

In the final third of the section on Job, Kreeft leaves the Problem of Evil behind and turns his attention to what he calls “The Problem of Faith versus Experience.” Here he engages in exactly my kind of comparative history:

In previous ages, especially the Middle Ages, which were strong on reason but weak on psychological introspection, and attention to feeling and experience, the crucial problem was the relation between faith and reason. . . . In our age, which is weak on reason (and even doubts reason’s power to discover or prove objective truth) and strong on psychology and experience, the crucial problem is the relation between faith and experience. Today many more people lose their faith because they experience suffering and think God has let them down than lose their faith because of any rational argument. Job is a man for all seasons but especially for ours. His problem is precisely our problem.

I’ve seen compelling arguments that conversion is often if not always pre- or sub-rational. CS Lewis’s account of his acquiescence to God in Surprised by Joy comes to mind. Traveling in the opposite direction, compare the recent evangelical phenomenon of “deconstruction,” which seems primarily to be a process of publicly washing off political cooties (Christians have been mean to gays! Christians owned slaves! Christians voted for Trump!) rather than a Christopher Hitchens- or Bertrand Russell- or Friedrich Nietzsche-style coldly reasoned apostasy.

Kreeft published this book in 1989, yet here he foresees our not only postmodern but post-truth world and the need for an apologetics based not solely on rational argument. Alister McGrath is the theologian I’m most familiar with who has made a deliberate effort at this.

Three Philosophies of Life has been excellent so far, not least because Ecclesiastes and Job are two of my favorite books of the Bible. I look forward to the final part on Song of Solomon, “Life as Love.” While Ecclesiastes and Job have spoken to me where I am for years, Song of Solomon has always been something of a mystery to me. Having read Kreeft’s examination of Job, I’m prepared to embrace that.

Gladiator II trailer reaction

Naval combat in the Colosseum in Gladiator II

On Tuesday, the first trailer for Ridley Scott’s Gladiator II appeared on YouTube. I immediately watched it and opened up a draft post here on the blog. A few thoughts:

I’ve been skeptical of a sequel to Gladiator for as long as Scott and friends have been talking about it. Not only was Gladiator a great movie and a perfect standalone story, it was—like Star Wars or Pirates of the Caribbean—lightning in a bottle, a lucky product of the planned, the unforeseen, and the ability of imaginative craftsmen to adapt to unique circumstances. Recreating the magic of such a great movie for a sequel would not only be unnecessary, I thought, it would probably prove impossible. It hasn’t helped that some of the leaked proposals for a follow-up were insane. Add to this the aging Sir Ridley’s increasingly unconcealed indifference to history and Napoleon’s thudding arrival last year and I hope you’ll understand why I wasn’t excited to learn, in the middle of all that, that Gladiator II was finally shooting.

Well, now that a trailer has arrived I have to say I’m pleasantly surprised.

Gladiator II picks up the story of Lucius (Paul Mescal), the son of Lucilla (Connie Nielsen), Commodus’s sister and Maximus’s love interest in the original, about twenty years later. When we last see him in Gladiator he’s leaving the sand of the arena where Maximus and Commodus have just killed each other. Now he is, per the trailer and scraps of information online, living in North Africa. Apparently he is captured in an amphibious raid by a Roman army under Marcus Acacius (Pedro Pascal) and sold into slavery as a gladiator, where he follows his hero Maximus’s example by taking the fight to Rome via the Colosseum.

Lucius’s owner and trainer is Macrinus (Denzel Washington), who appears to have a similarly intimidating semi-mentor role to that of Proximo in the original. Macrinus has designs on political power, which is currently wielded by brothers and co-emperors Geta and Caracalla (Joseph Quinn and Fred Hechinger). Bloodsport ensues.

What most surprised me about this trailer is the extent to which it recaptures the feel of the original. Gladiator had a look you could smell. The sharp, sun-drenched palettes, the sand and grit, the backlit smoke, the lavish textiles, the metal that looks hot to the touch, and the towering classical architecture are all present in Gladiator II. The seamless fit of this with the original’s style, more than anything else, made me excited for this movie.

This is, of course, playing to Scott’s strong suit. None of it means that the story will adequately support the visuals. (See again Napoleon.) Scope is guaranteed, but depth?

Other observations:

  • I’m honestly thrilled to see more of the Colosseum, including its famous mock naval battles—complete with dolphins? sharks?—and a beast fight. When you learn about the Colosseum in school this is the stuff you really wish you could see. And this sample looks great.

  • Speaking of the beast fight, the segments with the rhino reveal the starkest visual difference between this and the original: obvious CGI. Gladiator had some but here, nearly a quarter century later, it’s more apparent. The rhino looks pretty great but Lucius’s little tumble doesn’t.

  • I like the nods to Maximus’s arms and armor. A proper Roman touch, and a nice callback to Maximus and Lucius’s scene in the original.

  • Marcus Acacius’s amphibious attack looks rather too much like the climactic fight in Indiana Jones and the Dial of Destiny. Potential unintentional comedy.

  • Plotwise, this really looks like a rehash of Maximus’s story from the original. Not necessarily a bad thing, but I hate to see a great movie followed up twenty-four years later with the standard same-but-different sequel plot.

  • Hans Zimmer has not returned to compose the score, which is a bummer. I’m curious to see whether Harry Gregson-Williams, a fine composer, repurposes some of Zimmer’s themes or writes entirely new music.

  • “The greatest temple Rome ever built: the Colosseum.” Great line. I’m reminded of an observation my undergrad Rome professor made: you can learn a lot about a civilization by looking at the buildings it spends the most time and effort on and that dominate its skyline. In the Middle Ages it was the castle and the cathedral. Today it’s the skyscraper. In Rome it was the arena.

Okay, history stuff:

  • I expect no attempt at accuracy. Like the original, I plan to enjoy this as a good movie and nothing more—provided it’s a good movie.

  • Geta and Caracalla were real emperors who ruled together following the death of their father, Septimius Severus (r. AD 193-211). Geta was assassinated, presumably at his older brother’s bidding, after less than a year of co-rule, so that places the events of this movie in AD 211. Caracalla ruled another six years, though, and has entered history as a byword for imperial cruelty and bloodthirstiness alongside Caligula and Nero. Presumably his fratricide will play some role in the film.

  • Caracalla was eventually murdered while on campaign and succeeded by Macrinus, Denzel Washington’s character. The real Macrinus was Berber rather than black, a fact internet comment sections are already full of fulmination about, and reigned a little over a year. At least one production still of Washington sitting on what looks like a throne has been released. After being murdered in his turn, Macrinus was succeeded by Elegabalus, a notoriously perverted teenage tyrant who has been the subject of a recent move to spin him as a “transgender woman.” Gladiator II is probably already biting off more than it can chew, history-wise. Lord help us if the filmmakers go there.

  • Geta and Caracalla get a stereotypical depraved Roman emperor look, with an uncanny resemblance to John Hurt’s Caligula in I, Claudius. They creeped my wife out when I showed her the trailer.

  • Marcus Acacius has a rather presentist line about not wishing to “waste another generation of young men for their [Geta and Caracalla, presumably] vanity.” You’ll have to look hard to find someone outright defending Caracalla—who, in addition to his personal violence and cruelty, also debased the coinage and granted citizenship to nearly everyone in the Empire—but he didn’t campaign pointlessly. Scott’s modern posturing creeping in, as usual.

Verdict: cautiously optimistic.

So we’ll see. I don’t precisely have high hopes for Gladiator II but the trailer looks good and I’ll certainly be there when the film opens in November.

Hitchcock and the eggheads

Ethel Griffies in The Birds (1963) and Simon Oakland in Psycho (1960)

Speaking of experts, last week, during our Independence Day trip to the beach with my in-laws, I rewatched The Birds for the first time in several years. What most struck me the last time I watched it—how long it takes to get to the bird attacks—seemed less remarkable to me this time. Hitchcock, master craftsman, spends the first half of the film both lulling the audience and foreshadowing the terror to come, all through the whimsical romance he creates in a realistic-feeling world.

No, what struck me this time was Mrs Bundy (Ethel Griffies), the elderly ornithologist who strides into the film just before the first major attack looking for cigarettes. She knows her birds. She’s observed them for decades and knows what they do and do not do. She has facts and figures, including a strangely precise calculation of the number of birds currently living in North America. Presented with Melanie’s stories of bird attacks, Mrs Bundy pooh-poohs them. Confidently, firmly.

She reminded me of a character who appears at the end of Psycho, the film Hitchcock made immediately before The Birds. Following that film’s unbearably suspenseful climax and shocking twist, Hitchcock treats the viewer to a good five minutes of Dr Richman (Simon Oakland) talking, and talking, and talking. Dr Richman explains to the other characters—and, by extension, the audience—what they’ve just witnessed and how Norman Bates came to be what he is. He knows his Freudian psychobabble and is strangely precise in his diagnosis of Norman. He’s confident, firm. He also feels like he talks forever, a strange inclusion in what is otherwise a terrifically paced, highly visual film.

I’ve seen a few explanations for Dr Richman’s protracted, stentorian lecture:

  1. It’s intended as a genuine scientific explanation of Norman and the events of the film based on the pop Freudianism of the day

  2. It’s intended as a parody of Freudian psychology and the way it can explain away anything

  3. It’s there for structural purposes, to give the audience a few minutes to come down from the suspense and terror of the climax before wrapping up with the film’s genuinely chilling final moments

  4. It’s some combination of the above

I think #3 is indisputable as a formal consideration, and so incline toward #4. But which of #1 and #2 is it?

The huge amount of time Hitchcock gives to Dr Richman suggests #1. Hitchcock loved his jokes but constructed them economically. Also, screenwriter Joe Stefano has said in interviews that he was heavily committed to Freudian analysis at the time, so his contribution was probably intended sincerely.*

On the other hand, Dr Richman acts like a blowhard and his explanation is too pat, too easy, fitting the mystery of Norman Bates snugly within the die-cut confines of theory. His explanation—and based on a single police station interview!**—is incommensurate with what the audience has seen over the preceding hour and a half. His confidence smacks of cocksureness rather than insight. Tellingly, even after his lecture we are left uneasy by Norman in his final scene, during which we leave the safe confines of law and order and expertise and travel down the hall to Norman’s cell and whatever is contained there. One senses that the cops guarding the door have a clearer grasp on Norman than Dr Richman.

The Birds reinforced my gut feeling that the latter is the better understanding of Psycho. Here, the expert shows up nearer the middle of the film rather than at the end and—most unlike Dr Richman, whose explanation is seemingly allowed to stand—is thoroughly humiliated. We see Mrs Bundy twice: the first time as an imperious expert holding court, the second as a traumatized survivor of the thing she denied was possible minutes before. She can’t even bring herself to look at Melanie and Mitch.

Hitchcock learning lessons between films? Or simply a difference in source material and screenwriter? I don’t know, but I think Mrs Bundy’s role in The Birds is the better of the two, heightening rather than explaining away the film’s central mystery.

* I know a psychiatrist does appear in Robert Bloch’s original novel, but I haven’t read it and can’t comment on how this information is handled there.

** Mark Twain comes to mind: “There is something fascinating about science. One gets such wholesale returns of conjecture out of such a trifling investment of fact.”

Credential envy

I’m currently reading Histories and Fallacies: Problems Faced in the Writing of History, by Carl R Trueman, a good introduction to the historiographical traps laid in the way of students of the past.

In his first full chapter, which covers Holocaust denial (“HD” below), Trueman briefly explores a side-topic he calls “the aesthetic fallacy”—the assumption that if something looks scholarly and scientific (by some subjective image of what “scholarly” and “scientific” should look like) it must be. This, Trueman notes, is more a fallacy of the reader of history than the historian, but bad historians often tailor their work and images with this in mind.

Trueman looks specifically at the case of Fred Leuchter, who undertook a chemical study of one gas chamber at Auschwitz and claimed to have found little or no evidence of Zyklon-B residues in the bricks. After picking apart Leuchter’s study, which was methodologically unsound but provided a seemingly scientific talking point for certain audiences, Trueman makes an important side observation:

On close examination, we can easily see that his method is so flawed that it is not really scientific at all, but it has all the appearance of being scientific. He uses all the right words, even down to his claim in the title that he is an engineer. In fact, he is not; he is a designer of execution machines. Indeed, he has been barred from using the title “engineer” with reference to himself because of his lack of formal qualifications. The title gave him weight and plausibility; he presumably hoped that it would provide him with the credibility to have a seat at the table and be taken seriously in discussions. One could say that the scientific form of his writing, or perhaps better (though slightly more pretentiously), the scientific aesthetics of his work gave his arguments credibility. For this reason, I am always suspicious of books that print “PhD” on the cover after the author’s name. Why do they need to do this? The person has written a book, so surely her competence can be judged by the volume’s contents? Perhaps, after all, many books are judged at least somewhat by their covers as well as what is printed on the inside.

The phenomenon Trueman describes here is common across self-published crank literature (just look through the Goodreads giveaways sometime) but is felt apparently instinctively by a lot of people. I call it “credential envy.” It has a few iterations:

  • Insisting on a title that is irrelevant to the topic under discussion

  • Claiming a title one is not legitimately entitled to

  • A version of both the former and the latter: insisting on being called doctor for an unearned doctorate

  • Pure fraud

The fundamental quality of credential envy is a craving for legitimacy—or, per Trueman’s “aesthetic fallacy,” the appearance of legitimacy. There’s a defensive, chip-on-the-shoulder aspect to credential envy. People who insist on impressive titles want to preempt criticism through intimidation or grandeur. And this attitude only becomes more apparent when the credentials are false or irrelevant or when they’re being used to mislead, as Leuchter’s appropriation of “engineer” was.

Credentials and qualifications matter enormously. But like Trueman, the more someone insists on their credentials and titles, the more wary I become. Real expertise is effortlessly confident and worn lightly. Or should be. Perhaps the behavior of some real experts today is part of the reason the broader public increasingly finds it hard to distinguish them from the cranks.