A quick personal update

Books and Bede—a favorite gift from my wife and kids

The hot but unhurried days of the summer gave way, right at the beginning of this month, to the haste and chaos of preparation for the fall semester. In my case, I am preparing for three fall semesters, as I have picked up adjunct classes at two other colleges in addition to my full-time teaching. Just keeping deadlines straight will be an adventure.

The reason for all of this is a happy one that I’m not sure I’ve directly addressed here—my wife and I are expecting twins, our fourth and fifth children. I’ve taken on this extra work for the time she will be out following their birth. These adjunct courses were mercifully easy to find. One was even offered to me sight unseen thanks to a recommendation from a colleague. How often now does someone need work and have it dropped into his lap like that? We are blessed and have had a lot of cause this summer to reflect on God’s provision—in time, in work, in material needs—for these babies and for us.

That said, when exactly the twins will arrive is up in the air. Were they to go full-term they would arrive three weeks into September, but my wife’s OB doesn’t let twins go past 38 weeks. So we were looking toward the second weekend in September. Now, though, the doctor may decide to induce around 37 weeks, bumping the twins’ arrival another week nearer. There is also the possibility—just a possibility, but a possibility that has a startling way of focusing one’s attention—that they may induced this week, depending on how my wife’s checkups go. She spent last night at the hospital under observation, a common enough occurrence for women at this stage of expecting twins but still a reminder of how near we are. Fortunately all signs were good and she’ll be released this morning.

And of course the babies could do their own thing and come at any time now, something we’ve been working to prepare for for the last couple weeks. We have a “go bag” in the back of the van, waiting.

All of which is to say that my writing here, already spotty since the end of the summer session, may be more sporadic in the coming weeks. I may not, for instance, have the time or stamina to complete a summer reading list. Then again, being able to work on something one paragraph at a time might be just the thing. There’s no way to tell at this point. But I hope y’all will keep checking in and stay in touch, and most of all that y’all will celebrate with us.

In the meantime, here’s a short reflection on birth and life inspired by an offhand metaphor in Beowulf that I wrote following the birth of our third child four years ago. Please check that out.

Further notes on Indy and Oppie

July was a big movie month here on the blog, with three reviews of movies ranging from “adequate compared to Kingdom of the Crystal Skull” to “great.” Two of them I’ve reflected on continually since seeing them and reviewing them here, especially as I’ve read, watched, and listened to more about them.

Here are a few extra thoughts on my summer’s movie highlights cobbled together over the last couple of weeks:

Indiana Jones and the Curse of Woke

When I reviewed Indiana Jones and the Dial of Destiny a month and a half ago, I didn’t dwell on the malign influence of woke ideology in its storytelling, only mentioning that I had justifiable suspicions of any Indiana Jones film produced by Disney. I wanted to acknowledge those doubts without going into detail, because after actually watching and, mostly, enjoying the movie, I found that the problems I had with Dial of Destiny weren’t political at all, but artistic. It isn’t woke, it’s just mediocre.

That didn’t stop a certain kind of critic from finding the spectral evidence of wokeness in the film and trumpeting their contempt for it. I’m thinking particularly of a caustic YouTube reviewer I usually enjoy, as well as this review for Law & Liberty, which comes out guns blazing and attacks Dial of Destiny explicitly and at length along political lines.

The problem with these reviews is that in their hypersensitivity and their mission to expose ideological propaganda they do violence to the object of their criticism, not just misinterpreting things but getting some thing completely wrong. Here’s a representative paragraph from that Law & Liberty review:

Next, we cut to 1969, the Moon Landing. Indy is an old tired man, sad, alone, miserable. The camera insists on his ugly, flabby naked body. His young neighbors wake him up with their rock music and despise him. His students don’t care about his anthropological course. His colleagues give him a retirement party and soon enough they’re murdered, by Nazis working secretly in the government, with the complicity of the CIA or some other deep state agency. We see the wife is divorcing him; we later learn, it’s because his son died in war, presumably Vietnam—Indy told the boy not to sign up.

What was remarkable about this paragraph to me is how much it simply gets wrong. Indy’s hippie neighbors wake him up by blasting the Beatles, yes, but they also treat him perfectly amiably. (In fact, it’s Indy who knocks on their door armed with a baseball bat.) It is never clear that Voller’s men have help from the CIA or any other “deep state agency;” I kept waiting for that connection but it never came. And Indy did not try to stop his son from joining the army, a point made so clear in the film—Indy’s one stated wish, were time travel possible, would be to tell him not to join—that it’s staggering to think a critic went to print with this.*

From later in the same review: “But turning from obvious metaphors to ideology, Indy is replaced by a young woman, Helen [sic—her name is Helena], daughter of his old archaeological friend Basil, but the film suggests you should think of her as a goddess to worship.” One of my chief complaints about Dial of Destiny was its failure to deal with Helena’s criminality, giving her a half-baked or even accidental redemptive arc that spares her a face-melting, as befitted all similar characters in Indy’s inscrutable but always moral universe. That bad writing again. But how one could watch her character in action and conclude that the audience is meant to “worship” her is beyond me. This is anti-woke Bulverism.

What these hostile reviewers describe is often the opposite of what is actually happening in the film. I’ve seen multiple critics assert that Helena has “replaced” Indy and “controls” and “belittles” him. The Law & Liberty reviewer describes Indy as just “along for the ride.” Helena certainly intends to use him—she’s a scam artist and he’s a mark. This is all made explicit in the film. But it is also made explicit that Indy does, in fact, keep taking charge and leading them from clue to clue and that he is much a tougher mark than Helena was counting on.

Dial of Destiny’s actual problems are all classic artistic failures—poor pacing, overlong action sequences, plodding exposition, weak or cliched characters,** slipshod writing, and a misapprehension of what matters in an Indiana Jones movie that becomes clearest in the ending, when Indy is reunited (for the third time) with Marion. Here the filmmakers make the same mistake as the team behind No Time to Die by giving Indy, like Bond, romantic continuity and attempting to trade on sentimentality when that is not what the character is about.

Again—these are artistic problems. Helena Shaw isn’t a girlboss or avenging avatar of wokeness; she’s a poorly written villain who doesn’t get her comeuppance. But I saw little such criticism among the fountains of indignation from the reviewers who pursued the “woke Disney” line of criticism.

Perhaps this is the greatest curse of wokeness: that it distorts even its critics’ minds. Once they’ve determined that a movie is woke, they’ll see what they want to see.

Call it woke derangement syndrome and add it to all the other derangement syndromes out there. Woke ideology is real, even if the ordinary person can’t define it with the precision demanded by a Studies professor or Twitter expert, and it is pernicious, and it produces—even demands—bad art. It is a kind of self-imposed blindness, as are all ideologies. But zeroing in on wokeness as the explanation for bad art can blind us to real artistic flaws, and if any good and beautiful art is to survive our age we need a keen, clear, unclouded vision of what makes art work. We need not just a sensitivity to the bad, but an understanding of the good.

Douthat on Oppenheimer

On to better criticism of a better movie. Ross Douthat, a New York Times op-ed columnist who writes film criticism for National Review, has been one of my favorite critics for the last decade. Douthat begins his review of Oppenheimer with an abashed confession that he feels guilty saying “anything especially negative about” it, but that as brilliantly executed as it is, he is “not so sure” that it is “actually a great film.”

Fair enough. What gives Douthat pause, then? For him, the problem is Oppenheimer’s final third, which he sees not as a satisfying denouement but simply a long decline from the height of the Trinity test, a decline complicated by thematic missteps:

There are two problems with this act in the movie. The first is that for much of its running time, Oppenheimer does a good job with the ambiguities of its protagonist’s relationship to the commonplace communism of his intellectual milieu—showing that he was absolutely the right man for the Manhattan Project job but also that he was deeply naïve about the implications of his various friendships and relationships and dismissive about what turned out to be entirely real Soviet infiltration of his project.

On this point I agree. As I wrote in my own review, I thought this was one of the film’s strengths. Douthat continues:

But the ending trades away some of this ambiguity for a more conventional anti-McCarthyite narrative, in which Oppenheimer was simply martyred by know-nothings rather than bringing his political troubles on himself. You can rescue a more ambiguous reading from the scenes of Oppenheimer’s security-clearance hearings alone, but the portions showing Strauss’s Senate-hearing comeuppance have the feeling of a dutiful liberal movie about the 1950s—all obvious heroes and right-wing villains, no political complexity allowed.

The second problem, as Douthat sees it, is that the drama surrounding Oppenheimer’s political destruction and Strauss’s comeuppance is unworthy of the high stakes and technical drama of the middle half of the movie concerning the Manhattan Project: “I care about the bomb and the atomic age; I don’t really care about Lewis Strauss’s confirmation, and ending a movie about the former with a dramatic reenactment of the latter seems like a pointless detour from what made Oppenheimer worth making in the first place.”

There is merit here, but I think Douthat is wrong.

I, too, got the “dutiful liberal” vibe from the final scenes, but strictly from the Alden Ehrenreich character. Ehrenreich is a fine actor unjustly burdened with the guilt of Solo, but his congressional aide character’s smug hostility to Strauss as Strauss is defeated in his confirmation hearing feels too pat, too easy. It’s Robert Downey Jr’s sympathetic and complicated portrayal of Strauss, not to mention the fact that the film demonstrates that, however Strauss acted upon them, his concerns about espionage and Oppenheimer’s naivete were justified, that saves the film from simply being standard anti-McCarthy grandstanding.***

Regarding the seemingly diminished stakes of the final act, I too wondered as I first watched Oppenheimer whether Nolan might have done better to begin in medias res, to limit himself strictly to the story of the bomb. But that story has already been told several times and Oppenheimer is very much a character study; this specific man’s rise and fall are the two necessary parts of a story that invokes Prometheus before it even begins.

The key, I think, is in the post-war scene with Oppenheimer and Einstein talking by the pond at Princeton. Nolan brings us back to this moment repeatedly—it’s therefore worth paying attention to. The final scene reveals Oppenheimer and Einstein’s conversation to us:

Oppenheimer: When I came to you with those calculations, we thought we might start a chain reaction that would destroy the entire world.

Einstein: I remember it well. What of it?

Oppenheimer: I believe we did.

Cue a vision of the earth engulfed in flames.

A technology that can destroy the entire world is not just the literal danger of Oppenheimer’s project, but a metaphorical one. The Trinity test proves fear of the literal destruction of the world unfounded, but the final act of the film—in which former colleagues tear each other apart over espionage and personal slights and former allies spy and steal and array their weapons against each other and the United States goes questing for yet more powerful bombs, a “chain reaction” all beginning with Oppenheimer’s “gadget”—shows us an unforeseen metaphorical destruction as it’s happening. The bomb doesn’t have to be dropped on anyone to annihilate.

This is a powerful and disturbing dimension of the film that you don’t get without that final act.

Finally, for a wholly positive appraisal of Oppenheimer as visual storytelling—that is, as a film—read this piece by SA Dance at First Things. Dance notes, in passing, the same importance of the film’s final act that I did: “The two threads are necessary to account for the political paradox of not just the a-bomb but of all technology.” A worthwhile read.

Addenda: About half an hour after I posted this, Sebastian Milbank’s review for The Critic went online. It’s insightful well-stated, especially with regard to Oppenheimer’s “refusal to be bound” by anyone or anything, a theme with intense religious significance.

And a couple hours after that, I ran across this excellent Substack review by Bethel McGrew, which includes this line, a better, more incisive critique of the framing narrative than Douthat’s: “This is a weakness of the film, which provides all the reasons why Oppenheimer should never have had security clearance, then demands we root against all the men who want to take it away.”

Tom Cruise does the impossible

The most purely enjoyable filmgoing experience I had this summer was Mission: Impossible—Dead Reckoning, Part I. To be sure, Oppenheimer was great art, the best film qua film of the summer, but this was great entertainment. I enjoyed it so much that, after reviewing it, I haven’t found anything else to say about it except that I liked it and can’t wait for Part II.

Leaving me with one short, clearly expressed opinion—a truly impossible mission, accomplished.

Endnotes

* In fairness, the review has one really interesting observation: in reference to the film’s titular Dial being Greek in origin, unlike the Ark of the Covenant or the Holy Grail, “Jews are replaced by Greeks in the Indiana Jones mythology, since our elites are no longer Christian.” The insight here is only partially diminished by the fact that the elites who created Indiana Jones were not Christian, either. Steven Spielberg, Philip Kaufman, and Lawrence Kasdan—key parts of Raiders—are all Jewish.

** Here is where Dial of Destiny drifts closest to woke characterization. The agents working for Voller in the first half include a white guy in shirt and tie with a crew cut and a thick Southern accent and a black female with an afro and the flyest late 1960s fashion. Which do you think turns out to be a devious bad guy and which a principled good guy? But even here, I don’t think this is woke messaging so much as the laziness of cliché. Secondary characters with Southern accents have been doltish rubes or sweaty brutes for decades.

*** A useful point of comparison, also involving a black-and-white Robert Downey Jr, is George Clooney’s engaging but self-important Good Night, and Good Luck. Watch both films and tell me which is “all obvious heroes and right-wing villains.”

Poe on Progress

The capital P above is intentional. Here’s a passage by Poe that I’ve run across in excerpt several times, from an 1844 letter to fellow poet James Russell Lowell, who had requested “a sort of spiritual autobiography” from Poe. In the course of laying out his beliefs and opinions, Poe writes:

 
I have no faith in human perfectibility. I think that human exertion will have no appreciable effect upon humanity. Man is now only more active—not more happy—nor more wise, than he was 6000 years ago. The result will never vary—and to suppose that it will, is to suppose that the foregone man has lived in vain—that the foregone time is but the rudiment of the future—that the myriads who have perished have not been upon equal footing with ourselves—nor are we with our posterity.
— Edgar Allan Poe, July 2, 1849
 

I’ve written often enough about the myth of Progress—whether applied to politics as Progressivism, to the study of history as Whig or Progressive history, or in the popular imagination as the constant general improvement of everything over time—but Poe captures both my beliefs and my mood just about perfectly. Not only does the myth of Progress blind us to our own potential for failure, it rubbishes and belittles our forebears. It is not only incorrect, but impious.

You can read Poe’s entire letter to Lowell, which is full of personal asides and opinions, here. It’s available as part of a great archive of Poe correspondence made available online by the Edgar Allan Poe Society of Baltimore, an act of service that I deeply appreciate. You can peruse that here.

A thesis

The following started as only semi-serious off-the-cuff pontification in my Instagram “stories.” I’ve expanded on it and fixed a lot of autocorrect “help” along the way.

A favorite web cartoonist, Owen Cyclops, shared the following on Instagram this morning:

If you’re unfamiliar with semiotics, which I discovered via Umberto Eco late in high school, here’s the first bit of Wikipedia’s intro:

Semiotics (also called semiotic studies) is the systematic study of sign processes (semiosis) and meaning making. Semiosis is any activity, conduct, or process that involves signs, where a sign is defined as anything that communicates something, usually called a meaning, to the sign's interpreter. The meaning can be intentional, such as a word uttered with a specific meaning; or unintentional, such as a symptom being a sign of a particular medical condition.

The phrase “usually called a meaning” should give you some sense of how arcane, abstract, and high-falutin’ this can get. Emphasis on abstract. But semiotics is not really my point, here. Owen’s cartoon brought Dr Johnson’s refutation of Berkeley to mind. Per Boswell:

After we came out of the church, we stood talking for some time together of Bishop Berkeley’s ingenious sophistry to prove the non-existence of matter, and that every thing in the universe is merely ideal. I observed, that though we are satisfied his doctrine is not true, it is impossible to refute it. I never shall forget the alacrity with which Johnson answered, striking his foot with mighty force against a large stone, till he rebounded from it, “I refute it thus.”

This is the “appeal to the stone.” Wikipedia classifies it as “an informal logical fallacy.” I don’t care. When confronted with academic disciplines that have descended to this level of abstraction, I join Dr Johnson’s stone-kicking camp.

At some point, something has to be real. Argument divorced from concrete reality simply turns into sophisticated dorm room bickering.* That’s what Owen’s cartoon captures so well—argue about the “meanings” of “signs” like carrot tops and foxholes all you want, the real carrot and the real fox are going to present an inarguable ultimate meaning to those rabbits. I refute it thus.

I was struck that Wikipedia’s article on Johnson’s stone-kicking compares this appeal to the reductio ad absurdum, which it also treats as a fallacy. Its full article on the reductio is more circumspect, classifying it as a legitimate line of argument, though I’ve always regarded the reductio more as a useful rhetorical device, a way of comically** setting the boundaries to an argument or of twisting the knife once the logic has worked itself out as impossible. But, tellingly, the article’s “see also” points us toward slippery slope. This is, of course, described not just as an informal fallacy but “a fallacious argument.” I contend that slippery slope is not a fallacy but, at this point, an ironclad empirical law of Western behavior.

And that’s what brought the late Kenneth Minogue to mind. In my Western Civ courses I use a line from his Politics: A Very Short Introduction, to impart to students that the Greeks and Romans were different from each other in a lot of fundamental ways. Chief among these differences was the Greek and Roman approach to ideas:

The Greek cities were a dazzling episode in Western history, but Rome had the solidity of a single city which grew until it became an empire, and which out of its own decline created a church that sought to encompass nothing less than the globe itself. Whereas the Greeks were brilliant and innovative theorists, the Romans were sober and cautious farmer-warriors, less likely than their predecessors to be carried away by an idea. We inherit our ideas from the Greeks, but our practices from the Romans.

Succinct, somewhat oversimplified, sure, but helpful to students who mostly assume the Greeks and Romans were the same, just with redundant sets of names for the same gods. It’s also correct. Minogue goes on to note that this mixed heritage manifests differently culture to culture, state to state, but that “Both the architecture*** and the terminology of American politics . . . are notably Roman.”

Were, I’d say.

So, a thesis I’ve kicked around in conversation:

Given Minogue’s two categories of classical influence, as the United States was founded along (partially but significantly) Roman lines by men who revered the Romans, a large part of our cultural upheaval has arisen as the country has drifted more Greek—becoming progressively more “likely . . . to be carried away by an idea.”

The emphasis has shifted from the Founders’ “Roman” belief in institutions governed by people striving for personal virtue to a “Greek” pattern of all-dissolving ideologies pursuing unachievable ends. This reflects both political and social changes. Like Athens, the US became more aggressive and more inclined to foreign intervention the more it embraced democracy not just as a system but as an end. And note the way that, when an ideal butts up against an institution in our culture, it’s the institution that’s got to go—as does anything that stands in the way of the fullest possible fulfilment of the implicit endpoint of the ideal. How dare you impede my slide down this slope, bigot.

And this is not a new problem. A whole history of the US could be written along these lines.

* During my senior year of college I once listened to two roommates argue over whether the Trix Rabbit was a “freak of nature.” This lasted at least an hour. Take away the humor and you’d have enough material for several volumes of an academic journal.

** Comically, because what’s the point of arguing if you can’t laugh the whole time? That’s not an argument, but a quarrel. See note above.

** Not always for the best, as I’ve argued before.

Price and Keegan on walking the ground

Yesterday on my commute I listened to the latest episode of The Rest is History, “Viking Sorcery,” in which Tom Holland and Dominic Sandbrook interview archaeologist Neil Price, author of Children of Ash and Elm, a massive archaeological and historical study of the Norse, which I read two summers ago.

Holland begins by reading a striking passage from Price’s earlier book The Viking Way. Having relocated from Britain to the University of Uppsala, Price realizes how the landscape of the Norse homeland is reshaping his understanding:

I was disturbed by the fact that the ancestral stories of the North should seem so much more intelligible when looking out over those Swedish trees than they had done while sitting in my office in England.

Price himself elaborates on this point not long into the interview:

[W]hatever you’re studying about the past, it really does help to go to the places that you’re talking about to see the landscapes.

I was conscious when I wrote The Viking Way—it came out in 2002 originally, so it’s twenty years old now—that sort of sentiment that you quoted about me being disturbed by the fact that those ancestral stories seem so much more intelligible when looking out over Swedish trees, there’s a risk that that’s a kind of romanticizing view. There’s me thinking, ‘Wow, I’m in touch with the Viking Age,’ and of course I’m not. So you have to guard against that as well. But I do think that, whatever you’re studying about the past, it really does help to go to the places that you’re talking about to see the landscapes, to experience what a Scandinavian winter is like. When you look at, say, reconstruction drawings, it’s always summer. They’re never sort of hunkered down in a snowed-in building, and yet that’s a very large part of the year. So to sort of try and get that kind of experiential aspect of things I think is quite important.

What Price calls the “experiential” dimension of historical understanding is what Chesterton called “the inside of history”—a recurring theme of my work as a historian, teacher, and novelist, and of my reflections on this blog. Getting at this dimension is not just a matter of trying to grasp alien minds or dressing up in a lost peoples’ clothing but in feeling and understanding the actual physical places where they lived and died.

Price’s discussion immediately reminded me of one of the passages that first brought this home to me as a grad student and reshaped how and why I study history. From Sir John Keegan’s great study The Face of Battle, first published in 1976:

Anecdote should certainly not be despised, let alone rejected by the historian. But it is only one of the stones to his hand. Others—reports, accounts, statistics, map-tracings, pictures and photographs and a mass of other impersonal material—will have to be coaxed to speak, and he ought also to get away from papers and walk about his subject wherever he can find traces of it on the ground. A great pioneer military historian, Hans Delbrück in Germany in the last century, demonstrated that it was possible to prove many traditional accounts of military operations pure nonsense by mere intelligent inspection of the terrain.

This passage took root in my mind as “walking the ground,” something I have few resources to do but which always, always helps when I can. My writing of Griswoldville was based closely not only on the specific locations around Macon where the battle takes place—which I walked in appropriate winter weather—but on the landscapes of north and central Georgia generally: the hills, farms, fields, orchards, pecan groves, and the weather. The land and what it is like is a fundamental part of that story. And, naturally, that understanding transferred to my historical narrative of the battle for the Western Theater of the Civil War Blog a few years ago. I work hard on everything I write, but my own best work always has walking the ground behind it.

A small but important point in Price’s chat with Holland and Sandbrook, but the entire interview is excellent. I strongly recommend it.

Oppenheimer

When I reviewed the new Mission: Impossible a few weeks ago, I rather lamely called it “a whole lot of movie.” I should have saved that description another week or so for Oppenheimer.

Oppenheimer is an accurate title. Despite the big budget, world-historical sweep, and powerful story, it’s fundamentally a character study tightly focused on J Robert Oppenheimer. Fortunately, its subject, by virtue of his unique role in American history and the course and conduct of World War II, gives the film both scope and depth. And though the film’s marketing leaned heavily on the Manhattan Project, Los Alamos, and the Trinity test, the film encompasses a huge swath of its protagonist’s life.

The film is told through a pair of overlapping and interweaving flashbacks in the 1950s but begins, chronologically, with the American Oppenheimer (Cillian Murphy) studying at Oxford in the 1920s. He bounces around through the rarefied world of quantum physics, from Oxford to Germany and back to the US, where he introduces this strange new subject to American universities in California. Study of quantum theory grows rapidly. So does Oppenheimer’s noncommittal involvement with radical leftwing politics—supporters of the Republicans in the Spanish Civil War, labor organizers who want to unionize laboratory assistants, overt Communists. He develops an unstable, on-and-off sexual relationship with the Communist Jean Tatlock (Florence Pugh) but moves on and marries Kitty (Emily Blunt), a divorcee with an alcohol problem. He also butts heads with other scientists at his university, who object to his tolerance and occasional endorsement of Communist projects, especially when such projects intrude into the classroom and the lab.

The war comes, and Oppenheimer is approached to head the Manhattan Project. His contact with the military and government is General Leslie Groves (Matt Damon), a bullheaded tough who gets Oppenheimer everything he wants, most specifically a brand new lab complex and supporting town in the remote New Mexico desert. This third of the film shouldn’t need much explanation—it is the literal centerpiece of the story and leads to the film’s most stunning, exhilarating, and terrifying sequence.

The final third covers Oppenheimer’s postwar life. Recruited by Lewis Strauss (Robert Downey Jr) to work at Princeton and given a key role on the Atomic Energy Commission, Oppenheimer’s past threatens to ruin him when the US military detects the Soviets’ first atomic test. Every every former member of the Manhattan Project comes under scrutiny. This event, Oppenheimer’s caginess and seeming indifference to the security of the Manhattan Project, and personal conflict and callousness toward Strauss, a former admirer, cause Strauss to turn on him. After Oppenheimer is denounced as a probable Communist agent, an AEC tribunal unearths all of his former sins and picks them over minutely. Even former close associates like Groves and Edward Teller (Benny Safdie), who vigorously assert Oppenheimer’s loyalty to the United States, make damning concessions about his unreliability and strange behavior. Oppenheimer loses his security clearance and his job.

But Oppenheimer, indirectly, has his revenge. When Strauss is appointed to President Eisenhower’s cabinet and sits for senate confirmation hearings, his scapegoating of Oppenheimer and underhanded manipulation of the AEC costs him his cabinet position.

That’s the story of Oppenheimer in chronological order. But this being Christopher Nolan, it is not told so straightforwardly. It’s easy to get hung up on the structures of Nolan’s films, and in my original draft of this review I labored through how Oppenheimer works and why it works so well, but that’s spending too much time on how the story is told. The real strengths of Oppenheimer are its masterful technical execution and its performances, especially the central one by Cillian Murphy.

Oppenheimer looks brilliant. Much has been made, quite rightly, about the film’s IMAX cinematography.* Nolan and DP Hoyte van Hoytema use IMAX’s resolution and shallow depth of field to maximum effect, capturing everything from an atomic explosion to the irresolution and doubt on a man’s face with startling immediacy. Oppenheimer is also beautiful—New Mexico landscapes, the stately traditional architecture of old college campuses,** and the black and white of Strauss’s sequences are all stunning to look at. Additionally, the costumes, sets, and props are all excellent. If “immersion” in an “experience” is what brings you to the movies, Oppenheimer’s 1920s, 30s, 40s, and 50s are as immersive as Hollywood gets.

I’ve seen a few people complain about the wall-to-wall score, especially in the first half, but I honestly didn’t notice that. Ludwig Göransson’s music, like the intercutting flashbacks, helps establish and sustain the film’s dramatic momentum early on. It’s also a good score, not nearly as punishing and concussive as previous Nolan film scores. And unlike, say, Tenet, I could hear all of the film’s dialogue, so no complaints with the sound design and sound editing here.

My one technical problem is with the editing, which reminded me of some of Nolan’s earlier films, especially Batman Begins. Conversations often play out in unimaginative shot-reverse shot style and it sometimes feels like all the pauses have been cut out of the dialogue. Some scenes barely have room to breathe. I noticed this especially clearly with the handful of jokes and one-liners in Nolan’s script, where timing is crucial. Fortunately this evens out by the middle portion of the film concerning Los Alamos, but it gave Oppenheimer an odd, rushed feel in the first third.

As for the performances, Oppenheimer rivals those crazy CinemaScope productions of the 1950s and 60s for its huge cast. Nolan, not unlike Oppenheimer himself, built a small army of amazing talent for this movie, with even small roles played by well-known actors. Perhaps my favorite is Gary Oldman as Harry Truman, who appears for one scene that can’t last more than three minutes. And Oldman is excellent, turning in a rich, complicated performance despite his limited screentime and Nolan’s understated writing.

The same is true of everyone else in the film. Robert Downey Jr is excellent as Strauss, playing him sympathetically but still as a clear antagonist. Downey has said that he understands where Strauss was coming from and so didn’t play him as a villain, and it shows. His performance is the perfect counterbalance to Murphy. Other standouts include Benny Safdie as H-bomb theorist and engineer Edward Teller and Matt Damon as Leslie Groves. Groves’s and Oppenheimer’s odd-couple working relationship is one of the highlights of the film. Emily Blunt makes the most of an underwritten role as Oppenheimer’s difficult, morose, alcoholic wife—who nevertheless comes through when it counts—and Josh Hartnett and David Krumholtz were especially good playing two different kinds of colleague to Oppenheimer. I also enjoyed the many, many historical cameos, including Werner Heisenberg (Matthias Schweighöfer), Niels Bohr (Kenneth Branagh), and, in a slightly larger role, Tom Conti as Albert Einstein.

But as I hinted above, this is Murphy’s movie. He appears in almost every scene across all three hours and remains continuously interesting. He plays Oppenheimer as a cipher; as we watch, we feel we understand him from scene to scene, but—as becomes especially clear at the end—our impressions don’t add up in any satisfactory way. What we get is an unpleasant character full of flaws: a resentful outsider, an arrogant insider, an adulterer, a recklessly naïve and self-regarding political do-gooder, a man with astonishingly bad judgment and enormous blind spots, who can devote himself to a project that will inevitably result in mass murder and celebrate its completion only to reverse himself later, who chooses the wrong moments to stand on principle and whose one moment of keen self-awareness comes when he realizes he is being approached with an offer to spy for the Soviets and refuses—a good decision that he still manages to bungle. And yet he is undoubtedly brilliant at what he does, people as different as Einstein and Groves like him, and he sees a crucial project through to completion.

This tension is never resolved, and Oppenheimer only becomes more inscrutable as the film progresses. When Edward Teller wishes he could understand him better, he could be speaking for the audience. As one of Oppenheimer’s rivals in the race for the Bomb might have suggested, the more we see of him, the less we actually know. No wonder he rubbed people the wrong way.

The film opens with an epigraph explaining, in brief, the myth of Prometheus, who stole fire from the gods as a gift for mortals and was punished by being chained to a rock where birds would peck out his liver all day, every day, for eternity. This myth is apropos—especially since Nolan’s source material was the Oppenheimer biography American Prometheus—and I found myself reflecting on Oppenheimer as a Greek tragedy. Oppenheimer is a hero who has achieved great things for a thankful citizenry but is undone by his own past sins. He has no one to blame but himself. In this way, Oppenheimer also becomes a human metaphor for the entire project to split the atom. The film’s final moments make this clear in a genuinely chilling way.

I’m struck that, of Christopher Nolan’s twelve films, three are Batman movies, three are contemporary thrillers, three are near-future sci-fi action adventures, and three are historical films. Of the latter, two concern World War II. After seeing and thinking a lot about Oppenheimer, I can see the attraction of the period for Nolan. What other modern event offers such a variety of combinations of the technical, theoretical, and personal—and with such high stakes? World War II is ideal Nolan country. I hope he’ll return soon.

In the meantime, Oppenheimer is a great film—excellently produced, powerfully acted, and thematically rich. I strongly recommend it.

*As of this writing I still haven’t had a chance to see Oppenheimer in IMAX, because the one screen near me has been jampacked during every showing except the one that gets out at 2:00 AM. I hope to see it as it was intended soon and will amend this review if seeing it in IMAX alters my judgment in any way.

**If Nolan wanted to make a spiritual sequel to Oppenheimer, another period film about amoral Communist-adjacent theorists and their world-destroying experiments, his next project could be Bauhaus.

Scope vs depth

Depth: Ralph Ineson and Kate Dickie as an outcast Puritan mother and father in The Witch (2015)

An insightful line from this good short video essay on The Witch and how writer-director Robert Eggers created such a beautiful, authentic film with such limited resources:

 
Scope requires money. Depth only requires knowledge.

This has been the peculiar pleasure of Eggers’s three films so far: whether narrowly focused on a single family or a pair of lighthouse keepers or having a sweep encompassing Scandinavia, Eastern Europe, Iceland, and Asgard, all three films go deep. Eggers has done the work. However little or however much he shows, you feel the reach and fullness of the worlds his films take place in.

A gloss for writers, whether for the screen or the page: Your story may or may not have scope—breadth, epic sweep, intricate complications, civilization-size conflict—but there is no reason it shouldn’t have depth. In an ideal world, every story could balance both. But if you can only have one, go deep. Learn everything you can. Let it inform, strengthen, and deepen the story. If your story has limitations, let it be because of conscious artistic choice to work within self-imposed boundaries rather than overreach, bad judgment, or unknowing error.

As the essayist puts it in another good line from that video, “Know your limits. Don’t show your limits.”

Robert Downey Jr on historical accuracy

In a recent video breakdown of his career for Vanity Fair, Robert Downey Jr, reflects on his experience preparing for and filming the 1992 biopic Chaplin, which was directed by Sir Richard Attenborough:

[I]t’s hard to tell a story any more interestingly than the way it actually occurred.
— Robert Downey Jr

When you’re twenty-five and you’re given the keys to the kingdom you’re probably going to come out of center, maybe out of fear, maybe out of confidence. And for me, I, at that point—not to boast, but I was as much of a Chaplin expert as anyone involved in the project, and I was making corrections to the things that were factually and historically inaccurate. To which Attenborough said, “But, poppet, we’re making a film. It’s not a documentary.” I did learn at that point, though, that it’s hard to tell a story any more interestingly than the way it actually occurred.

That closing observation is exactly right. I think a lot of people assume that those of us who complain about historical inaccuracy in film adaptations of true stories are just humorless scolds or nitpickers. Certainly those exist, but for a true lover of history or of a specific historical period inaccuracy rankles because whatever Hollywood comes up with can never be nearly as good or surprising as the real thing. Credit to Downey for recognizing that and acting upon it.

Filmmaking as a medium has limitations, of course. Information, in film, is best communicated visually. Adaptation is necessary and inevitable. But those limitations shouldn’t be an excuse for inventing things where the reality is much more interesting. The more so where “inventing things” means molding history to the shape of cliches.

A few years ago on this blog I complained about the film Tolkien in precisely these terms. You can read that review here. You can watch Downey’s Vanity Fair breakdown here. The whole thing is fun and informative but it’s worth watching just for his perfect Richard Attenborough impression.

The Twilight World

Filmmaker Werner Herzog and Japanese soldier HIroo Onoda (1922-2014) upon his surrender in 1974

Werner Herzog is a filmmaker famously drawn to the obsessive, the fanatical, and the single-mindedly self-destructive. He also, based on my limited engagement with his filmography, appreciates grim irony but can tell ironic stories with great sympathy. So the story of Hiroo Onoda—a man we’ve all heard of even if you don’t know his name—is a natural fit for Herzog’s fascinations as well as his set of storytelling skills.

Onoda, a junior officer in the Imperial Japanese Army stationed on Lubang, a small island in the Philippines near the mouth of Manila Bay, took to the jungles after the American invasion began in late 1944. He had been specially detailed for acts of scorched earth sabotage—dynamiting a pier, rendering an airfield useless—and, having completed those objectives, to carry on the struggle against the enemy using “guerrilla tactics.” He had three other soldiers under his command. One turned himself in to Filipino forces in 1950, five years after the end of the war. The other two were killed, one in the mid-1950s and the other in 1972. Onoda held out alone until 1974, the next to last Japanese soldier to surrender.

Herzog met Onoda during a trip to Japan in 1997. This novel, The Twilight World, published in 2021, seven years after Onoda’s death at the age of 91, is the result of that meeting and Herzog’s enduring fascination.

Herzog explains, by way of prologue, the embarrassing circumstances that led to his meeting Onoda. He then begins Onoda’s story in 1974, with Norio Suzuki, a young adventurer whose stated goal was to find and see Hiroo Onoda, the yeti, and a giant panda, “in that order.” Suzuki camped out on Lubang until Onoda found him. Suzuki convinced Onoda to pose for a photograph and insisted that the war was over—long over. Onoda agreed to turn himself in if Suzuki could bring his commanding officer from thirty years before to Lubang and formally order him to stand down.

The novel then returns to the fall of 1944, the fateful days when a twenty-two-year old Onoda received his orders. Frustrated in his attempts to carry out his acts of sabotage, Onoda and his three subordinates move into the jungles and slowly figure out how to survive as guerrillas. They give up their tent, set up caches of ammunition, move repeatedly from place to place, crack coconuts, and attack isolated villages for food and supplies. Onoda broods. He lost his honor in failing to complete his objective, and the bravado of a final banzai charge would be absurd. What to do?

Herzog narrates this story dispassionately and without embellishment. His style is minimalistic but deeply absorbing. Michael Hofmann’s English translation reads like a cross between a screenplay—I wondered often while reading if this novel hadn’t begun life as a screenplay—and the stripped-down style of late Cormac McCarthy in No Country for Old Men and, especially, The Road. Herzog evokes mood and character through small, telling details and sharply observed environments.

This simple, direct approach proves richly rewarding. Most interesting to me were the ways in which Onoda and his comrades try to make sense of their own situation as the years pass. Evidence that the war is still going on are, from their perspective, plentiful and obvious. The Filipinos are still trying to kill them, aren’t they? And Onoda and his men regularly spot squadrons of American warplanes—ever larger and more sophisticated as the years pass, but still headed northwest toward mainland Asia. Herzog is here able to use the dangerous tool of dramatic irony for maximum pathos.

Most interesting, to me, were Onoda and company’s wrestling with repeated rumors that the war had ended. The American and Philippine militaries dropped leaflets explaining that the war was over. Onoda and his men interpreted mistakes in the leaflets’ Japanese typography as evidence that they were fake—a ruse. The Filipinos left a newspaper in a plastic bag at one of Onoda’s known resting points as proof that the war was long over. This, too, Onoda interpreted as a fabrication—what newspaper would ever print so many advertisements? Thus also with news heard on a transistor radio. Even when relatives of the holdouts travel to Lubang and call to them to come out over loudspeakers, Onoda finds reasons to believe they are being lied to. The Twilight World is, in this regard, one of the best and most involving portraits of the insane logic of paranoia that I’ve read.

But Herzog is, thematically, most interested in the passage of time. The scale of Onoda’s tenacity is almost unimaginable—twenty-nine years in the jungle. Twenty-nine years of surviving on stolen rice, of annual visits to Onoda’s hidden samurai sword to clean and oil it, of eluding Filipino police and soldiers, of watching American aircraft fly north, of attacking villages and avoiding ambush. What is that like?

In Herzog’s version of this story, after his initial commitment to his guerrilla campaign Onoda settles into a routine in which the years pass like minutes. In the jungles of Lubang Island, Onoda comes into some kind of contact with eternity. One is tempted to call this contact purgatorial, but Onoda is neither purged nor purified by his experience. Neither does this timelessness offer the beatific vision or even an experience of hell—if it had, Onoda might have surrendered in 1950 like his most weak-willed soldier. Instead, this eternity is an impersonal, indifferent one of duty lovelessly and unimaginatively fulfilled, forever.

I’ve seen The Twilight World accused of making a hero out of Onoda or of reinforcing a preexisting impression of Onoda as a heroic romantic holdout—an absurd accusation. As with many of Herzog’s other subjects, whether the self-deluded Timothy Treadwell or the innocent Zishe Breitbart, Herzog relates this story out of pure interest. Herzog, laudably, wants to understand. That he presents Onoda sympathetically does not mean that he condones his actions. If anything, the intensity with which Herzog tries to evoke Onoda’s three decades in the jungle is an invitation to pity and reflection. That’s certainly how I received it.

I’ve also read reviewers who fault Herzog for either downplaying or refusing to acknowledge Onoda’s violence against the Filipinos of Lubang Island. Onoda and his men’s depredations have quite justifiably received more attention in the last few years, notably in this spring’s MHQ cover story, rather provocatively if misleadingly titled “Hiroo Onoda: Soldier or Serial Killer?”

But Herzog does acknowledge this side of Onoda’s story. An early incident in which Onoda and his men attack villagers and kill and butcher one of their precious water buffalo is especially vivid. By the end, Onoda is walking into villages and firing randomly in the air, just to remind them he’s around. None of this is presented as heroic or even necessary. When Filipino troops try to ambush and kill Onoda and his men, the reader understands why.

Perhaps all of this is why Herzog begins his novel with a curious—but quintessentially Herzog-esque—author’s note:

Most details and factually correct; some are not. What was important to the author was something other than accuracy, some essence he thought he glimpsed when he encountered the protagonist of this story.

Seen in this light, and not forgetting that The Twilight World is a work of fiction—based on a true story—Hiroo Onoda’s bleak years in lonely touch with eternity are a fitting subject for a filmmaker who has spent his career teasing the mythic out of the real. The Twilight World is one of the most interesting and most involving books I’ve read this year, a testament not only to the strength of the dark and ironic story it tells but to the skill and cleareyed compassion of its storyteller.

Mission: Impossible - Dead Reckoning, Part One

Tom Cruise enjoys the clear air of the Austrian Alps in Mission: Impossible—Dead Reckoning, Part One

It’s interesting, so shortly after having reviewed Indiana Jones and the Dial of Destiny, to see another movie that has so many similarities—a dangerously powerful object sought by multiple nefarious parties, action in Middle Eastern and Mediterranean locations, car chases in comically small vehicles, even a fight atop a speeding train—but that does everything so much better. The baton has clearly been passed some time ago. But I reflect on Mission: Impossible—Dead Reckoning, Part One with a twinge of melancholy. The series seems set to end with Part Two.

Well, sufficient to the day is the evil thereof. Today, let me briefly recommend the newest Mission: Impossible.

This double-barreled story begins under the ice of the Bering Sea aboard a Russian nuclear sub. The sub is lost in a spectacular and catastrophic failure owing, apparently, to computer malfunction. Or is it something more sinister?

The two keys carried by the sub’s captain and first officer, both of which are required to make a crucial system aboard the submarine work, reemerge separately on the black market. When the US government detects that one of keys is about to come up for sale, the IMF assigns Ethan Hunt to track it down and acquire it. Ethan and his old comrades Benji Dunn and Luther Stickell fly to Abu Dhabi with their usual stock of masks, gadgets, and guts and lay plans to intercept both seller and buyer.

It’s in Abu Dhabi that the plot grows more complicated through two outside interventions. First, a professional thief, Grace, lifts the key from the seller and Ethan is forced into a confrontation and uneasy collaboration with her. Second, a figure from Ethan’s past, Gabriel, appears. His purpose is unknown, but he is able to cloud his presence on all security monitoring in real time, even disappearing from Ethan’s augmented reality glasses, and his arrival coincides with Benji’s discovery of a nuclear bomb. The bomb requires vocal answers to riddles and personal questions to be disarmed.

The two keys, it turns out, relate to an AI referred to as “the Entity.” Created for sabotage and espionage and apparently responsible for the sinking of the Russian sub, the Entity is a protean self-learning program that security experts believe has become sentient. It has hacked and assimilated data from all major governments and can predict and react to seemingly all possible contingencies. Everything used against it makes it smarter. But it has one weakness: its source code. Whoever possesses the two keys Ethan is chasing will have the ability to stop the Entity—or use it.

This makes the keys more than a MacGuffin, like Mission: Impossible III’s “rabbit’s foot.” The keys, and the Entity, are the film’s One Ring. And as becomes clear over the course of the film, the sick desire for the ring and the mastery it offers infects everyone. Like Boromir, all the parties angling to acquire the keys and control the Entity want to use it. Only Ethan is determined to destroy it.

This gives a layer of thematic depth to all the action and stunts that I found surprisingly thought-provoking, not least because the film was in the works long before ChatGPT and the other Nazgûl of AI arrived. It also makes the film genuinely eerie, as the Entity can use facial recognition, create deepfakes, interfere with its perceived enemies’ electronics, and generate real people’s voices for its own purposes. (In an amusing subplot, the US government is forced to go analog, breaking out derelict pre-internet satellites, banks of old-fashioned cathode ray tube computer monitors, and armies of interns on typewriters to make hard copies of all their precious intel.) Ethan, Luther, Benji, Grace, and Ilsa Faust eventually realize that they can trust nothing that is not happening in the flesh, before their own eyes. This paranoia is a wonderful new complication in a series that has already excelled at the expected espionage twists and turns.

And Mission: Impossible—Dead Reckoning, Part One excels at everything else you’d expect it to excel at. The action is well-executed and exciting. An extended car chase through Rome keeps finding ways to reinvent itself, staying clever and exciting throughout. The dirtbike BASE jump featured in all the trailers was not a letdown. In the IMAX screening where I saw the film, the people around me lifted out of their seats and held their breath in the palpable hush following the leap over the edge. And the train chase aboard the Orient Express that Ethan is trying to reach is a brilliantly executed series of action and suspense sequences.

The plot, of course, is not resolved and the Entity and Gabriel are not defeated by the end of Dead Reckoning, Part One. But this first part tells a coherent and satisfying story that is intricately plotted and grows steadily more complicated without becoming impossible to follow. It is also surprisingly moving.

I’ve left a lot out, but Mission: Impossible—Dead Reckoning, Part One is a whole lot of movie in the best way possible. I could quibble with a few things. The first twenty minutes or so is unevenly paced and heavy on exposition and the film uses more obvious CGI than some previous installments. But these are just that: quibbles. With its complex and unusually rich plot, its genuine paranoia and ethical complications, its beautiful locations, exciting action, loathsome villains, and stable of beloved characters, this is a great new entry in the series and my favorite film of the summer so far.

Now, for Dead Reckoning, Part Two. As one of my friends said as we left the screening, “I’m going to need that to come out sooner.”

Cormac McCarthy and the power of the particular

As I was closing out unused browser tabs yesterday I was glad to rediscover this in memoriam post on Cormac McCarthy by Declan Leary at The American Conservative, written after McCarthy’s death last month. It’s a good piece, making some insightful comments on McCarthy’s style, his philosophy, and his intentional resistance to easy didactic interpretation—as well as having some fun mocking the insufferable ego and faux intellectualism of wannabe auteur James Franco—but I especially appreciated it for two related points Leary makes near the end.

First, Leary responds to a 1992 New York Times profile of McCarthy in which the interviewer lazily turns the desolate setting of Blood Meridian into a mere metaphor:

Yet [Richard] Woodward, like later students at McCarthy’s feet, is bothered by the master’s resistance to interpretation. He slips up at one point, writing that McCarthy “has made dozens of similar scouting forays to Texas, New Mexico, Arizona and across the Rio Grande into Chihuahua, Sonora and Coahuila. The vast blankness of the Southwest desert served as a metaphor for the nihilistic violence in his last novel, ‘Blood Meridian,’ published in 1985.”

This is one way of putting it, but it is not a very good one. Blood Meridian is set against the deserts of Mexico and the American West because it happened there; it cannot have happened anywhere else. If there is symbolism in the landscape, it is God’s, not Cormac McCarthy’s.

Second, “a related mistake,” Leary cites an obituary that makes a common but fundamentally mistaken assumption about how and why good fiction lasts:

Graeme Wood, eulogizing McCarthy in the Atlantic, makes a related mistake. He writes that “the McCarthy voice was timeless—not in the pedestrian sense of ‘will be read for generations,’ but in the unsettling, cosmological sense that one could not tell whether the voice was ancient or from the distant future.”

The interpretation is understandable, but the more one reads McCarthy the more firmly located his work feels. It is rock-solid in time and place and bound by historical force, even as it indulges the same fantasy and mystery of other Southern gothic greats. It is a failure either of imagination or of piety to assume that myth and Americana cannot coexist.

What Leary is driving at in his critiques of these incomplete appreciations is particularity. His assertion that Blood Meridian “cannot have happened anywhere else” is spot-on. To shift its action in time or place would be to change it utterly and almost certainly to weaken it. Likewise in all of McCarthy’s other books, all of which are closely observed and deliberately specific in every detail. And yet Blood Meridian—and No Country for Old Men and All the Pretty Horses and, most spectacularly, The Road—speak to us wherever we are and will continue to do so.

This is the paradox of universality or “timelessness,” as Wood puts it in the passage Leary quotes above: If you want to say something genuinely universal, you have to get specific.

The works of literature that speak most universally, that have the greatest longevity and staying power and that readers come back to over and over, are not those with the most broadly applicable free-floating themes or messages, but those most firmly rooted in a specific time and place, among specific people and their specific mores and customs. What could be more seemingly parochial than Jane Austen’s matchmaking and county balls? Or Dante’s score-settling over the vicissitudes of one town’s politics? Or Shakespeare’s dramas of royal intrigue? Or Homer’s war stories? Or Moby-Dick’s painstaking account of every facet of whaling? And yet what books have dug deeper into human nature, heroism, home, love, sin, or salvation?

It took me a long time to grasp this (and it is largely thanks to Jane Austen, Dante, and Homer that I did). But how many young writers striving for greatness through theme or message or—worst of all—political enlightenment miss out on permanence because they don’t first humble themselves and attend to particulars? Know thyself is not only a philosophical necessity.

I wrote about the particularity of good fiction—and the present day’s lazy resort to “thinking in categories”—in another context last year. That post was inspired by an observation about the “antagonistic relationship” between politics, “the great generalizer,” and fiction, “the great particularizer.” And of course particularity of the kind McCarthy evinced contributes to the “vivid and continuous fictive dream.”

Swuster sunu

Peter Dennis’s depiction of the Battle of Maldon for Osprey’s Combat: Viking Warrior vs Anglo-Saxon Warrior

One of the noteworthy aspects of The Battle of Maldon is the large number of named individuals, presented as real people, included in what we have left of the poem. Byrhtnoth, the Ealdorman of Essex, is the central figure in the poem’s action and themes, but there are many others like Æthelric and Offa, members of Byrhtnoth’s retinue; or Dunnere, “a simple ceorl” or non-noble freeman; or the brothers Oswold and Ealdwold. Many, like the latter, are given just enough biographical information to identify them to an audience presumably familiar with the event and the men who, overwhelmingly, died in it.

And the poet is careful to distinguish men with shared names, noting the presence of both a Wulfmær and a “Wulfmær the young” and, most damningly, Godric Æthelgar’s son who died fighting as opposed to “that Godric that forsook the field.” Others offer pure tantalization: Æschferð, Ecglaf’s son, from Northumbria, who “showed no faint heart,” is a “hostage” (gysel) of Byrthnoth’s household. Who is he? Why is he a hostage? What’s the Northumbria connection? And is it a coincidence that his name is so similar to Unferð Ecglaf’s son? We’ll probably never know—the poem is concerned only with recording his bravery.

In his notes on Maldon, Tolkien writes this of the first Wulfmær, Byrhtnoth’s nephew specifically by his sister (his swuster sunu): “The relationship was one of special import in Germanic lines and the especially close tie existing between uncle and sister’s son is motive in several legends (notably Finnsburg).”

Tolkien then makes a broader point about the relationship of stories like this to actual historical events and their treatment by modern critics and historians:

Things do not become legendary unless they are common and poignant human experiences first.

There is however no reason to suspect that Wulfmær was not actually swuster sunu of Byrhtnoth, and this is a good caution to that kind of criticism which would dismiss as falsification actual events and situations that happen to be [the] same as familiar motives of legends. Things do not become legendary unless they are common and poignant human experiences first. The traditional affection of the relationship (whether or not it be a last survival of matriarchy or not!) may however have been the cause of the poet’s special mention.

I have complained before about the tendency of a certain kind of historian to doubt or dismiss any story that has even the rudiments of a literary shape. According to these, this represents the intrusion of fiction into reality, or perhaps some shadowy figure reshaping raw material to suit a literary design. At worst, it represents deliberate falsehood with a political purpose—that is, propaganda.

Tolkien here correctly inverts that suspicion. The kind of historian or critic he describes has gotten the relationship of legend to reality backwards, and, more specifically in the case in question, they have ignored many other possible explanations for the inclusion of details like Wulfmær’s kinship with Byrhtnoth—not least that it might actually be true.

Later in his notes, Tolkien writes this of Byrhtwold, the old retainer (eald geneat) who gives the famous final speech of the poem, in which he declares his intention to die avenging Byrhtnoth: “We have here another instance of old traditional situation and actual occurrence coinciding. We need not doubt that Byrhtwold was an eald geneat, and that he actually spoke memorable words not unlike the remarkable ones here enshrined. Yet it was traditional for the eald geneat to be relentless and dauntless and ‘speak winged words.’”

Historians and critics would do better to accept that the literary and the actual “coincid[e]” a lot more often than they suspect.

I’ve previously written about a related problem, the tendency of suspicious historians, having seen through everything that strikes them as literary falsehood, to make history boring, here. (Cf CS Lewis on “seeing through” things.) For my thoughts on describing ancient and medieval works as “propaganda,” see here.