On the fine art of insinuation

Eric Idle and Terry Jones in Monty Python's "Wink Wink, Nudge Nudge" (1969)

Earlier this week, in my notes on a recent historical controversy, I mentioned some of the “dark insinuations” that were one part of the furor. That particular aspect of the controversy wasn’t the point of my post, but I did want to revisit it in general terms—especially since I was working on a post on the same topic last year, a post I eventually abandoned.

Since facts and sound historical interpretation prove dangerously prone to turn back on them, conspiracists rely heavily upon insinuation—the “you know what I mean,” “wink-wink, nudge-nudge” implications of whatever factual information they do put forward. This approach allows them to present information in what seems to be a purely factual way, but with a tone that implies the conclusion they want you to reach. It’s a technique used in what David Hackett Fischer called “the furtive fallacy” in historical research.

The fine art of insinuation crossed my mind again just before the interview that prompted my previous post when I watched a recent short video on the Cash-Landrum incident, a genuinely weird and interesting—and genuinely unexplained—UFO sighting in Texas in 1980. Briefly, during a late night drive on a remote East Texas highway, two ladies and a child spotted a glowing, white-hot, fire-spewing object that hovered in their path for some time before drifting away, apparently escorted by US Army helicopters. The ladies subsequently developed severe illnesses related to radiation poisoning.

It’s a decent enough video, so please do watch it, but the YouTuber behind it provides a few textbook examples of insinuation. After describing the ladies’ attempts to get compensation from the military and the government following the incident, the narrator relates the first formal third-party research into the incident this way:

[Aerospace engineer and MUFON co-founder John] Schuessler agreed to investigate the case and was taken by Betty and Vickie to the site where they claimed it had happened. When they arrived, they found a large circular burn mark on the road where the UFO has supposedly been levitating, cementing more credibility to their claims. However, several weeks later, when Schuessler returned to the spot, the road had been dug up and replaced, with witnesses claiming that unmarked trucks came by and took the burnt tarmac away.

This is already a UFO story, and now we have unmarked trucks destroying evidence! The story autopopulates in your mind, doesn’t it? But this part of the story, as presented for maximum insinuation, is vague—which points toward the best tool for combating the use of insinuation: specific questions. For instance:

  • What’s so unusual about a damaged road being repaired?

  • Did Texas DOT vehicles have uniform paint schemes or other markings in 1980?

  • Who were the witnesses who saw these unmarked trucks?

And, granting for a moment the conclusion that the narrator is trying to imply:

  • If some powerful agency was trying to cover up what had happened, why did it take “several weeks”? And why did they allow witnesses to watch them?

Insinuation relies on context, especially our preconceptions and prejudices, to do its work. It’s a mode of storytelling that invites the listener to complete the story for you by automatically filling in details. Questioning the vague prompts and implications that start this process can bring the discussion back down to earth and the basic level of fact and source. And, perhaps more importantly in this kind of discussion, specific questions can force people to say what they mean rather than letting them get away with insinuations or implication.

A 44-year old UFO sighting offers a pretty harmless test case for interrogating this technique, but pressing for clearly stated details might have proven more helpful to everyone—as well as more revealing—during that interview last week.

The Cash-Landrum incident was memorably dramatized in a 1991 episode of “Unsolved Mysteries,” which you can watch here. It’s worth noting that the ladies involved always assumed the UFO was purely terrestrial and that they were the victims of some kind of government or military test gone wrong.

Notes on the Churchill kerfuffle

V for Victory? Or accidentally signaling the best response to his critics?

Speaking of The Bridge on the River Kwai, in a revealing moment early in the film the antagonist, Col Saito, speaking to his British counterpart about prisoners who had been shot in an escape attempt, shows pride in his enemy’s behavior: “For a brief moment between escape and death… they were soldiers again.”

Well, last week, for a brief moment between TikTok and college football, people cared about history again.

Background and backlash

Briefly, last week podcaster Darryl Cooper of Martyr Made appeared in an interview with Tucker Carlson on Twitter. Carlson feted Cooper as “the best and most honest popular historian” in America, fulsome hyperbole that did Cooper no favors once the discussion started and Cooper ventured his unconventional opinions about World War II. These resulted in immediate controversy.

While early reporting on the interview floated a number of possible points of outrage, including wobbly suggestions of Holocaust denial and—more accurately and damningly—Cooper’s dark insinuations about the Zionists who had financial connections to Winston Churchill, the controversy eventually settled around Cooper’s examination of Churchill’s decision-making and leadership, and not least his description of Churchill as a “psychopath” and “the chief villain” of the war. Churchill’s crimes? Having needlessly antagonized Hitler before the war, bullheadedly refused peace offers during the war, and pushed for things like the strategic bombing of German cities. Cooper even repeats the meme-level cheap shot that Churchill was “a drunk.” (He wasn’t.)

Journalistic outrage-baiting ensued, all conducted in the breathless tone with which I assume Puritans reported the discovery of witches. I found it pretty rich that the same media that justified and celebrated anti-Churchill protests and vandalism in 2020 used a podcaster’s profanation of the same man for clicks. Well, it worked. I couldn’t escape this story as it unfolded.

I don’t intend to wade into the details. Churchill biographer Andrew Roberts, to whom I have referred many times here on the blog, handled those with aplomb in a blistering essay for the Washington Free Beacon. Read that, then follow it up with Roberts’s appearance on the School of War podcast in an episode that dropped just last night. The past week has produced many more apologias for Churchill and critiques of Cooper, but Roberts has done the work and is worth listening to on any subject he’s researched.

For his part, Cooper posted a characteristically discursive response on his Substack, which you can read here.

Hyperreality and post-literate history

What I found interesting and, at first, a little baffling about the controversy from the beginning was the… prosaicness of some of Cooper’s views. Churchill as warmonger, Churchill as manipulator of America, Churchill as the real instigator of the bloodiest war in history, even Churchill as drunk—these are all pretty pedestrian contrarian takes. Pat Buchanan published a book laying out many of these arguments sixteen years ago, and he was drawing on a current of anti-Churchill interpretation that was already decades old. (Roberts does a good job explaining some of the historiography of this controversy on School of War.)

The fact that such perspectives are and ought to be old news to anyone who has studied Churchill or the Second World War even a little bit suggests that most people—journalists, media personalities, podcasters, and the general public—simply haven’t.

For most people, Churchill is a recognizable character with no depth in a simplistic good-and-evil tale rather than a complex real person living through uncertain and dangerous times. This reduction of the man to the icon means that an attack of Cooper’s kind will generate either outrage at the profanation of a sacred image (when, again, we should have heard all this before) or the frisson of the conspiracy theorist discovering forbidden (false) knowledge. Beyond Cooper’s bad history, the fact that this interview generated the controversy that it did is revealing.

It’s this broader context that I’m most interested in, and two essays in particular offer a lot of food for thought in response.

First, writing at Compact, Matthew Walther sees the Carlson-Cooper interview and the resulting controversy as symptoms of a “post-literate history,” there being an “epistemic gulf between the current consensus . . . of practicing historians on any given subject and the attitudes of the ordinary person of general education.” The appetite of the public for charismatic purveyors of dark, hidden truths—usually old, debunked ideas that can still be used to surprise the ignorant—is part of the problem, but historians and educators generally share the blame. Take a few minutes and read the whole essay.

Second, Sebastian Milbank, one of my favorite writers at The Critic, published an essay this morning that only glances across the Cooper controversy as an example of our present absorption into “hyperreality,” an imaginary world shaped by social media that, through information overload and partisan polarization, turns real people and things into symbols and erodes discernment, judgement, and wisdom. Simplification, detachment from reality, the reduction of knowledge and rival truth claims to mere content, and the “openness to everything” of online hyperreality create an environment in which false views appear more inviting, and not only for the ignorant or wicked:

Anyone with a modicum of knowledge will be able to spot the huge gaps in Cooper’s argument here. But what is more interesting is how he came to embrace such a grotesque viewpoint. Cooper isn’t stupid, or wicked, or even ill-informed in a conventional sense. Instead, we could say that he is “overinformed”. He is the product of hyperreality, supersaturated with information to the point that his analytical faculties and sense of reality breaks down. One gets a sense of this in the interview alone, where he describes reading, not systematically, but omnivorously, consuming over eighty books for his podcast on Israel/Palestine, and not being able to recall all the titles.

Milbank’s essay is longer and richer than the discussion surrounding Cooper—and Milbank includes a favorite passage about madness from Chesterton—so be sure to read the whole thing. For an even more dramatic parallel case, including another pertinent Chesterton quotation, see Jonathon Van Maren’s essay on Candace Owens at the European Conservative here.

Caveats and crankery

Churchill lived a long time and involved himself in a lot of things, not always successfully. Far from the “correct” view being the flawless and burnished bronze lion of British defiance in the face of tyranny, Churchill is open to legitimate lines of critique that historians still debate. Irish and Australian critics, for dramatically different reasons, sometimes take a more negative view of Churchill, and he is the object of an entire subfield of anti-imperialist Indian criticism. But all of this is despite the role he played in World War II, and all of these grievances and arguments are subject to evaluation according to the evidence.

Which is the first place Cooper fails. And when Cooper asserts that the reactions to his interview are evidence that he’s correct, he fails even more seriously by falling into a trap I’ve written about here before: crankery.

Cooper is not, as Carlson tried to puff him, an historian. I’ve tried to avoid pointing this out but others, like Niall Ferguson, have been much less polite about it. Cooper is, however, as Walther and Milbank’s essays suggest, a gifted autodidact. But the problem for autodidacts in any field is that their enthusiasm is not a substitute for the basic intellectual formation that formal, guided study by those that have already mastered the subject provides. There is a moral dimension to this as well—enthusiasm and omnivorous reading are no substitutes for sound historical judgement or simple human wisdom.

And so the autodidact blunders into plausible but false theories that, owing to gaps they aren’t even aware of, become their entire frame of reference. “Everything becomes reduced down to a single question or thesis,” as Milbank puts it. Their world view is complete, but too small, according to Chesterton. And if, when questioned on their interpretation, they double down, attack their questioners, or begin to distort their evidence, they risk becoming a crank. Once they begin referring to “them” and an undefined “establishment” with knowing contempt, they’re already there.

This is, more than anything, a good example of why education in history and the humanities more broadly still matters.

Recommended reading

Churchill’s memory lives among those very few men—like Lincoln and Napoleon—who inspire a continuous flow of books. The following are those that I most often recommend:

  • Churchill, by Sir John Keegan—An excellent and approachable short biography from a great military historian.

  • The Duel: The Eighty-Day Struggle Between Churchill and Hitler, by John Lukacs—A good look at a specific episode of Churchill’s life, from his appointment as Prime Minister in May 1940 into the summer, with Hitler’s activities at the same time told in quite revealing parallel.

  • Winston’s War: Churchill 1940-1945, by Max Hastings—An excellent study of Churchill’s time as Prime Minister, with a lot of attention devoted to his frustrating relationship with the United States. A good antidote to at least one of Cooper’s claims.

  • Churchill: Walking with Destiny, by Andrew Roberts—The big one, a massive and deeply researched comprehensive biography by an expert who, as I said above, has done the work. It shows.

  • Moral Combat: Good and Evil in World War II, by Michael Burleigh—If you’re interested in the moral and ethical questions raised by the war, this is a more serious and better researched consideration of them than you’ll get from the Carlson interview.

I’d recommend any one of these for a more detailed and nuanced grasp of a great man than any podcast or social media interview can possibly provide.

Summer reading 2024

Though I’m thankful to say that, compared to where I was at the end of the spring, I’ve wrapped up the summer and begun the fall semester feeling refreshed and rejuvenated, my reading has still been unusually fiction-heavy. That’s not necessarily a bad thing—all work and no play, after all—but I do mean to restore some balance. I look forward to it.

“Summer,” for the purposes of this post, runs from approximately the first week or two of summer classes to today, Labor Day. Since there’s a lot more fiction and non-fiction this time around, I mean to lead off with the smattering of non-fiction reading that I enjoyed. And so, my favorites, presented as usual in no particular order:

Favorite non-fiction

While I only read a handful of non-fiction of any kind—history, biography, philosophy, theology, you name it—almost all of them proved worthwhile. They also make an unusually idiosyncratic list, even for me:

An Illustrated History of UFOs, by Adam Allsuch Boardman—A fun, wonderfully illustrated picture book about UFOs and all sorts of UFO-adjacent phenomena. Not deep by any means and only nominally skeptical, this book is surprisingly thorough, with infographic-style tables of dozens of different purported kinds of craft, aliens both cinematic and purportedly real, and brief accounts of some legendary incidents from Kenneth Arnold and the Roswell crash to Betty and Barney Hill’s abduction, Whitley Strieber’s interdimensional communion, and the USS Nimitz’s “flying Tic Tac.” If you grew up terrified to watch “Unsolved Mysteries” but also wouldn’t think of missing it, this should be a fun read. See here for a few sample illustrations.

Histories and Fallacies: Problems Faced in the Writing of History, by Carl Trueman—A concise, sensible, and welcoming guide to some of the pitfalls of historical research and writing. Trueman is especially good on the dangers of historical theories, which naturally incline the historian to distort his evidence the better to fit the theory. There are more thorough or exhaustive books on this topic but this is the one I’d first recommend to a beginning student of history. I mentioned it prominently in my essay on historiography at Miller’s Book Review back in July.

Three Philosophies of Life, by Peter Kreeft—A short, poetic meditation on three Old Testament wisdom books: Ecclesiastes and Job, two of my favorite books of the Bible, and Song of Songs, a book that has puzzled me for years. Kreeft presents them as clear-eyed dramatizations of three worldviews, the first two of which correctly observe that life is vain and full of suffering, with the last supplying the missing element that adds meaning to vanity and redemption to pain: God’s love. An insightful and encouraging short book.

Homer and His Iliad, by Robin Lane Fox—Who was Homer and what can we know about him? Was he even a real person? And what’s so great about his greatest poem? This is a wide-ranging, deeply researched, and well-timed expert examination of the Iliad and its author, thoroughly and convincingly argued. Perhaps the best thing I can say about Lane Fox’s book is that it made me fervently want to reread the Iliad. My favorite non-fiction read of the summer. Full-length review forthcoming.

Always Going On, by Tim Powers—A short autobiographical essay with personal stories, reminiscences of Philip K Dick, nuts and bolts writing advice, aesthetic observations, and philosophical meditations drawing from Chesterton and CS Lewis, among others. An inspiring short read.

The Decline of the Novel, by Joseph Bottum—An excellent set of literary essays on the history of the novel in the English-speaking world; case studies of Sir Walter Scott, Dickens, Thomas Mann (German outlier), and Tom Wolfe; and a closing meditation on popular genre fiction—all of which is only marginally affected by a compellingly argued but unconvincing thesis. I can’t emphasize enough how good those four case study chapters are, though, especially the one on Dickens. Full dual review alongside Joseph Epstein’s The Novel, Who Needs It? on the blog here.

Favorite fiction

Again, this was a fiction-heavy summer in an already fiction-heavy year, which was great for me while reading but should have made picking favorites from the long list of reads more difficult. Fortunately there were clear standouts, any of which I’d recommend:

On Stranger Tides, by Tim Powers—An uncommonly rich historical fantasy set in the early years of the 18th century in the Caribbean, where the unseen forces behind the new world are still strong enough to be felt and, with the right methods, used by new arrivals from Europe. Chief among these is Jack Chandagnac, a former traveling puppeteer who has learned that a dishonest uncle has cheated him and his late father of a Jamaican fortune. After a run-in with seemingly invincible pirates, Jack is inducted into their arcane world as “Jack Shandy” and slowly begins to master their arts—and not just knot-tying and seamanship. A beautiful young woman menaced by her own deranged father, a trip to Florida and the genuinely otherworldly Fountain of Youth, ships crewed by the undead, and Blackbeard himself further complicate the story. I thoroughly enjoyed On Stranger Tides and was recommending it before I was even finished. That I read it during a trip to St Augustine, where there are plenty of little mementos of Spanish exploration and piracy, only enriched my reading.

Journey Into Fear, by Eric Ambler—An unassuming commercial traveler boards a ship in Istanbul and finds himself the target of a German assassination plot. Who is trying to kill him, why, and will he be able to make it to port quickly enough to survive? As much as I loved The Mask of Dimitrios back in the spring, Journey into Fear is leaner, tighter, and more suspenseful. A wonderfully thrilling read.

The Kraken Wakes, by John Wyndham—Another brilliant classic by Wyndham, an alien invasion novel in which we never meet or communicate with the aliens and the human race always feels a step behind. Genuinely thrilling and frightening. Full review on the blog here.

Mexico Set, by Len Deighton—The second installment of Deighton’s Game, Set, Match trilogy after Berlin Game, this novel follows British agent Bernard Samson through an especially tricky mission to Mexico City and back as he tries to “enroll” a disgruntled KGB agent with ties to an important British defector. Along with some good globetrotting—including scenes in Mexico reminiscent of the world of Charles Portis’s Gringos—and a lot of tradecraft and intra-agency squabbling and backstabbing, I especially appreciated the more character-driven elements of this novel, which help make it not only a sequel but a fresh expansion of the story begun in Berlin Game. Looking forward to London Match, which I intend to get to before the end of the year.

LaBrava, by Elmore Leonard—A former Secret Service agent turned Miami photographer finds himself entangled in an elaborate blackmail scheme. The mark: a former Hollywood femme fatale, coincidentally his childhood favorite actress. The blackmailers: a Cuban exile, a Florida cracker the size of a linebacker, and an unknown puppet master. Complications: galore. Smoothly written and intricately plotted, with a vividly evoked big-city setting and some nice surprises in the second half of the book, this is almost the Platonic ideal of a Leonard crime novel, and I’d rank only Rum Punch and the incomparable Freaky Deaky above it.

Night at the Crossroads, by Georges Simenon, trans. Linda Coverdale—Two cars are stolen from French country houses at a lonely crossroads and are returned to the wrong garages. When found, one has a dead diamond smuggler behind the wheel. It’s up to an increasingly frustrated Inspector Maigret to sort through the lies and confusion and figure out what happened. An intricate short mystery that I don’t want to say much more about, as I hesitate to give anything at all away.

Epitaph for a Spy, by Eric Ambler—A stateless man spending some hard-earned cash at a Riviera hotel is, through a simple mix-up, arrested as a German spy. When the French police realize his predicament and his need to fast-track his appeal for citizenship, they decide to use him to flush out the real spy. Well-plotted, suspenseful, and surprising, with a great cast of characters. My favorite Ambler thriller so far this year. There’s also an excellent two-hour BBC radio play based on the book, which Sarah and I enjoyed on our drive back from St Augustine.

The Light of Day, by Eric Ambler—One more by Ambler, which I also enjoyed. Arthur Simpson, a half-English, half-Egyptian smalltime hood involved in everything from conducting tours without a license to smuggling pornography is forced to help a band of suspicious characters drive a car across the Turkish border. He’s caught—and forced to help Turkish military intelligence find out what the group is up to. Published later in Ambler’s long career, The Light of Day is somewhat edgier, but also funnier. It’s more of a romp than a heavy spy thriller, with wonderfully sly narration by Arthur himself. I greatly enjoyed it. Do yourself a favor, though, and read it without looking at any summaries, even the one on the back of the book. My Penguin Modern Classics paperback gave away a major plot revelation. I still enjoyed it, but have to wonder how much more I might have with that important surprise left concealed.

Runner up:

Swamp Story, by Dave Barry—A wacky crime novel involving brothers who own a worthless Everglades bait shop, potheads trying to make their break into the world of reality TV, a disgraced Miami Herald reporter turned birthday party entertainer, a crooked businessman, Russian mobsters, gold-hunting ex-cons, and a put-upon new mom who finds herself trying to survive all of them. Fun and diverting but not especially funny, which some of Barry’s other crime thrillers have managed to be despite going darker, I still enjoyed reading it.

John Buchan June

The third annual John Buchan June included five novels, a short literary biography of one of Buchan’s heroes, and Buchan’s posthumously published memoirs. Here’s a complete list with links to my reviews here on the blog:

Of this selection, my favorite was almost certainly The Free Fishers, a vividly imagined and perfectly paced historical adventure with a nicely drawn and surprising cast of characters. “Rollicking” is the word Ursula Buchan uses to describe it in her biography. An apt word for a wonderfully fun book. A runner up would be Salute to Adventurers, an earlier, Robert Louis Stevenson-style tale set in colonial Virginia.

Rereads

I revisited fewer old favorites this season than previously, but all of those that I did were good. As usual, audiobook “reads” are marked with an asterisk.

Conclusion

I’m looking forward to more good reading this fall, including working more heavy non-fiction back into my lineup as I settle into the semester. I’m also already enjoying a couple of classic rereads: Pride and Prejudice, which I’ve been reading out loud to my wife before bed since early June, and Shadi Bartsch’s new translation of the Aeneid. And, of course, there will be fiction, and plenty of it.

I hope my summer reading provides something good for y’all to read this fall. As always, thanks for reading!

Introducing historiography at Miller’s Book Review

Earlier this month I was humbled to be asked to contribute to Miller’s Book Review, an outstanding and wide-ranging Substack run by Joel Miller. Joel asked that I put together an essay on the nuts and bolts aspects of historiography, one of my favorite subjects and a regular topic on this blog. After a few abortive attempts to summarize everything (“It is a great mistake to include everything,” the late John Lukacs once said, accurately) I turned in an essay organized around a few of the books I like to recommend to students who are curious about how history works as a discipline.

I’m pleased to say that the essay is now available! Read the whole thing here. Expect some Herodotus, some basic research questions, some philosophy of history, some theory, some deadball era baseball, a warning or two, one salvaged reputation, a little dunking on Ridley Scott, a whole lot of Hitler, and several books I heartily recommend.

And be sure to subscribe to Joel’s reviews. I’ve added many more titles to my to-read list thanks to him. I’m grateful to him for the invitation to write—and to learn a little about Substack at last—and hope that y’all will enjoy the finished product! Thanks for reading.

Maturity and evolution in military history

A friend with a deep interest in Celtic and specifically Welsh history recently shared this passage from a popular book on ancient Celtic warfare, in which the author tries to see through legendary material relating to Irish warbands:

If the Fianna of the Irish epics are actually celebrated in epic verse as a heroic archetype, an in-depth and disillusioned examination can recognize their historical characters as unruly elements and promoters of endemic political unrest, taking part in conflict only for the sake of conflict and, due to the absence of alternative adversaries, maintaining an obsolete, un-evolving developmental phase of warfare.

Elsewhere in the same book the author describes Celtic warfare in the British Isles as not “mature” compared to the warfare of their Continental cousins. My friend was puzzled by this passage (and wryly noted that it “sounds like it was written by a Roman colonial governor”) and its suggestion that geographic isolation left British Celtic warfare moribund and pointless.

That language of maturity and evolution and development—even the simple noun phase—is a giveaway. There is a whiggish approach to military history that views warfare as progressing linearly, from the primitive, ritualized fighting of the tribe to the pragmatic modern professional army in the employ of a nation-state pursuing rational material objectives. As Jeremy Black puts it in his introduction to The Age of Total War: 1860-1945, which I serendipitously picked up just after seeing my friend’s posts on this topic, this “teleological” approach describes history as “mov[ing] in a clear direction, with developments from one period to another, and particular characteristics in each. This approach is an aspect of modernization theory.”

I’ve written on this topic before, and with reference to another book by Black, coincidentally, but what I didn’t get into as much in that post was the dangers of this view of linear historical progress.

There are two big problems with this approach. The first is that it encourages an assessment of historical subjects as good or bad, better or worse, primitive or modern, depending on how closely they approximate what a modern person recognizes as warfare. A culture’s warfare, in this view, is “mature” insofar as it resembles us, the implicitly assumed endpoint. Judgments according to modern standards are sure to follow.* The condemnation of “endemic political unrest” gives away the author’s assumption that “rest,” so to speak, is the norm. Ancient people didn’t see it that way.

The second, related problem is that, with this viewpoint in place, you need not actually understand a given culture and why it would fight the way it did on its own terms. You can simply slot it into place in a linear scheme of technical and/or tactical evolution and ignore their own viewpoint on the subject.

The result, which has been pointed out as far back as Herbert Butterfield’s Whig Interpretation of History, is that you train yourself either to dismiss or simply not to see anything falling outside the thread of development you’ve chosen to follow and you blind yourself to what’s actually going on with that culture. The search for through lines and resemblances warps the overall view. This is, at base, a form of presentism.

There’s quite a lot of this in the older historiography of Anglo-Saxon warfare. Like the ancient Britons and Irish, the Anglo-Saxons were geographically isolated from related cultures like the Franks for centuries following the Migration Period and continued to fight in recognizably older ways than their cousins. So a common whiggish approach to the story of the Conquest was that the outdated (notice the use of obsolete in the quotation we started with) infantry levy of Harold Godwinson was quite naturally defeated by the combined arms of the Normans, who deployed infantry, cavalry, and dedicated archers at Hastings. It’s a step in evolution, you see, the end of a “phase.” It’s easy to detect a faint tone of contempt for the Anglo-Saxons in a lot of those old books.

This is, of course, to ignore the entire history of this culture, its past enemies and conflicts,** and the good reasons they had to develop and use the military institutions and methods that they did. And so a historian can blithely describe a culture’s unique response to the situations it had found itself in as simply stuck in a rut—until the inevitable triumph of something more modern. No further investigation needed.

Not only is this approach presentist, it fosters an incuriosity that is the bane of good history.

* And the modern always gets the benefit of the doubt, which is morally questionable. Tribal warriors fighting for prestige on behalf of their king is “primitive” and bad but a state nuking civilians in the name of democracy is “modern” and therefore good.

** As well as the fact that William the Conqueror’s victory was down more to luck than to battlefield performance.

Credential envy

I’m currently reading Histories and Fallacies: Problems Faced in the Writing of History, by Carl R Trueman, a good introduction to the historiographical traps laid in the way of students of the past.

In his first full chapter, which covers Holocaust denial (“HD” below), Trueman briefly explores a side-topic he calls “the aesthetic fallacy”—the assumption that if something looks scholarly and scientific (by some subjective image of what “scholarly” and “scientific” should look like) it must be. This, Trueman notes, is more a fallacy of the reader of history than the historian, but bad historians often tailor their work and images with this in mind.

Trueman looks specifically at the case of Fred Leuchter, who undertook a chemical study of one gas chamber at Auschwitz and claimed to have found little or no evidence of Zyklon-B residues in the bricks. After picking apart Leuchter’s study, which was methodologically unsound but provided a seemingly scientific talking point for certain audiences, Trueman makes an important side observation:

On close examination, we can easily see that his method is so flawed that it is not really scientific at all, but it has all the appearance of being scientific. He uses all the right words, even down to his claim in the title that he is an engineer. In fact, he is not; he is a designer of execution machines. Indeed, he has been barred from using the title “engineer” with reference to himself because of his lack of formal qualifications. The title gave him weight and plausibility; he presumably hoped that it would provide him with the credibility to have a seat at the table and be taken seriously in discussions. One could say that the scientific form of his writing, or perhaps better (though slightly more pretentiously), the scientific aesthetics of his work gave his arguments credibility. For this reason, I am always suspicious of books that print “PhD” on the cover after the author’s name. Why do they need to do this? The person has written a book, so surely her competence can be judged by the volume’s contents? Perhaps, after all, many books are judged at least somewhat by their covers as well as what is printed on the inside.

The phenomenon Trueman describes here is common across self-published crank literature (just look through the Goodreads giveaways sometime) but is felt apparently instinctively by a lot of people. I call it “credential envy.” It has a few iterations:

  • Insisting on a title that is irrelevant to the topic under discussion

  • Claiming a title one is not legitimately entitled to

  • A version of both the former and the latter: insisting on being called doctor for an unearned doctorate

  • Pure fraud

The fundamental quality of credential envy is a craving for legitimacy—or, per Trueman’s “aesthetic fallacy,” the appearance of legitimacy. There’s a defensive, chip-on-the-shoulder aspect to credential envy. People who insist on impressive titles want to preempt criticism through intimidation or grandeur. And this attitude only becomes more apparent when the credentials are false or irrelevant or when they’re being used to mislead, as Leuchter’s appropriation of “engineer” was.

Credentials and qualifications matter enormously. But like Trueman, the more someone insists on their credentials and titles, the more wary I become. Real expertise is effortlessly confident and worn lightly. Or should be. Perhaps the behavior of some real experts today is part of the reason the broader public increasingly finds it hard to distinguish them from the cranks.

The furtive fallacy

Some years ago I wrote here about “the fallacy of the universal man,” the assumption that all people everywhere are “intellectually and psychologically the same.” The term and definition come from David Hackett Fischer’s 1970 book Historians’ Fallacies: Toward a Logic of Historical Thought. I concluded that post by mentioning “the furtive fallacy.” Here’s Fischer on that error:

The furtive fallacy is the erroneous idea that facts of special significance are dark and dirty things and that history itself is a story of causes mostly insidious and results mostly invidious. It begins with the premise that reality is a sordid, secret thing; and that history happens on the back stairs a little after midnight, or else in a smoke-filled room, or a perfumed boudoir, or an executive penthouse or somewhere in the inner sanctum of the Vatican, or the Kremlin, or the Reich Chancellery, or the Pentagon. It is something more, and something other than merely a conspiracy theory, though that form of causal reduction is a common component. The furtive fallacy is a more profound error, which combines a naïve epistemological assumption that things are never what they seem to be, with a firm attachment to the doctrine of original sin.

There is a little of the furtive fallacy in us all . . . And when there is much of it, we are apt to summon a psychiatrist. In an extreme form, the furtive fallacy is not merely an intellectual error but a mental illness which is commonly called paranoia.

History afflicted with the furtive fallacy is warped by the endless search for the ulterior motive and the hidden hand.

This is not a new problem. Fischer names as one of the earliest practitioners Algie Simons, a socialist reporter who was possibly the first to spin the Constitution as a conspiracy of the wealthy to exploit and disenfranchise.

But furtive history’s greatest and most influential example is certainly Charles Beard, whom Fischer investigates in some detail. Beard made his name by imputing purely economic motives to the framers of the Constitution (“Beard . . . several times insisted that his thesis was misunderstood. But in fact it was misconceived.”) and ended his career with a book arguing a thesis popular among the latter-day furtive: that FDR had deliberately maneuvered the United States into participation in WWII.

Interestingly, Fischer notes that the same paranoid-leaning mindset at work in critics of Beard, namely the conservative historian Forrest McDonald, whose account of the drafting and ratification of the Constitution deliberately targets Beard’s and provides instead “a rum and strumpet history” of backroom deals and smoke-filled rooms different in degree—and political angle—but not in kind. Whether left-wing, right-wing, or politically indiscriminate, in history marked by furtiveness “[r]eality is reduced to a set of shadows, flickering behind a curtain of flimsy rhetoric.”

As Fischer notes near the beginning of this section, the furtive fallacy is not the same thing as a conspiracy theory, but conspiracy theories seldom lack this hermeneutic of paranoia. Put another way, you can be paranoid without drifting into conspiracism, but not vice versa. Understandably, since if you already believe all true motives are base but hidden, it’s not a difficult step to find spectral evidence for these assumptions everywhere.

In fact, it was Fischer’s description of furtive history, driven by “causes mostly insidious and results mostly invidious” that caught my attention and reminded me of one of my favorite short documentaries: “The Umbrella Man,” a six-minute film by Errol Morris. In this film, private investigator Tink Thompson, himself a JFK conspiracy theorist, tells the story of a mysterious man spotted in film and photographs from Dealy Plaza. He wore a suit and stood holding up an open umbrella—despite the brilliant fall weather—as JFK’s motorcade passed by.

Thompson summarizes the suspicions surrounding the Umbrella Man thus: “The only person under any umbrella in all of Dallas standing right at the location where all the shots come into the limousine. Can anyone come up with a non-sinister explanation for this? Hm? Hm?”

I don’t want to give the documentary away—seriously, take six minutes and watch the film—but Thompson does tell the satisfactory but wholly, totally unexpected story of who the Umbrella Man was and why he did what he did that day, a solution “just wacky enough it has to be true!” Thompson concludes:

What it means is, if you have any fact which you think is really sinister, that is really obvious a fact which can only point to some sinister underpinning, hey, forget it man, because you can never, on your own, think of all the non-sinister, perfectly valid explanations for that fact. A cautionary tale.

Food for thought and a useful rule of thumb, especially given that even much of the non-conspiratorial history produced today revels in and even demands the furtive perspective.

Eisensteinian historical montage

Today Medievalists.net shared a good summary of a 2006 article by Donald Ostrowski in which he examines the actual historical evidence for the Battle of Lake Peipus and finds that the one fact everyone “knows” about the battle is almost certainly made up.

The Battle of Lake Peipus was fought in April 1242 between a Crusader coalition led by a suborder of the Teutonic Knights and a Russian force from Novgorod led by Prince Alexander Nevsky. After an initial cavalry assault by the Knights, Alexander drove them back, winning the battle and thwarting the attempt to conquer Novgorod and bring the Orthodox Christians there under the authority of the Latin or Catholic Church.

The “one fact everyone ‘knows’” that I mentioned above concerns the way Alexander was able to win and the fate of the Teutonic Knights. Look the Battle of Lake Peipus up and you’ll certainly find descriptions of the way the Knights, charging across and even fighting on the frozen lake, drowned in large numbers when the overstressed late spring ice broke up beneath them in the latter stages of the battle. Hence the battle’s better-known name: “The Battle on the Ice.”

But it turns out that most of the details related to the frozen lake date from much later than the battle itself, with—in a process that will be familiar to anyone who has had to work with medieval chronicles—more and more detailed and elaborate accounts being recorded later, often much later. And the breaking up of the ice specifically originates not in any historical source but in a movie: Sergei Eisenstein’s 1938 propaganda epic Alexander Nevsky.

Eisenstein was a Russian filmmaker who worked for decades making historical dramas for the Stalinist Soviet state. He was also a film theorist, experimenting with intellectual montage techniques to convey story and meaning and—most importantly for a propagandist—evoke emotional reactions. He had a good eye for an exciting sequence, and Alexander Nevsky’s battle on a frozen lake and the wicked Germans’ plunge into the icy depths is among his best. But not his most famous.

That Eisenstein invented this vision of the battle is isn’t exactly news, at least to anyone who has studied this region and period. Note that Ostrowski’s Russian History article dates from 2006. William Urban, in The Teutonic Knights: A Military History, first published in 2003, is also circumspect about anything ice-related, and quotes part of the Livonian Rhymed Chronicle which describes the dead and dying lying “on the grass” after the battle. No frozen sinking corpses here.

But there’s another dimension of the gradual elaboration and fabrication of the story. Urban:

The battle has become undeservedly famous, having been endowed—for twentieth-century political considerations—with much more significance than it merited in itself, through Sergei Eisenstein’s 1938 film Alexander Nevsky, and the stirring music of Sergei Prokofiev. Indeed, although this movie is a reasonably accurate portrayal of some aspects of the battle, especially the costumes and tactics, and gives us an impressive sense of the drama of medieval combat, other aspects are pure propaganda. Certainly the ancestors of today’s Estonians and Latvians were not dwarfs, as the movie suggests, nor were they serfs. Master Andreas was in Riga, and thus could not have been taken prisoner by Alexander himself and ransomed for soap. The Russian forces were mainly professionals, not pre-Lenin Communist peasants and workers facing the equivalent of German armoured columns; the Germans were not proto-Nazis, blonde giants who burned babies alive. In short, many scenes in Alexander Nevsky tell us much more about the Soviet Union just before Hitler’s invasion than about medieval history.

Alexander Nevsky is a great movie, though, and, as Urban notes, Prokofiev’s score is fantastic. I have it on CD. Here’s a sample from the scene in question.

But this isn’t the only historical myth created by Eisenstein and spread with the imprimatur of the Comintern. By far his most famous film, the silent propaganda classic Battleship Potemkin, which depicts a 1905 mutiny of Russian sailors in the Ukrainian port of Odessa as a proto-Soviet uprising crushed by the cold-blooded Tsarists, features as its climactic sequence a massacre of newly liberated and class-conscious proles on a long elegant staircase. “The Odessa Steps” is one of the most famous scenes in cinema history, a continuous series of stunning, unforgettable images, and has been imitated and alluded to many, many times.

But the massacre never happened. Per Roger Ebert, in a “Great Movies” essay on Battleship Potemkin:

That there was, in fact, no czarist massacre on the Odessa Steps scarcely diminishes the power of the scene. The czar's troops shot innocent civilians elsewhere in Odessa, and Eisenstein, in concentrating those killings and finding the perfect setting for them, was doing his job as a director. It is ironic that he did it so well that today, the bloodshed on the Odessa Steps is often referred to as if it really happened.

Both of these myths—the breakup of the ice under the Teutonic Knights and the massacre on the Odessa Steps—illustrate the unique power and danger of historical cinema. These are inventions by a director following the rule of cool which, as Ebert notes, is a director’s job. But as Urban suggests above there is plenty of shady ideology working alongside those artistic considerations. More importantly, these made up stories are now the entire story for many people. As Chesterton put it in a line I’ve shared here before, “A false film might be refuted in a hundred books, without much affecting the million dupes who had never read the books but only seen the film.”

Medievalists.net’s summary post caught my eye not only because I love the subject and period as well as Eisenstein, but because matters of historical truth in filmmaking are always on my mind. After all, think about the Battle on the Ice sequence in Alexander Nevsky and how influential it was, then watch—or perhaps rewatch—this scene from last year’s Napoleon.

Falsehood, if introduced through film, can have a very long life.

Agatha Christie on historical perspective

Coincident to my recent posts about the “right side” of history and how our understanding of what happened in the past changes and, ideally, grows more thorough and accurate as time passes, here’s Agatha Christie in the short story “The Coming of Mr Quin,” which I’m reading in the collection Midwinter Murder: Fireside Tales from the Queen of Mystery.

Briefly, a New Year’s Eve party at a comfortable home is interrupted just after midnight by the arrival of a Mr Harley Quin, whose car has broken down. Quin says that he knew the house’s former owner, one Derek Capel, who unexpectedly killed himself a decade prior. Notice how Quin invites the partygoers to revisit what they know about the incident:

‘A very inexplicable business,’ said Mr Quin, slowly and deliberately, and he paused with the air of an actor who has just spoken an important cue.

‘You may well say inexplicable,’ burst in Conway. ‘The thing's a black mystery—always will be.’

‘I wonder,’ said Mr Quin, non-committally. ‘Yes, Sir Richard, you were saying?’

‘Astounding—that's what it was. Here's a man in the prime of life, gay, light-hearted, without a care in the world. Five or six old pals staying with him. Top of his spirits at dinner, full of plans for the future. And from the dinner table he goes straight upstairs to his room, takes a revolver from a drawer and shoots himself. Why? Nobody ever knew. Nobody ever will know.’

‘Isn’t that rather a sweeping statement, Sir Richard?’ asked Mr Quin, smiling.

Conway stared at him.

‘What d’you mean? I don't understand.’

‘A problem is not necessarily unsolvable because it has remained unsolved.’

‘Oh! Come, man, if nothing came out at the time, it's not likely to come out now—ten years afterwards?’

Mr Quin shook his head gently.

The contemporary historian never writes such a true history as the historian of a later generation. It is a question of getting the true perspective, of seeing things in proportion.
— Mr Quin

‘I disagree with you. The evidence of history is against you. The contemporary historian never writes such a true history as the historian of a later generation. It is a question of getting the true perspective, of seeing things in proportion. If you like to call it so, it is, like everything else, a question of relativity.’

Alex Portal leant forward, his face twitching painfully.

‘You are right, Mr Quin,’ he cried, ‘you are right. Time does not dispose of a question—it only presents it anew in a different guise.’

Evesham was smiling tolerantly.

‘Then you mean to say, Mr Quin, that if we were to hold, let us say, a Court of Inquiry tonight, into the circumstances of Derek Capel’s death, we are as likely to arrive at the truth as we should have been at the time?’

More likely, Mr Evesham. The personal equation has largely dropped out, and you will remember facts as facts without seeking to put your own interpretation upon them.’

Evesham frowned doubtfully.

‘One must have a starting point, of course,’ said Mr Quin in his quiet level voice. ‘A starting point is usually a theory. One of you must have a theory, I am sure. How about you, Sir Richard?’

Simple and tailored to the mystery genre, but not a bad explanation of how the greater perspective afforded by historical distance can lead to a more accurate understanding of important events. There are, certainly, parts of my own life I understand much better now than when I was an eyewitness living through them.

I’ve been trying to read more of Agatha Christie the last year or so after having made it to my late thirties with Murder on the Orient Express as my sole experience of her storytelling. My wife, on the other hand, has read a lot of Christie, and has done so over many years. But even she was unfamiliar with Christie’s Mr Quin, who is the subject of several short stories collected as The Mysterious Mr Quin. I’m enjoying him in this story so far—especially with this kind of sharp historical aside—and plan to check that out.

History has no sides

History, a mosaic by Frederick Dielman in the Library of COngress

I started this post some weeks ago, but sickness—mine and others—intervened. Fortuitously so, since it seems appropriate to finish and post this as a New Year’s Eve reflection, a reminder as 2023 gives way, irretrievably, to 2024.

Writing in Law & Liberty a few weeks ago, Theodore Dalrymple takes the recent conflict between Venezuela and Guyana, a large area of which Venezuela is now claiming as its own territory, as an opportunity to consider an idea invoked by Guyana’s rightly aggrieved foreign minister: “the right side of history.”

This is now a common term for an idea that was already fairly widespread, a sort of popularized Whig or Progressive view of history’s supposed outworkings that, as Dalrymple notes, “implies a teleology in history, a pre-established end to which history is necessarily moving.” History has a goal, an ultimate good toward which societies and governments are moving, a goal that offers an easy moral calculus: if a thing helps the world toward that goal, it is good, and if it hinders or frustrates movement toward that goal, it is bad. This is how history comes to have “sides.”

As worldviews go, this is relatively simple, easily adaptable—whiggishness, as I’ve noted, tends to be its conservative form, and Progressivism or doctrinaire Marxism to be its liberal form—and offers a clarity to thorny questions that may have no easy answer. This is why people who believe in “the right side of history” are so sure both of themselves and of the perversity and evil of anyone who disagrees with them.

But “the right side of history” has one problem: it doesn’t exist. Dalrymple:

[H]istory has no sides and evaluates nothing. We often hear of the ‘verdict of history,’ but it is humans, not history, that bring in verdicts.
— Theodore Dalrymple

But history has no sides and evaluates nothing. We often hear of the “verdict of history,” but it is humans, not history, that bring in verdicts, and the verdicts that they bring in often change with time. The plus becomes a minus and then a plus again. As Chou En-Lai famously said in 1972 when asked about the effect of the French Revolution, “It is too early to tell.” It is not merely that moral evaluations change; so do evaluations of what actually happened and the causes of what actually happened. We do not expect a final agreement over the cause or causes of the First World War. That does not mean that no rational discussion of the subject is possible—but finality on it is impossible.

“It is true,” he continues, “that there are trends in history, but they do not reach inexorable logical conclusions.” This is the false promise of Hegel or, further back, the Enlightenment. Outcomes are not moral judgements, and victories of one side over another are not proof of rightness. Dalrymple:

History is not some deus ex machina, or what the philosopher, Gilbert Ryle, called the ghost in the machine; it is not a supra-human force, a kind of supervisory demi-urge acting upon humans as international law is supposed to act upon nations. . . . Are we now to say that authoritarianism is on the right side of history, as recently liberal democracy was only thirty years ago, because so much of the world is ruled by it?

To equate victory with goodness or to view success as superiority—the inescapable but usually unstated Darwinian element in “the right side of history”—is, as CS Lewis put it, to mistake “the goddess History” for “the strumpet Fortune.”

Dalrymple concludes with an important question, one he is unusually reticent in answering:

History might excuse our worst actions, justifying grossly unethical behaviour.
— Theodore Dalrymple

Does it matter if we ascribe right and wrong sides to history? I think it could—I cannot be more categorical than that. On the one hand, it might make us complacent, liable to sit back and wait for History to do our work for us. Perhaps more importantly, History might excuse our worst actions, justifying grossly unethical behaviour as if we were acting as only automaton midwives of a foreordained denouement. But if history is a seamless robe, no denouement is final.

I’m going to be more categorical and say that it certainly matters whether we believe history has sides, and for the latter of the two reasons Dalrymple lays out. History—with a right and wrong side and a capital H—offers a rationalization, a handy excuse. Armed with an ideology and a theory of history’s endpoint and the post-Enlightenment cocksureness that society is malleable enough to submit to scientific control in pursuit of perfection, group after group of idealists has tried to shove, whip, or drag the world forward into the light. And when the world proves intractable, resistant to “the right side of history,” it is easy to treat opponents as enemies, blame them for failure, and eradicate them.

This is true even, and perhaps especially, of groups that start off making pacifist noises and decrying the violence and oppression of the status quo. The Jacobins and the Bolsheviks are only the most obvious examples, though our world in this, the year of our Lord 2023, is full of groups that have granted themselves permission to disrupt and destroy because they are on “the right side of history.” What do your puny laws, customs, and scruples matter in the face of History?

That’s the extreme danger, but a real one as the last few centuries have shown. Yet the first danger Dalrymple describes is even more insidious because it is so common as to become invisible—the smug complacency of the elect.

What kind of grim New Year’s Eve message is this? It’s a denunciation of a false idea, sure, but also a plea to view the change from 2023 to 2024 as no more than that—the change of a date. Year follows year. Time gets away from us. Everything changes without progress, things neither constantly improving nor constantly worsening and with no movement toward a perfect endpoint of anyone’s choosing.

Unless, of course, something from outside history intervenes. History, like war, like gravity, like death, is a bare amoral fact in a fallen world. If it is to have meaning and moral import at all it must come from somewhere other than itself. For those of us who believe in God, this is his providence. He has an endpoint and a goal and a path to get there but, tellingly, though he has revealed his ends he has kept his means, the way there, hidden. Based on what I’ve considered above, this is for our own good. The temptation not only to divine his hand in our preferred outcomes but to seize control of history and improve the world is powerful. We haven’t reached the end of it yet.

Until then, if history has sides at all, they are only the two sides of Janus’s face—looking behind and ahead, observing but never reaching either past or future. The more clearly we see this, the more deliberately we can dispel the luminous intellectual fog of thinking about the movement of History with a capital H, the more we can focus on the things nearest and most present with us. Celebrate the New Year, pray for your children, and get to work on the little patch that belongs to you, uprooting evil in the fields you know. That’s my goal, at least.

Thanks as always for reading. Happy New Year, and best wishes to you for 2024!

More if you’re interested

Dalrymple’s entire essay is worth your while. Read it at Law & Liberty here. The sadistic violence of the ostensibly pacifist French Revolutionaries is fresh on my mind because of David A Bell’s excellent book The First Total War, which I plan to write more about in my reading year-in-review. For CS Lewis on the false idea of “the judgement of history,” see here. And for one of my favorite GK Chesterton lines on progress, see here. For a view of history and progress and the pursuit of human perfectibility that closely aligns with my own, see Edgar Allan Poe here. Let me also end the year with another recommendation of Herbert Butterfield’s classic study The Whig Interpretation of History, the fundamental text in rebuking ideas of progress.

That's not how any of this works

Director Ridley Scott talks with Dan Snow about Scott’s forthcoming film Napoleon

Yesterday History Hit released a 16-minute talk with Ridley Scott covering some aspects of his epic drama Napoleon, which comes out in three weeks. The interview is mostly interesting even if host Dan Snow doesn’t dig very deep, but Scott got strangely testy when Snow—over a clip of cannonballs smashing up the ice of a frozen pond beneath the feet of retreating Russian infantry at Austerlitz—raised the question of historical accuracy:

Snow: What about historical accuracy? When a historian says, “Uh, sorry, Sir Ridley, it didn’t quite happen like that,” you say, “Listen, I’ve done enough with you.” You have to have artistic license, right?

Scott: You know, I would say, “How would you know? Were you there?”

Snow: [laughs]

Scott: They go, “Oh, no, right.” I say, “Exactly.” So I said, You know, Napoleon [?] had four-hundred books written about him. So it means, maybe the first was the most accurate. The next one is already doing a version of the writer. By the time you get to 399, guess what—a lot of speculation.

Oof. That’s not how this works. That’s not how any of this works.

Historians don’t know things because they were there, they know things because they study. It’s work. They’ve read and researched and compared notes and argued and walked the ground. Scott’s rejoinder is surprisingly childish for such a sharp and accomplished man.

Further, his breezy explanation of how history works as a discipline and a profession is simply bizarre. The implication of what he says about how books cover a subject over time is that historical facts are established at the beginning, and the rest is just eggheads batting ever more intricate theoretical interpretations back and forth.

The truth is that, as I’ve had cause to reflect here recently, the first accounts of an event are fragmentary or partial even if they’re accurate. It takes diligent study, the perspective of time, the synthesis of all available sources, and a good bit of luck to piece together a big-picture account of what actually happened. And with big, heavily-documented subjects—like, say, a French emperor—new material is being discovered all the time. There is no substitute for a primary source or eyewitness account, but if you want accuracy qua accuracy, you will absolutely want a secondary source, a book written later.

I’m all for allowing responsible artistic license—I’m always interested to hear filmmakers explain how and why they choose to change what they change—but Scott doesn’t stop at artistic license. His arrogant dismissiveness toward truth in historical storytelling is breathtaking. Maybe he picked up more from Napoleon than he’s aware.

To be fair, Scott was speaking off-the-cuff, and is 85 years old. I’m not even absolutely certain he said “Napoleon” when he cited the figure of 400 books because he was mumbling. (The real figure, if he was talking about Napoleon, is tens of thousands, more than 300,000 by one old estimate.) But given his track record with using history for his own purposes—I stand by my thoughts on Kingdom of Heaven from the early days of this blog—and the forcefulness with which he said this, I have to assume he means it. I can’t say I’m surprised.

At any rate, I’m cautiously optimistic about Napoleon, but I’m not hoping for much more than interesting performances and exciting spectacle.

The fog of war is no excuse

Speaking of John Keegan, here’s a passage from the chapter on Waterloo from The Face of Battle that I’d like to enlarge upon. Regarding the way the Battle of Waterloo is traditionally described as unfolding—in five “phases” of engagement—Keegan writes:

It is probably otiose to point out that the ‘five phases’ of the battle were not perceived at the time by any of the combatants, not even, despite their points of vantage and powers of direct intervention in events, by Wellington and Napoleon. The ‘five phases’ are, of course, a narrative convenience.

A narrative convenience, he might have added, laboriously gathered and constructed after the fact and over many years. He goes on to describe “how very partial indeed was the view of most of” the participants, beginning with distraction and proceeding to visibility:

There were other causes, besides the preoccupation of duty, which deprived men of a coherent or extended view of what was going on around them. Many regiments spent much of their time lying down, usually on the reverse slope of the position, which itself obscured sight of the action elsewhere. . . . A few feet of elevation, therefore, made the difference between a bird’s-eye and a worm’s-eye view . . . But even on the crest of a position, physical obstacles could limit the soldier’s horizon very sharply. In many places, at least at the beginning of the battle, the crops of wheat and rye stood tall enough for the enemy to approach to within close musket shot undetected. . . . [T]he men in the rear or interior of dense columnar formations, of the type adopted by the Guard in their advance, would have glimpsed little of the battle but hats, necks and backs, and those at a distance of a few inches, even when their comrades at the front were exchanging fire with the enemy. And almost everyone, however well-positioned otherwise for a view, would for shorter or longer periods have been lapped or enveloped by dense clouds of gunpowder smoke.

And those are just problems affecting vision. The other senses have equally severe limitations and are just as susceptible to illusion. Look up acoustic shadow sometime. Keegan: “To have asked a survivor . . . what he remembered of the battle, therefore, would probably not have been to learn very much.”

Now compound these limitations and frequent misperceptions and misunderstands by passing them through reporters. But at least reporters are impartial, right?

Visit the New York Times complete online digital archive—or the archive of any old newspaper—and look up a the earliest possible reporting on a conflict you know a lot about. You’ll be amazed at how much is simply wrong. And that’s not even allowing for spin, for bias, for lies, for manifold other motivated errors.

What we know about battles and wars and other conflicts we know because of that laborious process I mentioned above, of gathering, compiling, organizing, and collating sources and information, and then study and study and more study, not to mention walking the ground. There are things happening now that we will never—none of us in our own lifetimes—have the perspective, much less the information, to understand completely. Even then, there will still be unanswered questions, or questions answered after years, even centuries of uncertainty.

Assume that everything you hear or read about a current conflict is wrong, incomplete, made up, or the precise opposite of the truth.

So my rule of thumb: Assume that everything you hear or read about a current conflict is wrong, incomplete, made up, or the precise opposite of the truth. And wait. And don’t get emotionally invested in what’s happening, especially if your sense of moral worth depends upon viewing yourself as on The Right Side and raging against a barbarous enemy.

War is tragic, and people will suffer. That’s guaranteed. But there is no reason to compound those facts with ignorant and impotent rage.

If you slow down, you won’t beclown yourself the way certain institutions have in the previous week. Many of these have now, suddenly, discovered the concept of “fog of war,” which has been dusted off to provide a sage reminder to readers instead of a mea culpa. Look here and here for samples, and here for well-earned mockery.

Per Alan Jacobs, who wrote excellently and succinctly on this topic over the weekend:

The more unstable a situation is, the more rapidly it changes, the less valuable minute-by-minute reporting is. I don’t know what happened to the hospital in Gaza, but if I wait until the next issue of the Economist shows up I will be better informed about it than people who have been rage-refreshing their browser windows for the past several days, and I will have suffered considerably less emotional stress. . . .

“We have a responsibility to be informed!” people shout. Well, maybe . . . But let me waive the point, and say: If you’re reading the news several times a day, you’re not being informed, you’re being stimulated.

To the New York Times’s credit, it has offered an editorial apology, but, as Jeff Winger once put it, “Be sorry about this stuff before you do it, and then don’t do it!

I’ll end with a reflection from CS Lewis, in a passage from his World War II radio talks eventually incorporated into Mere Christianity, a passage that was going the rounds late last week:

Suppose one reads a story of filthy atrocities in the paper. Then suppose that something turns up suggesting that the story might not be quite true, or not quite so bad as it was made out. Is one's first feeling, ‘Thank God, even they aren't quite so bad as that,’ or is it a feeling of disappointment, and even a determination to cling to the first story for the sheer pleasure of thinking your enemies are as bad as possible? If it is the second then it is, I am afraid, the first step in a process which, if followed to the end, will make us into devils. You see, one is beginning to wish that black was a little blacker. If we give that wish its head, later on we shall wish to see grey as black, and then to see white itself as black. Finally we shall insist on seeing everything . . . as bad, and not be able to stop doing it: we shall be fixed for ever in a universe of pure hatred.

Let the reader understand.

We already have something approaching Screwtape’s universe of pure noise. Can we still turn back from a universe of pure hatred?