Category: History

A Hollywood story that deserves its own movie

Fans of Mel Brooks know “The Producers,” in which two producers’ plans to make money by making money-losing movies is foiled by their accidentally making a popular movie.

Proving that real life is stranger than fiction, Buzz Dixon tells this story:

The Mob had a problem: 
Deep Throat was making too much money.

I won’t recount the history of porn in America at this time — it’s fascinating stuff (and not for the reasons you think!) — but it’s too much of a sideshow to what I want to post about.

Suffice it to say this:
The same black market-to-barely legitimate distribution system that made bootlegging not only possible but highly profitable during Prohibition, the same system that got pressed into service to spread comics and pulp magazines far and wide, that same system had a modestly earning sideline in shoveling porn around the country up to the 1960s.

At that point, as more and more adult films began being imported from Europe, as American indie producers found more legal tolerance for their grindhouse features, the Peraino members of the NYC-based Colombo crime family decided to splurge ($22,500 to $50,000 depending on who tells the story) on a feature length 35mm full color porn film that had an actual bona fide (albeit goofy) story and something that could be loosely interpreted as acting by less discriminating members of its audience.

We’re talking Deep Throat, folks, and I’ll head everyone off at the pass and say Linda Lovelace (nee Linda Susan Boreman) was at the very least coerced and intimidated into making the film, so sympathy to her, and a big hearty horselaugh to all those others involved as you read further.

We come not to praise Deep Throat (which in addition to being the first American porn feature with an actual story was also a musical [!] and a borderline sci-fi film [!!]), but in the words of another / later / even more infamous Deep Throat:  “Follow the money.”

. . .

Deep Throat may very well be the biggest return on investment of any movie ever made, basically walking around pocket change for Mob wiseguys turned into a $250 million grossing picture!  (And that’s just the generally agreed upon lowest gross estimate for the film; nobody really knows for sure.  Gawd only knows what they could have done with Lucasfilm’s marketing team.)

Of course the Mob harbored absolutely no desire to let the Feds have any of that, and so for ideas on how to hide it, they turned to a bigger / badder / even more financially corrupt institution:  Hollywood

I’ve posted elsewhere about the financial shenanigans the Hollywood studio system employs to hide its loot.  One of their mainstays is cross collateralization.

It works like this:
Say a studio release six movies in a three month period.  One smash hit, one modestly successful, two break even, one mild disappointment, one bomb.

The studio takes money from the smash hit and modestly successful films’ revenues and apply them to the losses of the bottom two films.  With any luck all six films barely break even, and as such the studio keeps all the revenues and the profit participants (har!) get bupkis.

They call it “standard industry practices”.

And that’s what the Mob wanted to do with Deep Throat.

. . .

Problem:
Deep Throat wasn’t conceived of as a franchise tentpole; it was just a standalone stroke film.

So they cobbled together a distribution company called Bryanston Films (presumably because it was the least Mafioso-sounding name they could think of) and, like Mel Brooks’ ill-fated The Producers, went out in search of the worst movies they could lay their hands on so they could “lose” money with them and siphon off that sweet, sweet Deep Throat cash (and more on why they wanted to do that in a bit).

Among the very first films they distributed was Dark Star, a low budget sci-fi movie shot mostly on 16mm as a student film by two USC classmates:  John Carpenter and Dan O’Bannon.

Budgeted at a final grand total of $60,000, it looked a helluva lot more polished and professional that Deep Throat.  Still, Bryanston expected to lose money on this and were surprised when word of mouth among sci-fi fans earned the film a cult reputation that edged it into break even territory.

Oh, well, you can’t lose ‘em all, can you?

. . .

So they tried again, picking up a couple of Hong Kong imports to cash in on the kung fu craze they knew next to nothing about.

They figured by overpaying for a film, it would be easy to claim they lost money on it, and normally that would be true…

…unless one of the Hong Kong films you pick up is The Way Of The Dragon with Bruce Lee, and you release it just as a larger studio announces their (relatively) big budget Bruce Lee epic, Enter The Dragon.

Well, lightning can’t strike three times, can it?

Wanna bet?

They opted for something safe and crappy, absolutely guaranteed not to make any money.  A film made by some punks from some podunk place down south, shot on 16mm with even lower production values than Deep Throat and arguably far worse acting.  A stupid, ugly, vulgar film about a family of cannibals.

After running it for Bryanston, the screening room projectionist looked them square in the eye and famously said:  “There are a lot of sick bastards in this world and every single one of them will pay five dollars to see this movie.”

Boy howdy, was he ever right!

Ladies and gentlemen, The Texas Chainsaw Massacre

. . .

Meanwhile, outside the confines of Bryanston’s front offices, virtuous forces were gathering against them.

It’s hard for people to fathom today, but once upon a time hard core porn was illegal in many if not most communities in the United States.

Typically the reason had to do with blue nose morality, but law enforcement also knew the Mob liked to move money around using liquid assets.

That’s why local authorities waged war against pinball machines:  They made it possible for the Mob to hide cash from heroin sales by claiming it was just millions of kids putting their quarters down.

Having a string of failed movies, and using Hollywood-style cross collateralization made it possible for the Perainos to hide a lot of the Colombos’ illicit cash —

— but the movies had to fail massively for the scheme to work.

And try as they might, Bryanston just couldn’t get their movies to fail.

(Well…most of them…)

To help hide money from the Colombos’ other rackets, the Perainos began siphoning for of their Deep Throat cash off to Mob-dominated Las Vegas casinos.

Casinos, like pinball machines, could hide a lot of illicit cash.

Cash the FBI and the IRS would love to be able to trace.

Question:
How do you penetrate many levels of Mob security to get a look at their books?

. . .

By now the Bryanston boys were growing desperate.

Despite their best / worst efforts, their movies kept making money!

It finally dawned on them that their least profitable films (“least” as in “but still”) were more conventional films with recognizable although far from big box office draw names.

Finally an independent production showed up that was ideal for their purposes:  A dumb medium budget horror film with a bunch of has been stars in it, too well made to appeal to the freaks buying tickets for Blood For Dracula and Flesh For Frankenstein (two Andy Warhol produced horror films that surprised the hell out of everybody by being modestly successful), too inept for mainstream audiences who came to see The Human Factor or Caravan To Vaccares.

With a cast featuring Ernest Borgnine, Eddie Albert, Ida Lupino, and William Shatner — all woefully miscast — it was sure to turn off younger audiences.

In fact, the only young character in it was just a minor supporting role played by a kid who was one of a half dozen sidekicks on a modestly successful sit-com, a kid whose next role would be in a dumb disco dance fad movie.

So Bryanston acquired The Devil’s Rain and put it into general release.

And it fared poorly, and it lost money, and the boys at Bryanston smiled because at long last their scheme was working…

…until 18 months later when Saturday Night Fever was released and suddenly theaters were demanding every movie with John Travolta in it get re-released to cash in.

. . .

The Feds finally found a crack in the financial wall surrounding the Mob’s money.

Remember, even in the late 1970s, porn was not legal everywhere in the United States.

On July 7, 1974, the FBI arrested Harry Reems, Ms. Lovelace’s Deep Throat co-star, in New York for a federal obscenity charge filed in Memphis.

Dick Nixon, desperate to distract Americans from his own political scandals, kept pressing for crime bills and prosecution against porn, even though his own commission on pornography saw no societal harm in it.

The FBI, on the other hand, saw the Deep Throat obscenity case from a different perspective:  A chance to finally get their hands on the Mob’s books.

And that’s exactly what happened.

Using their right of discovery from the Reems’ case and similar indictments in other federal courts around the country, the FBI swooped in on Bryanston and grabbed their financial records.

Sound familiar?

It should.

If you’ve seen Martin Scorsese’s Casino, you know roughly half-way through the film the FBI launches raids across the country to crack the Mob’s money laundering schemes.

Casino clocks in just shy of three hours — and Scorsese spends about a fourth of that time just explaining how the casino business works so the rest of the movie will make sense.

Adding a whole big sub-plot about how Deep Throat and Bryanston led the FBI right to the Mob’s piggy banks would have been fascinating — and incredible long.

So Scorsese just dismisses how the FBI found their way in and follows what happens after they did.

Wiseguys went to jail, that’s what happened.

. . .

Bryanston’s books provided the loose thread the FBI pulled that unraveled the whole deal.

No, it didn’t eliminate organized crime or shut down Mob influence in Las Vegas, but it sure put a dent in ‘em.

And it put a lot of guys — many named Peraino — behind bars.

Bryanston went inert for 30 some years.  It makes noises now like it wants to come back as a legitimate distribution company, but so far…nothing.

All the original players are pretty much dead and gone, the lucky ones via natural causes, the not-so-lucky ones by other mobsters.

I was sparked to write this because several online friends had shared The Devil’s Rain poster recently without knowing how it fit into the weird history of the Mob and porn and Las Vegas, so I thought I’d write up this summary for them.

Up above I mentioned almost every film Bryanston release proved modestly successful at the very least.

The one exception is The Last Porno Flick (a.k.a. Those Mad, Mad Moviemakers).

I knew and worked with the late Larry DiTillio, who wrote the screenplay for the movie.

Without knowing it, he and the other film makers pitched a movie to Bryanston that was exactly like what Bryanston was trying to do!

i.e., a movie about some con men trying to make a bad porn movie so they could hide money.

Larry described the horror he and the other members of the production team felt when they realized they were in the Mob’s den, pitching a movie that made fun of what the mobsters were actually trying to do.  He felt sure they were all going to leave with broken kneecaps at the very least —

— but to their surprise the Bryanston boys went for it and not only agreed to distribute the film but financed it as well.

And it flopped.

Larry felt sorry for them.

They had tried so very, very hard to be a failure, but they just kept on succeeding.

And when somebody brought them a film idea that reflected what they had been going through, they probably thought to themselves, “Yeah, let’s do this, let’s show the world what it was like for us.”

And the world didn’t give a crap.

Advertisement

What we’re supposed to be about

Virginia Postrel:

This article, which I wrote five years ago, is as timely now as it was then—and was inspired not by the current examples it cites but by the wisdom of Albert Camus, the subject of a Liberty Fund conference I’d recently attended.

One of the rare feel-good stories of our current political moment is also terribly sad. On a train in Portland, Oregon, three very different men tried to protect two young women, one wearing a hijab, from a ranting white supremacist who turned out to be carrying a knife. The action cost two their lives, while the third is still in the hospital.

“America is about a Republican, a Democrat, and an autistic poet putting their lives on the line to protect young women from a different faith and culture simply because it is the right thing to do. You want diversity and tolerance? We just saw it,” writes Michael Cannon in an especially good appreciation, concluding “America is already great —and so long as we continue to produce men such as Rick Best, Taliesin Namkai-Meche, and Micah Fletcher, it always will be.”

Cultures are held together by stories. We define who we are — as individuals, families, organizations, and nations — by the stories we tell about ourselves. These stories express hopes, fears, and values. They create coherence out of complexity by emphasizing some things and ignoring others. Their moral worth lies not in their absolute truth or falsehood — all narratives simplify reality — but in the aspirations they express and the cultural character they shape.

So, as I’ve recently written, the Chinese government’s Silk Road story provides a positive counterweight to colonial and Maoist narratives that disdained China’s imperial past. It offers a national heritage of enlightenment, progress, and peaceful trading relations. Because of the values it represents, that story, despite its propaganda purposes and distortions of history, is largely admirable.

By contrast, consider the story New Orleans Mayor Mitch Landrieu eloquently repudiated in his May speech about taking down Confederate statues. “The Robert E. Lee, Jefferson Davis, and P.G.T. Beauregard statues were not erected just to honor these men, but as part of the movement which became known as The Cult of the Lost Cause,” he said. “This ‘cult’ had one goal — through monuments and through other means — to rewrite history to hide the truth, which is that the Confederacy was on the wrong side of humanity….These monuments purposefully celebrate a fictional, sanitized Confederacy; ignoring the death, ignoring the enslavement, and the terror that it actually stood for.”

The story of the Lost Cause told white southerners that their black fellow citizens were rightly subordinate and less than fully human. It also told them that the South’s best days were behind it and that it had been deprived of a glorious civilization by an occupying force. It was both morally pernicious and culturally dispiriting.

The sad irony is that Landrieu’s speech was necessary in the 21st century. In the 1880s, an alternative story, of “the New South” had already emerged, heralded most prominently by Henry Grady, the managing editor of the Atlanta Constitution. “We understand that when Lincoln signed the Emancipation Proclamation, your victory was assured,” he told a New York audience in 1886. “For he then committed you to the cause of human liberty, against which the arms of man cannot prevail.” Freed from its plantation past, the New South would rise as an industrial region in everyone, not just a “splendid and chivalric oligarchy” shared in prosperity. The New South narrative told only partial truths, but it pointed in a positive direction.

America is now awash in stories. “We tell each other more stories — and as a result have more opportunities to view the world through eyes other than our own — than any other culture in history,” wrote an astute observer in 1995. Since then, technology has amplified those numbers exponentially. The stories have become so numerous and contradictory — and so often demoralizing — that they’re eroding empathy and shattering social trust.

Sadly, the big stories competing for dominance today are demoralizing ones. They have more in common with the Lost Cause than with the New South or the Silk Road. One, told by the president of the United States, is that the country used to be great but allowed its greatness to be eroded by foreigners and cosmopolitan elites. It is that story, more than any specific policy agenda, that connects Donald Trump to authoritarian rulers — because it is with versions of that story that so many authoritarian regimes begin. The story of diabolical foreigners and perfidious fellow citizens is, at its core, a fable attacking liberal values. It misleadingly divides the nation into patriots and traitors, the latter defined as anyone who bucks the party line.

The competing left-wing story, against which many Trump voters reacted, isn’t much better. It portrays the American story as nothing more than a series of injustices in which every seeming accomplishment hides some terrible wrong and the country’s very existence is a crime against humanity. What begins as a valid historical corrective, like Landrieu’s speech, evolves into a corrosive nihilism. A culture cannot long survive self-hatred.

“It is good for a nation to find in its tradition and its sense of honor enough strength to find the courage to denounce its own errors,” cautioned Albert Camus in the wake of France’s decolonization. “But it should not forget the reasons it may have for continued self-esteem. It is, in any case, dangerous to ask it to confess that it alone is guilty and to dedicate to perpetual penitence.” It was a wise and prescient observation. The era of colonialism was over, he wrote, but “it is is vain to condemn several centuries of European expansion, absurd to include in the same curse Christopher Columbus and [French colonial administrator Hubert] Lyautey.”

Landrieu’s speech was intended for a local audience. It went viral because it did something remarkable and much-needed. It embraced the messiness of history. It made a place for everyone (even George W. Bush). And it acknowledged the importance of stories. “If presented with the opportunity to build monuments that told our story or to curate these particular spaces, would these monuments be what we want the world to see?” asked Landrieu. “Is this really our story?” We choose the stories that define us. And right now America is in great need of new ones.

This is one reason why I believe Donald Trump should not be the Republican presidential nominee in 2024, even though I would vote for Trump over any Democrat. In addition to the fact that Trump (or Biden if God forbid he is reelected) would become an instant lame duck, Trump promotes fear, not being better than you are. Democrats are wrong about policy, but Trump is no Ronald Reagan.

The president is not your messiah

Jonah Goldberg explores what I consider to be the most deplorable feature (out of a list as long as the Mississippi River) of today’s Democratic and Republican parties:

In 1970, Richard Nixon nominated G. Harrold Carswell to fill Abe Fortas’ seat on the Supreme Court. Critics charged that Carswell was a decidedly mediocre jurist. Sen. Roman Hruska’s defense of Carswell and the nomination is considered a minor classic in political spin. In a TV interview, he said, “Even if he were mediocre, there are a lot of mediocre judges and people and lawyers. They are entitled to a little representation, aren’t they? We can’t have all Brandeises and Frankfurters and Cardozos.”

I like this anecdote for a bunch of reasons. Hruska was a good man and he had a perfectly respectable—at times even laudatory—political career. This episode is the only thing he’s remembered for by those other than his friends and family and some Nebraska political junkies. It got ample space in his obituaries, and it’s a good cautionary tale about how small slips of the tongue can end up defining you.

But what I really like about this story is how it mangles a way of thinking about representation. There’s a category error buried in it.

I don’t like the Stanford Encyclopedia of Philosophy’s entry on category errors because it reduces them to “infelicitous” statements. On the other hand, I do like its examples of the infelicity of category errors: “The number two is blue,” “The theory of relativity is eating breakfast,” or “Green ideas sleep furiously.”

I love statements like that because they expose how language can become “visible” to our brains when it makes connections between things we don’t expect to be connected. For instance, there are a bunch of versions of the following joke:

Q: What is the difference between an orange?

A: A pencil. Because a vest has no sleeves.

If you laugh at this, it’s because your brain can’t make sense of it, so you enjoy the absurdity. And I think part of that enjoyment stems from the recognition of how language drives how we think about stuff. We like to think language is bound up with rationality. The words we use align with reality, and reality is governed by reason in some fundamental sense: 2+2 = 4 because when I take two rocks and add two more rocks, I get four rocks. But language doesn’t have to be bound by reason. I can say “two plus two equals a duck,” but, so far, reality can’t make that happen. In other words, language can put distance between the world and our brains.

A more reliable form of humor points out connections between things we either don’t see or thought we were the only ones to notice. A whole branch of comedy boils down to “Did you ever notice … ?” These jokes work because they confirm pre-rational intuitions or make irrational connections between things like cause and effect. Don’t believe me? Pull my finger and I’ll prove it to you.

Anyway, the reason I don’t like reducing category errors to merely absurd statements is that I think category errors are the bane of politics. Everyone recognizes that “the theory of relativity is breakfast” is nonsense. But when Chris Rock said Barack Obama was the “dad of the country,” lots of very smart people nodded. Of course, lots of conservatives rolled their eyes, but not out of rejection of a category error. Partisan animosity did most of the work getting those eyes to roll. Likewise, when supporters of Trump—or Reagan or Eisenhower or whomever—made similar statements, partisan opponents rolled their eyes. The idea that the president is the father of the “American family” is a bit of political boilerplate going back to George Washington. But at least Washington’s claim to that metaphorical title depended on the act of creating the country in the first place.

But the idea that the president is akin to a parent is a category error. The president is not my boss. He’s definitely not my father. He has no power, moral or legal, to tell me how to live my life beyond the very limited power of persuasion and a few contestable and narrow emergency powers. My dad could tell me to give my seat to a lady on the bus, and he did it many times. The president can’t.

The “body politic”—corpus politicumis one of the most fraught category errors in history. It was tolerable as a mystical medieval metaphor, but in the 19th and 20th century, intellectuals grabbed all sorts of pseudo-scientific nonsense off the shelf and argued that nation states were organic entities. Herbert Croly, one of the co-founders of The New Republic, said society was just “an enlarged individual.” Edward Alsworth Ross, arguably the most influential sociologist of his day, believed society is “a living thing, actuated, like all the higher creatures, by the instinct for self-preservation.” When Woodrow Wilson rejected the system of “checks and balances” inherent to the Constitution, it was in service to these ideas. He rejected the vision of the Founders as naively Newtonian rather than Darwinian. “The trouble with the [Founders’] theory,” Wilson wrote, “is that government is not a machine, but a living thing. It falls, not under the theory of the universe, but under the theory of organic life. It is accountable to Darwin, not to Newton. It is modified by its environment, necessitated by its tasks, shaped to its functions by the sheer pressure of life. No living thing can have its organs offset against each other, as checks, and live.”

Wilson was wrong in every regard. Government is a machine in the sense that it is technology, a manufactured system designed for specific purposes. It is not in any way a living thing bound by the theory of organic life. Checks and balances work precisely because Congress isn’t like a spleen and the judiciary isn’t like a liver. Moreover, I’m not entirely sure that our organs don’t work “against” each other in a checks-and-balancey kind of way insofar as various organs regulate each other. But I could be wrong about that.

Nazis were obsessed with the idea that the Aryan nation was an organic entity, and that idea gave them permission to see other groups as “parasites.”

Now, some stickler might object to what I’m talking about by arguing that these theories were just bad metaphors and analogies. And that’s fine. But when we don’t consciously recognize that an idea is merely metaphorical—never mind a bad metaphor—we take it to be literal, or close enough to literal to act as if it were.

You could say category errors we like are just called metaphors or analogies. It’s sort of like “censorship.” Pretty much everyone is in favor of censorship, but we only use the word censorship for the kinds of censorship we don’t like. I used to have great fun arguing with libertarians of the right and left about this. They’d say something like, “I’m against all forms of censorship.” And I’d respond, Socratically, “So you think it’s fine for TV networks to replace Saturday morning cartoons with mock snuff films or simulated child pornography?” (I have to insert the “mock” and “simulated” qualifiers to avoid clever “but real snuff films and child pornography are illegal” rejoinders). Eventually, most would end up arguing that censoring that stuff isn’t really censorship, it’s just responsible programming or some other euphemism. Naw, it’s censorship, and I’m fine with that.

Similarly, with metaphors and analogies, if you don’t regularly push back or poke holes in them, people come to accept them as descriptors of reality.

I had no idea I’d be spelunking down this rabbit hole. I planned on writing about the problems with our elites, but I’ll save that for another time. Like the runza peddler said at the Cornhusker game, let’s just circle back to Sen. Hruska.

The other thing I love about Hruska’s representation-for-mediocrities argument is that it mangles the concept of representation. On the surface it kind of makes sense, like an intellectual Potemkin village. For starters, the Supreme Court is not a representative body—or at least it’s not supposed to be. Forget the identity politics arguments about how the court is improved by, say, the presence of a “wise Latina” in ways that it wouldn’t be improved by a “wise Nordic.” Why not put plumbers or electricians on the court? Don’t they deserve representation, too? Although the court has always been top-heavy with Pale Penis People, it’s been utterly monopolized by lawyers.

It’s sort of like the term diversity. Everyone likes to say they’re in favor of diversity, but diversity—much like censorship—is very narrowly defined. We don’t think the NBA would be improved if there was a quota to get more one-legged players or blind people on the court. When I talk to my financial adviser about “diversifying my portfolio,” I never say, “Make sure there’s a healthy balance between good investments and bad investments.” A “balanced diet” doesn’t have a lot of strychnine or razor blades in it.

The idea that the court would be improved by mediocrity takes the familiar political logic of representation and exposes how it can take us in ridiculous directions if we don’t recognize its limitations. It’s funny precisely because it exposes how serious ideas can suddenly become silly by grabbing something from the wrong category and shoving it where it doesn’t belong.

There’s an unwritten rule not to verbalize such things. But a lot of the dysfunction in our politics is Hruskian in reality: Lots of people are fine with mediocrities representing them as long as they “represent” their team. Hruska supported Carswell because he was Nixon’s pick and Nixon deserved a win. Run through the list of politicians garnering passionate support from partisans. Some are smart, many are dumb. Some know how to do their jobs, many don’t have the first clue how policy is made or legislating is done. But the important question is: How often does intelligence or competence even enter into it?

As with diversity and censorship, representation is a broad category that we narrow down in reality—certain kinds of diversity, specific forms of censorship. If we understood representation in its broadest, most categorical sense, Congress should reflect a broad cross section of Americans that would include everything from morons to geniuses, violent criminals to pacifists, physicists to spoken-word poets. But we understand that the filter has to be set with a narrower screen.

The problem is that we have the filter on the wrong settings. If I want to hire an electrician, I might consider all sorts of factors: price, recommendations, availability, etc. But the indispensable qualification would be expertise. I would immediately rule out all people who aren’t electricians. In other words, can they do the job?

Marjorie Taylor Greene—to take a very easy example—is an ignoramus. She doesn’t understand the job she was elected to, but even if she did, she couldn’t do it because she’s not on any committees (because she’s also a bigoted loon). But Republican voters just renominated her, presumably on the grounds that what Congress needs is representation of bigoted lunacy and performative jackassery.

Most other politicians aren’t elected for such ludicrous reasons. But many of them are elected to perform and entertain in ways that have nothing to do with the job itself. Alexandria Ocasio-Cortez is no fool and she has an adequate academic grasp of the job, but she’s also among the least effective members of Congress. She’d have to step up her game to be a mediocre legislator if effective legislating determined the bulk of her grade. But it doesn’t for her voters, or for the media that lavishes attention and praise on her.

When it comes to hiring a politician, there are a bunch of things that can or should be on the checklist: ideological agreement, good character, patriotism, a good work ethic, a record of success, etc. You can even include things like religion, height, attractiveness, or odor. This is a democracy after all, and people can vote for whatever reason they want. But one of the things that should be non-negotiable—not as a matter of law, but as a matter of civic hygiene—is the candidate’s ability to do the job.

But for a lot of voters, the job description has been rewritten without even a minute of debate or discussion. Do they hate the other guys enough? Are they entertaining? Are they angry enough? Are they loyal to my team?

No wonder so few can do the actual job. That’s not what they were hired for.

An American musical icon

The Spy Command:

John Williams told The Assocated Press earlier this month, that his score for Indiana Jones 5 may be his final movie work.

“I don’t want to be seen as categorically eliminating any activity,” The 90-year-old composer told AP. But a Star Wars score, he said, is a six-month commitment and “at this point in life is a long commitment to me.”

Williams is known mostly for his film scores, which include 51 Oscar nominations beginning in the 1960s for scores and songs. Williams was the composer of choice for director Steven Spielberg, a collaboration that lasted decades.

However, once upon a time, Williams was known as Johnny Williams and his work was all over television in the 1950s and 1960s.

Williams played piano on the Peter Gunn theme for Henry Mancini. Williams also played as a musician in film scores such as The Magnificent Seven, Sweet Smell of Success and To Kill a Mockingbird, he recalled in a 2002 tribute to composer Elmer Bernstein.

Williams was hired in 1958 by Stanley Wilson, music supervisor for Revue television (later Universal), to score episodes of M Squad, a police drama starring Lee Marvin. At that point, the composer was billed as John T. Williams Jr.

Wilson evidently liked the results and kept bringing Williams back for work. One of Williams’ jobs for Revue writing the theme for Checkmate, a 1960-62 series created by Eric Ambler.

Checkmate concerned the exploits of two private eyes (Anthony George and Doug McClure) assisted by an academic (Sebastian Cabot). Williams was now billed as Johnny Williams.

Williams also did the theme (and scored some episodes for) a Revue anthology show, Kraft Suspense Theater. One of the installments he scored, Once Upon a Savage Night, was a particularly tense story about the search by Chicago authorities for a psychopathic killer (Philip Abbott).

https://www.youtube.com/watch?v=mwZcD5gEss0

One of the victims in the aforementioned “Kraft Suspense Theatre” episode was from Madison, Wis., according to investigator Ted Knight.

In his TV days, Williams was versatile. His credits included the odd sitcom, such as the unaired pilot (plus additional episodes) of Gilligan’s Island as well as the theme for The Tammy Grimes Show, a quickly canceled program in the 1966-67 season.

Producer Irwin Allen brought in Williams to work on series such as Lost in Space and The Time Tunnel, which credited Johnny Williams for their themes.

Johnny Williams even showed up on camera in the first episode of Johnny Staccato, a 1959 series starring John Cassavettes and made at Revue. Williams, clean-shaven and with hair, played a jazz pianist. He was listed in the cast as Johnny Williams.

The Johnny Williams era drew to a close by the late 1960s. His credit for the theme of Irwin Allen’s Land of the Giants series listed the composer as John Williams. For Williams, the best was yet to come.

William Jefferson Reagan

William Galston:

During the past decade, a critique of neoliberalism has become widespread in the progressive wing of the Democratic Party. During the 1970s, the argument goes, many Democrats espoused the pro-market, antigovernment views long associated with opposition to the New Deal and the modern welfare state. In the name of efficiency, growth and lower prices, the Carter administration deregulated airlines, trucking and other sectors. The Clinton administration espoused free trade and the unfettered flow of capital across national boundaries. In response to the Great Recession, President Obama’s economic advisers focused on the health of giant banks and tolerated a grindingly slow recovery.

The problem, critics allege, is that these policies ignore disadvantaged Americans who do not benefit from broad market-driven policies. Markets, they say, are indifferent to equitable outcomes. The focus on aggregate growth comes at the expense of fairness, which requires benefits and opportunities targeted to marginalized groups. Through regulations and wealth transfers, government must lean against markets to achieve acceptable results.

In this narrative of the past half-century, critics often mark the Clinton administration as the moment when establishment Democrats capitulated to the ideology of the unfettered market. Poor and working-class Americans paid the price, they charge, with lower pay, diminished job security, and the collapse of entire sectors exposed to trade competition.

The historical record tells a different story.

Begin with the economic aggregates. During eight years of the Clinton administration, annual real growth in gross domestic product averaged a robust 3.8% while inflation was restrained, averaging 2.6%. Payrolls increased by 22.9 million—nearly 239,000 a month, the fastest on record for a two-term presidency. (Monthly job growth during the Reagan administration averaged 168,000.) Unemployment fell from 7.3% in January 1993 to 3.8% in April 2000 before rising slightly to 4.2% at the end of President Clinton’s second term. Adjusted for inflation, real median household income rose by 13.9%.

Mr. Clinton inherited a substantial budget deficit. Despite this, one group of administration officials, headed by Labor Secretary Robert Reich, urged him to propose a major stimulus package to accelerate economic growth and reduce unemployment more quickly. He refused, focusing instead on reducing inflation and interest rates to create the conditions for long-term growth. (I worked in the White House at the time but had no role in economic policy.) During the administration, federal spending as a share of GDP fell from 21.2% to 17.5%, and federal debt as a share of GDP fell from 61.4% to 54.9%.

What about the North American Free Trade Agreement, which Mr. Clinton pushed through Congress over the objections of a majority of his own party in the House? Didn’t it eviscerate the manufacturing sector? No doubt the agreement reduced jobs in some areas, but manufacturing jobs increased during Mr. Clinton’s eight years. The collapse occurred during George W. Bush’s administration, when 4.5 million manufacturing jobs disappeared and have never been regained. (Manufacturing employment in April 2022 is about where it was when Mr. Bush left office 13 years ago.)

What about the poor? The poverty rate declined during the Clinton administration by nearly one quarter, from 15.1% to 11.3%, near its historic low. And it declined even faster among minorities—by 8.1 percentage points for Hispanics and 10.9 points for blacks.

What about the distribution of gains from economic growth? Income gains for working-class households equaled the national average, and gains for the working poor rose even faster. White households gained an average of 13.9%, but minorities gained even more: 22.0% for Hispanics and 31.5% for blacks.

In sum, during the heyday of neoliberalism, Americans weren’t forced to choose between high growth and low inflation or between aggregate growth and fairness for the poor, working class and minorities. This helps explain why Mr. Clinton’s job approval stood at 65% when he left office.

We can’t go back to the 1990s, but there are lessons from the past. Deregulation can go too far, but so can regulation. The market doesn’t automatically produce acceptable results for society, but neither does government. In these and other respects, policy makers need to find a reasonable balance, the location of which depends on ever-changing circumstances. No algorithm can substitute for good judgment guided by study and common sense.

In our effort to respond to the pandemic generously and humanely, we lost our balance. We have learned the hard way that demand doesn’t automatically create its own supply and that bad things happen when too much money chases too few goods. As we struggle to regain equilibrium, the critics of neoliberalism have much to learn from an administration whose economic performance will be hard to beat.

Carter and Biden

The U.S. is now plagued with the highest inflation rate since Jimmy Carter was president, which corresponded with this country’s second energy crisis.

But those aren’t the only similarities between Carter and Biden, as Jonah Goldberg notes:

In his remarks on inflation, the president laid out a series of concrete measures he was undertaking to curb inflation. But, he cautioned, “it is a myth that the government itself can stop inflation. Success or failure in this overall effort will be largely determined by the actions of the private sector of our economy.” A bit further on he proclaimed:

“No act of Congress, no program of our government, no order of mine as president can bring out the quality that we need: to change from the preoccupation with self that can cripple our national will, to a willingness to acknowledge and to sacrifice for the common good.”

In other words, inflation and our other economic woes were downstream of deeper cultural problems with the country.

You shouldn’t feel guilty for missing these remarks or angry at the media for not reporting on them, because the current president didn’t say any of this. These remarks were delivered 44 years ago by President Jimmy Carter to the American Society of Newspaper Editors.

In fairness to Carter, he did offer a number of serious proposals—whether they were all wise or adequate is a debate for another time. He imposed a cap on government worker wages and asked the private sector to do likewise. He sensibly ordered a review of regulations that had the effect of driving up prices. “I’m determined to eliminate unnecessary regulations and to ensure that future regulations do not impose unnecessary costs to the American economy,” he said.

But this idea that inflation was the product of selfishness and widespread moral failings of the American people is what really sticks out. Carter, who famously admonished Americans a year earlier to treat the energy crisis as “the moral equivalent of war,” was a consummate moralist who liked to reduce technical problems to moral failures. He told the newspaper editors, “The problems of this generation are, in a way, more difficult than those of a generation before. We face no sharply focused crisis or threat which might make us forget our differences and rally to the defense of the common good.” A year later, in his even more famous “malaise speech” (which never used the word “malaise”), Carter said, “all the legislation in the world can’t fix what’s wrong with America. What is lacking is confidence and a sense of community.”

There are a lot of similarities between the Carter presidency and the Biden presidency. Both—so far—have been rocked by any number of crises, many of which were not entirely of their own making, but were nonetheless beyond their ability to get control of. Both Biden and Carter won the presidency in large part because voters wanted to rebuke a previous Republican president. (Yes, Gerald Ford was Nixon’s immediate replacement. But were it not for Nixon’s abuse of his office, Ford wouldn’t have been a placeholder and Carter wouldn’t have won.) And both shared the belief that America’s problems were the result of a breakdown in team spirit and selfishness.

The biggest difference between their approach to leadership lay in style. Despite his conviction that America’s problems boiled down to a lack of esprit de corps, Carter was no cheerleader. He was more like an exhausted youth pastor who didn’t know how to talk to young people but thought he knew exactly what was wrong with these damn kids. Biden, meanwhile, often sounds like a high school yearbook editor fighting the lazy senioritis of his staff—“Come on, everybody, if we all work our hardest and come together, we can make this the greatest yearbook ever!” I’ve lost count of how many times he’s said some version of, “If we come together there’s nothing we cannot do,” or, “We have never, ever, ever failed in America when we have acted together.”

Now, longtime readers will not be shocked to learn I think is profoundly wrong. Unity is a tool, and like any tool it can be used for good or for ill. Lots of terrible things are done under the flag of unity, including war, genocide, lynching, and repression generally. To use a contemporary example, let’s say you passionately believe in a constitutional right to abortion. Do you think that right should evaporate if a large majority of Americans are united in their belief that no such right exists? Even in a democracy, unity alone isn’t always a persuasive argument.

But that’s a familiar refrain of mine. More basically, tools are only good for solving problems they are suited to solve. Screwdrivers are pretty useless for chopping wood and the best scalpels are worse than the crudest rock for pounding nails. And sometimes, the wrong tool is worse than no tool at all.

“We’ve got to amputate his arm!”

“All we’ve got is a mallet!”

“I’ll make it work!”

In Biden’s remarks this week he said:

“My plan is to lower … everyday costs for hardworking families and lower the deficit by asking large corporations and the wealthiest Americans to not engage in price gouging and to pay their fair share in taxes.

The Republican plan is to increase taxes on the middle-class families and let billionaires and large companies off the hook as they raise profit and — raise prices and reap profits at record number — record amounts.

And it’s really that simple.”

Now, I have the pundit’s obligation to note that this is not “the Republican plan.” It’s Sen. Rick Scott’s politically barmy trial balloon that crashed like so much blue ice accidentally jettisoned from a commercial jet. Mitch McConnell’s “plan” is very similar to Michael Corleone’s offer to Sen. Geary: “Nothing.” Like it or not, McConnell’s well-developed attitude is that when your political opponent is smashing himself in the groin, the last thing you want to do is say, “Can I borrow your hammer?”

But the interesting part is that Biden thinks inflation is being driven by corporate greed and “price gouging.” He’s not alone. Elizabeth Warren has introduced economically illiterate legislation to empower the Federal Trade Commission to punish companies guilty of charging “unconscionably excessive” prices. What are “unconscionably excessive” prices? The legislation doesn’t say. She simply trusts that her emissaries on the FTC will know them when they see them. Firms will be “presumed to be in violation” if they use “the effects or circumstances related to the exceptional market shock as a pretext to increase prices.”

“What’s an ‘exceptional market shock?’” you ask. “Any change or imminently threatened (as determined under guidance issued by the Commission) change in the market for a good or service resulting from a natural disaster, failure or shortage of electric power or other source of energy, strike, civil disorder, war, military action, national or local emergency, public health emergency, or any other cause of an atypical disruption in such market.”

With a precise definition like that, what could possibly go wrong?

In short, the FTC would be the economic conscience of the nation and its commissioners would have free rein to let their conscience be their guide. In effect, obscene profits would be determined by Potter Stewart’s standard for obscenity, “I know it when I see it.” The problem is a market system that punishes companies when their investments pay off but doesn’t compensate them when they don’t (or vice versa) isn’t a market system. It’s a form of bureaucratic autocracy.

Now, it’s absolutely true that, say, oil companies are making big profits right now. But in 2020, they suffered what some might call unconscionable losses. ExxonMobil lost $22 billion in 2020 alone. And yet, the people now denouncing Big Oil’s greed didn’t congratulate Big Oil’s “generosity” when it was losing money.

As I’ve written a bunch, the concept of “institutional racism” was invented to explain how undesirable racial outcomes could manifest themselves even when no human actors had racist intent. I think this is a perfectly fine intellectual endeavor so long as those engaging in it hold fast to the part about intent. The problem is that humans aren’t wired that way. The crusaders against “institutional racism” usually can’t help but accuse those who dissent from their analysis as racist. I think it’s because the same part of the brain that drives us toward conspiracy theories needs to assign agency to bad things. For instance, I have no idea whether J.D. Vance is a conspiracy theorist or just plays one for electoral advantage. But the suggestion that Biden is intentionally inviting fentanyl into our country to kill “MAGA voters” is dangerous nonsense.

Similarly, Marxism is mostly garbage as economic theory, but it’s really useful as an illustration of how people can talk a good game about systemic problems—class structure, misallocation of capital, etc.—but invariably get seduced into morality tales about villains and victims. Marx’s labor theory of value was, again, garbage as economic analysis, but man was it awesome for demonizing money-lenders, industrialists, and the bourgeoisie under the rubric of “science.”

We see a version of this kind of thinking all over the place. According to Warren, Kroger, which has very low profit margins, is a monopoly-like malefactor responsible for soaring food prices that must be punished.

There’s this weird irony in progressive rhetoric about corporations. They despise the idea that “corporations are people” but they are the first to anthropomorphize corporations, assigning sinister motives to them. Multifactorial dynamics are reduced to voluntary evil choices.

I don’t want to be accused of perpetuating the naturalistic fallacy, in part because I don’t think the free market is particularly natural. But we all understand that when wolves eat deer, they’re just playing their part in the ecosystem. We don’t denounce “lupine greed” or the depredations of Big Wolf. And when wolves starve because the supply of prey is inadequate for their population, we don’t decry the selfishness of ungulates who run away from the hungry canines. But when it comes to the market system, we routinely assign moral intent and corrupt agency to complicated systemic phenomena.

The baby formula shortage, for instance, is very bad, but it’s not an evil scheme. Scott Lincicome is of course correct that protectionism and certain regulations are partly to blame for this crisis and other supply chain woes. But even Scott, who hates protectionism with the same intensity my old basset hound Norman had for that gray poodle, doesn’t insinuate that protectionists want babies to go hungry.

The projection of simplistic moral categories onto the complicated workings of markets is not always absurd, but it usually is. It’s best understood as our tribal brains rebelling against what we don’t understand.

I know I’m running longer than a Steve Schmidt Twitter vendetta, but I want to make one last point. Last year, Rick Perlstein wrote a much discussed—and mockedessay arguing that contemporary concerns over inflation are silly. Concern about the possible inflationary effect of Biden’s massive spending proposals, Perlstein wrote last fall, “makes no sense, and no liberal should take it seriously — let alone be seduced by it into balking over Biden’s spending plans.”

But that wasn’t the controversial part (most liberals, including at the White House, were saying similar stuff). Perlstein also argued that the inflation of the 1970s really had little to nothing to do with, well, inflation. He wrote:

The conclusion I’ve drawn is that this was a form of moral panic. The 1970s was when the social transformations of the 1960s worked their way into the mainstream. “Inflation spiraling out of control” was a way of talking about how more permissiveness, more profligacy, more individual freedom, more sexualfreedom had sent society spiraling out of control. ‘Discipline’ from the top down was a fantasy about how to make all the madness stop.

I thought that was ridiculous at the time, and I still think it’s substantially wrong. Americans justifiably cared a lot about inflation and the cost of living, and it’s just very weird that a historian would deny that. Here’s a passage from Robert Samuelson, writing in 2009:

“Since 1935, the Gallup Poll has regularly asked respondents, ‘What do you think is the most important problem facing the country today?’ In the nine years from 1973 to 1981, ‘the high cost of living’ ranked No. 1 every year. In some surveys, an astounding 70 percent of the respondents cited it as the major problem. In 1971 it was second behind Vietnam; in 1972 it faded only because wage and price controls artificially and temporarily kept prices in check. In 1982 and 1983, it was second behind unemployment (and not coincidentally: the high joblessness stemmed from a savage recession caused by inflation).”

For Perlstein’s thesis to be correct, not only was Ronald Reagan’s and Paul Volker’s heroic (and politically risky) effort to wrench inflation out of the economy a mere sideshow, but America’s “moral panic” over progressive change just happened to end around the same time they succeeded. That’s hardly how progressives described Reagan’s America at the time.

That said, I think there’s a point to Perlstein’s argument, just not the one he intended. He got the causality backward. Inflation is panic inducing. It makes people feel like things are out of control, that their leaders are in over their heads, and that their economic future is imperiled. And when you’re all jangly with fear and the sense that powerful forces are buffeting you, you’re more likely to be freaked out by other stuff, too. In other words, the economic panic of the 1970s made moral panics generally feel more justified. The inflation of Weimar Germany (far worse than our current predicament, I should note) made Germans susceptible to other forms of panic. When droughts or other calamities afflicted our ancestors, all sorts of moral panics followed.

Obviously, in politics nothing happens in a vacuum. In the 1970s the very legitimate fear of crime was unsettling, too. The unease caused by skyrocketing crime surely fueled unease about the cost of living and vice versa, particularly among those who felt trapped in neighborhoods they couldn’t afford to move out of. And in that context, it’s surely plausible that middle class anxieties about everything from feminism, to Vietnam, to racial discord, to those damn hippies were made worse by inflation—and vice versa. But that doesn’t change the fact that inflation was a real thing, not some metaphorical catchall for conservative bourgeoise reaction.

The polarization and hysteria of the last decade no doubt makes inflation feel even worse. The fact that Biden seems not just inadequate for the job but incapable of describing the problems he faces undoubtedly makes people more anxious about inflation. Similarly, Donald Trump’s inability to talk about the pandemic as something other than a conspiracy against him, a hoax, or a boffo ratings opportunity made people even more anxious about COVID. Leadership—and the lack of it—matters. And we’ve had crappy leadership for quite a while now.

Ronald Reagan said in 1980 that a recession (toward which we are now halfway) is when your neighbor loses his job, a depression is when you lose your job, and recovery is when Carter loses his job. Replace “Carter” with “Biden” and everyone with a D after their names, and the same applies.

 

A constitutional lesson for those who need it

For those who think Roe v. Wade and Planned Parenthood v. Casey must not be overturned by the U.S. Supreme Court because they’re settled law, the National Constitution Center presents a list of Supreme Court decisions that were overturned by the Supreme Court:

In 1992, an opinion from three justices in the Casey decision reinforced the role of stare decisis, or precedent, in the court’s proceedings. “After considering the fundamental constitutional questions resolved by Roe, principles of institutional integrity, and the rule of stare decisis, we are led to conclude this: the essential holding of Roe v. Wade should be retained and once again reaffirmed,” wrote Sandra Day O’Conner, Anthony Kennedy and David Souter.

However, the court doesn’t always follow its precedents. In 1932, Justice Louis Brandeis explained stare decisis in his dissent in Burnet v. Coronado Oil & Gas Co.  “Stare decisis is usually the wise policy, because in most matters it is more important that the applicable rule of law be settled than that it be settled right,” Brandeis wrote. “But in cases involving the Federal Constitution, where correction through legislative action is practically impossible, this Court has often overruled its earlier decisions.”

The Library of Congress tracks the historic list of overruled Supreme Court cases in its report, The Constitution Annotated. As of 2020, the court had overruled its own precedents in an estimated 232 cases since 1810, says the library. To be sure, that list could be subject to interpretation, since it includes the Korematsu case from 1943, which justices have repudiated but never formally overturned. But among scholars, there are a handful of cases seen as true landmark decisions that overturned other precedents.

Here is a short list of those landmark cases, as reported by the Congressional Research Service and Library of Congress:

West Coast Hotel Company v. Parrish(1937). In a 5-4 decision, the Hughes court overturned a decision from the previous year, now stating that the establishment of minimum wages for women was constitutional. The decision was seen as ending the court’s Lochner era.

West Virginia State Board of Education v. Barnette (1943). In a 6-to-3 decision, the Court overruled Minersville School District v. Gobitis (1940). Justice Robert Jackson’s majority opinion affirmed that forcing public school students to salute the American flag was unconstitutional. “If there is any fixed star in our constitutional constellation, it is that no official, high or petty, can prescribe what shall be orthodox in politics, nationalism, religion, or other matters of opinion, or force citizens to confess by word or act their faith therein,” Jackson famously wrote.

Brown v. Board of Education of Topeka (1954). A unanimous Warren Court (pictured above) decided that a separate but equal policy of educational facilities for racial minorities, consistent with Plessy v. Ferguson (1896), violated the 14th Amendment’s Equal Protection Clause.

Mapp v. Ohio(1961). Overruling Wolf v. Colorado (1949), the court said in a 6-3 decision that evidence gathered by authorities through searches and seizures that violated the Fourth Amendment could not be presented in a state court—otherwise known as the “exclusionary rule.”

Gideon v. Wainwright(1963). Justice Hugo Black’s unanimous opinion invalidated Betts v. Brady (1942) and required state courts to appoint attorneys for defendants who cannot afford to retain lawyers on their own.

Miranda v. Arizona (1966). In a 5-4 opinion, Chief Justice Earl Warren concluded that police violated Ernesto Miranda’s rights by not informing Miranda that he could remain silent and also ask for an attorney during interrogations. The ruling invalidates two court rulings from 1958: Crooker v. California(1958) and Cicenia v. Lagay(1958).

Katz v. United States (1967). In a 7-1 decision (Justice Thurgood Marshall did not take part in the case), the court determined that a man in a phone booth could not be wiretapped by authorities without a warrant from a judge. The decision overturned two prior Supreme Court decisions: Olmstead v. United States (1928) and Goldman v. United States (1942.)

Brandenburg v. Ohio (1969). The court decided that Ohio’s criminal syndicalism law, barring public speech calling for illegal activities, was unconstitutional on First and 14th Amendment grounds unless the speech incited “imminent lawless action.” The decision overruled Whitney v. California (1927).

Gregg v. Georgia(1976). In a 7-2 decision from Potter Stewart, the court ruled that Georgia’s capital punishment laws didn’t violate the Eighth and 14th Amendment’s prohibitions on cruel and unusual punishment. The court invalidated McGautha v. California(1971), a prior death-penalty case.

Planned Parenthood of Southeastern Pennsylvania v. Case(1992). A divided court invalidated parts of two prior decisions, Thornburghand Akron I, as inconsistent with Roe v. Wade.

Atkins v. Virginia (2002). The Supreme Court held that executions of intellectually challenged criminals were “cruel and unusual punishments” barred by the Eighth Amendment. The decision overturned Penry v. Lynaugh (1989).

Lawrence v. Texas (2003). Justice Anthony M. Kennedy, in a 6-3 ruling,  cited the Due Process Clause and invalidated a Texas law making it a crime for two persons of the same sex to engage in sexual conduct.  The decision overturns Bowers v. Hardwick (1986).

Citizens United v. FEC (2010). By a 5-to-4 decision, Justice Anthony M. Kennedy writes for the majority and says the First Amendment did not permit the government to ban corporate funding of independent political broadcasts during election cycles. The decision overturned Austin v. Michigan Chamber of Commerce(1990) and parts of McConnell v. FEC(2003).

Obergefell v. Hodges (2015).  In a 5-4 opinion, Justice Kennedy said the 14th Amendment’s Due Process Clause guaranteed the right to marry as a fundamental liberty that applied to couples regardless of their sex. The decision overruled a one-sentence ruling in Baker v. Nelson (1972).

South Dakota v. Wayfair (2018). In another 5-4 decision from Justice Kennedy, the court said sellers who engage in significant business within a state may be required to pay taxes, even if the business does not have a physical presence in the taxing state. The ruling overturned Quill Corp. v. North Dakota (1992).

Janus v. American Federation of State, County, and Municipal Employees (2018). In a 5-4 opinion from Justice Samuel Alito, the court said the state of Illinois violated the First Amendment by extracting agency fees from nonconsenting public-sector employees. The decision overturned Abood v. Detroit Bd. of Education(1977).

Anyone think “separate but equal” is acceptable?

 

The “S” stood for “Scam”

Today in 1945:

There is a myth about Harry S. Truman as the last common-man president. Jeff Jacoby punctures that myth:

When the Washington Post reported in 2007 that Bill Clinton had pocketed nearly $40 million in speaking fees since leaving the White House six years earlier, I wrote a column regretting that yet another former chief executive had proved so eager “to leverage the prestige of the presidency for big bucks.”

Well, not every former chief executive. While Clinton followed in the footsteps of George H.W. Bush, Ronald Reagan, Jimmy Carter, and Gerald Ford, one president had been different. Citing historian David McCullough, I noted that Harry Truman had left the White House in such straitened circumstances that he and his wife Bess needed a bank loan to pay their bills.

Nonetheless, Truman insisted he would not cash in on his name and influence. As McCullough recounted in his Pulitzer Prize-winning biography, Truman’s only intention “was to do nothing — accept no position, lend his name to no organization or transaction — that would exploit or commercialize the prestige and dignity of the office of the President.”

Moved by Truman’s seeming financial distress, Congress eventually passed the Former Presidents Act, which provides former presidents with a lavish pension, furnished offices, and lucrative staff and travel allowances.

There’s just one thing wrong with that oft-repeated narrative about Truman’s economic desperation. According to law professor and journalist Paul Campos, it is completely false. In a bombshell article in New York Magazine, Campos shows that Truman lied shamelessly and repeatedly about the state of his finances in order to guilt-trip Congress into passing the Former Presidents Act, which would provide him with taxpayer-funded benefits for which he had no need.

Campos’s findings are jaw-dropping. The story of Truman’s post-presidential penury has long been taken as undoubted fact, and his self-denying refusal to trade on his public legacy for private gain doubtless contributed to his dramatic rebound in public esteem. Truman left the White House with the lowest approval rating in modern presidential history; today he is ranked among the best presidents ever.

But Campos brings the receipts. With the cooperation of the Harry S. Truman Presidential Library, he spent months examining the 33rd president’s financial records, many of which became available only with the 2011 release of Bess Truman’s personal papers.

“Harry Truman was a very rich man on the day he left the White House,” writes Campos, “and he became a good deal richer in the five and a half years between that day and the passage of the FPA.”

The evidence for those jolting assertions comes from none other than Truman himself. In a will drafted in his own hand and kept with Bess Truman’s papers, Truman estimated that his net worth at the end of his presidency was $650,000 — a sum comprising $250,000 in savings bonds, $150,000 in cash, and land worth an estimated $250,000. Adjusted for inflation, $650,000 in 1953 is the equivalent of $6.6 million in 2021.

Far from being one step from the poorhouse on his return to private life, Campos writes, Truman’s own private calculations show that his income was among the top 1 percent of American households. Which, in hindsight, makes sense: As president, he received one of the most generous salaries in America — in 1949, presidential pay was raised to $100,000 annually, an amount worth more than $1.1 million today.

9/11

Sept. 11, 2001 started out as a beautiful day, in Wisconsin, New York City and Washington, D.C.

I remember almost everything about the entire day. Sept. 11, 2001 is to my generation what Nov. 22, 1963 was to my parents and Dec. 7, 1941 was to my grandparents.

I had dropped off our oldest son at Ripon Children’s Learning Center. As I was coming out, the mother of one of his group told me to find a good radio station; she had heard as she was getting out with her son that a plane had hit the World Trade Center.

I got in my car and turned it on in time to hear, seemingly live, a plane hit the WTC. But it wasn’t the first plane, it was the second plane hitting the other tower.

As you can imagine, my drive to Fond du Lac took unusually long that day. I tried to call Mrs. Presteblog, who was working at Ripon College, but she didn’t answer because she was in a meeting. I had been at Marian University as their PR director for just a couple months, so I didn’t know for sure who the media might want to talk to, but once I got there I found a couple professors and called KFIZ and WFDL in Fond du Lac and set up live interviews.

The entire day was like reading a novel, except that there was no novel to put down and no nightmare from which to wake up. A third plane hit the Pentagon? A fourth plane crashed somewhere else? The government was grounding every plane in the country and closing every airport?

I had a TV in my office, and later that morning I heard that one of the towers had collapsed. So as I was talking to my wife on the phone, NBC showed a tower collapsing, and I assumed that was video of the first tower collapse. But it wasn’t; it was the second tower collapse, and that was the second time that replay-but-it’s-not thing had happened that day.

Marian’s president and my boss (a native of a Queens neighborhood who grew up with many firefighter and police officer families, and who by the way had a personality similar to Rudy Giuliani) had a brief discussion about whether or not to cancel afternoon or evening classes, but they decided (correctly) to hold classes as scheduled. The obvious reasons were (1) that we had more than 1,000 students on campus, and what were they going to do if they didn’t have classes, and (2) it was certainly more appropriate to have our professors leading a discussion over what had happened than anything else that could have been done.

I was at Marian until after 7 p.m. I’m sure Marian had a memorial service, but I don’t remember it. While I was in Fond du Lac, our church was having a memorial service with our new rector (who hadn’t officially started yet) and our interim priest. I was in a long line at a gas station, getting gas because the yellow low fuel light on my car was on, not because of panic over gas prices, although I recall that one Fond du Lac gas station had increased their prices that day to the ridiculous $2.299 per gallon. (I think my gas was around $1.50 a gallon that day.)

Two things I remember about that specific day: It was an absolutely spectacular day. But when the sun set, it seemed really, really dark, as if there was no light at all outside, from stars, streetlights or anything else.

For the next few days, since our son was at the TV-watching age, we would watch the ongoing 9/11 coverage in our kitchen while Michael was watching the 1-year-old-appropriate stuff or videos in our living room. That Sunday, one of the people who was at church was Adrian Karsten of ESPN. He was supposed to be at a football game working for ESPN, of course, but there was no college football Saturday (though high school football was played that Friday night), and there was no NFL football Sunday. Our organist played “God Bless America” after Mass, and I recall Adrian clapping with tears down his face; I believe he knew some people who had died or been injured.

Later that day was Marian’s Heritage Festival of the Arts. We had record attendance since there was nothing going on, it was another beautiful day, and I’m guessing after five consecutive days of nonstop 9/11 coverage, people wanted to get out of their houses.

In the 20 years since then, a comment of New York City Mayor Rudy Giuliani has stuck in my head. He was asked a year or so later whether the U.S. was more or less safe since 9/11, and I believe his answer was that we were more safe because we knew more than on Sept. 10, 2001. That and the fact that we haven’t been subject to another major terrorist attack since then is the good news.

Osama bin Laden (who I hope is enjoying Na’ar, Islam’s hell) and others in Al Qaeda apparently thought that the U.S. (despite the fact that citizens from more than 90 countries died on 9/11) would be intimidated by the 9/11 attacks and cower on this side of the Atlantic Ocean, allowing Al Qaeda to operate with impunity in the Middle East and elsewhere. (Bin Laden is no longer available for comment.) If you asked an American who paid even the slightest attention to world affairs where a terrorist attack would be most likely before 9/11, that American would have replied either “New York,” the world’s financial capital, or “Washington,” the center of the government that dominates the free world. A terrorist attack farther into the U.S., even in a much smaller area than New York or Washington, would have delivered a more chilling message, that nowhere in the U.S. was safe. Al Qaeda didn’t think  to do that, or couldn’t do that. The rest of the Middle East also did not turn on the U.S. or on Israel (more so than already is the case with Israel), as bin Laden apparently expected.

The bad news is all of the other changes that have taken place that are not for the better. Bloomberg Businessweek asks:

So was it worth it? Has the money spent by the U.S. to protect itself from terrorism been a sound investment? If the benchmark is the absence of another attack on the American homeland, then the answer is indisputably yes. For the first few years after Sept. 11, there was political near-unanimity that this was all that mattered. In 2005, after the bombings of the London subway system, President Bush sought to reassure Americans by declaring that “we’re spending unprecedented resources to protect our nation.” Any expenditure in the name of fighting terrorism was justified.

A decade later, though, it’s clear this approach is no longer sustainable. Even if the U.S. is a safer nation than it was on Sept. 11, it’s a stretch to say that it’s a stronger one. And in retrospect, the threat posed by terrorism may have been significantly less daunting than Western publics and policymakers imagined it to be. …

Politicians and pundits frequently said that al Qaeda posed an “existential threat” to the U.S. But governments can’t defend against existential threats—they can only overspend against them. And national intelligence was very late in understanding al Qaeda’s true capabilities. At its peak, al Qaeda’s ranks of hardened operatives numbered in the low hundreds—and that was before the U.S. and its allies launched a global military campaign to dismantle the network. “We made some bad assumptions right after Sept. 11 that shaped how we approached the war on terror,” says Brian Fishman, a counterterrorism research fellow at the New America Foundation. “We thought al Qaeda would run over the Middle East—they were going to take over governments and control armies. In hindsight, it’s clear that was never going to be the case. Al Qaeda was not as good as we gave them credit for.”

Yet for a decade, the government’s approach to counterterrorism has been premised in part on the idea that not only would al Qaeda attack inside the U.S. again, but its next strike would be even bigger—possibly involving unconventional weapons or even a nuclear bomb. Washington has appropriated tens of billions trying to protect against every conceivable kind of attack, no matter the scale or likelihood. To cite one example, the U.S. spends $1 billion a year to defend against domestic attacks involving improvised-explosive devices, the makeshift bombs favored by insurgents in Afghanistan. “In hindsight, the idea that post-Sept. 11 terrorism was different from pre-9/11 terrorism was wrong,” says Brian A. Jackson, a senior physical scientist at RAND. “If you honestly believed the followup to 9/11 would be a nuclear weapon, then for intellectual consistency you had to say, ‘We’ve got to prevent everything.’ We pushed for perfection, and in counterterrorism, that runs up the tab pretty fast.”

Nowhere has that profligacy been more evident than in the area of homeland security. “Things done in haste are not done particularly well,” says Jackson. As Daveed Gartenstein-Ross writes in his new book, Bin Laden’s Legacy, the creation of a homeland security apparatus has been marked by waste, bureaucracy, and cost overruns. Gartenstein-Ross cites the Transportation Security Agency’s rush to hire 60,000 airport screeners after Sept. 11, which was originally budgeted at $104 million; in the end it cost the government $867 million. The homeland security budget has also proved to be a pork barrel bonanza: In perhaps the most egregious example, the Kentucky Charitable Gaming Dept. received $36,000 to prevent terrorists from raising money at bingo halls. “If you look at the past decade and what it’s cost us, I’d say the rate of return on investment has been poor,” Gartenstein-Ross says.

Of course, much of that analysis has the 20/20 vision of hindsight. It is interesting to note as well that, for all the campaign rhetoric from candidate Barack Obama that we needed to change our foreign policy approach, president Obama changed almost nothing, including our Afghanistan and Iraq involvements. It is also interesting to note that the supposed change away from President George W. Bush’s us-or-them foreign policy approach hasn’t changed the world’s view, including particularly the Middle East’s view, of the U.S. Someone years from now will have to determine whether homeland security, military and intelligence improvements prevented Al Qaeda from another 9/11 attack, or if Al Qaeda wasn’t capable of more than just one 9/11-style U.S. attack.

Hindsight makes one realize how much of the 9/11 attacks could have been prevented or at least their worst effects lessened. One year after 9/11, the New York Times book 102 Minutes: The Untold Story of the Fight to Survive Inside the Twin Towers points out that eight years after the 1993 World Trade Center bombing, New York City firefighters and police officers still could not communicate with each other, which led to most of the police and fire deaths in the WTC collapses. Even worse, the book revealed that the buildings did not meet New York City fire codes when they were designed because they didn’t have to, since they were under the jurisdiction of the Port Authority of New York and New Jersey. And more than one account shows that, had certain people at the FBI and elsewhere been listened to by their bosses, the 9/11 attacks wouldn’t have caught our intelligence community dumbfounded. (It does not speak well of our government to note that no one appears to have paid any kind of political price for the 9/11 attacks.)

I think, as Bloomberg BusinessWeek argued, our approach to homeland security (a term I loathe) has overdone much and missed other threats. Our approach to airline security — which really seems like the old error of generals’ fighting the previous war — has made air travel worse but not safer. (Unless you truly believe that 84-year-old women and babies are terrorist threats.) The incontrovertible fact is that every 9/11 hijacker fit into one gender, one ethnic group and a similar age range. Only two reasons exist to not profile airline travelers — political correctness and the assumption that anyone is capable of hijacking an airplane, killing the pilots and flying it into a skyscraper or important national building. Meanwhile, while the U.S. spends about $1 billion each year trying to prevent Improvised Explosive Device attacks, what is this country doing about something that would be even more disruptive, yet potentially easier to do — an Electromagnetic Pulse attack, which would fry every computer within the range of the device?

We have at least started to take steps like drilling our own continent’s oil and developing every potential source of electric power, ecofriendly or not, to make us less dependent on Middle East oil. (The Middle East, by the way, supplies only one-fourth of our imported oil. We can become less dependent on Middle East oil; we cannot become less dependent on energy.) But the government’s response to 9/11 has followed like B follows A the approach our culture has taken to risk of any sort, as if covering ourselves in bubblewrap, or even better cowering in our homes, will make the bogeyman go away. Are we really safer because of the Patriot Act?

American politics was quite nasty in the 1990s. For a brief while after 9/11, we had impossible-to-imagine moments like this:

And then within the following year, the political beatings resumed. Bush’s statement, “I ask your continued participation and confidence in the American economy,” was deliberately misconstrued as Bush saying that Americans should go out and shop. Americans were exhorted to sacrifice for a war unlike any war we’ve ever faced by those who wouldn’t have to deal with the sacrifices of, for instance, gas prices far beyond $5 per gallon, or mandatory national service (a bad idea that rears its ugly head in times of anything approaching national crisis), or substantially higher taxes.

Then again, none of this should be a surprise. Other parts of the world hate Americans because we are more economically and politically free than most of the world. We have graduated from using those of different skin color from the majority as slaves, and we have progressed beyond assigning different societal rights to each gender. We tolerate different political views and religions. To the extent the 9/11 masterminds could be considered Muslims at all, they supported — and radical Muslims support — none of the values that are based on our certain inalienable rights. The war between our world, flawed though it is, and a world based on sharia law is a war we had better win.

And winning that war does not include withdrawal. Whether or not Donald Trump was right about leaving Afghanistan, Joe Biden screwed up the withdrawal so badly that everyone with memory compared it to our withdrawing from South Vietnam. The obviously incomplete vetting of Afghan refugees, who are now at Fort McCoy, and our leaving billions of dollars of our military equipment in Afghanistan, guarantees will be back, and more American soldiers, and perhaps non-soldiers, will die.

In one important sense, 9/11 changed us less than it revealed us. America can be both deeply flawed and a special place, because human beings are both deeply flawed and nonetheless special in God’s eyes. Jesus Christ is quoted in Luke 12:48 as saying that “to whomsoever much is given, of him shall be much required.” As much as Americans don’t want to be the policeman of the world, or the nation most responsible for protecting freedom worldwide, there it is.