Category: Culture

The Wisconsin religious right

Peter W. Woolf:

The New York Times has re-discovered the religious right. In a front-page story, we learn the awful truth that there is a “right-wing political movement powered by divine purpose, whose adherents find spiritual sustenance in political action.” They sing hymns; they pray; they burn candles. They import “their worship of God, with all its intensity, emotion and ambitions, to their political life.” Quite a few support Trump and also protest “against Covid restrictions,” among other unspeakable acts.

Once, long ago, I ventured into this dark territory, not armored by the shield of New York Times-style contempt for the deplorables, but like Marlowe heading up river into the Heart of Darkness. It was a hard-won lesson.

In February 1949, a forty-year-old farmwife in rural Wisconsin had a vision of the Virgin Mary appearing in her bedroom. Mrs. Mary Ann Van Hoof kept this secret for a while, but Mary reappeared to her in her garden in May, and then starting making more frequent visits. Word got around, crowds gathered, and on August 15, 1950, some 100,000 people made their way to the Van Hoof farm near Necedah (Na-SEE-dah), Wisconsin.

A sermon you will not hear Sunday

Michael Smith:

Something you should know about me is that I am endlessly fascinated by the things humans will invent to justify pretty much any human behavior, no matter how bad it is.

That is why I am doing a deep dive into several social theories, Queer Theory among them.

One commonality in the subjects I am examining is this: postmodernist theorists and philosophers not only object to anyone drawing a line, they do not believe a line even exists,There was a philosophical movement with roots in the 16th and 17th century, mainly consisting of Italian and French erudite cultural and philosophical thought that sought to establish reason and nature as the criteria of morality, politics, and law, and thus questioning transcendental sources of truth and authority. Called libertinism, it celebrated an authority of nature that debunked societal prohibitions as religious superstition and argued for the value of immediate physical pleasure rather than some heavenly reward later.

It gained new-found adherents in the 18th and 19th centuries, particularly in France and Great Britain. Notable among these were John Wilmot, 2nd Earl of Rochester, and the Marquis de Sade.

A libertine is often defined as “one devoid of most moral or sexual restraints, which are seen as unnecessary or undesirable, especially one who ignores or even spurns accepted morals and forms of behavior sanctified by the larger society.

Libertinism is rightly described as extreme form of individualist hedonism and as such, puts primary value on sensual or physical pleasures. Libertinism also necessarily requires the rejection of any religious stigma, moral code or social mores that argue against the attainment of such pleasures.

Libertinism supposedly rests on a foundation of “reason and nature as the criteria of morality, politics, and law” but modern “libertines” reject both reason and nature for concocted fairy tales that substantiate their actions.

They will only “reason” themselves to a point of emotional satisfaction rather than to a logical endpoint.

The more I studied libertinism, the more I saw the common thread between it and the modern sexual philosophies and how most seem little more than excuses and defenses for desires and behaviors that contradict established social mores and religious beliefs.

As previously noted, I’ve been studying Queer Theory, which, in my opinion, is just an extreme form of libertinism.

Nothing new under the sun.

Libertinism ultimately fails, as will any school of thought sharing its roots.

Like libertinism, many of the modern variants are much like an addiction (a porn addiction is a pretty good analog) because when satisfaction is attained by one thing, to reach satisfaction the next time requires a more extreme approach until the person is completely consumed, their very existence bounded and defined by the addiction.

The addict’s very identity is ultimately destroyed by the very pleasure by which he seeks to define himself.

Ros Ballaster, Professor of 18th Century Studies at Oxford’s Mansfield College, noted:
“Libertinism, rather than reinforcing the natural elements of the self, creates a void within humanity, exposing man as a passionless, diseased ‘non-entity’.”

I have formed a theory that the beliefs that have the potential to unify us in a free society are also the same beliefs that have the potential to divide that same society.

For example, America is a nation founded on individual liberty, individual rights, and rugged individualism – but individualism can take on many forms. What happens when there are individuals who act in opposition to the traditional social mores? If we prohibit those actions, how do we reconcile that with our belief in individualism?

How about philosophies that directly contradict a couple centuries of general social cohesion?

I think the real question is whether we, in our pursuit of a more civil society, are better off for such inquiries or we would be better served to ignore them.

I go back to Enlightenment philosopher Immanuel Kant saying that it does not matter whether the existence of God can ever be empirically proven, people are justified in believing in His existence and following His laws if that belief forms the basis for reason and supports a civil society.

Belief in God produces exclusionary behaviors, so to follow God’s laws means some things are allowed, some are forbidden. It would follow that forbidding individual behaviors that cut against the grain of social cohesion in a civil society is something justifiable in the quest to maintain that society.

Therefore, the question in any civil society becomes not whether it is appropriate and necessary to draw a line, but where that line is to be drawn and who is to draw it.

America’s governance by a “moral and religious” people has done a damn good job of balancing individual liberty with social cohesion.

Whether it can resist the forces of contemporary libertinism seems an open question.

Against “woke”

John Kass passes on Pat Hickey:

I taught English Literature and Composition at Catholic schools from 1975 until I retired in 2017, including Honors and Advanced Placement. Since that time my hours were filled as a substitute teacher in Northwest Indiana, and I now work as a Jobs Coach for Special Education students in a large public high school.  I take troops of students to workplace locations (Al’s Grocery, WINN Machines, LaPorte County Animal Shelter and show young men and women how to wash dishes for an elementary school) and teach them how to comport themselves in a workplace.

My charges are mostly Autistic and Downs Syndrome youngsters, and they are sensational workers. Their General Education contemporaries go to college and vocational preparatory classes. I think Gen Ed kids get the short end of stick.  The Special Education kids go right into the workplace and their talents and work ethic gives them a leg up on their classmates. The students are not paid for their labors, as the time (usually 90 minutes a couple of days a week) gets credited as part of their graduation certificate. From the Workplace Program, seventeen-, eighteen- and nineteen-year-old workers very often get recruited directly into jobs. Three of my students are now salaried employees, after school.

These young people are not dependents; they are workers.

They will not go on to Valparaiso, DePauw, Purdue or Notre Dame, nor will their parents be saddled with hundreds of thousands of dollars in college debt. They will not read The Canterbury Tales, Aeneid, The Virginian, Ethan Frome, Henry V, Tale of Two Cities, Jane Eyre or Invisible Man; but neither will their General Education counterparts. That is a huge problem.

The current secondary school English canon is dumbed down. It seems to me that everything of value went to hell when we politely considered the opinion of dim bulbs who interrogate with “Well, who’s to say?”  People who know something, Karen.

The Who’s to Sayers have screwed up religion, politics, and sports. Keep reading, gentle folks, because at the end of my jeremiad I post a list of essential works of literature.

What were once essential readings have disappeared from high school curricula universal. As a substitute teacher I was shocked to learn that texts once deemed essential to one’s intellectual, ethical, and civic growth are no longer taught, offered, or considered. Young people have no connection to the great conversation anymore. Thousands of years of shared thoughts have been cast aside in favor of graphic novels, critical race theory, or books related to movies.

Vanity Fair, Tom Jones, The Mysterious Stranger and Moby Dick have been erased in favor of books selected by Oprah, or room temperature I.Q.’s like former Chicago Mayor Richard M. Daley, who placed his political imprimatur on Harper Lee’s To Kill a Mockingbird.

That tome was considered a ‘nice story’ but not remotely on a par with Jude the Obscure, much less George Eliot’s Middlemarch. After the City That Used to Work gushed over the canonization of Truman Capote’s BFF, Ms. Lee, To Kill a Mockingbird climbed to top of the academic pyramid. It’s a nice story, but it is no David Copperfield. It was considered young adult fiction. Now it is mentioned in hushed tones.

Whenever I run into classmates from the 1960s, we tend to talk about the sad world the young are forced to wade through: a swamp of tepid experiences without any sense of common struggles and shared joys. This is a mean and humorless age that celebrates the balkanization of races, religions, and classes. Literature mirrors the music of the times. Chief Keef is the Old Blue Eyes and Amanda Gorman the Robert Frost. In the 1960s our common culture shared the magic of Sam Cooke, Marvin Gaye and Dusty Springfield along with Joe Williams, Sarah Vaughan and Ella Fitzgerald, as well as Shostakovich and Schubert–all through television and radio. The Sound Must Seem the Echo of Sense. That was a quote from Alexander Pope’s Essay on Criticism, which these days gets a nod only in an Advanced Placement English class. It is a musical essay that once set the canons of taste.

Books and tunes make little sense these days and essential key codes to leading a vital life are not available to students today. Let me explain.

Herman Melville’s novella Bartleby the Scrivener is a warning about the dire consequence of copying the words of others. Most people, other than the 46th President of the United States, know that imitation is flattery, but plagiarism is soul-sucking theft. The character Bartleby refuses to write or do anything other than die. He sheds his mortal husk and finally communicates with his employer, who laments, “Ah, Bartleby! Ah, humanity!”

That apostrophe (in poetry, an address to a dead or absent person) sums up our copy-cat culture that churns out formulaic novels, stories and ‘spoken word’ screeds that pass for poetry.

The classics were artifacts of truth. Practice, or genuine imitation, was the only passage to genius. Samuel Johnson, Alexander Pope and John Dryden imitated, not duplicated, Horace, Virgil, and Juvenal. Real poets like Seamus Heaney imitated the greats and wrote the greatest translation of Beowulf, which celebrates courage and commitment. Amanda Gorman is celebrated for eschewing meter and rhyme scheme in favor of odd pauses. Seamus Heaney? Not in our high schools.

The virtues teach us to be morally excellent by taking the golden mean. Courage, for example, stands squarely between two vices (deficiency/excess: cowardice/rashness) and should never seem ambiguous or ironic. Shakespeare taught Aristotle to his audiences better than the Stagirite might have done himself. Richard III is a monster and Richard II is a vacillating whiner, but Henry V is what kingship is all about. Julius Caesar offers a simple seminar on political rhetoric: Brutus – the Attic, or closed fist and Antony the African, or open palm. The Attic school of rhetoric was from Greece and was determined by logic and cold reason, while the African school came from Egypt and appealed to the heart. Attic says, “Do what I say!”  African says, “Hey guys, give me a hand!”

The Attic style works for autocrats like our shut-down elected officials. Mandates are neither suggestions nor invitations to debate. They are an exercise of power.

The African rhetorical style worked nicely with people who were shown respect and allowed to exercise their civic duties as free men and women.

Brutus speaks to logic and Antony to emotion. Know your audience. The Roman plebs are angry that their champion Julius Caesar had been butchered by the woke senators who wanted to maintain the oligarchs’ power of the Senate over the common folks of Rome. Antony, a masker and reveler, understood the common man. Brutus, an honorable man, was convinced that his class should always lord it over the unwashed mob. Brutus’s closed fist: Romans, countrymen: Be patient till the last. Hear me for my cause and be silent that you may hear. Believe me for mine honor and have respect to mine honor that you may believe. Censure me in your wisdom and awake your senses that you may the better judge.

Yeah, right.

Antony offers an open palm:  Friends, Romans, countrymen, lend me your ears; I come to bury Caesar, not to praise him. The evil that men do lives after them; The good is oft interred with their bones; So, let it be with Caesar. The noble Brutus Hath told you Caesar was ambitious: If it were so, it was a grievous fault, And grievously hath Caesar answer’d it.

Well, the folks rioted and the noble Romans had to beat it out of town, and fast! Between Antony’s rash emotionalism and Brutus’s cold Attic arrogance lies Octavian, the true heir to Caesar. A common Shakespearian device in tragedy and history is to have the last speech go to the person meant to rule–the last word: According to his virtue let us use him, With all respect and rites of burial. Within my tent his bones tonight shall lie Most like a soldier, ordered honorably. So, call the field to rest, and let’s away to part the glories of this happy day.

The great ruler is fair to all. The golden mean requires it.

Do you think that our elected officials have any sense of fairness? Our young people should be introduced to eternal values and virtues. Instead, they mask up as advocates and slogan-slinging cranks.

These are empirical observations based on what I witnessed. Who’s to say?  In this case, me.

I could not be an English teacher in 2022. The Woke culture would cancel me immediately. But I was able to impart eternal truths and basic virtues via literature for four decades. Helping Special Education youngsters learn to bag groceries at Al’s in LaPorte is far more important than trying to convince young minds that Amanda Gorman is a poet. Not gonna happen.

These are a few essential readings that young people once had presented to them. N.B., I find Joseph Conrad, a Polish sailor who could write in three languages and produce the most beautiful English prose about honor, duty, dignity, and compassion to be the most important. Herman Melville and Ralph Ellison wrote the two greatest American novels: Moby Dick by the former and Invisible Man by the later. If I had one book to save from extinction to prove that humanity is the work of God, it would be John Milton’s Paradise  Lost, which seems destined to be known as a weak joke in Animal House.   The greatest comic novel is A Confederacy of Dunces, John Kennedy Toole’s posthumous indictment of cant, ignorance and pretense.

Ladies and gents, my promised list:

The N*****of the Narcissus –Joseph Conrad–also titled “The Children of the Sea: A Tale of the Forecastle”

The Secret Sharer – Joseph Conrad

Lord Jim – Joseph Conrad

The Man Who Would be King – Rudyard Kipling

Invisible Man – Ralph Ellison

Barnaby Rudge – Charles Dickens

Jane Eyre – Emily Bronte

Paradise Lost – John Milton

The Canterbury Tales – Geoffrey Chaucer

Henry V – William Shakespeare

Sonnets – John Donne

Moby Dick – Herman Melville

Bartleby the Scrivener – Herman Melville

Red Badge of Courage – Stephan Crane

The Virginian – Owen Wister

The Big Blonde – Dorothy Parker

Poems of Emily Dickinson

Man Without a Country – Edward Everett Hale

Aeneid – Virgil

The Odyssey – Homer

The Greek Passion – Nikos Kazantzakis

The Informer – Liam O’Flaherty

Short Stories of Brett Harte

Huckleberry Finn – Mark Twain

U.S.A. Trilogy – John Dos Passos

The Day of the Locusts – Nathaniel West

Catch 22 – Joseph Heller

The Caine Mutiny – Herman Wouk

The Continental Op – Dashiell Hammett

The Little Sister – Raymond Chandler

The Sign of Four – Arthur Conan Doyle

Napoleon of Notting Hill – G.K. Chesterton

A Confederacy of Dunces – John Kennedy Toole

Wise Blood – Flannery O’Connor

The 2021 Presteblog Christmas album

Starting shortly after my birth, my parents purchased Christmas albums for $1 from an unlikely place by today’s standards, tire stores.

(That’s as seemingly outmoded as getting, for instance, glasses every time you filled up at your favorite gas station, back in the days when gas stations were usually part of a car repair place, not a convenience store. Of course, go to a convenience store now, and you can probably find CDs, if not records, and at least plastic glasses such as Red Solo Cups and silverware. Progress, or something.)

The albums featured contemporary artists from the ’60s, plus opera singers and other artists.

These albums were played on my parents’ wall-length Magnavox hi-fi player.

Playing these albums was as annual a ritual as watching “The Grinch Who Stole Christmas,” “A Charlie Brown Christmas,” or other holiday-season appointment TV.

Those albums began my, and then our, collection of Christmas music.

You may think some of these singers are unusual choices to sing Christmas music. (This list includes at least six Jewish singers.)

Of course, Christians know that Jesus Christ was Jewish. (And faithful to his faith.)

And I defy any reader to find anyone who can sing “Silent Night” like Barbra Streisand did in the ’60s.

These albums are available for purchase online, but record players are now as outmoded as, well, getting glasses with your fill-up at the gas station. (Though note what I previously wrote.)

But thanks to YouTube and other digital technology, other aficionados of this era of Christmas music now can have their music preserved for their current and future enjoyment.

The tire-store-Christmas-album list has been augmented by both earlier and later works.

In the same way I think no one can sing “Silent Night” like Barbra Streisand, I think no one can sing “Do You Hear What I Hear” (a song written during the Cuban Missile Crisis, believe it or not) like Whitney Houston:

This list contains another irony — an entry from “A Christmas Gift for You,” Phil Spector’s Christmas album. (Spector’s birthday is Christmas.)

The album should have been a bazillion-seller, and perhaps would have been had it not been for the date of its initial release: Nov. 22, 1963.

Finally, here’s the last iteration of one of the coolest TV traditions — “The Late Show with David Letterman” and its annual appearance of Darlene Love (from the aforementioned Phil Spector album), which started in 1986 on NBC …

… and ended on CBS:

Merry Christmas. (To play this whole thing as a YouTube playlist, click here.)


Philip Klein:

Most people have at some point in their lives been asked to entertain a version of the cheesy question, “If you knew you had one day to live, what would you do?” It’s often posed as a playful game or essay topic or used by self-help gurus to prod people into trying to get a deeper sense of their priorities. But it’s time for everybody to start asking themselves a different question: If COVID-19 will be here forever, is this what you want the rest of your life to look like? In this case, it’s not an idle or theoretical exercise. It will be central to how we choose to live and function as a society for years or even decades to come.

Ever since the onset of COVID-19, we have more or less been living under an illusion. That illusion was that it would reach some sort of natural endpoint — a point at which the pandemic would be declared “over,” and we could all more or less go back to normal. The original promise of taking “15 days to slow the spread” or six weeks to “flatten the curve” has long since been reduced to a punchline.

In March of 2020, the outside estimates were that this coronavirus period would come to an end when safe and effective vaccines became widely available. Even the infamous Imperial College London report, viewed as draconian at the time for its estimate of up to 2.2 million deaths in the U.S. absent sustained intervention, predicted that its mitigation strategies “will need to be maintained until a vaccine becomes available.” Yet vaccines have been available for anybody who wants one for nearly six months, and our leaders have ignored the obvious off-ramp. The CDC backtracked on guidance and said that vaccinated people must wear masks in public, and many people and jurisdictions have listened. For example, Montgomery County, Md., has an extraordinarily high vaccination rate — with 96 percent of the eligible over-twelve population having received at least one dose and 87 percent of them being fully vaccinated. By its own metrics, the county has “low utilization” of hospital beds. Yet the county requires masks indoors — including in schools. In Oregon, vaccinated people are required to wear masks even outdoors. And it isn’t just liberal enclaves. A new Economist/YouGov poll found that eight in ten Americans report having worn a mask in the past week at least “some of the time” when outside their homes, with 58 percent masking “always” or “most of the time.” If masking has remained so widespread among adults months after vaccines became widely available, why will it end in schools after vaccines become available for children?

When operating under the assumption that there is a time limit on interventions, it’s much easier to accept various disruptions and inconveniences. While there have been ferocious debates over whether various mitigation strategies have ever been necessary, we should at least be able to agree that the debate changes the longer such restrictions are required. People making sacrifices for a few weeks, or even a year, under the argument that doing so saves lives is one thing. But if those sacrifices are indefinitely extended, it’s a much different debate.

There are many Americans who willingly locked themselves down and who still favor some restrictions. But what if this were to drag on for five years? Ten years? Twenty years? Do you want your children to be forced to wear masks throughout their childhoods? Do you want to bail on weddings if some guests may be unvaccinated? Skip future funerals? Ditch Thanksgiving when there’s a winter surge? Keep grandparents away from their grandkids whenever there’s a new variant spreading? Are you never going to see a movie in a theater again?

These are not wild scenarios. The Delta variant has led to surges throughout the world months after vaccines became widely available. Despite being a model of mass vaccination, Israel has been dealing with a significant Delta spike. To be clear, vaccines still appear to be quite effective at significantly reducing the risk of hospitalization and death. But if the virus continues to adapt and people need to get booster shots every six months or so, it seems there’s a good chance that the coronavirus will continue to spread for a very long time. So the question is how we, as individuals, and society as a whole, should adapt to this reality. Instead of thinking in terms of policies that may be tolerable for a very short period of time, it’s time to consider what would happen if such policies had to continue forever.

Whatever arguments were made to justify interventions early on in the pandemic, post-vaccine, we are in a much different universe. There is a negligible statistical difference in the likelihood of severe health consequences between vaccinated people who go about their business without taking extra precautions, and those who take additional precautions. Yet having to observe various protocols in perpetuity translates into a reduced quality of life. Put another way, the sort of question we need to start asking ourselves is not whether we can tolerate masking for one trip to the grocery store, but whether we want to live in a society in which we can never again go shopping without a mask.

People may ultimately come to different conclusions about the amount of restrictions they want to accept, regardless of the time frame. But at a minimum, we need to dispense with the framework that assumes the end of COVID-19 is just around the corner and instead recognize that it’s likely here to stay.

Worshiping Gaia instead of God

Ryan M. Yonk and Jessica Rood:

Dramatic headlines and images showing a deteriorating environment exist to demand swift, decisive, and large-scale action. We saw this approach in the 1960s when the first made-for-TV environmental crises showed oil-drenched seabirds on the California Coast and more recently in depressing videos depicting starving polar bears. Dramatic imagery has become the norm when discussing environmental issues.

We also see trends in editorial writing, discussions among political groups, changing business practices, and increasingly scholarly claims that also use dramatic imagery. At face value, these trends could indicate that the public demands dramatic governmental action on environmental issues. Some scholars, however, see this as more than mere increased public demand for government intervention, and they highlight similarities between environmentalism and religious movements. For example, Laurence Siegal states:

In the decades since modern environmentalism began, the movement has taken on some of the characteristics of a religion: claims not backed by evidence, self-denying behavior to assert goodness, (and a) focus on the supposed end of times.

Scholars have tuned into the general public’s zealous interest in the environment and more importantly, emphasis on government action, to push forward their own ideological goals under the guise of scholarship. Whereas the ultimate goal of scholarship is to mitigate climate change and improve sustainability, the reality is instead corrupted by thinly veiled ideology masquerading as scholarship, which is sure to distort any useful policy recommendations.

This phenomenon is illustrated by a recent study making the rounds in Science Daily and The Climate News Network. The authors, Vogel et al., claim that the world must decrease energy use to 27 gigajoules (GJ) per person in order to keep average global temperature increases to 1.5 degrees Celsius, a recommendation included in the Paris Agreement. Our current reality illustrates the outlandish nature of this suggestion. We are a far cry from this goal both in 2012, the year chosen for this study, as well as in 2019, the most recent year for available data. …

Using these data, the authors pair what they view to be excessive energy use with a failure to meet basic human needs worldwide. In their own argument, they acknowledge that among the 108 countries studied, only 29 reach sufficient need satisfaction levels. In each case where need satisfaction is met, the country uses at least double the 27 GJ/cap of sustainable energy use, thereby creating a conundrum both for those concerned about the environment and human well-being.

The authors, however, provide a solution arguing that their research shows a complete overhaul of “the current political-economic regime,” would allow countries to meet needs at sustainable energy levels. Some of their recommendations include: universal basic services, minimum and maximum income thresholds, and higher taxes on wealth and inheritance.

These policy recommendations are not supported by the research and directly contradict a body of literature that argues economic growth, not government redistribution, is our way forward. Vogel et al. argue against the necessity for economic growth and even go as far as to support degrowth policies on the grounds that their model finds no link between economic growth and maximizing human need satisfaction and minimizing energy use.

In short, their proposed solution would punish affluent countries and favor a collective misery in which any market driven environmental improvements are crushed under the promise of equality and sustainable energy use.

Conversely, Laurence Siegel in Fewer, Richer, Greener: Prospects for Humanity in an Age of Abundance and the 2020 Environmental Performance Index (EPI) argue that economic prosperity allows countries to invest in new technologies and policies that improve not only environmental health but also the well-being of the people. Thus, if we want to continue to improve our relationship with the environment and human progress, we should be more supportive of economic growth and the entrepreneurship that drives it.

If the above relationship between economic prosperity, environmental health, and human well-being is the case, how can these authors claim the opposite? The most likely conclusion is that the authors allow an ideological bias to drive their research, a claim that is supported by their normative descriptions of affluent countries as examples of planned obsolescence, overproduction, and overconsumption as well as the authors’ obvious demonization of profit-making.

As Vogel et al. demonstrates, environmental issues can be exploited by the drama and religious nature of the movement. Unfortunately, academics, such as Vogel et al., have learned to use these tools to stretch their limited findings into a full-blown rallying cry for their own preferred policies; in this case, socialism on a global scale.

An unlikely sermon subject for Sunday

Mark Malvasi:

My uncle made book for a living. That is, he took money from those who wagered on sporting events, presidential elections, anything whereby they thought they could make a fast and easy dollar. I suppose then it was inevitable that, as a young man, I felt a certain affinity for the thought of Blaise Pascal (1623-1662). Pascal, of course, gambled on stakes much higher than my uncle ever imagined. At the same time, my uncle knew something that Pascal never understood or, in any event, never admitted. You can’t beat the house.

Pascal’s mind was among the finest of the seventeenth century. He was a prodigy, perhaps a genius, who, at fifteen, published a distinguished essay on conic sections. He invented the first calculating machine, which he called La Pascaline, and his experiments with the vacuums that nature abhors led to the creation of the barometer.   Pascal was also a first-rate mathematician whose fascination with, and success at, the gaming table enabled him to contribute to the development of probability theory. To test his hypotheses, he devised the roulette wheel.

On November 23, 1654, at the age of thirty-one, Pascal underwent an emotional conversion that stirred him to abandon his worldly metier and to become an apologist for Christianity. He is best remembered today as a religious thinker, which he was, and a mystic, which he was not.   Like the nineteenth-century Danish theologian Søren Kierkegaard, Pascal approached God with “fear and trembling.” A mystic seeks and expects union with God. Pascal, by contrast, despaired of ever finding Him. His conversion did not bring him clarity of vision. God remained distant and unfathomable; the will of God was inscrutable and His design for the cosmos mysterious. “For in fact,” Pascal asked, “what is man in nature?” He answered his question, writing:

A Nothing in comparison with the Infinite, an All in comparison with the Nothing, a mean between nothing and everything. Since he is infinitely removed from comprehending the extremes, the end of things and their beginning are hopelessly hidden from him in an impenetrable secret; he is equally incapable of seeing the Nothing from which he is made, and the Infinite in which he is swallowed up.

Yet, alone and without God, humanity was lost, frightened, and miserable in vast and desolate universe.

To calm his anxiety that God was, at best, remote and, at worst, illusory, Pascal conceived his famous wager. He urged skeptics, atheists, and free-thinkers to live as if they believed in God. Critics then and since have denounced what seemed to be Pascal’s sneering disdain in urging people to affirm that God was real and existence meaningful. It was disingenuous, if not cynical, of Pascal to play the odds and to bet on the reality of God and eternal life when he suspected both were false. The critique, although carefully aimed, misses the target. It is no small irony given Pascal’s attacks on the Jesuits that, like Ignatius Loyola, he rejected predestination, convinced that men and women, through their own efforts, could earn God’s saving grace. Good habits and sincere piety, even in the absence of real belief, thus became indispensable to salvation. “Custom is our nature,” Pascal declared. “He who is accustomed to the faith believes it, can no longer fear hell, and believes in nothing else.” As Augustine taught, the routine practice of faith might in time lead to genuine faith.

Difficulties arise not from Pascal’s intentions but from his premises. Pascal argued that a man, perhaps in utter desperation, must speculate that God exists. If he wins, he wins big and for all eternity. If he loses, he loses almost nothing, since he will be in no worse condition than before. A prudent man thus has no alternative but to roll the dice or to turn over the next card. He’s gambling with house money. But in reality, in history, those who have denied God have often won glory, wealth, and power; according to scripture, they have gained the whole world. Satan took Jesus to a mountain and there “showed him all the kingdoms of the world and the glory of them; and he said to him, ‘All these I will give you, if you will fall down and worship me.’” (Matthew, 4:8-9) It is equally mistaken that a man loses nothing by hazarding that God is real. A man who worships God may sacrifice all he has, all he is, and all he loves in the vindication of his faith.   Consider Job.

Pascal’s tragedy originated in his embrace of Jansenism, which introduced Calvinist doctrines and attitudes into the Catholic world of seventeenth-century France and Western Europe. The Jansenists had revived the Manichean dualism, which characterized humanity as divided between good and evil. For the Jansenists, every soul was a battleground, its fate determined by whichever conflicting impulse was strongest. The Jansenists insisted, therefore, that virtue must be imposed on rebellious and perverse human beings. Only an exacting and solemn authority could direct individuals toward rectitude and purity. The Jansenists also prescribed such discipline for the churches they controlled and the local governments in France over which they exercised some influence. The flesh must be compelled to yield to the spirit. It takes no great leap of historical imagination to see that the Jansenist admiration for order, management, restraint, bureaucracy, and surveillance could be made to attend the requirements of the totalitarian state. The Jansenists determined to administer the “greatness and misery of man,” (“grandeur et misère de l’homme”), which was the foremost theme of Pascal’s work, though compulsion.

Jansenism, asserted Friedrich Heer, endowed Pascal with “an enormous capacity for hatred and over-simplification.”  Stressing the enthusiasm and certainty that governed the residents of Port Royal, the spiritual and theological capital of the Jansenist movement, Heer doubtless exaggerates the charges against Pascal. He ignores not the complexity of Pascal’s thought, but the complexity of the man himself.   Pascal was both austere and worldly, both rational and intuitive. When he partitioned the mind into l’esprit géométrique and l’esprit de finesse, he was mapping the course that a single mind—his own—could take. Pascal may have felt the zeal of a convert, but he never seems to have acquired the conviction that he possessed absolute truth or a sure method by which to attain it. For Pascal, God alone provided the antidote to the twin maladies of doubt and insecurity.

To alleviate his own misgivings, Pascal set out to compose a systematic defense of Christianity. The Pensées contain the remnants of the greater work that he never lived to complete. If these fragments and aphorisms suggest the character of the volume that Pascal meant to write, then it seems the Pensées would have been less an apologia for Christianity than the spiritual autobiography of a thinker attempting to explain to his intellect how his faith was possible.

In the Pensées, Pascal intimated that skepticism may transcend reason, and the doubts that reason awakens, leading not to certainty but to affirmation. By acknowledging the limits of reason, the thoughtful man, he hoped, could accept the mystery of life without also yielding to its absurdity. “The last proceeding of reason,” he wrote, ”is to recognize that there is an infinity of things which are beyond it. It is but feeble if it does not see so far as to know this. But if natural things are beyond it, what will be said of supernatural?” Yet, perhaps at this moment of vital insight, Pascal exhibited some of the odium that Friedrich Heer had detected in his thought and character. Like many intensely passionate and astute natures, Pascal disdained the society in which he lived—a disdain that reinforced his displeasure with his fellow human beings and, at times, with life itself. Most men, he assumed, were intellectually lazy and emotionally tepid. Desultory, incurious, and stupid, they were incapable of profound thought, searching doubt, or vibrant faith. The majority preferred not to bother about any subject, whether intellectual or theological, that would jolt them out of their passivity, lassitude, and indifference. Pascal’s disillusioned analysis of human nature may, as Heer suggests, have issued from the Jansenist view that human beings are both helpless and degraded. He could not avoid exposing the rancor, the insincerity, the conceit, the dishonesty, the self-deception, the cowardice, and the pettiness that circumscribed and disfigured the lives of most ordinary men, and made him despise them.

For Pascal, as for Kierkegaard and other, later existentialist philosophers and theologians, unending dread may well have been the cost of existence. “The eternal silence of these infinite spaces frightens me,” he proclaimed.  There is at times the echo of a terrible nihilism that reverberates though the otherwise silent spaces of Pascal’s infinite universe, as he gazed into the abyss. T. S. Eliot wrote that Pascal’s despair is “perfectly objective,” corresponding “exactly to the facts” and so “cannot be dismissed as mental disease.” In the end, Pascal concluded, the rational proof of God’s existence, such as Descartes had tried to construct with the ontological argument, was useless and unconvincing to those disinclined to believe. Essential questions about the meaning and purpose of human existence could never be resolved through the application of reason or logic. In fact, for Pascal, they could not be resolved at all. They could only be felt in all of their contradiction and paradox. The experience of such utter confusion and despair alone made faith possible and necessary, but offered no assurance that it would come.

Voltaire judged Pascal to be restless soul and a sick mind. Pascal agreed, confirming Voltaire’s assessment long before he had rendered it. During his final illness, Pascal often refused the care of his physician, saying: “Sickness is the natural state of Christians.” He believed that human beings had been created to suffer. Misery was the condition of life in this world. His was a hard doctrine.

But to what end did people suffer? What did their suffering accomplish? Did it exalt the spirit? Were they to suffer unto truth or, as was more likely, did they suffer because the flesh was evil and needed to be punished? Pascal had gambled for ultimate stakes. When he rolled the dice, it came up snake eyes, not once, not the last time, but every time. His tragedy, and potentially ours, is that he could discover no purpose in his encounters with creation, his fellow human beings, life itself. Philosophy, science, and reason offered no assurance of truth, and were of little comfort against anguish and hopelessness. Some could even use elaborate rational arguments to defy the will of God and to excuse sin, as had the Jesuits whom Pascal denounced in The Provincial Letters.

Love was equally vain and worthless. It prompted only deception and contempt for truth. Human beings are so flawed and imperfect that they are wretched and detestable. Loving themselves and desiring others to love them, they conceal their transgressions and deformities. Since no one is inviolate, no one deserves to be loved just as, were strict justice to prevail, no one deserves to be saved. Man, Pascal complained:

cannot prevent this object that he loves from being full of faults and wants. He wants to be great, and he sees himself as small. He wants to be happy, and he sees himself miserable. He wants to be perfect, and he sees himself full of imperfections. He wants to be the object of love and esteem among men, and he sees that his faults merit only their hatred and contempt. This embarrassment in which he finds himself produces in him the most unrighteous and criminal passion that can be imagined; for he conceives a mortal enmity against the truth which reproves him, and which convinces him of his faults. He would annihilate it, but, unable to destroy it in its essence, he destroys it as far as possible in his own knowledge and in that of others; that is to say, he devotes all his attention to hiding his faults both from others and from himself, and he cannot endure that others should point them out to him, or that they should see them.

All “disguise, falsehood, and hypocrisy,” men are ignorant, brazen, and delusional. Preferring lies to truth, they should not be angry at others for pointing out their shortcomings. “It is but right that they should know us for what we are,” Pascal insisted, “and should despise us.”

Elsewhere Pascal acclaimed the dignity of man. He was a reed, but “a thinking reed,” more noble than the insensible universe that would destroy him. But the damage had been done. In the centuries to come, the same revulsion for humanity that Pascal had articulated, the same regimentation and tyranny that the Jansenists had endorsed, transformed life on earth into a living hell. In the early twentieth-century, the Roman Catholic philosopher Gabriel Marcel came face to face with the tragedy of the human condition. Shattered by his experiences in the Great War, during which he had served with the French Red Cross identifying the dead and accounting for the missing, Marcel sought an alternative to grief and desolation.

He contended that in the modern world a person was no longer a person, but “an agglomeration of functions.” According to this functional definition, human beings were valued solely for the work they did and the goods they produced. Death became “objectively and functionally, the scrapping of what has ceased to be of use and must be written off as a total loss.” Such a vision of life deprived people of their spirituality and their faith, and robbed them of any joy that they might feel. Consumed by rancor, malice, and ingratitude, they suffered an “intolerable unease,” as they descended into the void that engulfed them.

Love was the answer. If people could overcome selfishness and egocentricity, if they could love one another, Marcel was confident that they could fulfill themselves as human beings. Such involvement with, and such fidelity to, others afforded a glimpse of the transcendent and was, in Marcel’s view, the most persuasive argument for the existence of God. Faith consoled and inspired the downtrodden, the persecuted, the oppressed, and the brokenhearted. It cultivated and enhanced all human relationships. For if people refused any longer to treat others merely as objects performing a function, if they came at last to recognize that all persons, however deficient, imperfect, errant, or sinful, mattered to God, then those persons were also more likely to matter to them.

I have come to the conclusion that each of us is capable of doing the right thing or the wrong thing at any one time. Your ratio of right decisions to wrong decisions shows the type of person you are, and whether or not your life will be successful (as in avoiding controllable bad things from happening to you).


The latest sports controversy

Michael Smith first wrote:

“I say put mental health first, because if you don’t, then you’re not going to enjoy your sport and you’re not going to succeed as much as you want to. It’s OK sometimes to even sit out the big competitions to focus on yourself because it shows how strong of a competitor and a person that you really are, rather than just battle through it.”

Says Simone Biles.

What I’m about to say may not be very popular, but if you have read my stuff for any period of time, you know I tend to say what I am thinking.

I feel sorry for her, but more sorry for her teammates. Her statement just seems to be the antithesis of the traditional American spirit.

Biles is a not a kid, she is 24. She is an adult. She indicated in an interview last night that she was feeling fear. Fear of injury, fear of making a mistake, fear of letting her teammates down.

So, she didn’t saddle up. Her mental toughness is gone. I’ve seen this before in people, they become so paralyzed with fear that they can’t even act. It is a from of PTSD or the old version known as “shell shock” – but the Biles situation isn’t a life threatening one. This is a sport.

But it isn’t just Biles – maybe she is the most famous example of it, but I have known young adults to not show up for work, calling out sick and claiming they had such a stressful week, they needed a “mental health day.”.

Somehow I can’t imagine a Chinese or Russian athlete walking away from a competition and you know, those CIA and US Army advertisements celebrating mental weakness and wokeness as desirable characteristics for intelligence employees and soldiers didn’t fill me with confidence.

Biles should not be disparaged for doing what she thought she needed to do for herself, but she damn sure should not be celebrated for it either. She let down her team and an entire country.

I am left to wonder if this is an example of the new America in the hands of the Biles generation, that they just don’t think they should “battle through” when things get tough.

May God have mercy on our souls if that what we have wrought.”

Then Smith wrote:

Many have criticized my comments about Simone Biles. That’s fair – but I have a personal reason for my opinion.

In the late 90’s the business I was part of failed. I was unemployed and dead broke, loaded with debt and with a wife and three kids to feed and literally no food in the house. I can imagine that kind of stress is at least equivalent to what Biles feels. Yet, I didn’t quit.

I couldn’t quit. My family depended on me.
I couldn’t find a job in my field at a comparable level so I turned to skills I learned working in construction from college, doing anything I could to make money to feed my kids. I didn’t give up, I wasn’t embarrassed to take any job I could find.

I clawed my way back over the past 30 years.

That’s real life. I know other people share a similar story. Rare are people who have never faced setbacks in life. Some buckle down and get things done, some people quit – on their families, their teams and themselves.

That has real and immediate consequences, so people will have to pardon me if I am harsh toward an athlete who quits for “mental health” reasons. She has been supported all her life by other people because she can do things nobody else can – and now she can’t (or won’t) do those things.

I don’t know her, and I have said she has the right to do whatever she wants to do and for whatever reason (or no reason at all), but again, she doesn’t just get to escape the consequences – people forming opinions of her or the real impact to her family and her teammates.

We all are entitled to our opinions, but now she is out of the individual competition as well, her legacy will forever be that she quit.

She doesn’t deserve denigration – but by the same token, she sure as hell shouldn’t be celebrated for quitting. Many are celebrating her for “living her truth” and “taking care of herself.”

I can’t do it. I won’t do it. I’m sorry for her problems but she performs in a sport that is basically entertainment, as all sports are to one degree or another. All sports, especially the Olympics, are luxury appendages of a prosperous world. They are nice, but not necessary, so maybe this doesn’t deserve the attention it is getting

Whether Simone Biles ever competes again will not change my world one iota. The fact remains that the behavior of elite athletes in several sports are making me not care about things I once enjoyed and supported (my wife, daughter and I volunteered and worked in the 2002 Salt Lake Games).

A friend once told me that society will forgive anything except going broke. The fact is that I’ll forever carry the stigma of being broke and bankrupt, but I will never be called a quitter.


The real purpose of today’s liberalism

Nathaniel Blake:

They want to make your life worse. They are the various diversity, equity, and inclusion activists and apparatchiks whose obsessions with race, sex, and gender now govern much of American life. They are the nation’s scolds, afraid that someone, somewhere, is having fun in a way that offends the ever-shifting demands of diversity and inclusion.

The latest victims of their killjoy spree are in Virginia, where prom kings and queens and daddy-daughter dances are about to be banned in public schools. By law, all Virginia public schools are required to implement policies that are “consistent with but may be more comprehensive than the model policies developed by the Department of Education.” These model policies are a disaster, requiring schools to “accept a student’s assertion of their gender identity without requiring any particular substantiating evidence, including diagnosis, treatment, or legal documents.”

Thus, on nothing more than a child’s say-so, schools must allow males into female locker rooms and showers, and house boys with girls on overnight trips. School employees are even told to keep a student’s steps toward transition secret from parents deemed insufficiently “supportive,” and to consider calling Child Protective Services on them. But the malice of the trans-kids movement may be most evident in the smaller cruelties of prohibiting prom queens and daddy-daughter dances.

This directive is meant to ensure the inclusion of students who identify as transgender, but it is an inclusion by elimination, achieved by banning sex-specific celebrations and honors. Any sex-specific distinctions, it is feared, might hurt the feelings of those with gender dysphoria.

But by this logic, even a generic gender-neutral parent-child dance should be cut for making children with dead, absent, or terrible parents feel left out. Indeed, dances in general should be cancelled because they might make socially awkward or physically disabled students feel bad. And so on the reasoning goes, cancelling everything that might make someone, somewhere, feel excluded.

This mindset allows misery to take happiness hostage, and it is particularly pernicious for sex and gender. We are embodied, and the reality of biological sex is fundamental to our being. It is also essential to human existence, for new persons are conceived through the union of male and female. That some people are uncomfortable with the sexed reality of physical embodiment is tragic (that cultural and political leaders are trying instill and encourage such discomfort is wicked), but that is no reason to require everyone else to officially ignore the basic experiential reality of being male and female.

Yet this is precisely what Virginia is doing to the children in its public schools. The cancellations of dances and other events are emblematic of a deeper erasure of identity, in which young men and women have the sexed reality of bodily existence officially erased in the name of inclusion. We are born male and female, but Virginia has decided that helping children grow into men and women is wrong, and that fundamental relational identities such as “father” and “daughter” must be publicly eliminated.

The irony in this is that those who preach most fervently about celebrating diversity seem terrified of acknowledging human differences. If, for instance, they really believe that “trans women are women,” then what is diverse about them? Only by acknowledging the difference between biological women and biologically male trans women can there be any diversity to celebrate. But admitting these differences threatens the entire ideological project.

The difficulty the ideologues face is that their mantra of diversity, equity, and inclusion is borderline incoherent. Diversity means difference, which intrinsically imperils equity and inclusion, for differences are by nature unequal in some way and exclusive of something. To be one thing is to not be another. To prefer one thing is to disfavor another. And so on.

Identifying and responding to important differences are always fundamental social and political tasks, as is finding commonalities that might unite us. These duties require wisdom and discernment, and the advocates and acolytes of diversity, equity, and inclusion are not up to the job. They efface essential differences and magnify those that they should minimize. And they lack a unifying principle that could unite different people and groups. Hence their flailing attempts to reconcile the tensions inherent in their sloganeering.

The Virginia public school system is a case in point, with Loudoun County alone providing a plethora of egregious examples that have provoked a major parental backlash even in this wealthy, solid-blue area. For instance, Loudoun administrators have illegally punished a teacher for stating the truth about biological sex, and they have become racial obsessives who have spent tens of thousands on critical race theory training that denounces colorblindness.

But the distinction between male and female is literally fundamental to human existence, whereas the construction of racial identities has little to no basis in biological reality. Effacing the former while emphasizing the latter is the opposite of what educators should be doing.

This folly arises because they are trying to remake society with a half-baked, incoherent ideology that is enforced by the shifting demands of the woke internet mob. And those people will always find something else to be unhappy about and another ideological envelope to push. Thus, the same ideology that cancels daddy-daughter dances in Virginia is putting men into female prisons in California, with predictably horrifying results.

These policies are the products of unhappy people who have realized that combining ideology and claims of victimhood give them power, which they can use to hurt others. This is why so much of our public discourse, especially from thelLeft, amounts to little more than accusations that “the thing you like is bad, and you should feel bad for liking it.”

The truth is that a daddy-daughter dance hurts no one, except those already determined to be miserable. Banning the dance helps no one, except for those eager to punish those who are happy.

The past isn’t what it used to be

Jonah Goldberg:

Joe Biden loves to say, “America is back.” He used it to announce his incoming national security team last November. “It’s a team that reflects the fact that America is back, ready to lead the world, not retreat from it.”

Last February, there were a slew of headlines about his first big foreign policy speech along the lines of this from the Associated Press:

Biden declares ‘America is back’ in welcome words to allies.”

In that speech, Biden told diplomats at the State Department, “when you speak, you speak for me. And so—so [this] is the message I want the world to hear today: America is back. America is back. Diplomacy is back at the center of our foreign policy.”

That phrase—as well as those Biden-tells-allies-America-is-back headlines—keeps coming to mind every time I read about the inexorable advance of the Taliban in Afghanistan. For the Afghans, America was “here,” and now it’s leaving. I wonder how “America is back” must sound to the people feeling abandoned by America in general, and the guy saying it in particular.

I’m not trying to pull on heart strings, so I won’t trot out the girls who will be thrown back into a kind of domestic bondage or the translators and aides who rightly fear mass executions may be heading their way. All I’ll say is that their plight does pull on my heart strings.

But let’s get back to this “America is back” stuff. For Biden, it seems to have two meanings. One is his narrow argument that we are rejoining all of the multilateral partnerships and alliances that Trump pulled out of or denigrated. Fair enough. I can’t say this fills me with joy, even though I disliked most of that stuff from Trump (the two obvious exceptions being getting out of the Paris Accord and the Iran deal). I think diplomacy often gets a bad rap. But I also think diplomacy is often seen as an end rather than a means. We want diplomats to accomplish things, not to get along with each other just for the sake of getting along. For too long, Democrats have cottoned to a foreign policy that says it’s better to be wrong in a big group than to be right alone.

But there’s another meaning to “America is back.” It’s an unsubtle dig at Trump and a subtle bit of liberal nostalgia all at once. It’s kind of a progressive version of “Make America Great Again.” It rests on the assumption that one group of liberal politicians speaks for the real America, and now that those politicians are back in power, the real America is back, too. But the problem is, there is no one real America. There are some 330 million Americans and they, collectively and individually, cannot be shoe-horned into a single vision regardless of what labels you yoke to the effort.

Liberals were right to point out that there was a lot of coding in “Make America Great Again.” I think they sometimes overthought what Trump meant by it, because I don’t think he put a lot of thought into it. He heard a slogan, liked the sound of it, and turned it into a rallying cry—just as he did with “America first,” “silent majority,” and “fake news.” Still, when, exactly,  was America great in Trump’s vision? The consensus seems to be the 1950s, a time when a lot of good things were certainly happening, but a lot of bad things were going on that we wouldn’t want to restore.

Liberal nostalgia is a funny thing. Conservative nostalgia I understand, because I’m a conservative and I’m prone to nostalgia (even though nostalgia can be a corrupting thing, which is why Robert Nisbet called it “the rust of memory”). Conservatives tend to be nostalgic for how they think people lived. Liberals tend to be nostalgic about times when they had power.

Consider the New Deal. Being nostalgic for the New Deal certainly isn’t about how people lived, not primarily. America was in a deep depression throughout the New Deal. Breadlines and men holding signs saying “will work for food” are probably the most iconic images of that time. Who wants to return to that? And yet, liberals will not banish it from their collective memory as something like the high water mark of American history. That’s why they keep pushing for new New Deals and slapping the label on new programs that consist of spending money we don’t have.

The only thing that competes with the New Deal in the liberal imagination is the 1960s in general and the civil rights movement and Great Society in particular. I’m reminded of a Washington Post interview with Howard Dean in 2003 in which he explained his nostalgia for that era:

“Medicare had passed. Head Start had passed. The Civil Rights Act, the Voting Rights Act, the first African American justice [was appointed to] the United States Supreme Court. We felt like we were all in it together, that we all had responsibility for this country. … [We felt] that if one person was left behind, then America wasn’t as strong or as good as it could be or as it should be. That’s the kind of country that I want back.”

“We felt the possibilities were unlimited then,” he continued. “We were making such enormous progress. It resonates with a lot of people my age. People my age really felt that way.”

That’s not how people his age felt back then. It’s how a certain group of liberals felt because they were winning. The 1960s and the 1930s were times of massive civic strife marked by race riots, domestic bombings, assassinations, and anti-war protests. But liberals were in charge, felt like history was on their side, and they had a lot of “wins” as Donald Trump might say.

The current obsession with the “new Jim Crow” seems like a perfect example of how liberal nostalgia distorts and corrupts. As I write today, I’m not a fan of the arguments coming out of the GOP or the Democrats. But the simple fact is that we don’t live in the 1960s—or 1890s—anymore. Whatever the future holds, it will not be a replay of that past. And that’s overwhelmingly for the good.

I always find it funny that the same people who ridicule “excessive” fidelity to the timeless principles of the Founding as archaic are often also the same ones who worship at the altar of the New Deal and the Great Society. The Founders didn’t know about mobile phones and the internet! Well, neither did the New Dealers or the Johnson administration. But that doesn’t matter because the part they really liked and yearn to restore is timeless: people in Washington deciding how Americans everywhere else should live and work.

I don’t know how the White House’s new collaboration with Facebook to combat “misinformation” will actually play out and I’m not fully up to speed on what the administration really intends to do. Though—given press secretary Jen Psaki’s comment that “you shouldn’t be banned from one platform and not others,” etc.—it doesn’t sound good. But I think David French’s gut check is exactly right: “Moderation is a platform decision, not a White House decision. Trying to force more moderation is as constitutionally problematic as trying to force less moderation.”

The principle at the heart of that speaks not just to social media regulation, but to all of the competing efforts from right and left to throw aside the rules in a thirsty search to rule.

Listeners of The Remnant know that I often find myself suffering from a peculiar form of nostalgia, for want of a better word. The title of my podcast comes from an essay by Albert Jay Nock, who was one of the “superfluous men” of the long Progressive Era that stretched—with a brief, and partial, parentheses under the sainted Calvin Coolidge—from the end of the Teddy Roosevelt administration to the end of the Franklin Roosevelt administration. I don’t agree with Nock, or the other superfluous men, on everything—they were a diverse lot. But the thing that connected them all—hence their superfluousness—was how they felt that they were standing on the sidelines as the major combatants at home and abroad competed over how best to be wrong, how to stir up populist anger for their agendas, and, most of all, how to use the state to impose their vision on the “masses.” The remnant was the sliver that wanted no part of any of it.

“Taking his inspiration from those Russians who seemed superfluous to their autocratic nineteenth-century society and sought inspiration in the private sphere, even to the point of writing largely for their desk drawers,” writes Robert Crunden, Nock’s biographer. “Nock made the essential point: ransack the past for your values, establish a coherent worldview, depend neither on society nor on government insofar as circumstances permitted, keep your tastes simple and inexpensive, and do what you have to do to remain true to yourself.”

Or as the great superfluous man of the Soviet Empire, Alexander Solzhenitsyn, put it, “You can resolve to live your life with integrity. Let your credo be this: Let the lie come into the world, let it even triumph. But not through me.”

I share this—yet again—as a kind of omnibus response to all of my critics these days and the ones yet to come. I’m lucky that I don’t have to write for my desk drawer, though I am reliably informed — daily — that many people would prefer I did. But I am going to continue to write for the remnant as I see it and those I hope to convince to swell its ranks, and not for those who think that to be against what “they” are doing I must endorse what “we” are doing. Our politics may be a binary system of competing asininities these days, but just because one side of a coin is wrong, that doesn’t mean the other side is right.

%d bloggers like this: