The thoughts of a journalist/libertarian–conservative/Christian husband, father, Eagle Scout and aficionado of obscure rock music. Thoughts herein are only the author’s and not necessarily the opinions of his family, friends, neighbors, church members or past, present or future employers.
The good thing about my job is that while I am required to write objectively about what I cover, I can also add my two cents worth in the appropriate place. Which I did.
Several years ago I passed on an opinion from a Roman Catholic priest who quoted C.S. Lewis as saying that there is no right to happiness, only to, as the Declaration of Independence says, pursue happiness. He concluded:
So we do not have a right to be happy. The assumption that we do lies behind the utopian turmoil of our times. The attempt to guarantee our right to be happy invariably leads to economic bankruptcy and societal coercion. By misunderstanding happiness and its gift-response condition, we impose on the political order a mission it cannot fulfill. We undermine that limited temporal happiness we might achieve if we are virtuous, prudent, and sensible in this finite world.
The column mentions retired UW Band director Mike Leckrone’s phrase “moments of happiness.” (Which he didn’t come up with until after I graduated. Which makes me think my leaving may have been one of his “moments of happiness.”) The column mentions that in the past few months I’ve had a few, including two sportscasting firsts (announcing the right team in a state football championship game and announcing an Illinois substate game), performing at Leckrone’s final three UW Varsity Band concerts, and seeing Chicago with my trumpet-, trombone- and guitar-playing sons.
I suppose my own definition of temporal happiness could be listening to this and this while driving a Corvette with Mrs. Presteblog as passenger on a beautiful summer day, perhaps on the way to or from eating a bacon cheeseburger, steak or shrimp, on the way to or from announcing a sporting event. But I think the aforementioned priest has it right when he says that there is no guarantee of earthly happiness.
It’s happened again. For the second time in three weeks, a prominent (at least in Evangelical circles) Christian has renounced his faith. In July, it was Josh Harris, a pastor and author of the mega-best-selling purity-culture book I Kissed Dating Goodbye. This month, it’s Hillsong United songwriter and worship leader Marty Sampson.
For those who don’t know, Hillsong United is one of the most popular and influential worship bands of the modern era. It was born at Hillsong Church in Australia and its albums routinely top the Christian charts — in fact, Billboard’s chart history gives it no fewer than eight number-one Christian albums.
It’s a powerhouse in what my former pastor derisively referred to as the “Jesus is my boyfriend” style of worship music. Their songs featured heartfelt, simple lyrics pledging undying Christian love and devotion. They also happen to inspire millions of Christians across the globe.
The relative lack of theological depth to much of Hillsong’s music has brought a predictable response to Sampson’s announcement — shallow songs, shallow theology. But I’m not sure that’s right. Of course only Sampson knows his own heart, but I want to focus on something else. Parts of his Instagram announcement of his change of heart just don’t ring true. I won’t paste the entire statement, but this part stood out to me:
This is a soapbox moment so here I go . . . How many preachers fall? Many. No one talks about it. How many miracles happen. Not many. No one talks about it. Why is the Bible full of contradictions? No one talks about it. How can God be love yet send four billion people to a place, all ‘coz they don’t believe? No one talks about it. Christians can be the most judgmental people on the planet — they can also be some of the most beautiful and loving people. But it’s not for me.
What is he talking about? “No one talks about” preachers falling, miracles, alleged biblical contradictions, or the challenge of hell? I take a backseat to no one in decrying youth ministries that concentrate more on ultimate Frisbee than on catachesis — or on pastors who focus on self-help to the exclusion of sound doctrine — but you simply cannot grow up in an Evangelical church without discussing many of these topics incessantly.
Yes, you can pass in and out of church — attend casually without going to Sunday school — and sometimes hear only therapeutic messages from the pulpit, but if you live in the church, as he did, you have real trouble believing his words. You also have seen the same thing many times — adults fall away in the face of the pressures of the world, rationalizing their departure with words that ring true to everyone except Christians who know what the church is really like.
As our culture changes, secularizes, and grows less tolerant of Christian orthodoxy, I’m noticing a pattern in many of the people who fall away (again, only Sampson knows his heart): They’re retreating from faith not because they’re ignorant of its key tenets and lack the necessary intellectual, theological depth but rather because the adversity of adherence to increasingly countercultural doctrine grows too great.
Put another way, the failure of the church isn’t so much of catechesis but of fortification — of building the pure moral courage and resolve to live your faith in the face of cultural headwinds.
In my travels around the country, one thing has become crystal clear to me. Christians are not prepared for the social consequences of the profound cultural shifts — especially in more secular parts of the nation. They’re afraid to say what they believe, not because they face the kind of persecution that Christians face overseas but because they’re simply not prepared for any meaningful adverse consequences in their careers or with their peers.
C. S. Lewis famously said that courage is the “form of every virtue at its testing point.” In practical application, this means that no person truly knows if he possesses any virtue until it’s tested. Do you think you’re loving? You’ll know you truly love another person only when loving that person is hard. Do you think you’re truthful? You’ll know only when telling the truth hurts. Soldiers are familiar with this phenomenon — most men who travel to the battlefield believe themselves to be brave, but they know they’re brave only if they do their duty when their life is on the line.
Earlier this summer, I spoke at an event in Georgia and discussed what I called the “courage cure to political correctness.” Are you afraid? Speak anyway, with humility, grace, and conviction. The law protects, but the culture resists you. After I spoke, a man came up to me and said, “That’s fine for you to say, but you don’t know what corporate America is like.”
I told him that I did know, and that I’ve experienced its bite.
He said no. He said, “It’s like East Germany now.” I asked him if he had tested that proposition, if he’d shared his beliefs in any meaningful way. He said no. He’d preemptively silenced himself.
Are you faithful? I’d submit that you don’t know until that faith is truly tested — either in dramatic moments of crisis or in the slow, steady buildup of worldly pressure and secular scorn. As the worldly pressure and secular scorn continue to mount, expect to see more announcements like Josh Harris’s and Marty Sampson’s. Expect to see more friends and neighbors retreat and conform. The church has its faults, yes, but the blame will lie less with a church that failed to instruct than with a person who didn’t, ultimately, have the courage to believe.
David Fiorazo has a message that applies to Christians and non-Christians:
Two more mass shootings over the weekend sparked another frenzy as liberals in the media renew their calls for gun control, others demand solutions to the mental health crisis, still others say it’s because of the alienation of young men today, and progressive politicians – including former president, Barack Obama and many 2020 Democrat hopefuls blame President Trump.
Few seem to suggest let alone address man’s greatest problem: sin.
In a culture where discipline and respect for authority is lacking, where people have bowed to the god of self-fulfillment, where narcissists abound, and where young people are coddled and not given healthy boundaries, it’s no wonder darkness and violence are increasing.
But there is a natural and spiritual law that affects every human being, like it or not: you will reap what you sow.
In the Old Testament of the Bible, the Prophet Jeremiah said,
“The heart is more deceitful than all else and is desperately sick; Who can understand it? “I, the Lord, search the heart, I test the mind, Even to give to each man according to his ways, according to the results of his deeds. (Jeremiah 17:9-10)
One glaring problem in our society is the rejection of any fixed moral anchors with which to guide and govern people. Chaos and godlessness are the result. Another clear consequence of removing the one true God from public places is that the culture of death spreads like a disease. …
This year marks the 50th anniversary of the Woodstock movement, a rebellion aimed at eradicating God and moral authority using music as an escape from reality. Those same ideas and philosophies are popular today: spirituality without religion, sex without consequences, relativism without moral absolutes, and salvation without repentance.
The ongoing, residual affect is hopelessness. The Apostle Paul warned:
“But realize this, that in the last days difficult times will come. For men will be lovers of self, lovers of money, boastful, arrogant, …disobedient to parents, ungrateful, unholy, unloving, …without self-control, …haters of good; conceited, lovers of pleasure rather than lovers of God;” (2 Timothy 3:1-4)
Sounds like American culture and the world today, doesn’t it? The truth is human beings were born with a sin nature and we all fall short of God’s standard.
Monday, Tucker Carlson of Fox News quoted author and cultural observer, James Howard Kuntsler on the recent mass shootings and violence he believes results in part from isolation and the modern void of lacking social interaction:
“[T]his is exactly what you get in a culture where anything goes and nothing matters. Extract all the meaning and purpose from being here on earth, and erase as many boundaries as you can from custom and behavior, and watch what happens, especially among young men trained on video slaughter games.”
Last year in the first weekend of August, 66 people were shot, 12 fatally, in Chicago. Shootings across the city this past weekend have left seven people dead and another 52 people wounded. This is the unreported new normal for Chicago and many inner cities in America.
When we forget the value of life and what’s important, apathy and hopelessness result.
Depression and suicide rates among teenagers continue to climb, and it seems answers are difficult to come by. According to the Pentagon, military suicides reached an all-time high in 2018.
And now, a majority of Americans support physician-assisted suicide. You can lawfully murder an elderly person in five states so far. It’s time to expand our discussions about mass shootings and guns to include abortion, suicide, euthanasia, and we need to go deeper than mental health.
But we’re simply unwilling. Evil is something God neither intended nor created. But we’re not robots. If we are truly free, then we have the free will to choose moral evil rather than good. This is why we need laws and morality.
The things we see increasing in America should be a warning for Christians who have found lasting peace and salvation through faith in Jesus Christ. We shouldn’t deny or ignore what’s happening. The Bible says, “because lawlessness is increased, most people’s love will grow cold.” (Matthew 24:12)
We must not be overwhelmed by death and decay or become indifferent to the suffering of others. Control what is within your control. We can’t allow our hearts to grow cold because without love, it is impossible to show compassion to those who are suffering and to obey God’s command to love our neighbor.
As for those who commit lawless deeds such as murder, there will be a day of reckoning.
Solutions aren’t easy, but in the book of Ecclesiastes, one of the wisest men to have lived on earth put it this way:
“The conclusion, when all has been heard, is: fear God and keep His commandments, because this applies to every person. For God will bring every act to judgment, everything which is hidden, whether it is good or evil.” (Ecclesiastes 12:13-14)
God bless you and keep speaking the truth about things that matter!
In the 1980s, Madonna captured the image of one girl’s shallow, self-absorbed life with her pop song, “Material Girl”:
You know that we are living in a material world And I am a material girl.
The era’s personal materialism of “I like stuff” or “Stuff is all that matters” was also captured in TV teen Alex Keaton of the sitcom Family Ties. Individuals may not be so enamored today of material things, but there’s another kind of collective materialism that holds undue sway in our culture. I’m talking about “materialism” as a philosophy.
Materialism as a philosophy is simply the idea that the material world is all there is. Put differently, materialism is the belief that matter and energy, interacting according to the laws of chemistry and physics, constitute the sum total of reality. Philosophical materialism, then, is a belief about the nature of reality.
Sometimes, we hear it stated overtly, such as when celebrity scientist Carl Sagan intoned, “The cosmos is all that is, or ever was, or ever will be.” Most often, though, it’s subtle. It is assumed but not stated. This is especially true in the realms of the natural sciences. Consider, for example, the children’s book You Are Stardust, which encourages young children to feel good about themselves because the atoms that make up their bodies were forged in the stars. Author Elin Kelsey doesn’t come right out and say, “There is no God” or “The universe is all that exists.” She has simply assumed that materialism is the truth about reality, and then written a whimsical children’s book from that philosophical perspective.
Today, philosophical materialism is almost universally conflated with science. You Are Stardust is categorized as a (what else?) science-based picture book for children. We can also discern this conflation behind statements like, “I don’t believe in God; I believe in science,” as if theistic belief and science are inherently incompatible. But they’re not incompatible, and despite what celebrity scientists like Neil deGrasse Tyson or Bill Nye the Science Guy might say, there’s nothing that says materialism and science necessarily go together.
So, the question thinking people should be asking is, Why should materialism enjoy such a privileged, unquestioned position in our culture? And the answer is, it shouldn’t.
Enter Science Uprising, a project of the Discovery Institute’s Center for Science and Culture. Science Uprising burst onto the scene this past summer with a series of short, edgy videos challenging this materialistic metanarrative on the ground it’s been squatting on for far too long: the natural sciences. The first episode sets things up by explaining what materialism is, demonstrating how its pretensions have become deeply embedded in our culture, and showing how it actually runs counter to many aspects of life we all believe to be true and value. Subsequent episodes look at neuroscience and the reality of the mind, DNA and the reality of coded information in the cell, evolutionary biology and the failure of the neo-Darwinian hypothesis, and more. The upshot of it all is that philosophical materialism fails to adequately explain reality as we know it and live it. Moreover, it fails when put to empirical tests.
How do such concepts as love, compassion, justice and the human soul fit into a narrative that says only matter and energy are real? They don’t. And this should be our first tipoff that maybe materialism isn’t the whole truth about reality. No one–not even materialists themselves–actually lives as if materialism is true.
You don’t have to be a working scientist to think for yourself about science. Research shows that a big reason young people are abandoning Christianity in droves is because they’ve been told it’s incompatible with science, when the truth is, it’s materialism that is incompatible with both Christianity and science. We are instructed in Scripture to “demolish arguments and every pretension that sets itself up against the knowledge of God” (2 Corinthians 10:5), and if ever there were a lofty pretension lifting itself up against theistic belief, then materialism should be crowned as king of the whoppers.
Thankfully, the consumeristic materialism of the 1980s has less appeal to youth today. The task for today is to pull back the curtain on this whopper of a lie about reality, an idol of the mind that is even more destructive to the soul. So, check out Science Uprising here, and let the demolishing begin.
Exactly 20 years ago, in U.S. Senate testimony just weeks after the Columbine High School massacre, I offered these thoughts:
The real problem [of Columbine-like violence in our culture] is in here, in us … In the last four decades we’ve created a culture that markets violence in dozens of different ways, seven days a week. It’s part of our social fabric. When we build our advertising campaigns on consumer selfishness and greed, and when money becomes the universal measure of value, how can we be surprised when our sense of community erodes? When we glorify and multiply guns, why are we shocked when kids use them? …
The Columbine murders will mark my [Denver] community for years to come. They’re a wound felt by the entire country — but I don’t think they’ll be the last. We live in the most violent century in history. Nothing makes us immune from that violence except a relentless commitment to respect the sanctity of each human life, from womb to natural death. The civility and community we’ve built in this country are fragile. We’re losing them. In examining how and why our culture markets violence, I ask you not to stop with the symptoms. Look deeper. The families in Littleton and throughout the country deserve at least that much.
In separate incidents over the past two weeks, gunmen have killed three persons and wounded 13 others in Gilroy, CA; killed at least 20 and wounded 26 others in El Paso TX; and killed at least nine and wounded 27 others in Dayton, OH. These are just the latest in a long pattern of mass shootings; shootings that have blood-stained the past two decades with no end in sight.
Now begins the usual aftermath: expressions of shock; hand-wringing about senseless (or racist, or religious, or political) violence; bitter arguments about gun control; heated editorials, earnest (but brief) self-searching of the national soul, and eventually — we’re on to the next crisis.
I buried some of the young Columbine victims 20 years ago. I sat with their families, watched them weep, listened to their anger, and saw the human wreckage that gun violence leaves behind. The experience taught me that assault rifles are not a birthright, and the Second Amendment is not a Golden Calf. I support thorough background checks and more restrictive access to guns for anyone seeking to purchase them.
But it also taught me that only a fool can believe that “gun control” will solve the problem of mass violence. The people using the guns in these loathsome incidents are moral agents with twisted hearts. And the twisting is done by the culture of sexual anarchy, personal excess, political hatreds, intellectual dishonesty, and perverted freedoms that we’ve systematically created over the past half-century.
So I’ll say it again, 20 years later. Treating the symptoms in a culture of violence doesn’t work. We need to look deeper. Until we’re willing to do that, nothing fundamental will change.
Remember when the scariest kid in your neighborhood was the football jock who terrorized the high school with his minions in tow, and got bailed out by his rich parents when he went too far? Or it was the gothic malcontent with the switchblade and the swagger. Either way, what made these high-status alphas so terrifying was that they came at you in numbers. They travelled in packs. This has been our narrative, in the stories we tell—from Henry Bowers in Stephen King’s It, to Biff Tannen in Back to the Future, to Billy Hargrove in Stranger Things, central-casting bullies attracted followers. They belonged.
As any grade eight schoolgirl who’s been bullied off Instagram can attest, this stereotype still holds. But when it comes to the most dangerous and sociopathic actors, the opposite is true. All three of the young mass shooters who terrorized the United States in recent nationally reported scenes of carnage—Connor Betts in Dayton, Ohio; Patrick Crusius in El Paso, Texas; and Santino William Legan in Gilroy, California—acted alone. The old image of the bully as locker-room alpha or goth leader now seems passé. Often, it is the kid who used to be the fictional protagonist, the social outcast, the member of the Losers Club from It, whose face now appears on our screens with a nightmarish empty stare.
These recent shooters fit a similar profile. They were outsiders, all seemingly socially awkward, who became emboldened through fringe online communities that act as mutual-support societies for violent malcontents. This phenomenon is fuelled by hate, guns, mental illness and ideological extremism. But there is another factor at play here, too. Before a youth makes the decision to murder, before the gun is stashed in his backpack, before his state of mental health is so deteriorated that he commits the unthinkable, what has happened to him? It’s important to remember that these murders are also, in most cases, suicides.
In his 2008 article School Shooting as a Culturally Enforced Way of Expressing Suicidal Hostile Intentions, psychiatrist Antonio Preti summarized existing research on school shootings to the effect that “suicidal intent was found in most cases for which there was detailed information on the assailants.” The research also indicated that “among students, homicide perpetrators were more than twice as likely as their victims to have been bullied by their peers, and also were described as loners and poorly integrated into school activities…In most of the ascertained cases, perpetrators prepared a well-organized plan, and often communicated details about it to acquaintances or friends, who failed to report threats because they did not consider them serious or were embarrassed or ignorant of where to go for help. The most antisocial peers sometimes approved the plan, sharing the same anger against the stated target of violence.”
Preti’s article predated the rise of some of the most notorious web sites—including 8chan, which was shut down this week after several mass shootings were linked to its users. But the nihilistic phenomenon these killers represent predates modern social-media culture. Indeed, it predates digital communication, and even broadcast media more generally.
In 1897, French sociologist Émile Durkheim noted that suicides overall were increasing in society. But there were differences among the affected populations, he noticed. Men were more likely than women to commit suicide—though the chances decreased if the man was married and had children. Durkheim observed that social groups that were more religious exhibited lower suicide rates. (Catholics were less likely to commit suicide than Protestants, for instance.) Durkheim also noted that many people who killed themselves were young, and that the prevalence of such suicides was linked to their level of social integration: When a person felt little sense of connection or belonging, he could be led to question the value of his existence and end his life.
Durkheim labelled this form of suicide as “anomic” (others being “egoistic,” “altruistic” and “fatalistic”). Durkheim believed that these feelings of anomie assert themselves with special force at moments when society is undergoing social, political or economic upheaval—especially if such upheavals result in immediate and severe changes to everyday life.
Durkheim came from a long line of devout Jews. His father, grandfather and great grandfather had all been rabbis. And so even though he chose to pursue an academic career, his experiences taught him to respect the mental and psychological support that religious communities supplied to their members, as well as the role that ritual plays in the regulation of social behavior. In the absence of such regulation, he believed, individuals and even whole societies were at risk of falling into a state of anomie, whereby common values and meanings fall by the wayside. The resulting void doesn’t provide people with a sense of freedom, but rather rootlessness and despair.
Durkheim’s thesis has largely stood the test of time, though other scholars have reformulated it for modern audiences. In his 1955 book The Sane Society, for instance, Erich Fromm wrote that, “in the nineteenth century, the problem was that God is dead. In the twentieth century, the problem is that man is dead.” He described the twentieth century as a period of “schizoid-self alienation,” and worried that men would destroy “their world and themselves because they cannot stand any longer the boredom of a meaningless life.”
In her 2004 book Rampage: The Social Roots of School Shootings, Katherine Newman described findings gleaned from over 100 interviews in Arkansas and Kentucky. The male adolescent shooters at the center of her study, she concluded, “shared a belief that demonstrating strength by planned attacks on their respective institutions with (too) easily available guns would somehow mitigate their unbearable feelings of inadequacy as males and bring longed-for respect from peers.” Ten years later, in a 2014 article titled The Socioemotional Foundations of Suicide: A Microsociological View of Durkheim’s Suicide, sociologists Seth Abrutyn and Anna Mueller set out to update Durkheim’s theory about how social integration and moral regulation affect suicidality. “The greater degree to which individuals feel they have failed to meet expectations and others fail to ‘reintegrate’ them, the greater the feelings of shame and, therefore, anomie,” they concluded. “The risk of suicidal thoughts, attempts, and completions, in addition to violent aggression toward specific or random others, is a positive function of the intensity, persistence, and pervasiveness of identity, role, or status-based shame and anomie.”
Writing in the 1890s, Durkheim was highly conscious of all the ways that industrial capitalism corroded traditional forms of social regulation in society, often at the expense of religious—and even governmental—authorities. (“Depuis un siècle, en effet, le progrès économique a principalement consisté à affranchir les relations industrielles de toute réglementation. Jusqu’à des temps récents, tout un système de pouvoirs moraux avait pour fonction de les discipliner…En effet, la religion a perdu la plus grande partie de son Empire. Le pouvoir gouvernemental, au lieu d’être le régulateur de la vie économique, en est devenu l’instrument et le serviteur.”) But if he were to visit us in 2019, Durkheim would be surprised at the extent to which once-dominant ideas with no connection to economics have been marginalized as regressive and hateful—such as nationalism, patriotism and even masculinity.
This is one reason why so many people now feel unmoored. As Canadian science fiction writer Donald Kingsbury eloquently put it in his novel Courtship Rite, “Tradition is a set of solutions for which we have forgotten the problems. Throw away the solution and you get the problem back.” Faith in god, country and manhood might be seen as regressive by modern lights. But insofar as they were holding back male anomie, we perhaps neglected to consider what damage would be done if we discredited those ideas before finding replacements.
In the history of our species, there has never been (to the knowledge of modern scholars) a human society that did not express belief in some sort of supernatural force—which suggests that we are programmed by a need to believe in something bigger than ourselves. Sociologist Max Weber warned in 1919 that “science deals with facts. It can’t tell us what to do or what’s important.” This is to say that while the scientific revolution did a good job of helping us explain and harness the natural world, it did nothing to fill the god-shaped hole that Blaise Pascal identified in the 17th-century: “What else does this craving, and this helplessness, proclaim but that there was once in man a true happiness, of which all that now remains is the empty print and trace? This he tries in vain to fill with everything around him, seeking in things that are not there the help he cannot find in those that are, though none can help, since this infinite abyss can be filled only with an infinite and immutable object; in other words by God himself.”
If we are to resign ourselves to the fact that “God himself” isn’t going to intercede any time soon, then we are left with the ordinary tools of policy, such as Robert Putnam outlined in his famous 2000 book, Bowling Alone: The Collapse and Revival of the American Community, in which he pointed to the value of “the connections among individuals’ social networks and the norms of reciprocity and trustworthiness that arise from them.” These connections could be strengthened, Putnam argued, through improved civics education, more extra-curricular activities for youth, smaller schools, family-oriented workplaces, a more enlightened approach to urbanism, technology that reinforces rather than replaces face-to-face interaction, as well as a decentralization of political power. These recommendations were written 19 years ago, before Facebook, Twitter or 4chan existed. It would be interesting to know how he would revise his recommendations now that we have a better appreciation for the massive effects of digital culture on our social dynamics.
In a 2017 article I wrote, titled Towards a Theory of Virtual Sentiments, I argued that real-time empathy generation often requires some degree of eye contact—which is hard to generate through online interaction. Moreover, it is shockingly easy to get worked up into a rage when you are interacting with an online avatar of a person you have never met. Simply put, the more we physically see each other, the less likely we are to be awful to each other. As Louis CK said in an interviewabout youth and technology, “They don’t look at people when they talk to them and they don’t build empathy. You know, kids are mean, and it’s cause they’re trying it out. They look at a kid and they go, ‘You’re fat,’ and then they see the kid’s face scrunch up and they go, ‘Oh, that doesn’t feel good to make a person do that.’ But when they write ‘You’re fat’ [online] then they just go, ‘Mmm, that was fun, I like that.’” Even putting aside the extreme cases of forums that cater to homicidal shooters, I remain unconvinced that any community that exists primarily in online form can be a force for long-term good. Perhaps more time offline is a good start for anyone seeking to enhance “the norms of reciprocity and trustworthiness.”
Do we need a new nationalism? A new religion? What common human project can we collectively embrace that gives a sense of mission to everyone, regardless of skin color, religion, economic class or ideology? It would be presumptuous for me to suggest I have the answers. All I know is that men who see human life as meaningless are symptoms of a larger sense of anomie that, in less dramatic and destructive form, increasingly grips us all.
Gene Veith writes from the Lutheran perspective of a fundamentalist ex-Christian (according to himself):
Josh Harris had earlier repudiated his book I Kissed Dating Goodbye, which launched the “courtship movement” and “the purity culture” that influenced a whole generation of Christian young people. A few days ago, he announced that he is divorcing his wife of 22 years and leaving their three children. The next day he announced that he is no longer a Christian.
In his Instagram post in which he renounced his Christianity, Harris quoted Martin Luther on how the whole life of the believer should be repentance. (That is #1 of the 95 theses.) Harris, who was once a megachurch pastor, said that he has “lived in repentance for the last several years,” and that he is now repenting his former opposition to the LGBT movement and to same-sex marriage. Such all-pervasive guilt–which is not what Luther meant by repentance–has led him not to Christ but to running away from Christ. “By all the measurements that I have for defining a Christian,” he concludes, “I am not a Christian.”
Rod Dreher gives him credit for refusing to stretch the Bible and distort Christianity to support his new unbelief. Dreher brings up a recent discussion in which Harris said that it would make more sense to reject Christianity altogether than to accept the contortions of progressive theology, which try to interpret away the sexual prohibitions of the Bible.
Still, I wonder about “all the measurements that I have for defining a Christian.” Such “measurements” may have led to his problems in the first place, defining Christianity first in terms of one’s “purity” and then turning against it when he fails his new purity tests.
David French, responding to Harris’s leaving the faith, recalls his own experience as a youth pastor, working with teenagers influenced by the purity culture. French says that the problem with Harris’s approach is not that it upheld Christian sexual morality–that was the good part about it–but that it put forward a Christianity without Christ:
It worked like this — sexual sin stained young persons, even if Christ forgave them. They would walk into marriage diminished in some crucial ways. The white dress, fundamentally, was a lie. And the message wasn’t confined to sexuality. Did you drink? Did you smoke a joint? Each one of those things altered a person’s self-definition. They were no longer “pure.” They could never be “pure” again.
All too many times, I saw the despair. A young person would come to me and say, “I screwed up.” They would really mean, “I’m ruined.” Their storybook dreams were dead. A 17-year-old with (God willing) 70 years of life ahead of him would approach me carrying the awful burden of thinking that he had defined his life forever. He was no longer — and never would be — the person he wanted to be.
Sometimes the despair would trigger wild rebellion. If they’re “ruined,” then why should they care about obedience? There are two states of being — virgin or not, teetotaler or not — and if you’re not, then you might as well indulge yourself. Other times the despair would trigger constant, nagging guilt and regret. A girl would walk down the aisle to marry a man who loved God and loved her, and she’d feel a shadow on her soul.
In point of fact, the gospel message rests first on bad news, then on indescribably good news. The bad news is simple: You were never “pure.” It’s not as if sex or drink or drugs represent the demarcation line between righteous and unrighteous. They are not and were never the “special” sins that created particularly acute separation from God. Yes, they could have profound earthly consequences, but they did not create unique spiritual separation.
The indescribably good news is that from the moment of the confession of faith, believers are not defined by their sin. They’re not defined even by their own meager virtues. They’re defined by Christ. Moreover, they find that “for those who love God, all things work together for good, for those who are called according to his purpose.” This does not by any stretch mean that past sin wasn’t sin — one of my best friends is an eleven-years-sober addict who did dreadful things during his worst days — but it does mean that their past now gives them a unique ability to reach suffering people. Their terrible stories and past pain have been redeemed, transformed into instruments of grace and mercy.
Joy Pullman, from a Lutheran perspective, shows how legalism (the notion that Christianity is all about what I have to do) leads naturally first to antinomianism (the rejection of morality altogether, since I can’t achieve it) and then to unbelief (giving up on the whole impossible project). This is a theological error, she says, that pervades contemporary evangelicalism:
Is there going to be a public reckoning with evangelicalism’s major heresies that fuel cycles of this kind of legalistic faddishness? As Harris’s experience — and the history of American Christianity (indeed, of the world) — shows, legalism leads inevitably to antinomianism. Antinomianism is the fancy theology term for rebelling against God’s law after observing how hard it is to keep it. It’s how Puritans turn into Social Gospelers. Thus, as is human nature, people ping-pong between opposite sides of the gutter rather than taking a straight course between them. But Christianity delineates the straight course, not the gutters.
The answer to legalism isn’t antinomianism. The answer to finding you can’t keep all God’s laws isn’t to say thus God must not actually have any laws. It isn’t to say “I believed that God has careful designs for sex and marriage, but I and lots of people can’t stay in line with them so I’ll just pretend God isn’t real or maybe none of his rules are.” It’s to receive the truth that God perfectly kept all his laws for you, which prompts such great joy that you actually begin to want to do what is right — which the laws defined in the first place. It’s not law or gospel, legalism or license. It’s both, which is liberty.
How many other tragedies, scandals, and apostasies among contemporary Christians are also due to this kind of theological confusion? Reformed theologian Carl Trueman relates what happened with Harris to the meltdown of other leaders of the “Young, Restless, and Reformed” movement in Calvinist megachurches, of which Harris was a prominent figure (who took over a ministry vacated over a sex scandal). Could such legalism and the consequent antinomianism and unbelief be a factor in the Roman Catholic sex scandals?
We Lutherans are not exempt from scandals like those, and antinomianism is a besetting heresy of ours. But at least we are fixated on the distinction between the Law and the Gospel; how Christianity is not about our works but the work of Christ for us; that we are all “sinful and unclean” but that Christ redeems us “with His holy, precious blood and with His innocent suffering and death, that I may be His own and live under Him in His kingdom” (Small Catechism).
At a time when many evangelical theologians are running away from the doctrines of the Atonement and justification by grace through faith in Christ, this sad business is a reminder of the continuing necessity of a strong understanding of the Gospel. Lutherans can help convey that to the rest of Christendom. Meanwhile, as Joy Pullman concludes, “If you go to church, don’t go to one that consistently gets this basic and important point of theology wrong.”
Some Christians, and some non-Christians, try to criticize the church by telling the Gospel story of the woman about to be stoned, and Jesus Christ’s stepping in and suggesting that he who had no sin should start firing away. The biggest flaw in this is that the storyteller seems to omit Christ’s next five words: “Go and sin no more.”
However, no reading of the Bible I have ever done suggests that Christ expects us to be perfectly sinless. That is impossible, because we are human beings, and as the writer points out we have never been pure. I believe Christ does expect Christians to lead, or at least try to lead, a better life, but we do not lose Christ’s love when we fail to do that.
Here are two sort-of secular examples to prove my religious point. Baseball pitcher Orel Hershiser, who would sing hymns on the mound when he was in a stressful situation, said that he went on the mound trying to give up no hits. When he gave up a hit, he then resolved to give up no hits after that. And when he gave up another hit, he resolved to give up no hits after that. Hershiser never threw a no-hitter in his career, but he was one of the better pitchers in baseball in his day. That seems like a more Christian attitude (if it’s sincere) than the idea that Christians must lead perfect, sinless lives, because we can’t.
One of my two favorite phrases from Vince Lombardi is “If we chase perfection, we can catch excellence.” Lombardi did go to daily Mass, so maybe that’s where it came from. Put the two together and it seems to me that the Christian responsibility includes loving your neighbor (and as a former priest of mine put it, sin is against God, your neighbor and yourself), but when we fail to do that (and at some point we will), try again.
This weekend Chevrolet is bringing a 2020 Corvette to Road America in Elkhart Lake.
I’m not going. I have other plans. Although I’ve always enjoyed Road America since the first time I went there in the 1980s (where there are photos of me appearing to break into a Ferrari and I got one of the worst sunburns of my life), I prefer the July vintage event, during one of which I found this:
Chevrolet also released its dealer tour schedule. The C8 is going to make one appearance in Wisconsin, on Sept. 30. (Which, if you consult your 2019 calendar, is on a Monday.) It will make two in Illinois, and one in Iowa.
The color I would like …
… isn’t offered, of course.
Readers know that I have been skeptical about this Corvette, largely because of its lack of manual transmission, which is a basic piece of any sports car. The rear/mid-engine placement of the engine is an application of technology from a company with historical difficulty in bringing new tech to the public that works as intended all the time.
It has been reported repeatedly that Zora Arkus-Duntov, stepfather of the Corvette (he didn’t create the Corvette, Harley Earl did, but Duntov wrote a detailed letter to GM chronicling everything wrong with the first Corvette, and so GM hired him), thought the Corvette should be mid-engine. (Which the Corvette actually has been for several years. A mid-engine car has its engine either behind the front wheels or ahead of the back wheels. Duntov sought a rear/mid-engine instead of a front/mid-engine.)
Well, with all due respect to Duntov, and not being an automotive engineer myself, I wonder how many rear/mid-engine cars he actually used on a daily basis, or got a dealer to fix, or tried to fix without having actual automotive engineering skills. Those people, not car engineers, are the owners of Corvettes.
The design of the Sting Ray had been the source of many clashes between Bill Mitchell and Zora Arkus-Duntov. Duntov was contemptuous of the car’s nonfunctional styling gimmicks and poor aerodynamics; the C2 had low drag, but an alarming amount of high-speed lift. Duntov was only an engineer, however, while Mitchell was a vice president of one of GM’s most powerful departments. Although Mitchell never enjoyed the almost unquestionable clout of his predecessor, who had had the patronage of GM chairman Alfred P. Sloan, GM’s senior management was well aware that Mitchell’s work was responsible for a great deal of GM’s market domination. In a clash between Duntov and Mitchell, the victor was inevitable.
Duntov wanted the Corvette Sting Ray’s replacement, which originally was slated to appear for the 1967 model year, to be smaller, leaner, and more aerodynamic, ideally with a rear- or mid-mounted engine. Mitchell, for his part, loved to make cars look aerodynamic, but he wasn’t terribly concerned if they actually were or not.
Like Harley Earl before him, Mitchell was a believer in the formula of longer-lower-wider, and he felt sports cars should have long hoods. He was no fan of the rear-engine layout that Duntov wanted, which he thought would be ugly. Mitchell envisioned the third-generation Corvette more like the XP-755 show car, known as Mako Shark.
Contemporary automotive journalists sneered at the many gimmicks of the Mako Shark and its successor, the 1965 Mako Shark II, both of which were the work of stylist Larry Shinoda, designer of the Sting Ray. Duntov didn’t care much for it either, but public reaction was favorable and in short order, the Mako Shark was approved as the basis of the third-generation C3 Corvette.
As for Duntov’s desired mechanical changes, GM senior management had no stomach for an expensive revamp of the Sting Ray platform. With Corvette sales on the upswing, there seemed to be no reason to mess with success.
A repair guy figured out a problem about the engine’s location:
That silly line of buttons down the center console. In person, it’s not nearly as awkward or intrusive as we thought from the photos—it actually looks kind of slick. That is, until you look more closely at the plasticky, cheap-looking buttons that fill it: They’re straight from the corporate parts bin. We understand why, but we can’t say we like it.
No manual transmission option. Yes, we know hardly anyone would buy a manual version. Ain’t care.
The rear end in general. We’re no purists (no specific number of taillamps, or their shape, is essential, for example) but we know a hot mess when we see it. Our design editor feels the same way.
The forthcoming bench racing.The Corvette’s price-to-performance ratio is going to spawn a whole generation’s worth of “just get a Corvette instead of X” posts on every forum we read, and likewise letters to the Automobile editors.
The wait. We still have months and months before we drive it, and before it goes on sale.
Best Things About the C8 Corvette
It’s less than $60,000! That’s Supra money for what is likely to be McLaren 570S-like performance. Even if “less than” means “$59,999” and comes before destination charges, it’s still something special.
Zero to 60 mph takes less than three seconds with the Z51 package and performance exhaust.That’s the best kind of crazy. Did we mention the price for this level of performance?
The engine and transaxle are super, super low in the car. This will certainly aid in handling.
The fit and finish. While the cars at the unveil we attended were hand-built prototypes, the interior materials’ quality and fit and finish are definitely intended to answer 30 or more years of criticism of the Corvette’s cabin. It’s a shockingly nice place to be—as long as you don’t look too closely at those buttons. Also, it’s available with brown paint.
The small, square steering wheel looks like it will be a joy to use. Plus, it leaves enough room for drivers more than six feet tall and of a certain leg diameter to move around as we attempt to tame Chevy’s mid-engine beast.
I’m not sure I agree with at least three of those five points, two of which are contradictory. The chance someone will drive off with a C8 for less than $60,000 is zero, merely due to GM’s destination and other charges and dealer markups, which will be substantial. That doesn’t include one single option — such as the Z51 option, without which there is no claimed 0–60 time, which itself is a Chevy claim unproven by anyone not employed by GM. So you can have a sub-$60,000 Corvette (except you can’t), or you can go 0–60 in 2.8 seconds (though that remains to be seen), but not both.
As for the steering wheels worked better in a non-round shape, all cars would have non-round steering wheels. The bottom of the steering wheel was squared off on C6s and C7s, and though I don’t like the look, that might be said to have a function. (Except that I have driven legs-only with round steering wheels for years without mishap.)
I was working hard in 1955 on a C2 planned for 1958, but its advanced rear-transaxle chassis finally achieved production only with the 1997 C5. That layout did reach production in 1977—outside General Motors—with the Porsche 928, created in part by Anatole Lapine, who’d worked with me on the stillborn ’50s C2. I know little about behind-the-scenes projects that might have occurred during the 40 years between my departure from GM in 1957 and the arrival of the C5 but I suspect that there were a lot of exciting and highly feasible—but not fundable—projects. I do know that Zora Arkus-Duntov advocated for mid-engine Corvettes at least 60 years ago, and that he built a mid-engine CERV research single-seater in the Fifties with its small block V-8 behind the driver. So this car has come to market extremely late.
Some 1970s mid-engine GM concept cars were built to show off the Wankel rotary engines GM might have built, but they were not specifically Corvette prototypes in name. Which is too bad, because they were better-looking than this actual C8. I am deeply sorry to be severely disappointed by the styling of the C8. I hoped for something really new and exciting, not a boringly generic supercar, mostly indistinguishable from the many and varied unimaginative devices that show up regularly at the Geneva auto show. Its styling is confused—and downright messy in fact. I count a dozen horizontal lines, not to mention four convoluted taillights; four nice rectangular exhaust tips; plus varied slots, vents, grilles, indented surfaces, and wing elements . . . just across the rear fascia. The front is no better, and the profile with its short, stumpy nose is equally surprising. Maybe it’s all meant to look purposeful, but to me it seems just a careless, cluttered graphic composition, not worthy of Corvette history and what we expect of this technically brilliant descendant of the Jaguar-inspired elegant original C1 from 1953.
I have no doubt that this will be a very good car, with truly world class performance coupled with American-style daily usefulness and (perhaps) easy servicing—dry-sump engines are not typical dealer shop fare. But I’d have liked to see some traces of the Astrovette or the four-rotor mid-engine concept from the Bill Mitchell era.
That would be one of these:
Compare and contrast previous Corvettes to the C8 in this magnificent illustration by Paco Ibarra:
The problem with nearly every rear/mid-engine car I have ever seen is there is usually more car behind the B-pillar (behind the door) than in front of the A-pillar (ahead of the door), which makes it look imbalanced in the wrong direction. As it is, nothing about this C8 screams Corvette to me; it looks like a teenage kid’s dream of a midengine car that could be made by anybody.
Another point made elsewhere is that GM is coming out with an exotic car supposed to make people forget about Ferraris and Porsches and Lamborghinis (oh my!), and yet it has the same engine the C7 has — a naturally aspirated overhead two-valve V-8. It is a very good overhead-valve V-8, and it wouldn’t stop me from buying a Corvette, but it seems illogical to feel the need to make it mid-engine with an exotic dual-clutch transmission without, say, a four-valve overhead-cam V-8 similar to the “King of the Road” C4. Anyone snobbish enough to turn up his nose over a front-engine Corvette isn’t going to be more convinced by a mid-20th century engine design that lacks the exotica of whatever Ferrari is sticking under its hoods now. (Or an exotic transmission installed in part because of the laziness or inabiliity of potential buyers to shift and use a clutch.)
You might say that the C7 engine is terrific, and it is. You might also point out my previous point about unproven GM tech. But the supposed point here is to make the Corvette appeal to those who wouldn’t buy Corvettes previously because they’re not supercarish enough (independent of the most important consideration, performance vs. price), and on that important point it fails because it’s not a Chevrolet, not a Corvette, and not a car with a 21st-century engine made of unobtainium. And in the process, GM alienated all the Corvette fans who wanted a better iteration of the previous formula (front-mid-engine, rear-drive, available manual transmission) that is one of the few profitable cars GM makes.
The worst thing about the C8 actually has nothing to do with the car, and has everything to do with people’s reactions to the car. One expects GM to shift the hype machine into overdrive. But one would hope adults would be at least somewhat resistant to the hype machine, particularly journalists. The aforementioned writing is all I could find from the auto enthusiast publications remotely critical of the C8.
In 1968 Car & Driver tested the first C3 Corvette and pronounced it undrivable because it was put together so poorly. Even after GM figured out how to put it together correctly, auto magazines pointed out correctly that the C3 was simultaneously a bigger car with less passenger and luggage space. Road & Track was particularly critical about the Corvette for decades, perhaps concluding it should have been more like a Jaguar E-Type (while ignoring British cars’ hideous quality reputations). Dissing the home team product wasn’t necessarily easy to do given GM’s advertising dollars. Now apparently they’re all sellouts.
The bigger issue, though, is that reaction to this new Corvette mirrors everything else in the sewer of our public discourse, on politics, sports teams, music preferences, what you watch (or don’t) on TV including iterations of “Star Trek,” food choices and everywhere else. We are supposed to believe, according to its uncritical fanboys, that the C8 is better than sex, chocolate chip cookies, sunny summer days and puppies, and how dare anyone express a contrary opinion.
I have read accusations that those who are not unalloyed fans of the C8 are Neanderthals stuck in the last century who can’t afford to buy one anyway, because insulting someone for their different opinion is so effective in changing opinions. (Not.) Someone actually bothered to create a Corvette owner stereotype that skipped past the usual midlife crisis trope to specifically include not gold chains and bad combovers, but jean shorts and white New Balance shoes.
Certainly, except possibly for the C2, every generation has been controversial for those who believe no Corvette but their favorite is really a Corvette. The C3 was way out there in appearance compared with the C2. The C4 had two horrible-looking instrument panels and was hard to get into and out of. The C5 looked blah. The C6 dumped the hidden headlights. The C7 got rid of a bunch of gauges and looked like a rearward-stretched C6.
For at least the last three generations (plus the King of the Hill C4) the Corvette has, however, been the best performance bargain on the planet, regardless of whether front-engine and rear-drive is the apotheosis of vehicular technology. GM, which has proven less than competent at big technological risks, has taken another one by selling its halo car — which has made money for GM for decades, unlike most of its current cars — with technology GM hasn’t used before and inadequately tested before it hits the market next year (there is no substitute for the real world) in a quest for buyers who don’t own Corvettes because they lack, in their misguided opinions, panache.
GM’s claim that they’re almost sold out needs a reminder that GM has not sold a single C8 Corvette. Not one. (I am highly skeptical of all the online claims of people ordering them. I could state that I own one of every generation Corvette, and no one reading this could prove otherwise.) And until they’re actually on the road, none of GM’s claims about the Corvette have proof.
GM has traditionally been one of the poorer run megacorporations for decades. (The conditions that resulted in the GM bailout far predated the Great Recession.) So maybe I shouldn’t suggest that GM could have kept building the C7, or updated it, while also selling the C8 as the Corvette Zora or something like that. The C7 makes money for GM. There is no guarantee the C8 will, and if it goes away, so will Corvette.
My position on cars and driving has always been that driving represents transportation freedom — the ability to go where you want to go when you want to go.
That cannot be said about any other form of transportation, including airplanes, trains and mass transit.
There is another thing about driving, though, noted in The Shop:
Countless millions of Americans find relief from their over-connected, stressed-out lives in the simple pleasures of yoga and meditation.
Then there are car lovers.
“What I remember most are those precious times I fired up my car with no particular place to go and no precise timetable, owing my punctuality to no one and my presence only to myself,” auto journalist Jack Baruth writes in a new book on the relationship many Americans feel between the cars they love and their peace of mind.
The book, titled Never Stop Driving: A Better Life Behind the Wheel, features essays and musings on the driving life by some of the nation’s leading automotive journalists and an array of celebrity car fans, including Jay Leno, Mario Andretti, Patrick Dempsey and others.
Why this book now?
“The book is essentially a love letter to the art and act of driving,” said Larry Webster, the editor and lead author of the book. “With driverless cars on the horizon, it’s worth celebrating the fact that, for many people, there are enormous benefits to simply taking a drive in the country or getting dirty under the hood.”
Packed with photos that complement the writing, Never Stop Driving: A Better Life Behind the Wheel is available through The Shop by Hagerty and via retailers nationwide. All proceeds from books purchased through The Shop by Hagerty will fund driver’s education scholarships for young drivers through Hagerty’s License to the Future initiative.
The company’s stated ongoing mission is to Save Driving in the coming age of autonomy and make sure that people who choose to continue to drive themselves always have a share of the road.
“People who love cars aren’t against driverless cars—far from it. They’re going to do a lot of good for society,” Webster said. “But we do want to protect something that also means a lot, and that’s driving yourself when you want to. I hope we never lose that. That’s what the Save Driving campaign is all about.”
Democratic candidates for president, in their impressive expansiveness, are promising free college. Some limit their proposals to community colleges, others to state-run schools, and a few, going for broke, want also to forgive student debt for private-college tuition. Since no realm of American life has undergone greater inflation in recent decades than higher education, this is no piddling promise. The cost to taxpayers could be in the trillions, though the prospect would please a nephew of mine who this autumn is sending a son to Dartmouth at the annual price of $76,000.
If government is going to pay for college, at least it ought to try to bring down the cost. I taught at a university for 30 years and have a few suggestions. Start at the top: I would reduce the salaries of university presidents by, say, 90%. (At the institution where I taught, the president made more than $2 million when last I checked.) I would also evict them from their rent-free mansions and remove their cadres of servants. The contemporary university president, after all, has little or nothing to do with education, but is chiefly occupied with fundraising and public relations. If universities were restaurants, the president would be a maître d’. To encourage their fundraising skills, perhaps they could be paid a small commission on the money they bring into their schools—cash, so to speak, and carry—excepting that on money used to erect more otiose buildings filled with treadmills, computers and condom machines.
The next big cut in the cost of higher education would be in superfluous administrative jobs, for the contemporary university is nothing if not vastly overstaffed. All those assistant provosts for diversity, those associate deans presiding over sensitivity programs, those directors for student experience—out, out with them. I would also suggest dispensing with courses that specialize exclusively in victimology, the history of victim groups told from the point of view of the victims. Young men and women do not need reinforcement in their already mistaken belief that they are victims because of their skin color, ethnicity or sexuality.
Another place serious money could be saved is college athletics. I’ve read that the highest-paid public employee in most states is the state-university football coach. The school at which I taught is not a state school, but its reasonably successful football coach earned $3.3 million in 2017, ranking him only 32nd among all college football coaches.
Nick Saban, the football coach at the University of Alabama, earns $8.3 million a year. Mike Krzyzewski, the basketball coach at Duke, earns $7 million. The argument for these astonishing figures is that football at Alabama and basketball at Duke more than pay for themselves. The Alabama football “program,” as they like to refer to this most brutal of sports, with its postseason games and television fees, brings in nearly $100 million a year. Duke’s perpetually winning basketball teams doubtless result in more student applications and alumni donations.
Under pure capitalism, Messrs. Saban and Krzyzewski might be said to earn their pay. But if higher education is to be free, as Bernie Sanders and Elizabeth Warren would have it, we are no longer talking about capitalism. Coaches’ salaries could be greatly reduced and the money earned by college sports—which means chiefly football and basketball—would need to be turned over to the federal government to help pay the cost of education itself.
Which brings us to the faculty. Faculty jobs in American universities have risen well in excess of any visible improvement in the quality of university teachers: $200,000-a-year-or-more professorships are now not uncommon. When I began teaching in my mid-30s, an older friend, long resident at the same university, said to me, “Welcome to the racket.” What he meant is that I would be getting a full-time salary for what was essentially a six-month job, and without ever having to put in an eight-hour day. At the tonier universities, professors in the humanities and social sciences might teach as few as three or four courses a year, the remainder of their time supposedly devoted to research. Like the man said, a sweet racket.
Under free higher education, perhaps it would make sense to pay university teachers by the hour, with raises in the wage awarded by seniority. Surely they could not complain. After all, the two most common comments (some would say the two biggest lies) about university teaching are, “I learn so much from my students” and “It’s so inspiring, I’d do it for nothing.” A strict hourly wage for teachers, as free university education may require, would nicely test the validity of that second proposition.
Free higher education—what a splendid ring it has, sufficient tintinnabulation to cause one to forget the old axiom that you get what you pay for.
With excessive heat warnings and heat advisories in southern Wisconsin today and Saturday, surely Penelope Green of the New York Times knows better than us:
Modernity was born 116 years, 11 months, two weeks and two days ago, at a printing plant in the East Williamsburg section of Brooklyn, when a junior engineer named Willis Carrier devised a contraption that blew air over water-filled pipes to dry out the humidity that was gumming up the pages of a humor magazine called Judge.
And in that moment (well, within a few decades), entire industries and geographies were transformed, and new technologies made possible, including, terribly, the internet: Without cooling, there would be no server farms.
Nearly 90 percent of American households now have some form of air-conditioning, more than any other country in the world except Japan, though that will change as global warming alters more temperate zones, and swelling populations and rising incomes in hot zones mean the folks there will clamor for AC, too.
On an overheated planet, air-conditioning becomes more and more desirable, solving in the short term the problem it helped create.
I was running today and was streaming iHeart radio on my phone and heard a statistic I had to check out.
There was a discussion of how our society and culture has lost its ability to laugh and it was suggested that a measure of that could be the revenue realized from comedy movies vs. all other types.
I didn’t hear the number they quoted, so I went to look for myself and dug up some interesting numbers.
It was 7.24%.
Versus 12.5% in 2016, 13.5% in 2008 and 19.6% in 2000.
Of course, some of the action and adventure movies had comedy in them but for the pure definition, 92.76% of all movies in 2018 were NOT classified as comedies.
And it looks like America is about 64% less humorous as we were in 2000. In 18 years, we have lost two thirds of our funny.
I was surprised but not surprised – surprised that the number was that low but not really after looking at the comedy movies that were released in 2018.
They all pretty much sucked.
The leading revenue generator was “Night School” with Kevin Hart.
Given that there are almost no TV comedies (at least none that don’t bash hetero men, have a stereotypical gay character or have a political agenda) that are even mildly interesting or funny and SNL is truly awful in so many ways, there is a distinct lack of comedic production in the United States.
I didn’t think a lot about how to quantify it but I could feel it. Back about 6 months or so ago, I wrote what follows, titled “What Happened to the Happy?”:
“Over the past couple of weeks, few random thoughts and observations have been ricocheting around in my quite spacious empty skull like a marble in an empty paint can.
– If America is not to be allowed to judge the cultures of others, then other cultures are not to be allowed to judge America.
– If you think policies pursued by former presidents are now bad because they are pursued by the current president, the problem isn’t the current president, it’s you.
– America seems to be losing its sense of humor, and while it is appropriate to be serious about truly serious things, what many in America consider serious are ridiculous. It seems a minority of our country believes they have a solution and spend all their waking hours looking for problems that solution can solve…and in the process, making most Americans 100% miserable.
To me, the first two are sort of basic logic and reason. It’s the last one that really bothers me – and at the risk of a double entendre – it’s not even funny.
Losing our sense of humor is something that seems unusual in American history – one of the interesting aspects of the most difficult and dangerous times in American history, wars- and particularly WWII, gave rise to great comedians, actors and musicians – Bob Hope, George Burns, Red Buttons, Red Skelton, Jack Benny, Glenn Miller, Tommy Dorsey, Artie Shaw, the Andrews Sisters, Vera Lynn, Bing Crosby, Danny Kaye, Richard Burton, Kirk Douglas, Clark Cable, Audrey Hepburn, Jimmy Stewart – and Ronald Reagan to name a few…
Our entertainment industry has bought into the idea that they need to push the postmodernist agenda…and therein lies the problem. If everything is serious enough to be an issue, then nothing is funny. Many established comedians have stopped playing college venues due to this very fact – and as a result, the comedy institutions are producing young comedians who just aren’t funny, at least not to the majority of America.
Where are people like the original SNL cast, “The Not Ready For Prime-Time Players” – Laraine Newman, John Belushi, Jane Curtin, Gilda Radner, Dan Aykroyd, Garrett Morris, and Chevy Chase (even though Chevy Chase has turned into a bitter old man) or the original SCTV cast – John Candy, Joe Flaherty, Eugene Levy, Andrea Martin, Catherine O’Hara, Harold Ramis, and Dave Thomas? Where are comedians like Cheech and Chong, Richard Pryor, and Eddie Murphy?
Where is the new Mel Brooks – for Christ’s sake, this man made arguably the funniest movie of all time, Blazing Saddles, a movie based on lampooning racial stereotypes. He even made a movie called “The Producers”, the central plot of which revolved around putting on a Broadway musical titled “Springtime for Hitler”.
Can you even imagine a film like “Blazing Saddles” getting green lit by Hollywood today?
As I told one of my kids, when you succumb to the postmodernist idea that there are no objective standards, that truth is relative and that opinions are equal in weight to facts, it should come as no surprise that people will be offended by anything and everything can be construed to fit any narrative. It just so happens that most of the narratives today are negative and designed to punish.
This is not to say America doesn’t have serious problems – because it does – but the attention given to issues created by the social justice postmodernists is taking time away from working on the real issues and without humor, the relief needed to deal with the true seriousness is missing.
What we need is a healthy dose of MAFA – Make America Funny Again.”
It seems the quantification agrees with the feeling I had last year.