Category: Culture

Catholics and being catholic, or not

It may surprise some people who pay attention to such things that apparently there are members of the Roman Catholic Church who are not necessarily fans of Pope Francis.

You might be able to tell from a blog’s naming the pope “Chaos Frank” that the Novus Ordo Watch is not part of the Franciscan Fan Club:

Every day we are being drowned in news about “Pope” Francis and the Vatican machinery. The incessant flood of information is becoming increasingly difficult for everyone to process, which means it is easy for stories to get missed.

Such was apparently the case with a real bombshell Francis dropped on February 26, 2017 while visiting an Anglican parish church in Rome. Virtually everyone seems to have missed it. What happened? During a Q&A session in which Francis was answering people’s questions off the cuff, he related an anecdote about ecumenical practice with Anglicans in his homeland of Argentina.

Have a look at what Francis said, and don’t forget to close your mouth afterwards:

And then, there is my experience. I was very friendly with the Anglicans at Buenos Aires, because the back of the parish of Merced was connected with the Anglican Cathedral. I was very friendly with Bishop Gregory Venables, very friendly. But there’s another experience: In the north of Argentina there are the Anglican missions with the aborigines, and the Anglican Bishop and the Catholic Bishop there work together and teach. And when people can’t go on Sunday to the Catholic celebration they go to the Anglican, and the Anglicans go to the Catholic, because they don’t want to spend Sunday without a celebration; and they work together. And here [at the Vatican], the Congregation for the Doctrine of the Faith knows this. And they engage in charity together. And the two Bishops are friends and the two communities are friends.

I think this is a richness [treasure] that our young Churches can bring to Europe and to the Churches that have a great tradition. And they give to us the solidity of a very, very well cared for and very thought out tradition. It’s true, — ecumenism in young Churches is easier. It’s true. But I believe that – and I return to the second question – ecumenism is perhaps more solid in theological research in a more mature Church, older in research, in the study of history, of Theology, of the Liturgy, as the Church in Europe is. And I think it would do us good, to both Churches: from here, from Europe to send some seminarians to have pastoral experience in the young Churches, so much is learned. We know [that] they come, from the young Churches, to study at Rome, at least the Catholics [do]. But to send them to see, to learn from the young Churches would be a great richness in the sense you said. Ecumenism is easier there, it’s easier, something that does not mean [it’s] more superficial, no, no, it’s not superficial. They don’t negotiate the faith and [their] identity. In the north of Argentina, an aborigine says to you: “I’m Anglican.” But the bishop is not here, the Pastor is not here, the Reverend is not here . . . “I want to praise God on Sunday and so I go to the Catholic Cathedral,” and vice versa. They are riches of the young Churches. I don’t know, this is what comes to me to say to you.

(“Pope’s Q & A at Anglican All Saints Church”, Zenit, Feb. 27, 2017; underlining added. Original Italian at Vatican web site here.)

Wow. Anglicans worship with “Catholics” and “Catholics” with Anglicans because they “want a celebration”, as though sacred worship were about them and not about God primarily. (To see what God thinks of unauthorized worship, even if not heretical, have a look at the demise of Core in Numbers 16; cf. Jude 11.)

Does Francis condemn this practice? Does he denounce it as offensive to God, dangerous, and favoring the heresy of indifferentism? Of course not. No, it is clear from the words, the context, and the absence of a condemnation that he is effectively endorsing it, using it as an example of ecumenically “working together”, which he calls a “richness” (or “treasure”) that churches in Latin America can give to Europe! The man is an indifferentist and a Modernist through and through. This should make it even more clear now why Francis couldn’t have had the slightest bit of a problem with the Anglican evensong service that was recently performed in the Vatican’s St. Peter’s Basilica. …

Notice also that he speaks of “church” and “churches” entirely without qualification, refusing to distinguish the true Church from Protestant sects. He does not have the Catholic Faith, which is why he cannot possibly be the “rock” on which Jesus Christ built His one and only true Church, “the pillar and ground of the truth” (1 Tim 3:15; cf. Mt 16:18-19) — the rock whose purpose is to confirm the brethren in the faith (cf. Lk 22:32), and who will never himself suffer shipwreck in it:

This gift of truth and never-failing faith was therefore divinely conferred on Peter and his successors in this see so that they might discharge their exalted office for the salvation of all, and so that the whole flock of Christ might be kept away by them from the poisonous food of error and be nourished with the sustenance of heavenly doctrine. Thus the tendency to schism is removed and the whole church is preserved in unity, and, resting on its foundation, can stand firm against the gates of hell.

(Vatican Council, Dogmatic Constitution Pastor Aeternus, Ch. 4; underlining added.)

By the way: In 1868, Pope Pius IX had something to say about the true Church of Christ versus the false churches of the Protestants:

Now, whoever will carefully examine and reflect upon the condition of the various religious societies, divided among themselves, and separated from the Catholic Church, which, from the days of our Lord Jesus Christ and his Apostles has never ceased to exercise, by its lawful pastors, and still continues to exercise, the divine power committed to it by this same Lord; cannot fail to satisfy himself that neither any one of these societies by itself, nor all of them together, can in any manner constitute and be that One Catholic Church which Christ our Lord built, and established, and willed should continue; and that they cannot in any way be said to be branches or parts of that Church, since they are visibly cut off from Catholic unity. For, whereas such societies are destitute of that living authority established by God, which especially teaches men what is of Faith, and what the rule of morals, and directs and guides them in all those things which pertain to eternal salvation, so they have continually varied in their doctrines, and this change and variation is ceaselessly going on among them. Every one must perfectly understand, and clearly and evidently see, that such a state of things is directly opposed to the nature of the Church instituted by our Lord Jesus Christ; for in that Church truth must always continue firm and ever inaccessible to all change, as a deposit given to that Church to be guarded in its integrity, for the guardianship of which the presence and aid of the Holy Ghost have been promised to the Church for ever. No one, moreover, can be ignorant that from these discordant doctrines and opinions social schisms have arisen, and that these again have given birth to sects and communions without number, which spread themselves continually, to the increasing injury of Christian and civil society.

(Pope Pius IX, Apostolic Letter Iam Vos Omnes)

A few years prior, the Holy Office under the same Pope had written a letter to the Puseyite Anglicans and reminded them that “all groups entirely separated from external and visible communion with and obedience to the Roman Pontiff cannot be the Church of Christ, nor in any way whatsoever can they belong to the Church of Christ” (Instruction Ad Quosdam Puseistas Anglicos, Nov. 8, 1865; italics added). So much for the Vatican II doctrine of “ecclesial elements” and “imperfect communion” that supposedly exists between the Church of God and the sects of man — but that’s another issue.

Assisting at the liturgical services of non-Catholics is a mortal sin and makes anyone who does so, suspect of heresy. This is clear from the Church’s Code of Canon Law (1917) and her moral theology:

It is not licit for the faithful by any manner to assist actively or to have a part in the sacred [rites] of non-Catholics.

(Canon 1258 §1)

Whoever in any manner willingly and knowingly helps in the promulgation of heresy, or who communicates in things divine [=assists at sacred rites] with heretics against the prescription of Canon 1258, is suspected of heresy.

(Canon 2316)

It is unlawful for Catholics in any way to assist actively at or take part in the worship of non-Catholics (Canon 1258). Such assistance is intrinsically and gravely evil; for (a) if the worship is non-Catholic in its form (e.g., Mohammedan ablutions, the Jewish paschal meal, revivalistic “hitting the trail,” the right hand of fellowship, etc.), it expresses a belief in the false creed symbolized; (b) if the worship is Catholic in form, but is under the auspices of a non-Catholic body (e.g., Baptism as administered by a Protestant minister, or Mass as celebrated by a schismatical priest), it expresses either faith in a false religious body or rebellion against the true Church.

(Rev. John A. McHugh, O.P. & Rev. Charles J. Callan, O.P., Moral Theology: A Complete Course Based on St. Thomas Aquinas and the Best Modern Authorities, vol. I [New York, NY: Joseph F. Wagner, 1958], n. 964)

The Catholic prohibition against worship with non-Catholics is clear, then, both from a legal-canonical as well as a moral perspective.

In 1948, this prohibition was underscored once more through a canonical warning issued by the Holy Office specifically in the context of a rising interest in ecumenical (ha!) religious gatherings, which for Catholics were (and still are) strictly forbidden:

Mixed gatherings of non-Catholics with Catholics have been reportedly held in various places, where things pertaining to the Faith have been discussed against the prescriptions of the Sacred Canons and without previous permission of the Holy See. Therefore all are reminded that according to the norm of Canon 1325 § 3 laypeople as well as clerics both secular and regular are forbidden to attend these gatherings without the aforesaid permission. It is however much less licit for Catholics to summon and institute such kind of gatherings. Let therefore Ordinaries urge all to serve these prescriptions accurately.

These are to be observed with even stronger force of law when it comes to gatherings called “ecumenical”, which laypeople and clerics may not attend at all without previous consent of the Holy See.

Moreover, since acts of mixed worship have also been posed not rarely both within and without the aforesaid gatherings, all are once more warned that any communication in sacred affairs is totally forbidden according to the norm of Canons 1258 and 731, § 2.

(Holy Office, Decree Cum Compertum)

In the case of Francis’ practical endorsement of Anglican worship, there is more to it than a “mere” participation in false worship, however, because not only is the worship of Anglicans heretical, schismatic, and unauthorized, and therefore objectively odious in His sight (cf. Jn 4:24; Jude 11; Num 16), but any Anglican “Masses” are also invalid because all ordinations performed by the Church of England are “absolutely null and utterly void”, as declared by Pope Leo XIII in 1896:

Wherefore, strictly adhering, in this matter, to the decrees of the pontiffs, our predecessors, and confirming them most fully, and, as it were, renewing them by our authority, of our own initiative and certain knowledge, we pronounce and declare that ordinations carried out according to the Anglican rite have been, and are, absolutely null and utterly void.

(Pope Leo XIII, Bull Apostolicae Curae, n. 36)

Thus, Anglican “priests” are nothing but mere laymen dressed in fancy clerical robes. (The same theological principles which prove Anglican orders invalid, by the way, also prove Novus Ordo ordinations [after 1968] invalid.)

Pope Leo’s pronouncement, we might add, is considered infallible:

It belongs to a class of ex cathedral utterances for which infallibility is claimed on the ground, not indeed, of the terms of the Vatican definition, but of the constant practice of the Holy See, the consentient teaching of the theologians, as well as of the clearest deductions from the principles of faith.

(The Catholic Encyclopedia, s.v. “Anglican Orders”)

For all intents and purposes, then, Francis has endorsed active participation in non-Catholic, heretical, schismatic, and even invalid liturgical rites, for he has told his followers that assistance at an Anglican “Mass” is not objectionable but praiseworthy, and is licitly done at least whenever (what he considers to be) a Catholic Mass is not available.

Here we see once again that the real news is much more absurd than any fake news ever could be. You just can’t make this stuff up!

Indeed. So as someone raised Catholic who is now a member of the Episcopal Church I am now a heretic? Cool! (Of course, as a journalist I am a heretic anyway and undoubtedly going to Hell. So I’ve got that going for me too.)

The author of that hate-filled screed is not the reason I left the Catholic Church, but this writer certainly validates my departure. (Along with a certain bishop and his supporters.) I have my differences with the extreme liberalism of the Episcopal Church, but as someone given a brain and free will by God I would be no happier in today’s Roman Catholic Church, which is not really the church I was raised in.

It should be pointed out that there is no Biblical justification for papal infallibility, and that any church document is subordinate to the actual Word of God, which is spelled out quite clearly in the Gospels: (1) Love God, (2) love your neighbor as yourself. The writer may want to familiarize himself or herself with the second Great Commandment.

 

 

The alt-right is the left’s fault

In the same way that Barack Obama caused Donald Trump’s presidency, Owen Strachan identifies the root cause of the alt-right:

Various journalists have helped form a narrative of sorts about the identity of this shadowy, boisterous alt-right movement. The alt-right is childish and vicious, full of sound and fury, signifying nothing other than the message-board histrionics of aggrieved young men in their parents’ basement.

From what I can see, this narrative does apply to a degree. Where various alt-right voices have articulated ethnocentrism, outright racism, misogyny, decadence, and a kind of juvenile hatred, among other vile stances, we should offer condemnation in no uncertain terms.

I do wonder, however, if the media has missed at least one true thing regarding the “alt-right.” The movement (if we can call it that) may often prove inchoate and even inarticulate, but behind the memes and coded language, there seems to be a massed sentiment. It is this: men feel left behind.

America is divided today on this matter and its import. Many folks, particularly those of a more progressive bent, see men as whining over lost cultural capital. Once, men had it good; now they’re forced to compete in an even playing field, and they’re falling on their faces. Sorry for the stacked deck, guys—how does it feel, losers?

Others see men struggling, observe them falling precipitously behind in earning college degrees and other achievements such as earnings for unmarrieds, watch them leaving their wives and children then violently lashing out, and begin to wonder if men need something besides elaborate gender theory or a dismissive long-form hot-take. Maybe men, particularly young men, need help.

This second group does not wish to cut men a blank check for their ill behavior. Actually, this group—a diverse and motley crew of religious groups, libertarians, and people who care about the future of civilization—wishes to hold men to a high standard. In other words, this is the group that most wants to hold men to account, that most takes their failings seriously. It is the group that dismisses men’s concerns with gentle remonstrance, that accommodates men by dumbing things down for them, that unwittingly ends up doing them terrific harm.
Because it is not friendly to them, many men do not like postmodern society. They have been taught they have no innate call to leadership of home and church, and accordingly have lost the script for their lives. They have been encouraged to step back from being a breadwinner, and do not know what they are supposed to do with their lives.

They have been told that they talk too loudly and spread their legs too wide, and thus do not fit in with a feminized society. They may be the product of a divorced home, and may have grown up without an engaged father, so possess both pent-up rage and a disappearing instinct. They did nothing to choose their biological manliness, but are instructed to attend sensitivity training by virtue of it. They recognize—rightly—that politically correct culture constrains free thought and free speech, and so they opt out from it.

But here is where the common narrative of the alt-right and related groups makes a major mistake. Men are disappearing, but they are not vanishing. They are moving out of the mainstream, and into the shadows.

Many men do not want this. Many men do not want to fall back. Many men want a challenge. They want to work. It is not in their nature to sit back; men on average have 1,000 percent more testosterone than women. Men know they are not superheroes, but they watch superhero movies because they wish in the quietness of their own lives to be a hero to someone, even just one wife and a few children. Men have a “glory hunger” that is unique and in many cases undeniable. For the right cause, men are not only willing to sacrifice, and even die, for the right cause they are glad to die.

But such discussion is not the lingua franca of our day. Young men have these desires coursing through their blood, but very few outlets in normal American life help them to understand such hard-wired drives. Those voices who do offer such a view face tremendous pushback and retributive hostility.

As a result, many younger men today do not know how to voice their instincts. This is at least partly why so many have adopted ironic signifiers for their frustrated ambitions and impolitic views—frogs, memes, and catchwords like “fail.” What young men cannot say in plain speech they say through an ironic graphic.

It is easy, and right, to identify where aspects of the alt-right are plainly misogynistic. But tying an entire people group to its worst excesses allows for the full-scale dismissal of a diverse array of concerns and experiences. This has happened with Donald Trump’s voters, for example; according to many journalists, they’re all either racist or angry about the loss of the halcyon days. The media executes the same lazy move with the angry young men of the alt-right: they’re idiotic little boys. We have nothing to hear from them, nothing to learn, nothing to consider.

This is a foolish instinct. But it is not only that: it is a dangerous one. It leaves you susceptible to groundswells that sweep over a culture seemingly without warning—the Tea Party, Brexit, Trump. Many folks on the progressive side assume that because they have won the college campus and now dominate the urban centers of power that the cultural game is over.

But what looks like a fortress-grade progressive order is really an unstable element, as we have seen several times over. The ideological insurgency will never have Ivy League degrees to award, coveted Beltway bylines to dole out, or global-power conference invites to issue. But the insurgency is finding its audience, and the audience is destabilizing and even remaking the public-square, and all without central coordination or control of leading cultural institutions. …
We can debate the extent to which the perceptions of angry young men are reality. What we cannot debate—if we care about them, that is—is that many men are angry, flailing, and dangerously volatile today.

We will not find an easy solution to this troubled situation. The public square is roiled and shows no signs of calming down soon. True, restoring the family will greatly aid in the nurture and care of young men. Sure, strengthening the economy and putting men to work will help. Yes, tabling the speech codes and thought codes of the secular academy will bring some men back to the table.

But men need a deeper solution than this. They need something more than a message-board movement to join. They need a call to maturity, to repentance, to greatness, to leadership, to courage, to self-sacrifice on behalf of women and children. They need a hero: not a political performance-artist, but a true hero, a savior who, unlike a fallen culture, leaves no repentant man—or woman—behind.

A question for the pulpit today

James Freeman:

Nicholas Kristof of the New York Times is having fun implying that Speaker of the House Paul Ryan’s effort to replace ObamaCare is at odds with Mr. Ryan’s Catholic faith. The column is of a piece with the “Jesus was a socialist” arguments that bounce around the left half of the social media universe.

Without wrestling with any difficult questions of faith or logic, Mr. Kristof simply casts the federal bureaucracy in the role of Jesus. Then the Timesman proceeds to suggest through satire that by seeking to reduce outlays and improve incentives in federal programs, Mr. Ryan is defying the will of his God. Of course if federal agencies were ever actually given the statutory mission to do as Jesus would do, Mr. Kristof would be as horrified as anyone. But this seems to be a political season when people who spend much of the year driving religion out of public life abruptly drag it back in as they attempt to justify big government. It’s not necessarily persuasive.

The ancient book has numerous admonitions to perform charity and various condemnations of greed, but it’s not easy to find a passage in which Jesus says that government is the best vehicle to provide aid, or that anyone should force others to donate.

Even casual readers of the Bible may notice that Jesus doesn’t get along all that well with the political authorities of his time and (spoiler alert!) his relationship with government ends rather badly. Back then, tax collectors were not presumed to be the dedicated public servants that we appreciate so much today. And in our own time, social conservatives who think the U.S. Government has become hostile to religion—Christianity in particular—should consider what Jesus had to put up with.

Fortunately, in part because of the influence of the Bible on America’s founders, we enjoy a form of government that is much more humane than the one that Jesus encountered. This raises the interesting question of what Jesus would say about our contemporary political debates. Perhaps he would gaze approvingly upon the $4 trillion annual federal budget and intone, “Whatever you do to the least of my appropriations, you do to me.” But would he still say that after examining all the line items? Beyond questions of specific allocations for Planned Parenthood and the like, would Jesus see even a relatively benign government like ours as superior to individual acts of charity in comforting the afflicted?

In contrast to Mr. Kristof’s drive-by, John Gehring nicely limns the issues at the heart of this debate in a 2015 piece for the National Catholic Reporter. He notes that the U.S. Conference of Catholic Bishops has condemned GOP attempts at budget-cutting but also mentions that a few church leaders are on Mr. Ryan’s side of the argument.

Mr. Gehring refers to a 2012 lecture Mr. Ryan delivered at Georgetown University. “Simply put, I do not believe that the preferential option for the poor means a preferential option for big government,” said the Speaker. “Look at the results of the government-centered approach to the war on poverty. One in six Americans are in poverty today – the highest rate in a generation. In this war on poverty, poverty is winning.” He concluded that “relying on distant government bureaucracies to lead this effort just hasn’t worked.”

 

Christian capitalists

Today begins the penitential season of Lent in the Christian church. The concept of humans as sinners in need of redemption is as countercultural as it gets in American society.

The Roman Catholic Church (in which I was raised) contains the unusual (for this country) combination of social conservatism (opposition to abortion rights and divorce) and economic liberalism. The latter is a mistake if the church wants to improve the lives of people, according to convert Arthur Brooks:

I fancied myself a social justice warrior and regarded capitalism with a moderately hostile predisposition. I “knew” what everyone knows: Capitalism is great for the rich but terrible for the poor. The natural progression of free enterprise is that the rich and powerful accumulate more and more of the world’s resources while the poor are exploited. That state of affairs might be fine for a follower of Ayn Rand, but it is hardly consistent for a devotee of Our Lady of Guadalupe. Right?

As with most people of my generation, for me the symbol of world poverty was a starving child in Africa. I remember a picture from my childhood—I think it was from National Geographic—of an African boy about my own age. He had a distended belly and flies on his face, and he became for me the human face of true deprivation. As I grew up, I assumed, as do most Americans, that the tragic conditions facing the starving African boy had gotten worse. Today, more than two-thirds of Americans think global poverty has worsened over the past three decades.

This assumption and the attendant beliefs about capitalism hit a snag when I studied economics for the first time. In reality, I learned, humanity has starvation-level poverty on the run. Since 1970, the fraction of the global population that survives on one dollar or less a day (adjusted for inflation) has shrunk by 80 percent. Since 1990, the number of children who die before their fifth birthday has collapsed by more than 50 percent. Life expectancy and literacy rates have steadily climbed.

When faced with suffering, we often ask a conventional question: “Why are some people poor?” But grinding material poverty was the norm for the vast majority of people through the vast majority of human history. Our ancestors had no concept that mass poverty was an acute social problem that cried out for remedies. Deprivation was simply the background condition for everyone.

In just the last few hundred years, that all changed for a few billion people. So the right question today is: “Why did whole parts of the world cease to be poor for the first time in history?” And further: “What can we do to share this ahistorical prosperity with more people?” Economics taught me that two billion of my brothers and sisters had escaped poverty in my own lifetime. This was a modern-day miracle. I had to find its source.

My search for the “why” of this miracle required almost no detective work. Virtually all development economists, across the mainstream political spectrum, agreed on the core explanation. It was not the success of international organizations like the United Nations (as important as they are) nor benevolent foreign aid that pulled billions back from the brink of starvation. Rather, the responsibility lay with five interrelated forces that were in the midst of reshaping the worldwide economy: globalization, free trade, property rights, the rule of law and the culture of entrepreneurship. In short, it was the American free enterprise system, spreading around the world, that had effected this anti-poverty miracle.

Again, this is a mainstream scholarly finding, not some political cliché. Informed people from left to right agree on these basic points. As no less an avowed progressive than President Barack Obama put it in a 2015 public conversation we had together at Georgetown University, the “free market is the greatest producer of wealth in history—it has lifted billions of people out of poverty.”

None of this is to assert that free enterprise is a perfect system—but more on that in a moment. Nor is it to claim that free enterprise is all we need as people. But it has unambiguously improved the lives of billions. It became my view that if I was truly to be a “Matthew 25 Catholic” and live the Lord’s teaching that “whatever you did for one of the least of these brothers and sisters of mine, you did for me,” then my vocation was to defend and improve the system that was achieving this miraculous result.

That is how an unlikely Catholic became an even more unlikely warrior for free enterprise.

My new mission gave meaning to my growing disenchantment with music. I was hungry for work that served vulnerable people more directly. Now I had a roadmap to point me toward that future. I graduated from correspondence college shortly before my 30th birthday. Traditional graduate work in economics followed, and I left music for good to pursue a Ph.D. in policy analysis. That sparked a career as a university professor, teaching economics and social entrepreneurship.

As I taught about the anti-poverty properties of free enterprise, a common objection—especially among my Catholic friends—remained. “Okay,” many said, “I see that markets have pulled up the living standards of billions, and that’s great. But they haven’t pulled people up equally. In fact, capitalism has created more inequality than we have ever seen.” This spawns ancillary concerns about the rich getting richer at the expense of the poor, and the rising inequality of opportunity. My challenge as a Catholic economist was to answer these questions in good faith.

The evidence on income inequality seems to be all around us and irrefutable, particularly in the United States. From 1979 to today, the income won by the “top 1 percent” of Americans has surged by roughly 200 percent, while the bottom four-fifths have seen income growth of only about 40 percent. Today, the share of income that flows to the top 10 percent is higher than it has been since at any point since 1928, the peak of the bubble in the Roaring Twenties. And our lackluster “recovery” following the Great Recession likely amplified these long-run trends. Emmanuel Saez, a University of California economist, estimates that 95 percent of all the country’s income growth from 2009 to 2012 wound up in the hands of the top 1 percent.

Taking this evidence on its face, it is easy to conclude that our capitalist system is hopelessly flawed. Digging deeper, however, produces a more textured story.

To begin with, we should remember that inequality is not necessarily a bad thing when the alternative is the equality of grinding poverty, which was the case in previous centuries. Few would prefer a nation of equal paupers to modern-day America. But in any case, the notion that global income inequality has been rising inexorably is incorrect. From 1988 to 2008, a key era in the continued worldwide spread of market systems, economists have shown that the worldwide Gini index—a common measure of inequality—at worst has stayed level and has most likely fallen.

The personal is political, consumer edition

Virginia Postrel:

After 20 years, the big Outdoor Retailer trade show is leaving Salt Lake City — not because it ran out of space or got a better deal elsewhere but because Utah lawmakers opposed an expansion of the industry’s biggest federal subsidy.

To most Americans, national parks and monuments are places to enjoy the outdoors while preserving natural and historical treasures. To the Outdoor Industry Association, they’re also a business necessity. It calls public lands “the backbone of the industry’s sales.”

“Utah elected officials do not support public lands conservation nor do they value the economic benefits — $12 billion in consumer spending and 122,000 jobs — that the outdoor recreation industry brings to their state,” Rose Marcario, the president and chief executive of Patagonia, declared in a statement announcing that her company would no longer attend the Salt Lake City show. Other industry leaders, including Polartec LLC and Arc’teryx Equipment Inc., quickly joined in. Then last Thursday, after a conference call with Utah Governor Gary Herbert that ended on a “curt” note, the show’s organizers said they’ll go elsewhere when their contract expires next year. Colorado is campaigning for their business.

At issue is the December designation of 1.35 million acres of federal land as Bears Ears National Monument. How to preserve the area has long been a contentious subject in Utah. President Barack Obama’s late-term action thrilled environmentalists and tribal leaders, upset ranchers and other rural residents, and thwarted oil and mineral development and the blue-collar jobs it might mean. The Republican governor, legislature, and congressional delegation all opposed the designation. In response, the legislature passed a resolution calling on the Trump administration to reverse Obama’s decision. When the governor signed it, calls for the boycott began.

Since it frankly acknowledges that its sales depend on public lands, is the outdoor industry applying a moral veneer to its quest for profits? Or is it using a corporate fig leaf to promote managers’ political views? The two motives are in fact impossible to separate.

When it comes to public lands, the outdoor industry shares the view of pioneering labor leader Samuel Gompers: “We do want more, and when it becomes more, we shall still want more.” As with the union movement, the industry’s financial interests are inextricable from its social values.

As I’ve previously written, the industry is one example of a much larger cultural and economic phenomenon: the shift from function to meaning as a source of economic value and, with it, the melding of consumption, politics and identity. What we buy increasingly expresses who we are.

Brands built on specific political or cultural values will inevitably take public stances, using their economic clout to influence public policy, whether out of genuine conviction, cold-eyed market positioning, or both. It’s not surprising that Patagonia Inc., the outdoor apparel brand most prominently built on its political stances, led the anti-Utah charge.

The bigger question is whether in the Trump era brands that aren’t traditionally political will feel forced to choose sides. Overtly political shopping is on the rise. Every week seems to bring a new boycott: against North Carolina over its bathroom bill, Nordstrom Inc. stores because they carry — or discontinue — Ivanka Trump’s merchandise, Kellogg Company for dropping ads on Breitbart News, Under Armour Inc. for its chief executive’s nice words about Donald Trump, Starbucks Corporation for pledging to hire refugees, and on and on. Wegmans Food Markets Inc. recently sold out of Trump Winery products in Virginia. The reason: calls for the chain to drop the wines, which produced a pro-Trump backlash.

Wine, breakfast cereal, workout clothes and business apparel aren’t inherently political goods. Brand choices choices may reflect largely unconscious tribal affinities, but they allow some play. You can eat organic food and vote Republican or drive an SUV and vote Democratic. Conservatives can enjoy Meryl Streep and liberals can esteem Clint Eastwood. “Vote right, live left,” an urbanite conservative advised me many years ago. Despite Trump’s frequent attacks on Jeff Bezos, Americans of all stripes like Amazon.com Inc.

Now, however, that pluralism is at risk. We seem headed toward an economy of red brands and blue brands, red employers and blue employers, with no common ground. In this context, the outdoor industry’s action is a disturbing bellwether, as is the increasing partisanship of once-evenhanded fashion magazines like Vogue. Outdoor activity appeals to Americans of all political persuasions, and the country’s western landscape has long helped define the national identity. People can disagree over how best to enjoy and protect that landscape, and how to weigh preservation against other values, while still sharing much in common. Enforcing the party line by declaring an entire state off limits is an extreme step.

Writing with the memory of religious wars, Voltaire in 1733 offered a peaceful alternative. “Go into the London Stock Exchange — a more respectable place than many a court — and you will see representatives from all nations gathered together for the utility of men,” he wrote:

Here Jew, Mohammedan and Christian deal with each other as though they were all of the same faith, and only apply the word infidel to people who go bankrupt. Here the Presbyterian trusts the Anabaptist and the Anglican accepts a promise from the Quaker. On leaving these peaceful and free assemblies some go to the Synagogue and others for a drink, this one goes to be baptized in a great bath in the name of Father, Son and Holy Ghost, that one has his son’s foreskin cut and has some Hebrew words he doesn’t understand mumbled over the child, others go to their church and await the inspiration of God with their hats on, and everybody is happy.

Once the great solvent of difference, commerce threatens to become its enforcer. And everyone is unhappy.

Wisconsinites know we’ve already had that here. Read the list from the Scott Walker Watch website of companies whose employees and/or management committed the unforgivable crime of giving money to Walker’s campaign, including Kwik Trip, Johnsonville Sausage and Georgia–Pacific, owned by The Evil Koch Brothers. At least two advocated boycotting the entire state.

That prompted an Isaac Newton-like “buycott,” where Walker supporters encouraged themselves and others to buy from companies whose employees and/or management contributed to Walker’s campaign. On the one hand, Walker has been reelected twice since the boycott attempt, and on the other hand, I believe no company on the boycott list has gone out of business, and Wisconsin appears to have survived. The political fortunes of those supporting boycotts have sunk like a Chicago Bears football season.

This is nearly all the fault of liberals. The phrase “the personal is political” came from neither conservatives nor libertarians; it came from a feminist, Carol Hanisch. There are a few liberals (this writer and the late Christopher Hitchens, for two) who understand how vapid that assertion is, but in our hyperpolitical times, now some conservatives are touting boycotting New Glarus Brewing, Penzey’s Spices, Madison and the musical “Hamilton.”

Part of the problem is that many people don’t grasp that for nearly every business (and I have yet to find one beyond one or two employees for which this is not the case) employee pay (including benefits) far exceeds that business’ profits. So if you think your not purchasing something from a company will hurt the owners, you’re wrong; it will hurt that company’s employees first and foremost. The American Enterprise Institute provides this chart …

… that shows how ignorant Americans are about business.

 

When things are rotten

Nicholas M. Eberstedt:

Whatever else it may or may not have accomplished, the 2016 election was a sort of shock therapy for Americans living within what Charles Murray famously termed “the bubble” (the protective barrier of prosperity and self-selected associations that increasingly shield our best and brightest from contact with the rest of their society). The very fact of Trump’s election served as a truth broadcast about a reality that could no longer be denied: Things out there in America are a whole lot different from what you thought. 

Yes, things are very different indeed these days in the “real America” outside the bubble. In fact, things have been going badly wrong in America since the beginning of the 21st century.

It turns out that the year 2000 marks a grim historical milestone of sorts for our nation. For whatever reasons, the Great American Escalator, which had lifted successive generations of Americans to ever higher standards of living and levels of social well-being, broke down around then—and broke down very badly.

The warning lights have been flashing, and the klaxons sounding, for more than a decade and a half. But our pundits and prognosticators and professors and policymakers, ensconced as they generally are deep within the bubble, were for the most part too distant from the distress of the general population to see or hear it. (So much for the vaunted “information era” and “big-data revolution.”) Now that those signals are no longer possible to ignore, it is high time for experts and intellectuals to reacquaint themselves with the country in which they live and to begin the task of describing what has befallen the country in which we have lived since the dawn of the new century.

Consider the condition of the American economy. In some circles people still widely believe, as one recent New York Times business-section article cluelessly insisted before the inauguration, that “Mr. Trump will inherit an economy that is fundamentally solid.” But this is patent nonsense. By now it should be painfully obvious that the U.S. economy has been in the grip of deep dysfunction since the dawn of the new century. And in retrospect, it should also be apparent that America’s strange new economic maladies were almost perfectly designed to set the stage for a populist storm.

Ever since 2000, basic indicators have offered oddly inconsistent readings on America’s economic performance and prospects. It is curious and highly uncharacteristic to find such measures so very far out of alignment with one another. We are witnessing an ominous and growing divergence between three trends that should ordinarily move in tandem: wealth, output, and employment. Depending upon which of these three indicators you choose, America looks to be heading up, down, or more or less nowhere.

From the standpoint of wealth creation, the 21st century is off to a roaring start. By this yardstick, it looks as if Americans have never had it so good and as if the future is full of promise. Between early 2000 and late 2016, the estimated net worth of American households and nonprofit institutions more than doubled, from $44 trillion to $90 trillion. (SEE FIGURE 1.)

Although that wealth is not evenly distributed, it is still a fantastic sum of money—an average of over a million dollars for every notional family of four. This upsurge of wealth took place despite the crash of 2008—indeed, private wealth holdings are over $20 trillion higher now than they were at their pre-crash apogee. The value of American real-estate assets is near or at all-time highs, and America’s businesses appear to be thriving. Even before the “Trump rally” of late 2016 and early 2017, U.S. equities markets were hitting new highs—and since stock prices are strongly shaped by expectations of future profits, investors evidently are counting on the continuation of the current happy days for U.S. asset holders for some time to come.

A rather less cheering picture, though, emerges if we look instead at real trends for the macro-economy. Here, performance since the start of the century might charitably be described as mediocre, and prospects today are no better than guarded.

The recovery from the crash of 2008—which unleashed the worst recession since the Great Depression—has been singularly slow and weak. According to the Bureau of Economic Analysis (BEA), it took nearly four years for America’s gross domestic product (GDP) to re-attain its late 2007 level. As of late 2016, total value added to the U.S. economy was just 12 percent higher than in 2007. (SEE FIGURE 2.) The situation is even more sobering if we consider per capita growth. It took America six and a half years—until mid-2014—to get back to its late 2007 per capita production levels. And in late 2016, per capita output was just 4 percent higher than in late 2007—nine years earlier. By this reckoning, the American economy looks to have suffered something close to a lost decade.

But there was clearly trouble brewing in America’s macro-economy well before the 2008 crash, too. Between late 2000 and late 2007, per capita GDP growth averaged less than 1.5 percent per annum. That compares with the nation’s long-term postwar 1948–2000 per capita growth rate of almost 2.3 percent, which in turn can be compared to the “snap back” tempo of 1.1 percent per annum since per capita GDP bottomed out in 2009. Between 2000 and 2016, per capita growth in America has averaged less than 1 percent a year. To state it plainly: With postwar, pre-21st-century rates for the years 20002016, per capita GDP in America would be more than 20 percent higher than it is today.

The reasons for America’s newly fitful and halting macroeconomic performance are still a puzzlement to economists and a subject of considerable contention and debate.1Economists are generally in consensus, however, in one area: They have begun redefining the growth potential of the U.S. economy downwards. The U.S. Congressional Budget Office (CBO), for example, suggests that the “potential growth” rate for the U.S. economy at full employment of factors of production has now dropped below 1.7 percent a year, implying a sustainable long-term annual per capita economic growth rate for America today of well under 1 percent.

Then there is the employment situation. If 21st-century America’s GDP trends have been disappointing, labor-force trends have been utterly dismal. Work rates have fallen off a cliff since the year 2000 and are at their lowest levels in decades. We can see this by looking at the estimates by the Bureau of Labor Statistics (BLS) for the civilian employment rate, the jobs-to-population ratio for adult civilian men and women. (SEE FIGURE 3.) Between early 2000 and late 2016, America’s overall work rate for Americans age 20 and older underwent a drastic decline. It plunged by almost 5 percentage points (from 64.6 to 59.7). Unless you are a labor economist, you may not appreciate just how severe a falloff in employment such numbers attest to. Postwar America never experienced anything comparable.

From peak to trough, the collapse in work rates for U.S. adults between 2008 and 2010 was roughly twice the amplitude of what had previously been the country’s worst postwar recession, back in the early 1980s. In that previous steep recession, it took America five years to re-attain the adult work rates recorded at the start of 1980. This time, the U.S. job market has as yet, in early 2017, scarcely begun to claw its way back up to the work rates of 2007—much less back to the work rates from early 2000.

As may be seen in Figure 3, U.S. adult work rates never recovered entirely from the recession of 2001—much less the crash of ’08. And the work rates being measured here include people who are engaged in any paid employment—any job, at any wage, for any number of hours of work at all.

On Wall Street and in some parts of Washington these days, one hears that America has gotten back to “near full employment.” For Americans outside the bubble, such talk must seem nonsensical. It is true that the oft-cited “civilian unemployment rate” looked pretty good by the end of the Obama era—in December 2016, it was down to 4.7 percent, about the same as it had been back in 1965, at a time of genuine full employment. The problem here is that the unemployment rate only tracks joblessness for those still in the labor force; it takes no account of workforce dropouts. Alas, the exodus out of the workforce has been the big labor-market story for America’s new century. (At this writing, for every unemployed American man between 25 and 55 years of age, there are another three who are neither working nor looking for work.) Thus the “unemployment rate” increasingly looks like an antique index devised for some earlier and increasingly distant war: the economic equivalent of a musket inventory or a cavalry count.

By the criterion of adult work rates, by contrast, employment conditions in America remain remarkably bleak. From late 2009 through early 2014, the country’s work rates more or less flatlined. So far as can be told, this is the only “recovery” in U.S. economic history in which that basic labor-market indicator almost completely failed to respond.

Since 2014, there has finally been a measure of improvement in the work rate—but it would be unwise to exaggerate the dimensions of that turnaround. As of late 2016, the adult work rate in America was still at its lowest level in more than 30 years. To put things another way: If our nation’s work rate today were back up to its start-of-the-century highs, well over 10 million more Americans would currently have paying jobs.

There is no way to sugarcoat these awful numbers. They are not a statistical artifact that can be explained away by population aging, or by increased educational enrollment for adult students, or by any other genuine change in contemporary American society. The plain fact is that 21st-century America has witnessed a dreadful collapse of work.

For an apples-to-apples look at America’s 21st-century jobs problem, we can focus on the 25–54 population—known to labor economists for self-evident reasons as the “prime working age” group. For this key labor-force cohort, work rates in late 2016 were down almost 4 percentage points from their year-2000 highs. That is a jobs gap approaching 5 million for this group alone.

It is not only that work rates for prime-age males have fallen since the year 2000—they have, but the collapse of work for American men is a tale that goes back at least half a century. (I wrote a short book last year about this sad saga.2) What is perhaps more startling is the unexpected and largely unnoticed fall-off in work rates for prime-age women. In the U.S. and all other Western societies, postwar labor markets underwent an epochal transformation. After World War II, work rates for prime women surged, and continued to rise—until the year 2000. Since then, they too have declined. Current work rates for prime-age women are back to where they were a generation ago, in the late 1980s. The 21st-century U.S. economy has been brutal for male and female laborers alike—and the wreckage in the labor market has been sufficiently powerful to cancel, and even reverse, one of our society’s most distinctive postwar trends: the rise of paid work for women outside the household.

In our era of no more than indifferent economic growth, 21st–century America has somehow managed to produce markedly more wealth for its wealthholders even as it provided markedly less work for its workers. And trends for paid hours of work look even worse than the work rates themselves. Between 2000 and 2015, according to the BEA, total paid hours of work in America increased by just 4 percent (as against a 35 percent increase for 1985–2000, the 15-year period immediately preceding this one). Over the 2000–2015 period, however, the adult civilian population rose by almost 18 percent—meaning that paid hours of work per adult civilian have plummeted by a shocking 12 percent thus far in our new American century.

This is the terrible contradiction of economic life in what we might call America’s Second Gilded Age (2000—). It is a paradox that may help us understand a number of overarching features of our new century. These include the consistent findings that public trust in almost all U.S. institutions has sharply declined since 2000, even as growing majorities hold that America is “heading in the wrong direction.” It provides an immediate answer to why overwhelming majorities of respondents in public-opinion surveys continue to tell pollsters, year after year, that our ever-richer America is still stuck in the middle of a recession. The mounting economic woes of the “little people” may not have been generally recognized by those inside the bubble, or even by many bubble inhabitants who claimed to be economic specialists—but they proved to be potent fuel for the populist fire that raged through American politics in 2016.

So general economic conditions for many ordinary Americans—not least of these, Americans who did not fit within the academy’s designated victim classes—have been rather more insecure than those within the comfort of the bubble understood. But the anxiety, dissatisfaction, anger, and despair that range within our borders today are not wholly a reaction to the way our economy is misfiring. On the nonmaterial front, it is likewise clear that many things in our society are going wrong and yet seem beyond our powers to correct.

Some of these gnawing problems are by no means new: A number of them (such as family breakdown) can be traced back at least to the 1960s, while others are arguably as old as modernity itself (anomie and isolation in big anonymous communities, secularization and the decline of faith). But a number have roared down upon us by surprise since the turn of the century—and others have redoubled with fearsome new intensity since roughly the year 2000.

American health conditions seem to have taken a seriously wrong turn in the new century. It is not just that overall health progress has been shockingly slow, despite the trillions we devote to medical services each year. (Which “Cold War babies” among us would have predicted we’d live to see the day when life expectancy in East Germany was higher than in the United States, as is the case today?)

Alas, the problem is not just slowdowns in health progress—there also appears to have been positive retrogression for broad and heretofore seemingly untroubled segments of the national population. A short but electrifying 2015 paper by Anne Case and Nobel Economics Laureate Angus Deaton talked about a mortality trend that had gone almost unnoticed until then: rising death rates for middle-aged U.S. whites. By Case and Deaton’s reckoning, death rates rose somewhat slightly over the 1999–2013 period for all non-Hispanic white men and women 45–54 years of age—but they rose sharply for those with high-school degrees or less, and for this less-educated grouping most of the rise in death rates was accounted for by suicides, chronic liver cirrhosis, and poisonings (including drug overdoses).

Though some researchers, for highly technical reasons, suggested that the mortality spike might not have been quite as sharp as Case and Deaton reckoned, there is little doubt that the spike itself has taken place. Health has been deteriorating for a significant swath of white America in our new century, thanks in large part to drug and alcohol abuse. All this sounds a little too close for comfort to the story of modern Russia, with its devastating vodka- and drug-binging health setbacks. Yes: It can happen here, and it has. Welcome to our new America.

In December 2016, the Centers for Disease Control and Prevention (CDC) reported that for the first time in decades, life expectancy at birth in the United States had dropped very slightly (to 78.8 years in 2015, from 78.9 years in 2014). Though the decline was small, it was statistically meaningful—rising death rates were characteristic of males and females alike; of blacks and whites and Latinos together. (Only black women avoided mortality increases—their death levels were stagnant.) A jump in “unintentional injuries” accounted for much of the overall uptick.

It would be unwarranted to place too much portent in a single year’s mortality changes; slight annual drops in U.S. life expectancy have occasionally been registered in the past, too, followed by continued improvements. But given other developments we are witnessing in our new America, we must wonder whether the 2015 decline in life expectancy is just a blip, or the start of a new trend. We will find out soon enough. It cannot be encouraging, though, that the Human Mortality Database, an international consortium of demographers who vet national data to improve comparability between countries, has suggested that health progress in America essentially ceased in 2012—that the U.S. gained on average only about a single day of life expectancy at birth between 2012 and 2014, before the 2015 turndown.

The opioid epidemic of pain pills and heroin that has been ravaging and shortening lives from coast to coast is a new plague for our new century. The terrifying novelty of this particular drug epidemic, of course, is that it has gone (so to speak) “mainstream” this time, effecting breakout from disadvantaged minority communities to Main Street White America. By 2013, according to a 2015 report by the Drug Enforcement Administration, more Americans died from drug overdoses (largely but not wholly opioid abuse) than from either traffic fatalities or guns. The dimensions of the opioid epidemic in the real America are still not fully appreciated within the bubble, where drug use tends to be more carefully limited and recreational. In Dreamland, his harrowing and magisterial account of modern America’s opioid explosion, the journalist Sam Quinones notes in passing that “in one three-month period” just a few years ago, according to the Ohio Department of Health, “fully 11 percent of all Ohioans were prescribed opiates.” And of course many Americans self-medicate with licit or illicit painkillers without doctors’ orders.

In the fall of 2016, Alan Krueger, former chairman of the President’s Council of Economic Advisers, released a study that further refined the picture of the real existing opioid epidemic in America: According to his work, nearly half of all prime working-age male labor-force dropouts—an army now totaling roughly 7 million men—currently take pain medication on a daily basis.

We already knew from other sources (such as BLS “time use” surveys) that the overwhelming majority of the prime-age men in this un-working army generally don’t “do civil society” (charitable work, religious activities, volunteering), or for that matter much in the way of child care or help for others in the home either, despite the abundance of time on their hands. Their routine, instead, typically centers on watching—watching TV, DVDs, Internet, hand-held devices, etc.—and indeed watching for an average of 2,000 hours a year, as if it were a full-time job. But Krueger’s study adds a poignant and immensely sad detail to this portrait of daily life in 21st-century America: In our mind’s eye we can now picture many millions of un-working men in the prime of life, out of work and not looking for jobs, sitting in front of screens—stoned.

But how did so many millions of un-working men, whose incomes are limited, manage en masse to afford a constant supply of pain medication? Oxycontin is not cheap. As Dreamland carefully explains, one main mechanism today has been the welfare state: more specifically, Medicaid, Uncle Sam’s means-tested health-benefits program. Here is how it works (we are with Quinones in Portsmouth, Ohio):

[The Medicaid card] pays for medicine—whatever pills a doctor deems that the insured patient needs. Among those who receive Medicaid cards are people on state welfare or on a federal disability program known as SSI. . . . If you could get a prescription from a willing doctor—and Portsmouth had plenty of them—Medicaid health-insurance cards paid for that prescription every month. For a three-dollar Medicaid co-pay, therefore, addicts got pills priced at thousands of dollars, with the difference paid for by U.S. and state taxpayers. A user could turn around and sell those pills, obtained for that three-dollar co-pay, for as much as ten thousand dollars on the street.

In 21st-century America, “dependence on government” has thus come to take on an entirely new meaning.

You may now wish to ask: What share of prime-working-age men these days are enrolled in Medicaid? According to the Census Bureau’s SIPP survey (Survey of Income and Program Participation), as of 2013, over one-fifth (21 percent) of all civilian men between 25 and 55 years of age were Medicaid beneficiaries. For prime-age people not in the labor force, the share was over half (53 percent). And for un-working Anglos (non-Hispanic white men not in the labor force) of prime working age, the share enrolled in Medicaid was 48 percent.

By the way: Of the entire un-working prime-age male Anglo population in 2013, nearly three-fifths (57 percent) were reportedly collecting disability benefits from one or more government disability program in 2013. Disability checks and means-tested benefits cannot support a lavish lifestyle. But they can offer a permanent alternative to paid employment, and for growing numbers of American men, they do. The rise of these programs has coincided with the death of work for larger and larger numbers of American men not yet of retirement age. We cannot say that these programs caused the death of work for millions upon millions of younger men: What is incontrovertible, however, is that they have financed it—just as Medicaid inadvertently helped finance America’s immense and increasing appetite for opioids in our new century.

It is intriguing to note that America’s nationwide opioid epidemic has not been accompanied by a nationwide crime wave (excepting of course the apparent explosion of illicit heroin use). Just the opposite: As best can be told, national victimization rates for violent crimes and property crimes have both reportedly dropped by about two-thirds over the past two decades.3 The drop in crime over the past generation has done great things for the general quality of life in much of America. There is one complication from this drama, however, that inhabitants of the bubble may not be aware of, even though it is all too well known to a great many residents of the real America. This is the extraordinary expansion of what some have termed America’s “criminal class”—the population sentenced to prison or convicted of felony offenses—in recent decades. This trend did not begin in our century, but it has taken on breathtaking enormity since the year 2000.

Most well-informed readers know that the U.S. currently has a higher share of its populace in jail or prison than almost any other country on earth, that Barack Obama and others talk of our criminal-justice process as “mass incarceration,” and know that well over 2 million men were in prison or jail in recent years.4 But only a tiny fraction of all living Americans ever convicted of a felony is actually incarcerated at this very moment. Quite the contrary: Maybe 90 percent of all sentenced felons today are out of confinement and living more or less among us. The reason: the basic arithmetic of sentencing and incarceration in America today. Correctional release and sentenced community supervision (probation and parole) guarantee a steady annual “flow” of convicted felons back into society to augment the very considerable “stock” of felons and ex-felons already there. And this “stock” is by now truly enormous.

One forthcoming demographic study by Sarah Shannon and five other researchers estimates that the cohort of current and former felons in America very nearly reached 20 million by the year 2010. If its estimates are roughly accurate, and if America’s felon population has continued to grow at more or less the same tempotraced out for the years leading up to 2010, we would expect it to surpass 23 million persons by the end of 2016 at the latest. Very rough calculations might therefore suggest that at this writing, America’s population of non-institutionalized adults with a felony conviction somewhere in their past has almost certainly broken the 20 million mark by the end of 2016. A little more rough arithmetic suggests that about 17 million men in our general population have a felony conviction somewhere in their CV. That works out to one of every eight adult males in America today.

We have to use rough estimates here, rather than precise official numbers, because the government does not collect any data at all on the size or socioeconomic circumstances of this population of 20 million, and never has. Amazing as this may sound and scandalous though it may be, America has, at least to date, effectively banished this huge group—a group roughly twice the total size of our illegal-immigrant population and an adult population larger than that in any state but California—to a near-total and seemingly unending statistical invisibility. Our ex-cons are, so to speak, statistical outcasts who live in a darkness our polity does not care enough to illuminate—beyond the scope or interest of public policy, unless and until they next run afoul of the law.

Thus we cannot describe with any precision or certainty what has become of those who make up our “criminal class” after their (latest) sentencing or release. In the most stylized terms, however, we might guess that their odds in the real America are not all that favorable. And when we consider some of the other trends we have already mentioned—employment, health, addiction, welfare dependence—we can see the emergence of a malign new nationwide undertow, pulling downward against social mobility.

Social mobility has always been the jewel in the crown of the American mythosand ethos. The idea (not without a measure of truth to back it up) was that people in America are free to achieve according to their merit and their grit—unlike in other places, where they are trapped by barriers of class or the misfortune of misrule. Nearly two decades into our new century, there are unmistakable signs that America’s fabled social mobility is in trouble—perhaps even in serious trouble.

Consider the following facts. First, according to the Census Bureau, geographical mobility in America has been on the decline for three decades, and in 2016 the annual movement of households from one location to the next was reportedly at an all-time (postwar) low. Second, as a study by three Federal Reserve economists and a Notre Dame colleague demonstrated last year, “labor market fluidity”—the churning between jobs that among other things allows people to get ahead—has been on the decline in the American labor market for decades, with no sign as yet of a turnaround. Finally, and not least important, a December 2016 report by the “Equal Opportunity Project,” a team led by the formidable Stanford economist Raj Chetty, calculated that the odds of a 30-year-old’s earning more than his parents at the same age was now just 51 percent: down from 86 percent 40 years ago. Other researchers who have examined the same data argue that the odds may not be quite as low as the Chetty team concludes, but agree that the chances of surpassing one’s parents’ real income have been on the downswing and are probably lower now than ever before in postwar America.

Thus the bittersweet reality of life for real Americans in the early 21st century: Even though the American economy still remains the world’s unrivaled engine of wealth generation, those outside the bubble may have less of a shot at the American Dream than has been the case for decades, maybe generations—possibly even since the Great Depression.

The funny thing is, people inside the bubble are forever talking about “economic inequality,” that wonderful seminar construct, and forever virtue-signaling about how personally opposed they are to it. By contrast, “economic insecurity” is akin to a phrase from an unknown language. But if we were somehow to find a “Google Translate” function for communicating from real America into the bubble, an important message might be conveyed:

The abstraction of “inequality” doesn’t matter a lot to ordinary Americans. The reality of economic insecurity does. The Great American Escalator is broken—and it badly needs to be fixed.

With the election of 2016, Americans within the bubble finally learned that the 21st century has gotten off to a very bad start in America. Welcome to the reality. We have a lot of work to do together to turn this around.

Driven by hate

This song came to mind …

… when reading Kurt Schlichter:

Leftists don’t merely disagree with you. They don’t merely feel you are misguided. They don’t think you are merely wrong. They hate you. They want you enslaved and obedient, if not dead. Once you get that, everything that is happening now will make sense. And you will understand what you need to be ready to do.

You are normal, and therefore a heretic. You refuse to bow to their idols, to subscribe to their twisted catechisms, to praise their false gods. This is unforgivable. You must burn.

Crazy talk? Just ask them. Go ahead. Go on social media. Find a leftist – it’s easy. Just say something positive about America or Jesus and they’ll come swarming like locusts. Engage them and very quickly they will drop their masks and tell you what they really think. I know. I keep a rapidly expanding file of Twitter leftist death wish screenshots.

They will tell you that Christians are idiots and vets are scum.

That normals are subhumans whose role is to labor as serfs to subsidize the progressive elite and its clients.

That you should die to make way for the New Progressive Man/Woman/Other.

Understand that when they call Donald Trump “illegitimate,” what they are really saying is that our desire to govern ourselves is illegitimate. Their beef isn’t with him – it’s with us, the normal people who dared rise up and demand their right to participate in the rule of this country and this culture.

They hate you, because by defying them you have prevented them from living up to the dictates of their false religion. Our rebelliousness has denied them the state of grace they seek, exercising their divine right to dictate every aspect of our puny lives. Their sick faith gives meaning to these secular weirdos, giving them something that fills their empty lives with a messianic fervor to go out and conquer and convert the heathens.

And the heathens are us.

Oh, there are different leftist sects. There are the social justice warriors who have manufactured a bizarre mythology and scripture of oppression, privilege, and intersectionality. Instead of robes, they dress up as genitals and kill babies as a blasphemous sacrament. Then there are the pagan weather religion oddballs convinced that the end is near and that we must repent by turning in our SUVs. Of course, the “we” is really “us” – high priests of the global warming cult like Leonardo DiCaprio will still jet around the world with supermodels while we do the ritual sacrificing of our modern comforts. Then there are the ones who simply worship themselves, the elitists who believe that all wisdom and morality has been invested in them merely because they went to the right college, think the right thoughts, and sneer at anyone living between I-5 and I-95.

But all the leftist sects agree – they have found the revealed truth, and imposing it upon the benighted normals like us is so transcendently important that they are relieved of any moral limitations. They are ISIS, except with hashtags instead of AKs, committed to the establishment of a leftist caliphate.

You wonder why the left is now justifying violence? Because they think that helps them right now. Today it’s suddenly OK to punch a “Nazi.” But the punchline is that anyone who opposes them is a “Nazi.”

You wonder why they ignore the rule of law, why they could switch on a dime from screaming at Trump for refusing to preemptively legitimize a Hillary win and then scream that he is illegitimate the moment she lost? Because their only principle is what helps the left win today. That’s why the media gleefully, happily lies every single day about every single thing it reports. Objectivity? When that stopped being a useful thing, it stopped being a thing at all.

They are fanatics, and by not surrendering, by not kneeling, and by not obeying, you have committed an unpardonable sin. You have defied the Left, and you must be broken. They will take your job, slander your name, even beat or kill you – whatever it takes to break you and terrify others by making you an example. Your defiance cannot stand; they cannot allow this whole Trump/GOP majority thing to get out of control. They must crush this rebellion of the normal, and absolutely nothing is off the table.

Schlichter’s premise requires believing that Trump is actually the savior of the “normal,” as opposed to yet another political opportunist. But well before Trump hit the scene, Wisconsin’s own Walker Derangement Syndrome proves that indeed many of Wisconsin’s leftists are motivated by hatred of the other side. Perhaps you’ve noticed how well that’s worked out for Democrats in the past four statewide elections, not even counting Recallarama.

Two ways Trump gains supporters

The New York Times’ Frank Bruni shows one …

You know how Donald Trump wins? I don’t mean a second term or major legislative victories. I’m talking about the battle between incivility and dignity.

He triumphs when opponents trade righteous anger for crude tantrums. When they lose sight of the line between protest and catcalls.

When a writer for “Saturday Night Live” jokes publicly that Trump’s 10-year-old son has the mien and makings of a killer.

“Barron will be this country’s first home-school shooter,” the writer, Katie Rich, tweeted. I cringe at repeating it. But there’s no other way to take proper note of its ugliness.

That tweet ignited a firestorm — and rightly so — but it didn’t really surprise me. It was just a matter of time. This is the trajectory that we’re traveling. This, increasingly, is what passes for impassioned advocacy.

Look elsewhere on Twitter. Or on Facebook. Or at Madonna, whose many positive contributions don’t include her turn at the microphone at the Women’s March in Washington, where she said that she’d “thought an awful lot about blowing up the White House,” erupted into profanity and tweaked the lyrics to one of her songs so that they instructed Trump to perform a particular sex act.

What a sure way to undercut the high-mindedness of most of the women (and men) around her on that inspiring day. What a wasted opportunity to try to reach the many Americans who still haven’t decided how alarmed about Trump to be. I doubt that even one of them listened to her and thought: To the barricades I go! If Madonna’s dropping the F bomb, I must spring into action.

All of this plays right into Trump’s hands. It pulls eyes and ears away from the unpreparedness, conflicts of interest and extreme conservatism of so many of his cabinet nominees; from the evolving explanations for why he won’t release his tax returns; from his latest delusion or falsehood, such as his renewed insistence that illegally cast ballots cost him the popular vote; from other evidence of an egomania so profound that it’s an impediment to governing and an invitation to national disaster.

There’s so much substantive ground on which to confront Trump. There are acres upon acres. Why swerve into the gutter? Why help him dismiss his detractors as people in thrall to the theater of their outrage and no better than he is?

And why risk that disaffected Americans, tuning in only occasionally, hear one big mash of insults and insulters, and tune out, when there’s a contest — over what this country stands for, over where it will go — that couldn’t be more serious.

After Rich’s tweet, “Saturday Night Live” suspended her, and she was broadly condemned, by Democrats as well as Republicans, for violating the unofficial rule against attacks on the young children of presidents. Chelsea Clinton, on her Facebook page, urged people to give Barron space and peace — something that wasn’t always done for her, for George W. Bush’s daughters or for Barack Obama’s.

But the treatment of presidential progeny isn’t the real story here. And that’s a complicated saga anyway, because so many presidents and candidates try to have things both ways, putting family on display when it suits them and then declaring them off limits when it doesn’t.

The larger, more pressing issues are how low we’re prepared to sink in our partisan back-and-forth and what’s accomplished by descending to Trump’s subterranean level. His behavior has been grotesque, and it’s human nature to want to repay him in kind. It feels good. It sometimes even feels right.

Many people I know thrilled to the viral footage a few days ago of the vile white supremacist Richard Spencer being punched in the head during a television interview. But that attack does more to help him than to hurt him.

Many people I know thrilled to BuzzFeed’s publication of a dossier with unsubstantiated allegations about Trump. But that decision bolstered his ludicrous insistence that journalists are uniquely unfair to him. It gave him a fresh weapon in his war on the media.

If Trump’s presidency mirrors its dangerous prelude, one of the fundamental challenges will be to respond to him, his abettors and his agenda in the most tactically prudent way and not just the most emotionally satisfying one. To rant less and organize more. To resist taunts and stick with facts. To answer invective with intelligence.

And to show, in the process, that there are two very different sets of values here, manifest in two very distinct modes of discourse. If that doesn’t happen, Trump may be victorious in more than setting newly coarse terms for our political debate. He may indeed win on many fronts, over many years.

It’s later than you think

Joy Pullmann:

So on the last day in December, my uncle died, and the early part of January this year meant addressing that. I heard of his death from two brothers who don’t know if they believe in God, and thus had a much more difficult and uncertain time processing this death in the family. Unlike me, but like the largest proportion of young Americans in history, my brothers don’t have faith in life after death. So, rather than an opportunity for peace as it presents itself to Christians who thrill with the assurance that to be absent from this life is, for believers, to be perfected and present with Love itself, death is for them a terror. An abyss.

This is an abyss more millennials need to stare into. The truth is, we will all die some day. What then? If you never ask yourself, you can’t be ready. In that moment, you will very much want to be, and despise yourself for not having prepared. And if you have nothing to die for, you have no reason to live.

For centuries Western civilization has collectively brooded over this reality, embodied in the Latin phrase “memento mori.” It means “Remember that you will die.” Legend says the ancient Romans tasked a servant with whispering this in the ear of a victorious general as he paraded through town. This sentiment has a long tradition in not only art but festivals of the dead, such as Halloween and Los Dios de Los Muertos.

In many famous actors’ interpretations, Hamlet stares at a skull that once belonged to the king’s jester, Yorick, and says: “Here hung those lips that I have kissed I know / not how oft. Where be your gibes now? Your / gambols? your songs? your flashes of merriment, / that were wont to set the table on a roar? Not one / now, to mock your own grinning? quite chap-fallen.” He is at the point of self-discovery, something his character crucially needs to ward off tragedy.

Yet Hamlet, like millennials and a major strain of our Western ethos, is extremely self-centered and self-referential, but all his navel-gazing never lifts him out of himself into something greater, which should be the end of introspection. It’s one of his key character defects, and it contributes to the despair and mayhem that ultimately concludes the play and his life. Rather than pondering something fruitful inside his friend’s skull, at this point in the dialogue he executes a turn to foolishness to overlay his despair: “Now get you to my lady’s chamber, and tell her, let / her paint an inch thick, to this favour she must / come; make her laugh at that.” In this, Hamlet is us. Reading of his folly may help prevent our own.

A college history professor once shocked our class by noting that for centuries families used to regularly visit and picnic among cemetery grounds among their ancestors. They felt a kinship with their dead, and a sense of their reality.

We don’t do much of this anymore. I’ve been to many funerals despite a young life, because my others-oriented father taught me to “always attend weddings and funerals.” People practically tiptoe in and tiptoe out, and very quickly, although nowadays we say we don’t believe in ghosts or spooky stuff. I think it’s because we are scared to confront the truth a dead body slaps us with: Someday, that will be us. Because life is unpredictable, it could be a lot sooner than we want to think. Am I going where he’s gone? And where is that, exactly?

This came home to me on the day my younger brother died. It was also my nineteenth birthday, a cold and rainy November day that settled for me the month’s cruelty. The sudden death of a healthy boy on the cusp of manhood, a boy who had always been so alive to me, indelibly impressed on me the truth that any one of us could die at any moment.

It’s why I sometimes kiss my husband as if it were the last time, because I know it could be, and I want to remember him and his kisses forever. It’s why sometimes when he closes the front door to run out for an errand I anxiously wonder why I didn’t send a smile with him, in case his last view of me was of self-absorption rather than love.

It’s why in the evenings I go by the children’s bedrooms and softly touch the doors, trying to steel myself against the possibility that in the morning they will not all come tumbling in upon me with tousled hair, hungry bellies, and hangry arguments.

Millennials like me have had good lives. In all the statistical respects it is the best set of lives ever lived. Violent crimes against children such as child rape and kidnapping have been declining for decades and are at record lows. Never in history have there been lower infant and child mortality rates. For centuries no civilization has boasted longer life averages. No people have ever had such broad access to such a broad array of lifesaving and life-prolonging treatments. Polio is almost eradicated from the earth, for heaven’s sake!

So why are we millennials so afraid? Why are we so lame, so tentative, so stuck in utero? I think it’s partly because our easy lives have not prepared us for a good death. If we never emerge into adulthood, perhaps we’ll never have to die. Some millennials take this to ridiculous extremes by entombing themselves in infantile actions like drinking breastmilk, signing up for adult preschool, jumping in grown-up ball pits, and wearing onesies. We’re pretending we’ll be young forever, and therefore impervious to death and every other serious pursuit in life that prepares one for it.

Our ease of life sings, sweetly like Sirens, to lay down and sleep, for evil perishes in proportion to our own enlightenment. Don’t worry. Be happy. Pay no mind to the man behind the curtain. Feel no guilt over your desires or what they suggest about the human condition, for there will be no reckoning.

Yet we have this uneasy feeling that, at the end of our days, we will look back on an endless row of trips to Costa Rica and the Himalayas; journeys to find ourselves and screw any variety of exotic people, animals, and plants; self-gratifying therapy courses in and out of institutions of “higher” education; and see we have nowhere learned what it means to face death like a man or woman. We’ll find our legacy on earth is one of endless self-gratification that has meant nothing eternally good for one single other soul, not even ours. On the day this is all you have to look back on, it will bite you—ferociously. Indelibly.

[T.S.] Eliot says “I will show you fear in a handful of dust.” That handful of dust is a literary reference to each man and woman. What confers it dignity and eternal transcendence is its acknowledgement of God, and God’s claims on each person he has made. Another Eliot poem is titled “Ash Wednesday,” the day when faithful Christians attend church and the pastor wipes the sign of the cross on their foreheads in ashes, saying “Remember, O man, that you are dust, and to dust you shall return.”

Only inside Christianity are these words a sign of hope, because they tell us amid our despair that someone has saved us. To all else, they merely confirm despair. Yet these words will come to every man and woman some time or another, and always much sooner than we think and many hope, even if we have a fairy-story nightingale that can send the Grim Reaper packing for a time. Death is inescapable, and unpredictable, even for emperors.

Memento mori. Remember, millennials, and all others, that you shall die. What will your life have meant then? Who will save you when you cannot save yourselves?

The quotable King

My favorite Martin Luther King quotes, some of which you may not read or hear on Martin Luther King Jr. Day:

A genuine leader is not a searcher for consensus but a molder of consensus.

A man who won’t die for something is not fit to live.

A nation or civilization that continues to produce soft-minded men purchases its own spiritual death on the installment plan.

All labor that uplifts humanity has dignity and importance and should be undertaken with painstaking excellence.

Freedom is never voluntarily given by the oppressor; it must be demanded by the oppressed.

He who passively accepts evil is as much involved in it as he who helps to perpetrate it. He who accepts evil without protesting against it is really cooperating with it.

Human progress is neither automatic nor inevitable … Every step toward the goal of justice requires sacrifice, suffering, and struggle; the tireless exertions and passionate concern of dedicated individuals.

Human salvation lies in the hands of the creatively maladjusted.

I have a dream that my four little children will one day live in a nation where they will not be judged by the color of their skin, but by the content of their character. … I have a dream that one day every valley shall be exalted, every hill and mountain shall be made low, the rough places will be made straight and the glory of the Lord shall be revealed and all flesh shall see it together.

If we are to go forward, we must go back and rediscover those precious values — that all reality hinges on moral foundations and that all reality has spiritual control.

Never forget that everything Hitler did in Germany was legal.

Our lives begin to end the day we become silent about things that matter.

Rarely do we find men who willingly engage in hard, solid thinking. There is an almost universal quest for easy answers and half-baked solutions. Nothing pains some people more than having to think.

Science investigates; religion interprets. Science gives man knowledge which is power; religion gives man wisdom which is control.

The function of education is to teach one to think intensively and to think critically. Intelligence plus character — that is the goal of true education.

The quality, not the longevity, of one’s life is what is important.

The ultimate measure of a man is not where he stands in moments of comfort and convenience, but where he stands at times of challenge and controversy.

Whatever your life’s work is, do it well. A man should do his job so well that the living, the dead, and the unborn could do it no better.