Category: History

A constitutional lesson for those who need it

For those who think Roe v. Wade and Planned Parenthood v. Casey must not be overturned by the U.S. Supreme Court because they’re settled law, the National Constitution Center presents a list of Supreme Court decisions that were overturned by the Supreme Court:

In 1992, an opinion from three justices in the Casey decision reinforced the role of stare decisis, or precedent, in the court’s proceedings. “After considering the fundamental constitutional questions resolved by Roe, principles of institutional integrity, and the rule of stare decisis, we are led to conclude this: the essential holding of Roe v. Wade should be retained and once again reaffirmed,” wrote Sandra Day O’Conner, Anthony Kennedy and David Souter.

However, the court doesn’t always follow its precedents. In 1932, Justice Louis Brandeis explained stare decisis in his dissent in Burnet v. Coronado Oil & Gas Co.  “Stare decisis is usually the wise policy, because in most matters it is more important that the applicable rule of law be settled than that it be settled right,” Brandeis wrote. “But in cases involving the Federal Constitution, where correction through legislative action is practically impossible, this Court has often overruled its earlier decisions.”

The Library of Congress tracks the historic list of overruled Supreme Court cases in its report, The Constitution Annotated. As of 2020, the court had overruled its own precedents in an estimated 232 cases since 1810, says the library. To be sure, that list could be subject to interpretation, since it includes the Korematsu case from 1943, which justices have repudiated but never formally overturned. But among scholars, there are a handful of cases seen as true landmark decisions that overturned other precedents.

Here is a short list of those landmark cases, as reported by the Congressional Research Service and Library of Congress:

West Coast Hotel Company v. Parrish(1937). In a 5-4 decision, the Hughes court overturned a decision from the previous year, now stating that the establishment of minimum wages for women was constitutional. The decision was seen as ending the court’s Lochner era.

West Virginia State Board of Education v. Barnette (1943). In a 6-to-3 decision, the Court overruled Minersville School District v. Gobitis (1940). Justice Robert Jackson’s majority opinion affirmed that forcing public school students to salute the American flag was unconstitutional. “If there is any fixed star in our constitutional constellation, it is that no official, high or petty, can prescribe what shall be orthodox in politics, nationalism, religion, or other matters of opinion, or force citizens to confess by word or act their faith therein,” Jackson famously wrote.

Brown v. Board of Education of Topeka (1954). A unanimous Warren Court (pictured above) decided that a separate but equal policy of educational facilities for racial minorities, consistent with Plessy v. Ferguson (1896), violated the 14th Amendment’s Equal Protection Clause.

Mapp v. Ohio(1961). Overruling Wolf v. Colorado (1949), the court said in a 6-3 decision that evidence gathered by authorities through searches and seizures that violated the Fourth Amendment could not be presented in a state court—otherwise known as the “exclusionary rule.”

Gideon v. Wainwright(1963). Justice Hugo Black’s unanimous opinion invalidated Betts v. Brady (1942) and required state courts to appoint attorneys for defendants who cannot afford to retain lawyers on their own.

Miranda v. Arizona (1966). In a 5-4 opinion, Chief Justice Earl Warren concluded that police violated Ernesto Miranda’s rights by not informing Miranda that he could remain silent and also ask for an attorney during interrogations. The ruling invalidates two court rulings from 1958: Crooker v. California(1958) and Cicenia v. Lagay(1958).

Katz v. United States (1967). In a 7-1 decision (Justice Thurgood Marshall did not take part in the case), the court determined that a man in a phone booth could not be wiretapped by authorities without a warrant from a judge. The decision overturned two prior Supreme Court decisions: Olmstead v. United States (1928) and Goldman v. United States (1942.)

Brandenburg v. Ohio (1969). The court decided that Ohio’s criminal syndicalism law, barring public speech calling for illegal activities, was unconstitutional on First and 14th Amendment grounds unless the speech incited “imminent lawless action.” The decision overruled Whitney v. California (1927).

Gregg v. Georgia(1976). In a 7-2 decision from Potter Stewart, the court ruled that Georgia’s capital punishment laws didn’t violate the Eighth and 14th Amendment’s prohibitions on cruel and unusual punishment. The court invalidated McGautha v. California(1971), a prior death-penalty case.

Planned Parenthood of Southeastern Pennsylvania v. Case(1992). A divided court invalidated parts of two prior decisions, Thornburghand Akron I, as inconsistent with Roe v. Wade.

Atkins v. Virginia (2002). The Supreme Court held that executions of intellectually challenged criminals were “cruel and unusual punishments” barred by the Eighth Amendment. The decision overturned Penry v. Lynaugh (1989).

Lawrence v. Texas (2003). Justice Anthony M. Kennedy, in a 6-3 ruling,  cited the Due Process Clause and invalidated a Texas law making it a crime for two persons of the same sex to engage in sexual conduct.  The decision overturns Bowers v. Hardwick (1986).

Citizens United v. FEC (2010). By a 5-to-4 decision, Justice Anthony M. Kennedy writes for the majority and says the First Amendment did not permit the government to ban corporate funding of independent political broadcasts during election cycles. The decision overturned Austin v. Michigan Chamber of Commerce(1990) and parts of McConnell v. FEC(2003).

Obergefell v. Hodges (2015).  In a 5-4 opinion, Justice Kennedy said the 14th Amendment’s Due Process Clause guaranteed the right to marry as a fundamental liberty that applied to couples regardless of their sex. The decision overruled a one-sentence ruling in Baker v. Nelson (1972).

South Dakota v. Wayfair (2018). In another 5-4 decision from Justice Kennedy, the court said sellers who engage in significant business within a state may be required to pay taxes, even if the business does not have a physical presence in the taxing state. The ruling overturned Quill Corp. v. North Dakota (1992).

Janus v. American Federation of State, County, and Municipal Employees (2018). In a 5-4 opinion from Justice Samuel Alito, the court said the state of Illinois violated the First Amendment by extracting agency fees from nonconsenting public-sector employees. The decision overturned Abood v. Detroit Bd. of Education(1977).

Anyone think “separate but equal” is acceptable?

 

Advertisement

The “S” stood for “Scam”

Today in 1945:

There is a myth about Harry S. Truman as the last common-man president. Jeff Jacoby punctures that myth:

When the Washington Post reported in 2007 that Bill Clinton had pocketed nearly $40 million in speaking fees since leaving the White House six years earlier, I wrote a column regretting that yet another former chief executive had proved so eager “to leverage the prestige of the presidency for big bucks.”

Well, not every former chief executive. While Clinton followed in the footsteps of George H.W. Bush, Ronald Reagan, Jimmy Carter, and Gerald Ford, one president had been different. Citing historian David McCullough, I noted that Harry Truman had left the White House in such straitened circumstances that he and his wife Bess needed a bank loan to pay their bills.

Nonetheless, Truman insisted he would not cash in on his name and influence. As McCullough recounted in his Pulitzer Prize-winning biography, Truman’s only intention “was to do nothing — accept no position, lend his name to no organization or transaction — that would exploit or commercialize the prestige and dignity of the office of the President.”

Moved by Truman’s seeming financial distress, Congress eventually passed the Former Presidents Act, which provides former presidents with a lavish pension, furnished offices, and lucrative staff and travel allowances.

There’s just one thing wrong with that oft-repeated narrative about Truman’s economic desperation. According to law professor and journalist Paul Campos, it is completely false. In a bombshell article in New York Magazine, Campos shows that Truman lied shamelessly and repeatedly about the state of his finances in order to guilt-trip Congress into passing the Former Presidents Act, which would provide him with taxpayer-funded benefits for which he had no need.

Campos’s findings are jaw-dropping. The story of Truman’s post-presidential penury has long been taken as undoubted fact, and his self-denying refusal to trade on his public legacy for private gain doubtless contributed to his dramatic rebound in public esteem. Truman left the White House with the lowest approval rating in modern presidential history; today he is ranked among the best presidents ever.

But Campos brings the receipts. With the cooperation of the Harry S. Truman Presidential Library, he spent months examining the 33rd president’s financial records, many of which became available only with the 2011 release of Bess Truman’s personal papers.

“Harry Truman was a very rich man on the day he left the White House,” writes Campos, “and he became a good deal richer in the five and a half years between that day and the passage of the FPA.”

The evidence for those jolting assertions comes from none other than Truman himself. In a will drafted in his own hand and kept with Bess Truman’s papers, Truman estimated that his net worth at the end of his presidency was $650,000 — a sum comprising $250,000 in savings bonds, $150,000 in cash, and land worth an estimated $250,000. Adjusted for inflation, $650,000 in 1953 is the equivalent of $6.6 million in 2021.

Far from being one step from the poorhouse on his return to private life, Campos writes, Truman’s own private calculations show that his income was among the top 1 percent of American households. Which, in hindsight, makes sense: As president, he received one of the most generous salaries in America — in 1949, presidential pay was raised to $100,000 annually, an amount worth more than $1.1 million today.

9/11

Sept. 11, 2001 started out as a beautiful day, in Wisconsin, New York City and Washington, D.C.

I remember almost everything about the entire day. Sept. 11, 2001 is to my generation what Nov. 22, 1963 was to my parents and Dec. 7, 1941 was to my grandparents.

I had dropped off our oldest son at Ripon Children’s Learning Center. As I was coming out, the mother of one of his group told me to find a good radio station; she had heard as she was getting out with her son that a plane had hit the World Trade Center.

I got in my car and turned it on in time to hear, seemingly live, a plane hit the WTC. But it wasn’t the first plane, it was the second plane hitting the other tower.

As you can imagine, my drive to Fond du Lac took unusually long that day. I tried to call Mrs. Presteblog, who was working at Ripon College, but she didn’t answer because she was in a meeting. I had been at Marian University as their PR director for just a couple months, so I didn’t know for sure who the media might want to talk to, but once I got there I found a couple professors and called KFIZ and WFDL in Fond du Lac and set up live interviews.

The entire day was like reading a novel, except that there was no novel to put down and no nightmare from which to wake up. A third plane hit the Pentagon? A fourth plane crashed somewhere else? The government was grounding every plane in the country and closing every airport?

I had a TV in my office, and later that morning I heard that one of the towers had collapsed. So as I was talking to my wife on the phone, NBC showed a tower collapsing, and I assumed that was video of the first tower collapse. But it wasn’t; it was the second tower collapse, and that was the second time that replay-but-it’s-not thing had happened that day.

Marian’s president and my boss (a native of a Queens neighborhood who grew up with many firefighter and police officer families, and who by the way had a personality similar to Rudy Giuliani) had a brief discussion about whether or not to cancel afternoon or evening classes, but they decided (correctly) to hold classes as scheduled. The obvious reasons were (1) that we had more than 1,000 students on campus, and what were they going to do if they didn’t have classes, and (2) it was certainly more appropriate to have our professors leading a discussion over what had happened than anything else that could have been done.

I was at Marian until after 7 p.m. I’m sure Marian had a memorial service, but I don’t remember it. While I was in Fond du Lac, our church was having a memorial service with our new rector (who hadn’t officially started yet) and our interim priest. I was in a long line at a gas station, getting gas because the yellow low fuel light on my car was on, not because of panic over gas prices, although I recall that one Fond du Lac gas station had increased their prices that day to the ridiculous $2.299 per gallon. (I think my gas was around $1.50 a gallon that day.)

Two things I remember about that specific day: It was an absolutely spectacular day. But when the sun set, it seemed really, really dark, as if there was no light at all outside, from stars, streetlights or anything else.

For the next few days, since our son was at the TV-watching age, we would watch the ongoing 9/11 coverage in our kitchen while Michael was watching the 1-year-old-appropriate stuff or videos in our living room. That Sunday, one of the people who was at church was Adrian Karsten of ESPN. He was supposed to be at a football game working for ESPN, of course, but there was no college football Saturday (though high school football was played that Friday night), and there was no NFL football Sunday. Our organist played “God Bless America” after Mass, and I recall Adrian clapping with tears down his face; I believe he knew some people who had died or been injured.

Later that day was Marian’s Heritage Festival of the Arts. We had record attendance since there was nothing going on, it was another beautiful day, and I’m guessing after five consecutive days of nonstop 9/11 coverage, people wanted to get out of their houses.

In the 20 years since then, a comment of New York City Mayor Rudy Giuliani has stuck in my head. He was asked a year or so later whether the U.S. was more or less safe since 9/11, and I believe his answer was that we were more safe because we knew more than on Sept. 10, 2001. That and the fact that we haven’t been subject to another major terrorist attack since then is the good news.

Osama bin Laden (who I hope is enjoying Na’ar, Islam’s hell) and others in Al Qaeda apparently thought that the U.S. (despite the fact that citizens from more than 90 countries died on 9/11) would be intimidated by the 9/11 attacks and cower on this side of the Atlantic Ocean, allowing Al Qaeda to operate with impunity in the Middle East and elsewhere. (Bin Laden is no longer available for comment.) If you asked an American who paid even the slightest attention to world affairs where a terrorist attack would be most likely before 9/11, that American would have replied either “New York,” the world’s financial capital, or “Washington,” the center of the government that dominates the free world. A terrorist attack farther into the U.S., even in a much smaller area than New York or Washington, would have delivered a more chilling message, that nowhere in the U.S. was safe. Al Qaeda didn’t think  to do that, or couldn’t do that. The rest of the Middle East also did not turn on the U.S. or on Israel (more so than already is the case with Israel), as bin Laden apparently expected.

The bad news is all of the other changes that have taken place that are not for the better. Bloomberg Businessweek asks:

So was it worth it? Has the money spent by the U.S. to protect itself from terrorism been a sound investment? If the benchmark is the absence of another attack on the American homeland, then the answer is indisputably yes. For the first few years after Sept. 11, there was political near-unanimity that this was all that mattered. In 2005, after the bombings of the London subway system, President Bush sought to reassure Americans by declaring that “we’re spending unprecedented resources to protect our nation.” Any expenditure in the name of fighting terrorism was justified.

A decade later, though, it’s clear this approach is no longer sustainable. Even if the U.S. is a safer nation than it was on Sept. 11, it’s a stretch to say that it’s a stronger one. And in retrospect, the threat posed by terrorism may have been significantly less daunting than Western publics and policymakers imagined it to be. …

Politicians and pundits frequently said that al Qaeda posed an “existential threat” to the U.S. But governments can’t defend against existential threats—they can only overspend against them. And national intelligence was very late in understanding al Qaeda’s true capabilities. At its peak, al Qaeda’s ranks of hardened operatives numbered in the low hundreds—and that was before the U.S. and its allies launched a global military campaign to dismantle the network. “We made some bad assumptions right after Sept. 11 that shaped how we approached the war on terror,” says Brian Fishman, a counterterrorism research fellow at the New America Foundation. “We thought al Qaeda would run over the Middle East—they were going to take over governments and control armies. In hindsight, it’s clear that was never going to be the case. Al Qaeda was not as good as we gave them credit for.”

Yet for a decade, the government’s approach to counterterrorism has been premised in part on the idea that not only would al Qaeda attack inside the U.S. again, but its next strike would be even bigger—possibly involving unconventional weapons or even a nuclear bomb. Washington has appropriated tens of billions trying to protect against every conceivable kind of attack, no matter the scale or likelihood. To cite one example, the U.S. spends $1 billion a year to defend against domestic attacks involving improvised-explosive devices, the makeshift bombs favored by insurgents in Afghanistan. “In hindsight, the idea that post-Sept. 11 terrorism was different from pre-9/11 terrorism was wrong,” says Brian A. Jackson, a senior physical scientist at RAND. “If you honestly believed the followup to 9/11 would be a nuclear weapon, then for intellectual consistency you had to say, ‘We’ve got to prevent everything.’ We pushed for perfection, and in counterterrorism, that runs up the tab pretty fast.”

Nowhere has that profligacy been more evident than in the area of homeland security. “Things done in haste are not done particularly well,” says Jackson. As Daveed Gartenstein-Ross writes in his new book, Bin Laden’s Legacy, the creation of a homeland security apparatus has been marked by waste, bureaucracy, and cost overruns. Gartenstein-Ross cites the Transportation Security Agency’s rush to hire 60,000 airport screeners after Sept. 11, which was originally budgeted at $104 million; in the end it cost the government $867 million. The homeland security budget has also proved to be a pork barrel bonanza: In perhaps the most egregious example, the Kentucky Charitable Gaming Dept. received $36,000 to prevent terrorists from raising money at bingo halls. “If you look at the past decade and what it’s cost us, I’d say the rate of return on investment has been poor,” Gartenstein-Ross says.

Of course, much of that analysis has the 20/20 vision of hindsight. It is interesting to note as well that, for all the campaign rhetoric from candidate Barack Obama that we needed to change our foreign policy approach, president Obama changed almost nothing, including our Afghanistan and Iraq involvements. It is also interesting to note that the supposed change away from President George W. Bush’s us-or-them foreign policy approach hasn’t changed the world’s view, including particularly the Middle East’s view, of the U.S. Someone years from now will have to determine whether homeland security, military and intelligence improvements prevented Al Qaeda from another 9/11 attack, or if Al Qaeda wasn’t capable of more than just one 9/11-style U.S. attack.

Hindsight makes one realize how much of the 9/11 attacks could have been prevented or at least their worst effects lessened. One year after 9/11, the New York Times book 102 Minutes: The Untold Story of the Fight to Survive Inside the Twin Towers points out that eight years after the 1993 World Trade Center bombing, New York City firefighters and police officers still could not communicate with each other, which led to most of the police and fire deaths in the WTC collapses. Even worse, the book revealed that the buildings did not meet New York City fire codes when they were designed because they didn’t have to, since they were under the jurisdiction of the Port Authority of New York and New Jersey. And more than one account shows that, had certain people at the FBI and elsewhere been listened to by their bosses, the 9/11 attacks wouldn’t have caught our intelligence community dumbfounded. (It does not speak well of our government to note that no one appears to have paid any kind of political price for the 9/11 attacks.)

I think, as Bloomberg BusinessWeek argued, our approach to homeland security (a term I loathe) has overdone much and missed other threats. Our approach to airline security — which really seems like the old error of generals’ fighting the previous war — has made air travel worse but not safer. (Unless you truly believe that 84-year-old women and babies are terrorist threats.) The incontrovertible fact is that every 9/11 hijacker fit into one gender, one ethnic group and a similar age range. Only two reasons exist to not profile airline travelers — political correctness and the assumption that anyone is capable of hijacking an airplane, killing the pilots and flying it into a skyscraper or important national building. Meanwhile, while the U.S. spends about $1 billion each year trying to prevent Improvised Explosive Device attacks, what is this country doing about something that would be even more disruptive, yet potentially easier to do — an Electromagnetic Pulse attack, which would fry every computer within the range of the device?

We have at least started to take steps like drilling our own continent’s oil and developing every potential source of electric power, ecofriendly or not, to make us less dependent on Middle East oil. (The Middle East, by the way, supplies only one-fourth of our imported oil. We can become less dependent on Middle East oil; we cannot become less dependent on energy.) But the government’s response to 9/11 has followed like B follows A the approach our culture has taken to risk of any sort, as if covering ourselves in bubblewrap, or even better cowering in our homes, will make the bogeyman go away. Are we really safer because of the Patriot Act?

American politics was quite nasty in the 1990s. For a brief while after 9/11, we had impossible-to-imagine moments like this:

And then within the following year, the political beatings resumed. Bush’s statement, “I ask your continued participation and confidence in the American economy,” was deliberately misconstrued as Bush saying that Americans should go out and shop. Americans were exhorted to sacrifice for a war unlike any war we’ve ever faced by those who wouldn’t have to deal with the sacrifices of, for instance, gas prices far beyond $5 per gallon, or mandatory national service (a bad idea that rears its ugly head in times of anything approaching national crisis), or substantially higher taxes.

Then again, none of this should be a surprise. Other parts of the world hate Americans because we are more economically and politically free than most of the world. We have graduated from using those of different skin color from the majority as slaves, and we have progressed beyond assigning different societal rights to each gender. We tolerate different political views and religions. To the extent the 9/11 masterminds could be considered Muslims at all, they supported — and radical Muslims support — none of the values that are based on our certain inalienable rights. The war between our world, flawed though it is, and a world based on sharia law is a war we had better win.

And winning that war does not include withdrawal. Whether or not Donald Trump was right about leaving Afghanistan, Joe Biden screwed up the withdrawal so badly that everyone with memory compared it to our withdrawing from South Vietnam. The obviously incomplete vetting of Afghan refugees, who are now at Fort McCoy, and our leaving billions of dollars of our military equipment in Afghanistan, guarantees will be back, and more American soldiers, and perhaps non-soldiers, will die.

In one important sense, 9/11 changed us less than it revealed us. America can be both deeply flawed and a special place, because human beings are both deeply flawed and nonetheless special in God’s eyes. Jesus Christ is quoted in Luke 12:48 as saying that “to whomsoever much is given, of him shall be much required.” As much as Americans don’t want to be the policeman of the world, or the nation most responsible for protecting freedom worldwide, there it is.

Simba

MLB.com:

Ted Simmons’ upcoming induction to the Baseball Hall of Fame has robust support from the record book. He has the most hits in Major League history among switch-hitting catchers. His career OPS+ is higher than that of fellow Hall of Famers Carlton Fisk, Gary Carter, and Iván Rodríguez.

But to fully measure Simmons’ legacy requires a different sort of story, one that unfolded subtly over the 15,000-plus innings he caught for the Cardinals, Brewers, and Braves.

While plenty of statistics classify Simmons as an all-time great, his peers rarely allude to them. Instead, they speak about his passion, his intellect, and his unwavering focus. He never coasted through a ballgame. And by all accounts, he’s riding a lifelong streak of 72 years without a perfunctory conversation.

“He got the most out of his ability because of his mental approach,” said Bill Schroeder, a teammate in Milwaukee during the early 1980s. “He out-thought everyone.”

Simmons is a scholar of baseball, art, history — and life. He earned his undergraduate degree from the University of Michigan in 1996, nearly three decades after attending his first college class. He completed his coursework in Ann Arbor on trips to Detroit while scouting for the Cleveland Indians, because, as he said, there are certain things in life that a person is supposed to finish.

And he is willing to share his wisdom, provided the interlocutor is prepared with the right questions.

“You’d better be thinking it out for yourself,” Simmons told me earlier this year. “If you’re not taking this game real seriously and examining everything you see — incorporating it into some form, like an essay, a notebook, or a booklet — then you’re never going to understand baseball the way a professional should.

“If someone who is really serious about the game asks you a thoughtful question, I’m inclined to say, ‘Come here and sit down, and I’ll tell about what you’re seeing.’ If they have thought about the game, you can help to illuminate it for them.

“What I tell people, in the simplest form, is this: Anytime you see something happen on the field that strikes you as unusual, go there. Tear that situation apart, inside-out, upside-down and backwards. There’s going to be insight in there.”

To think deeply while playing freely is baseball’s essential riddle. Over 21 Major League seasons, Simmons came closer to solving it than most mortals have, before or since. He found a way for painstaking contemplation to enhance, rather than compromise, his natural athleticism.

And he did it all with magnetism that was evident while growing up in the Detroit suburb of Southfield, Mich. His older brothers encouraged him to switch-hit. His mother, Bonnie Sue Webb-Simmons, modeled the determined work ethic that became the backbone of Ted’s career. He drew attention as a Big Ten football recruit while starring for the A&B Brokers amateur baseball team.

Oh, and have you heard the story about the time Ted and his wife, Maryanne Ellison Simmons, hitchhiked in Michigan with an aspiring rock star named Bob Seger?

Sources confirm: It’s true.

Simmons’ long, flowing hair earned him a memorable nickname: Simba. He planned to play baseball at the University of Michigan, until the St. Louis Cardinals selected him with the 10th overall pick in the 1967 Draft. He was 17 when he signed his first professional contract.

One year later, the Cardinals met his hometown Tigers in the World Series. The Cardinals arranged for Ted and Maryanne — then his girlfriend, now his wife of 51 years — to attend the games at Tiger Stadium. More than a half-century later, Simmons remembers the “horrible” internal conflict he felt. The Tigers were his boyhood team. Al Kaline was his favorite player. But now he was a professional. Baseball remained the game he loved — but now it was his livelihood, too.

“I was sitting in the upper deck, watching the game with all of the Cardinals’ front-office people,” Simmons recalled. “They knew I was from Detroit. I’d played that season in Modesto. At one point, Al Kaline got a big base hit to put the Tigers ahead. I did everything I could to prevent myself from cheering along with the rest of the crowd. I realized quickly enough where I was sitting and who was responsible for my tickets. I kept in my seat.”

Simmons spent 14 years in the Cardinals organization, absorbing the traditions and teachings of St. Louis baseball oracle George Kissell. Simmons made six All-Star appearances by the time he was dealt to Milwaukee after the 1980 season. He earned two more selections as a Brewer.

At first, Brewers players weren’t sure how to approach their new, serious-minded teammate. Simmons would read books in the clubhouse. He also enjoyed playing bridge, which evolved into a point of connection — and instruction — for his teammates.

Simmons revealed the nuances of bridge to Schroeder, a fellow catcher nine years his junior. In cards, as in baseball, Simmons demanded accountability and honesty.

“He had such passion for everything he did,” Schroeder said. “When we were playing bridge, he’d jump on people if he thought they were trying to cheat. We’d be having a laidback card game in the clubhouse, but if someone wasn’t acting appropriately, Ted would be the first one to go over to you and say, ‘We can’t operate that way. That’s not how a big leaguer acts.’”

After games, the daily seminar met at Simmons’ locker. Attendance was required for younger catchers, including Schroeder and future MLB manager Ned Yost.

Questions were specific. Answers had to be, too.

In the fourth inning, you called a 2-1 changeup to Ripken. Why?

“Baseball was a chess game for him,” Schroeder said. “One pitch set up another, within the at-bat or later in the game. I got a sense for that through Ted.”

So substantive were those conversations, so enduring the lessons, that Yost hired Simmons as his bench coach in Milwaukee for the 2008 season. And when Yost won back-to-back pennants with the Royals in 2014 and ’15, he publicly cited Simmons’ influence on his managerial approach.

By the time Yost won the World Series, Simmons was well into his decades-long MLB front office and coaching career. He was general manager of the Pirates in 1992 and ’93, before resigning from the position for health reasons after suffering a heart attack and undergoing an angioplasty. Simmons worked as an executive and scout for the Cardinals, Indians, Padres, Mariners and Braves.

Simmons also spent the 2009 and ’10 seasons as the bench coach in San Diego, where he mentored catcher Nick Hundley during a crucial period early in his career. Hundley, now 37, remembers how Simmons helped him to see the game holistically, in the context of a roadmap to 27 outs.

One example: It’s the eighth inning. The Padres are winning by two. The other team’s star — Buster Posey, let’s say — is the eighth hitter due up. With six outs to get, every pitch must be called with the goal of not allowing Posey to bat as the tying run.

“Even though that’s not something that would happen until the ninth inning, any 2-0 pitch in the eighth needs to be a strike,” Hundley explained in an interview earlier this year. “If you give up a base hit or a double, you can live with that. But he has to earn it. Otherwise, if you walk that guy, you’re one step closer to facing [Posey] as the tying run.”

Here’s another Simmons story — about decorum, more so than strategy: Rick Renteria, then a Padres coach, was acting manager for a split-squad game in Mesa, Ariz., during Spring Training in 2010. Hundley had a 3-0 count against Cubs starter Carlos Silva. Renteria gave him the take sign. Hundley saw it, but the urgency to prepare for the season compelled him to swing away. He rocketed an RBI triple off the wall.

Simmons was furious.

It didn’t matter that the result was favorable. Hundley had disregarded a sign. Simmons asked Hundley if he would have swung away had the sign come from Padres manager Bud Black, instead of a coach. Hundley said he didn’t know. And that was the point.

Simmons didn’t speak with Hundley for two days.

“He held me accountable for it,” Hundley said. “I remember that to this day. That’s the kind of impact he has on people. He makes sure you do things the right way. You knew he was coming from a place of love, because he would do anything to help our team.

“One of the biggest things he did for us in 2010 was he would always build people up, no matter what the score was. We’d be down in a game, and he’d go up and down the bench, saying, ‘Just get the tying run up!’ And if it happened, when that batter was on deck, he would be so fired up. He’d yell, ‘That’s him! Right there!’ Even if we lost, he’d tell us afterward, ‘Hey, we got that guy up. We got the guy we wanted to the plate.’ It was such unbelievable perspective. He showed me how someone can really impact a game without playing in it.”

He makes sure you do things the right way. You knew he was coming from a place of love, because he would do anything to help our team. … He showed me how someone can really impact a game without playing in it.

Nick Hundley

Simmons could be described as an adherent to baseball analytics — before the field existed. He once noted that the average for pinch-hitters is substantially lower than that of Major League hitters overall. From a pitching standpoint, he said, the goal should be to turn every batter into a “pinch-hitter.”

How could a team accomplish that? Well, by ensuring no pitcher faces the same hitter twice. Once through the lineup, per pitcher, at the most. If that reminds you of the Rays’ or Brewers’ savvy approach to recent postseasons, it should. And Simmons contemplated the proposal during his playing career, which ended in 1988.

Fortunately, Simmons kept a record of his baseball experiences and reflections — an unpublished journal that hasn’t been seen by the public and probably never will. According to legend, the magnum opus has philosophical paragraphs and diagrams of where every defensive player should be on a relay throw. Only a few copies exist. Simmons has the original, of course. Pete Vuckovich, his close friend and former teammate, has one edition.

“It might be 500 pages,” Vuckovich said earlier this year. “He put it into thick notebooks. Three-ring binders. There are probably two or three of them. He’s got everything in there: how to handle a bullpen, the things you need to do if you’re starting up an organization, what you look for in a starting pitcher.

“It might be boring to some people, but in terms of pure, old-school baseball, it’s very on the money.”

‘A LIFE APART FROM BASEBALL’

Simmons did something else that is acknowledged as crucial now but wasn’t discussed as often when he played: He sought balance in life. And he found it.

SImmons at Museum of Art 2568
Art has been a passion in Ted Simmons’ life since his playing days. | Art or Photo Credit: Simmons family

Maryanne is an accomplished artist with a bachelor’s of fine arts degree from the University of Michigan and master’s from Washington University in St. Louis. The couple has collected contemporary American art for decades. Earlier this year, the St. Louis Post-Dispatch reported that the St. Louis Art Museum acquired 833 works of art from the Simmons family — a “partial gift, partial purchase,” for which “the museum paid just over $2.3 million, about half of the collection’s worth.”

Ted and Maryanne still make their home in the St. Louis area, where Maryanne owns Wildwood Press LLC, which specializes in custom papermaking and the printing and publishing of contemporary art. The couple’s sons have careers that have taken them around the world — Jonathan to Australia and Matthew to San Francisco.

“You’ve got to be a human being first,” Simmons said. “That’s how I’ve lived my life, how I’ve kept everything compartmentalized and separated. By doing that, I was able to create a life apart from baseball. That’s very difficult for a Major League player to do — not only for himself, but his [family] too.

Simmons Family combo
Ted and Maryanne Ellison Simmons; Ted with sons Jonathan and Matthew. | Art or Photo Credit: Simmons family

“You have to carve out your own place and be the human being you want to be. Whether it’s going to the art museum, collecting furniture, contemporary art, works on paper — all of these things are more of what there is in this thing called life, that everybody has a responsibility to do for themselves. … I don’t believe anybody is entitled to anything. Everybody has an obligation to earn it.”

Ted Simmons earned it. He is a Hall of Famer.

The Modern Baseball Era Committee took a while to acknowledge that reality, but the length of their review has no bearing on the righteousness of the result.

Besides, they’ve elected a man who sees virtue, and perhaps a little art, in taking time to think it all through.

An unlikely sermon subject for Sunday

Mark Malvasi:

My uncle made book for a living. That is, he took money from those who wagered on sporting events, presidential elections, anything whereby they thought they could make a fast and easy dollar. I suppose then it was inevitable that, as a young man, I felt a certain affinity for the thought of Blaise Pascal (1623-1662). Pascal, of course, gambled on stakes much higher than my uncle ever imagined. At the same time, my uncle knew something that Pascal never understood or, in any event, never admitted. You can’t beat the house.

Pascal’s mind was among the finest of the seventeenth century. He was a prodigy, perhaps a genius, who, at fifteen, published a distinguished essay on conic sections. He invented the first calculating machine, which he called La Pascaline, and his experiments with the vacuums that nature abhors led to the creation of the barometer.   Pascal was also a first-rate mathematician whose fascination with, and success at, the gaming table enabled him to contribute to the development of probability theory. To test his hypotheses, he devised the roulette wheel.

On November 23, 1654, at the age of thirty-one, Pascal underwent an emotional conversion that stirred him to abandon his worldly metier and to become an apologist for Christianity. He is best remembered today as a religious thinker, which he was, and a mystic, which he was not.   Like the nineteenth-century Danish theologian Søren Kierkegaard, Pascal approached God with “fear and trembling.” A mystic seeks and expects union with God. Pascal, by contrast, despaired of ever finding Him. His conversion did not bring him clarity of vision. God remained distant and unfathomable; the will of God was inscrutable and His design for the cosmos mysterious. “For in fact,” Pascal asked, “what is man in nature?” He answered his question, writing:

A Nothing in comparison with the Infinite, an All in comparison with the Nothing, a mean between nothing and everything. Since he is infinitely removed from comprehending the extremes, the end of things and their beginning are hopelessly hidden from him in an impenetrable secret; he is equally incapable of seeing the Nothing from which he is made, and the Infinite in which he is swallowed up.

Yet, alone and without God, humanity was lost, frightened, and miserable in vast and desolate universe.

To calm his anxiety that God was, at best, remote and, at worst, illusory, Pascal conceived his famous wager. He urged skeptics, atheists, and free-thinkers to live as if they believed in God. Critics then and since have denounced what seemed to be Pascal’s sneering disdain in urging people to affirm that God was real and existence meaningful. It was disingenuous, if not cynical, of Pascal to play the odds and to bet on the reality of God and eternal life when he suspected both were false. The critique, although carefully aimed, misses the target. It is no small irony given Pascal’s attacks on the Jesuits that, like Ignatius Loyola, he rejected predestination, convinced that men and women, through their own efforts, could earn God’s saving grace. Good habits and sincere piety, even in the absence of real belief, thus became indispensable to salvation. “Custom is our nature,” Pascal declared. “He who is accustomed to the faith believes it, can no longer fear hell, and believes in nothing else.” As Augustine taught, the routine practice of faith might in time lead to genuine faith.

Difficulties arise not from Pascal’s intentions but from his premises. Pascal argued that a man, perhaps in utter desperation, must speculate that God exists. If he wins, he wins big and for all eternity. If he loses, he loses almost nothing, since he will be in no worse condition than before. A prudent man thus has no alternative but to roll the dice or to turn over the next card. He’s gambling with house money. But in reality, in history, those who have denied God have often won glory, wealth, and power; according to scripture, they have gained the whole world. Satan took Jesus to a mountain and there “showed him all the kingdoms of the world and the glory of them; and he said to him, ‘All these I will give you, if you will fall down and worship me.’” (Matthew, 4:8-9) It is equally mistaken that a man loses nothing by hazarding that God is real. A man who worships God may sacrifice all he has, all he is, and all he loves in the vindication of his faith.   Consider Job.

Pascal’s tragedy originated in his embrace of Jansenism, which introduced Calvinist doctrines and attitudes into the Catholic world of seventeenth-century France and Western Europe. The Jansenists had revived the Manichean dualism, which characterized humanity as divided between good and evil. For the Jansenists, every soul was a battleground, its fate determined by whichever conflicting impulse was strongest. The Jansenists insisted, therefore, that virtue must be imposed on rebellious and perverse human beings. Only an exacting and solemn authority could direct individuals toward rectitude and purity. The Jansenists also prescribed such discipline for the churches they controlled and the local governments in France over which they exercised some influence. The flesh must be compelled to yield to the spirit. It takes no great leap of historical imagination to see that the Jansenist admiration for order, management, restraint, bureaucracy, and surveillance could be made to attend the requirements of the totalitarian state. The Jansenists determined to administer the “greatness and misery of man,” (“grandeur et misère de l’homme”), which was the foremost theme of Pascal’s work, though compulsion.

Jansenism, asserted Friedrich Heer, endowed Pascal with “an enormous capacity for hatred and over-simplification.”  Stressing the enthusiasm and certainty that governed the residents of Port Royal, the spiritual and theological capital of the Jansenist movement, Heer doubtless exaggerates the charges against Pascal. He ignores not the complexity of Pascal’s thought, but the complexity of the man himself.   Pascal was both austere and worldly, both rational and intuitive. When he partitioned the mind into l’esprit géométrique and l’esprit de finesse, he was mapping the course that a single mind—his own—could take. Pascal may have felt the zeal of a convert, but he never seems to have acquired the conviction that he possessed absolute truth or a sure method by which to attain it. For Pascal, God alone provided the antidote to the twin maladies of doubt and insecurity.

To alleviate his own misgivings, Pascal set out to compose a systematic defense of Christianity. The Pensées contain the remnants of the greater work that he never lived to complete. If these fragments and aphorisms suggest the character of the volume that Pascal meant to write, then it seems the Pensées would have been less an apologia for Christianity than the spiritual autobiography of a thinker attempting to explain to his intellect how his faith was possible.

In the Pensées, Pascal intimated that skepticism may transcend reason, and the doubts that reason awakens, leading not to certainty but to affirmation. By acknowledging the limits of reason, the thoughtful man, he hoped, could accept the mystery of life without also yielding to its absurdity. “The last proceeding of reason,” he wrote, ”is to recognize that there is an infinity of things which are beyond it. It is but feeble if it does not see so far as to know this. But if natural things are beyond it, what will be said of supernatural?” Yet, perhaps at this moment of vital insight, Pascal exhibited some of the odium that Friedrich Heer had detected in his thought and character. Like many intensely passionate and astute natures, Pascal disdained the society in which he lived—a disdain that reinforced his displeasure with his fellow human beings and, at times, with life itself. Most men, he assumed, were intellectually lazy and emotionally tepid. Desultory, incurious, and stupid, they were incapable of profound thought, searching doubt, or vibrant faith. The majority preferred not to bother about any subject, whether intellectual or theological, that would jolt them out of their passivity, lassitude, and indifference. Pascal’s disillusioned analysis of human nature may, as Heer suggests, have issued from the Jansenist view that human beings are both helpless and degraded. He could not avoid exposing the rancor, the insincerity, the conceit, the dishonesty, the self-deception, the cowardice, and the pettiness that circumscribed and disfigured the lives of most ordinary men, and made him despise them.

For Pascal, as for Kierkegaard and other, later existentialist philosophers and theologians, unending dread may well have been the cost of existence. “The eternal silence of these infinite spaces frightens me,” he proclaimed.  There is at times the echo of a terrible nihilism that reverberates though the otherwise silent spaces of Pascal’s infinite universe, as he gazed into the abyss. T. S. Eliot wrote that Pascal’s despair is “perfectly objective,” corresponding “exactly to the facts” and so “cannot be dismissed as mental disease.” In the end, Pascal concluded, the rational proof of God’s existence, such as Descartes had tried to construct with the ontological argument, was useless and unconvincing to those disinclined to believe. Essential questions about the meaning and purpose of human existence could never be resolved through the application of reason or logic. In fact, for Pascal, they could not be resolved at all. They could only be felt in all of their contradiction and paradox. The experience of such utter confusion and despair alone made faith possible and necessary, but offered no assurance that it would come.

Voltaire judged Pascal to be restless soul and a sick mind. Pascal agreed, confirming Voltaire’s assessment long before he had rendered it. During his final illness, Pascal often refused the care of his physician, saying: “Sickness is the natural state of Christians.” He believed that human beings had been created to suffer. Misery was the condition of life in this world. His was a hard doctrine.

But to what end did people suffer? What did their suffering accomplish? Did it exalt the spirit? Were they to suffer unto truth or, as was more likely, did they suffer because the flesh was evil and needed to be punished? Pascal had gambled for ultimate stakes. When he rolled the dice, it came up snake eyes, not once, not the last time, but every time. His tragedy, and potentially ours, is that he could discover no purpose in his encounters with creation, his fellow human beings, life itself. Philosophy, science, and reason offered no assurance of truth, and were of little comfort against anguish and hopelessness. Some could even use elaborate rational arguments to defy the will of God and to excuse sin, as had the Jesuits whom Pascal denounced in The Provincial Letters.

Love was equally vain and worthless. It prompted only deception and contempt for truth. Human beings are so flawed and imperfect that they are wretched and detestable. Loving themselves and desiring others to love them, they conceal their transgressions and deformities. Since no one is inviolate, no one deserves to be loved just as, were strict justice to prevail, no one deserves to be saved. Man, Pascal complained:

cannot prevent this object that he loves from being full of faults and wants. He wants to be great, and he sees himself as small. He wants to be happy, and he sees himself miserable. He wants to be perfect, and he sees himself full of imperfections. He wants to be the object of love and esteem among men, and he sees that his faults merit only their hatred and contempt. This embarrassment in which he finds himself produces in him the most unrighteous and criminal passion that can be imagined; for he conceives a mortal enmity against the truth which reproves him, and which convinces him of his faults. He would annihilate it, but, unable to destroy it in its essence, he destroys it as far as possible in his own knowledge and in that of others; that is to say, he devotes all his attention to hiding his faults both from others and from himself, and he cannot endure that others should point them out to him, or that they should see them.

All “disguise, falsehood, and hypocrisy,” men are ignorant, brazen, and delusional. Preferring lies to truth, they should not be angry at others for pointing out their shortcomings. “It is but right that they should know us for what we are,” Pascal insisted, “and should despise us.”

Elsewhere Pascal acclaimed the dignity of man. He was a reed, but “a thinking reed,” more noble than the insensible universe that would destroy him. But the damage had been done. In the centuries to come, the same revulsion for humanity that Pascal had articulated, the same regimentation and tyranny that the Jansenists had endorsed, transformed life on earth into a living hell. In the early twentieth-century, the Roman Catholic philosopher Gabriel Marcel came face to face with the tragedy of the human condition. Shattered by his experiences in the Great War, during which he had served with the French Red Cross identifying the dead and accounting for the missing, Marcel sought an alternative to grief and desolation.

He contended that in the modern world a person was no longer a person, but “an agglomeration of functions.” According to this functional definition, human beings were valued solely for the work they did and the goods they produced. Death became “objectively and functionally, the scrapping of what has ceased to be of use and must be written off as a total loss.” Such a vision of life deprived people of their spirituality and their faith, and robbed them of any joy that they might feel. Consumed by rancor, malice, and ingratitude, they suffered an “intolerable unease,” as they descended into the void that engulfed them.

Love was the answer. If people could overcome selfishness and egocentricity, if they could love one another, Marcel was confident that they could fulfill themselves as human beings. Such involvement with, and such fidelity to, others afforded a glimpse of the transcendent and was, in Marcel’s view, the most persuasive argument for the existence of God. Faith consoled and inspired the downtrodden, the persecuted, the oppressed, and the brokenhearted. It cultivated and enhanced all human relationships. For if people refused any longer to treat others merely as objects performing a function, if they came at last to recognize that all persons, however deficient, imperfect, errant, or sinful, mattered to God, then those persons were also more likely to matter to them.

I have come to the conclusion that each of us is capable of doing the right thing or the wrong thing at any one time. Your ratio of right decisions to wrong decisions shows the type of person you are, and whether or not your life will be successful (as in avoiding controllable bad things from happening to you).

 

From the GREEN Bay Packers

Packers.com:

The Green Bay Packers on Thursday introduced their new, history-inspired third uniform: the 50s Classic Uniform. The new uniforms will debut at Lambeau Field on Oct. 24 against Washington.

The 50s Classic Uniform is inspired by the team’s uniforms from 1950-1953, which was the second time the team wore green and gold in its history. The Packers first wore green in the mid-to-late 1930s.

The uniforms are all green, with gold numbers and stripes similar to the jerseys worn in the 1950s. In those days, the green was a Kelly green and the team alternated between wearing it with green or gold pants. This alternate jersey, which is the Packers’ traditional green color, with gold numbers and stripes, will be worn with matching green pants with gold stripes, and matching green socks.

“The 1950s were one of the most interesting times in our organization’s rich history, creating the bridge between two of the greatest eras in pro football,” said Packers President/CEO Mark Murphy. “With the NFL growing rapidly, this time period set the stage for the construction of Lambeau Field and for the team’s success in the 1960s and beyond. We hope our fans enjoy celebrating our history with this new alternate uniform.”

First: How does it look?
This new uniform is based on this old uniform …
… which, though the new uniforms are forest green, not the kelly green of the earlier uniforms …
Green Bay Press–Gazette
… makes it a departure from the Packers’ previous throwbacks that were based on their navy-blue-and-gold days from founder (and one-year Notre Dame student) Curly Lambeau.
Once the Packers unveiled green uniforms in 1935 …
… they went back and forth between blue and green until Vince Lombardi said the Packers were the GREEN Bay Packers and would remain as such.
As someone who hates the Blue Bay Packers look, not to mention the White Bay Packers look (for the Nike-mandated Color Rush, though white isn’t really a color for purposes of clothing), I believe these are superior for that reason alone, though I am not usually a fan of monochrome uniforms. The Packers should, in fact, redesign their road look to replace the gold pants with green pants when wearing the white jersey.

I assert this (because I’m always right in this blog, and if you agree with me you’re right too) as someone who is not necessarily enamored with the green and gold look — specifically the “gold” part, which is more accurately described as “athletic gold” or “yellowgold,” basically a little bit darker than yellow. During the early 1950s apparently the Packers used a more metallic look …

… which is preferable to me from their current yellowgold.


(A Twin Cities sportswriter once described the Packers’ look as lemon and spinach. I have no problem with either description, but the writer should have included the Vikings colors — bruises and pus.)Other than the monochrome look, I have another issue:

While the early 1950s were not a particularly successful time for the Packers on the field, it was the dawn of an extraordinarily eventful decade off the field, a decade that began with the departure of the team’s founder Curly Lambeau and ended with the arrival of Vince Lombardi. In the 1950s, the NFL was growing quickly and gaining nationwide interest through television exposure. The Packers organization was at a turning point and a franchise-saving stock sale helped lay the groundwork for the eventual construction of Lambeau Field and set up the team to stay in Green Bay through modern times.

The first two seasons of green ended with the Packers’ 1937 title. The four seasons of the 1950–53 look that the Packers will debut against the don’t-call-them-Redskins-anymore were 3–9, 3–9, 6–6 and 2–9–1, which seems an era not worth commemorating. The rule of caring about how your team looks is that quality of look and quality of play are inextricably linked.

Five decades later, everything’s going to hell again

Gerard Baker:

There’s a consoling thought as we descend deeper into the socially disintegrating, culturally self-loathing, economically stalling dystopia of contemporary America: We’ve been here before.

The hegemony of today’s left-wing radicals, pursuing their ambitions to repudiate America’s historical values and remake the country in the image of some purified version of a big government, equity-enforcing, social-democratic paradise, recalls the 1970s. That decade culminated in the unique combination of economic ruin and international humiliation that defined a one-term Democratic presidency—and we know what happened next. Wait a while, the optimists say. The next Reagan Revolution is at hand.

History doesn’t repeat itself, despite what Marx said, but there is a pattern in the ebb and flow of historical tides. Extreme lurches in one direction tend to be self-correcting, especially when they push a nation as successful as America close to the abyss.

But conservatives should defer the optimism. There are surely similarities between today’s conditions and that benighted decade of 50 years ago, and you don’t have to have a wild imagination to see the Joe Biden-Jimmy Carter parallels. But there are important differences that should temper any confident predictions of an imminent new era of conservative ascendancy.

The 1970s were probably the last decade when existential doubts about the American project were as pronounced and debilitating as they are now. The advances of the 1960s in civil rights and economic prosperity collapsed into a tumult of social unrest and, to coin a phrase, national malaise. The racial strife that closed the previous decade continued to define much of the next one. There are echoes of today’s woke revolutionaries in the 1968 Summer Olympics, when black athletes demonstrated their antipathy to the flag and what it stood for in their own Black Power salute from the medal podium.

The surge in homicides in the past year is a flashback to the decade when American cities were hellscapes—as is the flight of many Americans from those cities to suburbs and beyond. Back then Democratic politicians blamed it on systemic injustice and racist policing and seemed to favor criminals over their victims. Sound familiar?

Then as now there was an existential sense of peril and failure. In the 1970s the nation was haunted by a widespread fear that America was losing the great ideological struggle of the time to the communist superpower. The U.S. retreat from Vietnam, the tightening Soviet grip on Eastern Europe, and Marxist advances in Latin America had at least American progressive elites convinced of ultimate decline and fall. More than 40 years later, American elites are convinced another communist power is eclipsing the U.S. and the civilization it has led.

The 1970s gave us stagflation—immortalized in the popularization of the “misery index”—the sum of the unemployment and inflation rates. While today’s number remains well shy of the peak it reached in 1980, it has doubled in the past two years—a feat last performed in the mid-1970s. Other echoes resonate across the half century: unaccustomed military misadventures, in Vietnam then and Iraq and Afghanistan now; presidential infamy in Richard Nixon and Donald Trump.

For all the similarities, though, there’s at least one big political difference—rooted in an economic one—that suggests reason for pessimism.

Today, unlike then, almost the entire American establishment lines up on one side. The progressive revolution is much more deeply embedded in the nation’s institutions than it ever was in the 1970s. It was still possible then to find conservatives on campuses—it was the intellectual revolution of Milton Friedman and the Chicago school that presaged Ronald Reagan’s political version. Friedman would probably be canceled today. The permanent government wasn’t steeped as it is now in the social and political orthodoxy that thwarts efforts to undo it.

But the biggest difference of all is the investment by America’s corporate leadership in the dominant progressive ideology.

By the late 1970s U.S. financial markets had been in a decade-long bear market. In 1979 the Dow Jones Industrial Average was where it had been in 1965. Since then, and thanks in great part to the global economic liberalization unleashed by the Reagan-Thatcher years, today’s American corporations have enjoyed a bull run like no other.

Which leaves us with one of the strangest alliances in history: a dominant political class that argues America is a fundamentally flawed society in need of complete transformation, in coalition with a dominant capitalist class that reaps unprecedented riches from investors’ convictions that things have never been better.

Barring an epic financial collapse or some improbable early cultural counterrevolution, the coalition that helped elect Ronald Reagan isn’t coming back. Any reversal of the tide of progressive hegemony will have to be achieved from the bottom up.