The terrorist attacks of September 11, 2001 cast a long shadow over American life. Twenty years later the world is more chaotic and less free because the U.S. government exploited fear to erode liberty and launch two disastrous wars. Now, yet another crisis, the COVID-19 pandemic, creates new opportunities to restrict freedom in the name of protecting us from a threat. President Joe Biden’s new vaccine mandate is just the latest round of pandemic restrictions and government by executive fiat that started in early 2020. The pandemic, too, threatens to leave an authoritarian legacy.
The best-known policy result of the 9/11 attacks is pervasive surveillance. Edward Snowden showed how the National Security Agency (NSA) used powers acquired after 9/11 to collect communications data from innocent people at home and abroad. But the government didn’t act on its own; it also conscripted communications companies into monitoring customers and installed NSA equipment in AT&T facilities.
In a disturbing parallel, early in the pandemic, then-President Donald Trump invoked the Defense Production Act to compel businesses to produce ventilators and other supplies for combatting the virus on his preferred terms. Once again, government officials turned to force to bend private parties to their will.
After 9/11, Congress created the Department of Homeland Security and nationalized airport security under the Transportation Security Administration (TSA). Since then, the TSA has become known for groping fliers, delaying transit, and for failing to actually make anybody safer despite the ordeal.
“Undercover investigators were able to smuggle mock explosives or banned weapons through checkpoints in 95 percent of trials,” ABC News reported in 2015.
Panic-fueled pandemic policy has also brought us restrictions on movement. Early on that included overtly unconstitutional limits on traveling between states and cities.
“Freedom of movement within and between states is constitutionally protected” but “the constitutional model is losing right now,” Meryl Justin Chertoff, executive director of the Georgetown Project on State and Local Government Policy and Law, wrote last year.
Perhaps more permanently, we’ve also seen the proliferation of vaccine passports as yet another document requirement for travel, and as a means for turning once-routine activities into conditional privileges.
“The Key to NYC Pass will be a first-in-the-nation approach,” New York City Mayor Bill de Blasio boasted last month. “It will require vaccination for workers and customers in indoor dining, in indoor fitness facilities, indoor entertainment facilities.”
Politicians seem to instinctively understand that a crisis is an opening to push freedom-eroding policies. For then-Senator Biden, the 9/11 attacks provided an opening for touting a bill he had drafted years earlier but couldn’t get passed. It became the Patriot Act.
“I drafted a terrorism bill after the Oklahoma City bombing,” Biden told The New Republic in 2001. “And the bill John Ashcroft sent up was my bill.”
That bill “turns regular citizens into suspects,” in the words of the American Civil Liberties Union.
Twenty years later, even while admitting that “the bulk of the constitutional scholarship says that it’s not likely to pass constitutional muster” and correctly predicting that the Supreme Court would overturn the order, now-President Biden extended an emergency federal eviction moratorium that violated the property rights of millions of landlords.
The fear-fueled post-9/11 culture also gave politicians free rein to spend vast amounts of money on initiatives that had little to do with keeping Americans safe. A 2005 60 Minutes report found “counterterrorism” funding being used to supply small police departments with ATVs and defibrillators.
“A substantial portion of new homeland security spending is being used for politically motivated items—outlays that are unlikely to have any impact on terrorism,” an AEI paper found the next year.
The spending did make many local law-enforcement agencies dependent on federal funds and aligned with the federal government’s priorities at the expense of their communities’ more mundane concerns.
Today’s public health crisis has led to trillions in passed or proposed spending for so-called “recovery” and “infrastructure.” Somehow, that’s been defined to includeinternet access, child tax credits, electric-vehicle charging stations, corporate subsidies, and “buy American” mandates.
“Paid leave is infrastructure. Child care is infrastructure. Caregiving is infrastructure,” Sen. Kirsten Gillibrand (D-NY) insisted in a revealingly opportunistic tweet.
Opportunity also knocked in 2001 when it came to invading Iraq, which had nothing to do with 9/11. Regime change was on the Bush administration wish list, and the terrorist attack created an opening.
“I am often asked why we are in Iraq when Saddam Hussein was not responsible for the 9/11 attacks,” President George W. Bush commented in 2006. “The answer is that the regime of Saddam Hussein was a clear threat.”
Afghanistan’s Taliban regime was implicated in harboring Osama bin Laden, the mastermind of the 9/11 attacks. But, after 20 years of war, an estimated 157,000 lives lost, and a puppet regime that couldn’t survive the withdrawal of western forces, the Taliban is back in control.
Fortunately, the reaction to COVID-19 hasn’t resulted in warfare. But it has further divided the world, disrupted trade, and encouraged conflict—to the benefit of government officials, as after 9/11.
“The rich nations of Europe and North America are liberal democracies, but their governments are also ferociously efficient repression machines. The surveillance tools at their disposal have never been more powerful,” Thomas Hegghammer, a senior research fellow at the Norwegian Defence Research Establishment, writes in the current issue of Foreign Affairs about the aftermath of the War on Terror.
“As recorded in the Democracy Index in recent years, democracy has not been in robust health for some time,” The Economist‘s Democracy Index 2020 observed earlier this year. “In 2020 its strength was further tested by the outbreak of the coronavirus (Covid-19) pandemic… Across the world in 2020, citizens experienced the biggest rollback of individual freedoms ever undertaken by governments during peacetime (and perhaps even in wartime).”
The 20-year anniversary of September 11, 2001 is a day for mourning the loss of the nearly 3,000 people who perished—and to reflect on how political leaders used fear to steal our liberties. Because history is already repeating itself in ways that we, and our kids, will live to regret.
Category: US politics
-
No comments on 9/11 and COVID-19
-
Sept. 11, 2001 started out as a beautiful day, in Wisconsin, New York City and Washington, D.C.
I remember almost everything about the entire day. Sept. 11, 2001 is to my generation what Nov. 22, 1963 was to my parents and Dec. 7, 1941 was to my grandparents.
I had dropped off our oldest son at Ripon Children’s Learning Center. As I was coming out, the mother of one of his group told me to find a good radio station; she had heard as she was getting out with her son that a plane had hit the World Trade Center.
I got in my car and turned it on in time to hear, seemingly live, a plane hit the WTC. But it wasn’t the first plane, it was the second plane hitting the other tower.
As you can imagine, my drive to Fond du Lac took unusually long that day. I tried to call Mrs. Presteblog, who was working at Ripon College, but she didn’t answer because she was in a meeting. I had been at Marian University as their PR director for just a couple months, so I didn’t know for sure who the media might want to talk to, but once I got there I found a couple professors and called KFIZ and WFDL in Fond du Lac and set up live interviews.
The entire day was like reading a novel, except that there was no novel to put down and no nightmare from which to wake up. A third plane hit the Pentagon? A fourth plane crashed somewhere else? The government was grounding every plane in the country and closing every airport?

I had a TV in my office, and later that morning I heard that one of the towers had collapsed. So as I was talking to my wife on the phone, NBC showed a tower collapsing, and I assumed that was video of the first tower collapse. But it wasn’t; it was the second tower collapse, and that was the second time that replay-but-it’s-not thing had happened that day.
Marian’s president and my boss (a native of a Queens neighborhood who grew up with many firefighter and police officer families, and who by the way had a personality similar to Rudy Giuliani) had a brief discussion about whether or not to cancel afternoon or evening classes, but they decided (correctly) to hold classes as scheduled. The obvious reasons were (1) that we had more than 1,000 students on campus, and what were they going to do if they didn’t have classes, and (2) it was certainly more appropriate to have our professors leading a discussion over what had happened than anything else that could have been done.
I was at Marian until after 7 p.m. I’m sure Marian had a memorial service, but I don’t remember it. While I was in Fond du Lac, our church was having a memorial service with our new rector (who hadn’t officially started yet) and our interim priest. I was in a long line at a gas station, getting gas because the yellow low fuel light on my car was on, not because of panic over gas prices, although I recall that one Fond du Lac gas station had increased their prices that day to the ridiculous $2.299 per gallon. (I think my gas was around $1.50 a gallon that day.)
Two things I remember about that specific day: It was an absolutely spectacular day. But when the sun set, it seemed really, really dark, as if there was no light at all outside, from stars, streetlights or anything else.
For the next few days, since our son was at the TV-watching age, we would watch the ongoing 9/11 coverage in our kitchen while Michael was watching the 1-year-old-appropriate stuff or videos in our living room. That Sunday, one of the people who was at church was Adrian Karsten of ESPN. He was supposed to be at a football game working for ESPN, of course, but there was no college football Saturday (though high school football was played that Friday night), and there was no NFL football Sunday. Our organist played “God Bless America” after Mass, and I recall Adrian clapping with tears down his face; I believe he knew some people who had died or been injured.
Later that day was Marian’s Heritage Festival of the Arts. We had record attendance since there was nothing going on, it was another beautiful day, and I’m guessing after five consecutive days of nonstop 9/11 coverage, people wanted to get out of their houses.
In the 20 years since then, a comment of New York City Mayor Rudy Giuliani has stuck in my head. He was asked a year or so later whether the U.S. was more or less safe since 9/11, and I believe his answer was that we were more safe because we knew more than on Sept. 10, 2001. That and the fact that we haven’t been subject to another major terrorist attack since then is the good news.
Osama bin Laden (who I hope is enjoying Na’ar, Islam’s hell) and others in Al Qaeda apparently thought that the U.S. (despite the fact that citizens from more than 90 countries died on 9/11) would be intimidated by the 9/11 attacks and cower on this side of the Atlantic Ocean, allowing Al Qaeda to operate with impunity in the Middle East and elsewhere. (Bin Laden is no longer available for comment.) If you asked an American who paid even the slightest attention to world affairs where a terrorist attack would be most likely before 9/11, that American would have replied either “New York,” the world’s financial capital, or “Washington,” the center of the government that dominates the free world. A terrorist attack farther into the U.S., even in a much smaller area than New York or Washington, would have delivered a more chilling message, that nowhere in the U.S. was safe. Al Qaeda didn’t think to do that, or couldn’t do that. The rest of the Middle East also did not turn on the U.S. or on Israel (more so than already is the case with Israel), as bin Laden apparently expected.
The bad news is all of the other changes that have taken place that are not for the better. Bloomberg Businessweek asks:
So was it worth it? Has the money spent by the U.S. to protect itself from terrorism been a sound investment? If the benchmark is the absence of another attack on the American homeland, then the answer is indisputably yes. For the first few years after Sept. 11, there was political near-unanimity that this was all that mattered. In 2005, after the bombings of the London subway system, President Bush sought to reassure Americans by declaring that “we’re spending unprecedented resources to protect our nation.” Any expenditure in the name of fighting terrorism was justified.
A decade later, though, it’s clear this approach is no longer sustainable. Even if the U.S. is a safer nation than it was on Sept. 11, it’s a stretch to say that it’s a stronger one. And in retrospect, the threat posed by terrorism may have been significantly less daunting than Western publics and policymakers imagined it to be. …
Politicians and pundits frequently said that al Qaeda posed an “existential threat” to the U.S. But governments can’t defend against existential threats—they can only overspend against them. And national intelligence was very late in understanding al Qaeda’s true capabilities. At its peak, al Qaeda’s ranks of hardened operatives numbered in the low hundreds—and that was before the U.S. and its allies launched a global military campaign to dismantle the network. “We made some bad assumptions right after Sept. 11 that shaped how we approached the war on terror,” says Brian Fishman, a counterterrorism research fellow at the New America Foundation. “We thought al Qaeda would run over the Middle East—they were going to take over governments and control armies. In hindsight, it’s clear that was never going to be the case. Al Qaeda was not as good as we gave them credit for.”
Yet for a decade, the government’s approach to counterterrorism has been premised in part on the idea that not only would al Qaeda attack inside the U.S. again, but its next strike would be even bigger—possibly involving unconventional weapons or even a nuclear bomb. Washington has appropriated tens of billions trying to protect against every conceivable kind of attack, no matter the scale or likelihood. To cite one example, the U.S. spends $1 billion a year to defend against domestic attacks involving improvised-explosive devices, the makeshift bombs favored by insurgents in Afghanistan. “In hindsight, the idea that post-Sept. 11 terrorism was different from pre-9/11 terrorism was wrong,” says Brian A. Jackson, a senior physical scientist at RAND. “If you honestly believed the followup to 9/11 would be a nuclear weapon, then for intellectual consistency you had to say, ‘We’ve got to prevent everything.’ We pushed for perfection, and in counterterrorism, that runs up the tab pretty fast.”
Nowhere has that profligacy been more evident than in the area of homeland security. “Things done in haste are not done particularly well,” says Jackson. As Daveed Gartenstein-Ross writes in his new book, Bin Laden’s Legacy, the creation of a homeland security apparatus has been marked by waste, bureaucracy, and cost overruns. Gartenstein-Ross cites the Transportation Security Agency’s rush to hire 60,000 airport screeners after Sept. 11, which was originally budgeted at $104 million; in the end it cost the government $867 million. The homeland security budget has also proved to be a pork barrel bonanza: In perhaps the most egregious example, the Kentucky Charitable Gaming Dept. received $36,000 to prevent terrorists from raising money at bingo halls. “If you look at the past decade and what it’s cost us, I’d say the rate of return on investment has been poor,” Gartenstein-Ross says.
Of course, much of that analysis has the 20/20 vision of hindsight. It is interesting to note as well that, for all the campaign rhetoric from candidate Barack Obama that we needed to change our foreign policy approach, president Obama changed almost nothing, including our Afghanistan and Iraq involvements. It is also interesting to note that the supposed change away from President George W. Bush’s us-or-them foreign policy approach hasn’t changed the world’s view, including particularly the Middle East’s view, of the U.S. Someone years from now will have to determine whether homeland security, military and intelligence improvements prevented Al Qaeda from another 9/11 attack, or if Al Qaeda wasn’t capable of more than just one 9/11-style U.S. attack.
Hindsight makes one realize how much of the 9/11 attacks could have been prevented or at least their worst effects lessened. One year after 9/11, the New York Times book 102 Minutes: The Untold Story of the Fight to Survive Inside the Twin Towers points out that eight years after the 1993 World Trade Center bombing, New York City firefighters and police officers still could not communicate with each other, which led to most of the police and fire deaths in the WTC collapses. Even worse, the book revealed that the buildings did not meet New York City fire codes when they were designed because they didn’t have to, since they were under the jurisdiction of the Port Authority of New York and New Jersey. And more than one account shows that, had certain people at the FBI and elsewhere been listened to by their bosses, the 9/11 attacks wouldn’t have caught our intelligence community dumbfounded. (It does not speak well of our government to note that no one appears to have paid any kind of political price for the 9/11 attacks.)
I think, as Bloomberg BusinessWeek argued, our approach to homeland security (a term I loathe) has overdone much and missed other threats. Our approach to airline security — which really seems like the old error of generals’ fighting the previous war — has made air travel worse but not safer. (Unless you truly believe that 84-year-old women and babies are terrorist threats.) The incontrovertible fact is that every 9/11 hijacker fit into one gender, one ethnic group and a similar age range. Only two reasons exist to not profile airline travelers — political correctness and the assumption that anyone is capable of hijacking an airplane, killing the pilots and flying it into a skyscraper or important national building. Meanwhile, while the U.S. spends about $1 billion each year trying to prevent Improvised Explosive Device attacks, what is this country doing about something that would be even more disruptive, yet potentially easier to do — an Electromagnetic Pulse attack, which would fry every computer within the range of the device?
We have at least started to take steps like drilling our own continent’s oil and developing every potential source of electric power, ecofriendly or not, to make us less dependent on Middle East oil. (The Middle East, by the way, supplies only one-fourth of our imported oil. We can become less dependent on Middle East oil; we cannot become less dependent on energy.) But the government’s response to 9/11 has followed like B follows A the approach our culture has taken to risk of any sort, as if covering ourselves in bubblewrap, or even better cowering in our homes, will make the bogeyman go away. Are we really safer because of the Patriot Act?
American politics was quite nasty in the 1990s. For a brief while after 9/11, we had impossible-to-imagine moments like this:
And then within the following year, the political beatings resumed. Bush’s statement, “I ask your continued participation and confidence in the American economy,” was deliberately misconstrued as Bush saying that Americans should go out and shop. Americans were exhorted to sacrifice for a war unlike any war we’ve ever faced by those who wouldn’t have to deal with the sacrifices of, for instance, gas prices far beyond $5 per gallon, or mandatory national service (a bad idea that rears its ugly head in times of anything approaching national crisis), or substantially higher taxes.
Then again, none of this should be a surprise. Other parts of the world hate Americans because we are more economically and politically free than most of the world. We have graduated from using those of different skin color from the majority as slaves, and we have progressed beyond assigning different societal rights to each gender. We tolerate different political views and religions. To the extent the 9/11 masterminds could be considered Muslims at all, they supported — and radical Muslims support — none of the values that are based on our certain inalienable rights. The war between our world, flawed though it is, and a world based on sharia law is a war we had better win.
And winning that war does not include withdrawal. Whether or not Donald Trump was right about leaving Afghanistan, Joe Biden screwed up the withdrawal so badly that everyone with memory compared it to our withdrawing from South Vietnam. The obviously incomplete vetting of Afghan refugees, who are now at Fort McCoy, and our leaving billions of dollars of our military equipment in Afghanistan, guarantees will be back, and more American soldiers, and perhaps non-soldiers, will die.
In one important sense, 9/11 changed us less than it revealed us. America can be both deeply flawed and a special place, because human beings are both deeply flawed and nonetheless special in God’s eyes. Jesus Christ is quoted in Luke 12:48 as saying that “to whomsoever much is given, of him shall be much required.” As much as Americans don’t want to be the policeman of the world, or the nation most responsible for protecting freedom worldwide, there it is.
-
Philip Klein:
Most people have at some point in their lives been asked to entertain a version of the cheesy question, “If you knew you had one day to live, what would you do?” It’s often posed as a playful game or essay topic or used by self-help gurus to prod people into trying to get a deeper sense of their priorities. But it’s time for everybody to start asking themselves a different question: If COVID-19 will be here forever, is this what you want the rest of your life to look like? In this case, it’s not an idle or theoretical exercise. It will be central to how we choose to live and function as a society for years or even decades to come.
Ever since the onset of COVID-19, we have more or less been living under an illusion. That illusion was that it would reach some sort of natural endpoint — a point at which the pandemic would be declared “over,” and we could all more or less go back to normal. The original promise of taking “15 days to slow the spread” or six weeks to “flatten the curve” has long since been reduced to a punchline.
In March of 2020, the outside estimates were that this coronavirus period would come to an end when safe and effective vaccines became widely available. Even the infamous Imperial College London report, viewed as draconian at the time for its estimate of up to 2.2 million deaths in the U.S. absent sustained intervention, predicted that its mitigation strategies “will need to be maintained until a vaccine becomes available.” Yet vaccines have been available for anybody who wants one for nearly six months, and our leaders have ignored the obvious off-ramp. The CDC backtracked on guidance and said that vaccinated people must wear masks in public, and many people and jurisdictions have listened. For example, Montgomery County, Md., has an extraordinarily high vaccination rate — with 96 percent of the eligible over-twelve population having received at least one dose and 87 percent of them being fully vaccinated. By its own metrics, the county has “low utilization” of hospital beds. Yet the county requires masks indoors — including in schools. In Oregon, vaccinated people are required to wear masks even outdoors. And it isn’t just liberal enclaves. A new Economist/YouGov poll found that eight in ten Americans report having worn a mask in the past week at least “some of the time” when outside their homes, with 58 percent masking “always” or “most of the time.” If masking has remained so widespread among adults months after vaccines became widely available, why will it end in schools after vaccines become available for children?
When operating under the assumption that there is a time limit on interventions, it’s much easier to accept various disruptions and inconveniences. While there have been ferocious debates over whether various mitigation strategies have ever been necessary, we should at least be able to agree that the debate changes the longer such restrictions are required. People making sacrifices for a few weeks, or even a year, under the argument that doing so saves lives is one thing. But if those sacrifices are indefinitely extended, it’s a much different debate.
There are many Americans who willingly locked themselves down and who still favor some restrictions. But what if this were to drag on for five years? Ten years? Twenty years? Do you want your children to be forced to wear masks throughout their childhoods? Do you want to bail on weddings if some guests may be unvaccinated? Skip future funerals? Ditch Thanksgiving when there’s a winter surge? Keep grandparents away from their grandkids whenever there’s a new variant spreading? Are you never going to see a movie in a theater again?
These are not wild scenarios. The Delta variant has led to surges throughout the world months after vaccines became widely available. Despite being a model of mass vaccination, Israel has been dealing with a significant Delta spike. To be clear, vaccines still appear to be quite effective at significantly reducing the risk of hospitalization and death. But if the virus continues to adapt and people need to get booster shots every six months or so, it seems there’s a good chance that the coronavirus will continue to spread for a very long time. So the question is how we, as individuals, and society as a whole, should adapt to this reality. Instead of thinking in terms of policies that may be tolerable for a very short period of time, it’s time to consider what would happen if such policies had to continue forever.
Whatever arguments were made to justify interventions early on in the pandemic, post-vaccine, we are in a much different universe. There is a negligible statistical difference in the likelihood of severe health consequences between vaccinated people who go about their business without taking extra precautions, and those who take additional precautions. Yet having to observe various protocols in perpetuity translates into a reduced quality of life. Put another way, the sort of question we need to start asking ourselves is not whether we can tolerate masking for one trip to the grocery store, but whether we want to live in a society in which we can never again go shopping without a mask.
People may ultimately come to different conclusions about the amount of restrictions they want to accept, regardless of the time frame. But at a minimum, we need to dispense with the framework that assumes the end of COVID-19 is just around the corner and instead recognize that it’s likely here to stay.
-
Ryan M. Yonk and Jessica Rood:
Dramatic headlines and images showing a deteriorating environment exist to demand swift, decisive, and large-scale action. We saw this approach in the 1960s when the first made-for-TV environmental crises showed oil-drenched seabirds on the California Coast and more recently in depressing videos depicting starving polar bears. Dramatic imagery has become the norm when discussing environmental issues.
We also see trends in editorial writing, discussions among political groups, changing business practices, and increasingly scholarly claims that also use dramatic imagery. At face value, these trends could indicate that the public demands dramatic governmental action on environmental issues. Some scholars, however, see this as more than mere increased public demand for government intervention, and they highlight similarities between environmentalism and religious movements. For example, Laurence Siegal states:
In the decades since modern environmentalism began, the movement has taken on some of the characteristics of a religion: claims not backed by evidence, self-denying behavior to assert goodness, (and a) focus on the supposed end of times.
Scholars have tuned into the general public’s zealous interest in the environment and more importantly, emphasis on government action, to push forward their own ideological goals under the guise of scholarship. Whereas the ultimate goal of scholarship is to mitigate climate change and improve sustainability, the reality is instead corrupted by thinly veiled ideology masquerading as scholarship, which is sure to distort any useful policy recommendations.
This phenomenon is illustrated by a recent study making the rounds in Science Daily and The Climate News Network. The authors, Vogel et al., claim that the world must decrease energy use to 27 gigajoules (GJ) per person in order to keep average global temperature increases to 1.5 degrees Celsius, a recommendation included in the Paris Agreement. Our current reality illustrates the outlandish nature of this suggestion. We are a far cry from this goal both in 2012, the year chosen for this study, as well as in 2019, the most recent year for available data. …
Using these data, the authors pair what they view to be excessive energy use with a failure to meet basic human needs worldwide. In their own argument, they acknowledge that among the 108 countries studied, only 29 reach sufficient need satisfaction levels. In each case where need satisfaction is met, the country uses at least double the 27 GJ/cap of sustainable energy use, thereby creating a conundrum both for those concerned about the environment and human well-being.
The authors, however, provide a solution arguing that their research shows a complete overhaul of “the current political-economic regime,” would allow countries to meet needs at sustainable energy levels. Some of their recommendations include: universal basic services, minimum and maximum income thresholds, and higher taxes on wealth and inheritance.
These policy recommendations are not supported by the research and directly contradict a body of literature that argues economic growth, not government redistribution, is our way forward. Vogel et al. argue against the necessity for economic growth and even go as far as to support degrowth policies on the grounds that their model finds no link between economic growth and maximizing human need satisfaction and minimizing energy use.
In short, their proposed solution would punish affluent countries and favor a collective misery in which any market driven environmental improvements are crushed under the promise of equality and sustainable energy use.
Conversely, Laurence Siegel in Fewer, Richer, Greener: Prospects for Humanity in an Age of Abundance and the 2020 Environmental Performance Index (EPI) argue that economic prosperity allows countries to invest in new technologies and policies that improve not only environmental health but also the well-being of the people. Thus, if we want to continue to improve our relationship with the environment and human progress, we should be more supportive of economic growth and the entrepreneurship that drives it.
If the above relationship between economic prosperity, environmental health, and human well-being is the case, how can these authors claim the opposite? The most likely conclusion is that the authors allow an ideological bias to drive their research, a claim that is supported by their normative descriptions of affluent countries as examples of planned obsolescence, overproduction, and overconsumption as well as the authors’ obvious demonization of profit-making.
As Vogel et al. demonstrates, environmental issues can be exploited by the drama and religious nature of the movement. Unfortunately, academics, such as Vogel et al., have learned to use these tools to stretch their limited findings into a full-blown rallying cry for their own preferred policies; in this case, socialism on a global scale.
-
You can tell a Democrat is president, because we’re starting to see pieces blaming “us” for his mistakes. In The Atlantic a couple of weeks ago, Tom Nichols wrote that “Afghanistan Is Your Fault.” “American citizens,” Nichols suggested, “will separate into their usual camps and identify all of the obvious causes and culprits except for one: themselves.” Today, Max Boot makes the same argument in the Post. “Who’s to blame for the deaths of 13 service members in Kabul?” he asks. Answer: “We all are.”
This is of a piece with the tendency of journalists and historians to start muttering about how the presidency is “too big for one man” when the bad president in question is a Democrat. Under these terms, Republicans just aren’t up to the job, while Democrats are the victims of design or modernity or of the public being feckless. Last year, coronavirus was Trump’s fault. Now, it’s the fault of Republican governors and the unvaccinated (well, only some of the unvaccinated).
Still, this has happened pretty quickly with Joe Biden. Usually, it takes a couple of years before the press starts to sound like a bunch of hippies sitting around a fire saying, “you know, in a sense, you’re me and I’m you, and all of us are we — and so when the president makes a mistake, it’s really, like, the universe making a mistake, isn’t it? And, y’know, we’re in the universe, so we are the presidency. That’s democracy, man.”
-
Honor has always had an enormous influence on human affairs and the conduct of governments — until, evidently, the advent of President Joe Biden in the year 2021.
There’s no perspective from which his exit from Afghanistan looks good. But abstracting it from any considerations of honor at least takes some of the sting out of a deeply humiliating episode that would have been considered intolerable throughout most of our nation’s history.
It is dishonorable — even if you believe we had to get out — to throw away what we had sacrificed for in Afghanistan in this grotesquely reckless manner.
It is dishonorable to criticize our erstwhile Afghan friends, especially after we pulled the rug out from under them, and kowtow to our current Afghan enemies.
It is dishonorable to do things we told people repeatedly that we wouldn’t.
It is dishonorable to abandon Afghan allies who put it all on the line for us and believed that, if the worst came, we would get them out.
It is especially dishonorable, unfathomably so, to leave Americans behind enemy lines, a potentiality that the administration has been trying to prepare the American public for in recent days (and hopefully somehow won’t come to pass).
A counterexample that reflects a more traditional American approach is President Teddy Roosevelt’s famous handling of the Perdicaris Affair in 1904, which involved the massive deployment of naval firepower over the kidnapping of one American in a faraway land of which we knew nothing.
Roosevelt’s reflexive bellicosity can seem atavistic at a time when national honor has lost a lot of its purchase.
James Bowman, who wrote a book years ago called Honor: A History, argued that the declining influence of honor in our time is a function of the enormous destructiveness of modern warfare and the feminist and psychotherapeutic reactions to it.
But it hasn’t disappeared, and never will. “Honor is the name of one category of concerns and motives that has dominated relations among peoples and states since antiquity,” the great historian and classicist Donald Kagan once noted. “Although concepts of what is honorable and dishonorable can vary over time and place, sometimes superficially and sometimes deeply, and although other people’s ideas of honor, especially those of an earlier time, can seem silly or outmoded, such surface variations often conceal a fundamental similarity or even identity.”
As for TR, his response to the Perdicaris kidnapping combined a sense of outraged honor at the mistreatment of one American with a prudent view of what military force really could achieve. It added up to a successful foray in coercive diplomacy.
Both Jerry Hendrix in his book Theodore Roosevelt’s Naval Diplomacy and Edmund Morris in his biography of TR have good accounts of the episode. Ion Perdicaris was a 64-year-old expat who lived in Tangier. He was a prominent figure in the English-speaking community in the Moroccan town.
The sultan of Morocco had limited control over the country, with bandits running loose in outlying areas, especially the charismatic Moulay Ahmed el Raisuli.
-
Tom Schatz of Citizens Against Government Waste (a redundant term since government wastes money as often as you breathe):
Since the beginning of the Reagan administration, the federal-budget submission to Congress has included a list of proposed budget cuts, terminations, consolidations, and savings, along with a management agenda. Every president since then thought it was important to provide this information, until President Joe Biden decided that there is not a single penny of the taxpayers’ money being misspent throughout the entire federal government.
The Biden-Harris administration’s $6 trillion budget for fiscal year (FY) 2002 is 37 percent greater than the $4.4 trillion spent in FY 2019, which was the last budget before the pandemic. Even with a $4 trillion tax increase, there is still a $1.84 trillion deficit, which is 86 percent greater than the $984 billion deficit in FY 2019. Deficits will exceed $1 trillion for each of the next ten years, pushing the national debt to more than $39 trillion from the current $28.4 trillion.
When he served with President Barack Obama, Mr. Biden was a key member of the administration’s efforts to promote its budget on Capitol Hill. He was involved in the negotiations over the Budget Control Act of 2011, which set spending caps and helped to somewhat restrain the growth of spending until it expired. President Obama tasked the then–vice president with leading his Campaign to Cut Waste, saying, “I know Joe’s the right man to lead it because nobody messes with Joe.”
Mr. Biden called himself “Sheriff Joe” for his work with the Recovery Board, which tracked expenditures under the $831 billion stimulus bill, along with the Government Accountability and Transparency Board, which was established to identify ways agencies could eliminate waste and improve performance. Mr. Biden said the transparency board would be looking at every dollar of government spending.
According to an April 19, 2019, Government Executive article about how then-candidate Joe Biden would approach managing the government as president, he had said the success of the Campaign to Cut Waste “would be measured by results, not rhetoric.” He said it would “restore trust in government” and do “more than just eliminating waste and fraud . . . by instilling a new culture of efficiency in each of our agencies, greater responsibility, responsiveness and accountability.”
In addition to the campaign and the establishment of the Accountability and Transparency Board, all eight Obama-Biden budget submissions included a volume of “terminations, reductions, and savings,” or “cuts, consolidations, and savings.” The administration said in several of these submissions that it had been going through the budget line-by-line to identify ineffective, duplicative, and overlapping programs. They would be recommended for reduction or termination so that the taxpayers’ money would be used for programs that work as intended.
According to the last Obama-Biden budget for FY 2017, that process “identified, on average, more than 140 cuts, consolidations, and savings averaging more than $22 billion each year.” The proposals were included in a separate volume with the budget submission to Congress. The Trump administration’s four budgets included an average of $50 billion in program eliminations and reductions, making it 40 consecutive years of presidential administrations providing such proposals to Congress. But the Biden-Harris budget for FY 2022 has broken that streak by providing no list of program consolidations or terminations, either separately or as part of the total budget submission.
This complete lack of interest in cutting spending began early in the administration, when President Biden sent a letter to Congress on January 31, 2021, withdrawing President Trump’s 73 proposed rescissions that would have saved taxpayers $27.4 billion, including several that were included in the Obama-Biden budgets for terminations, reductions, and savings, such as the Commission on Fine Arts, the East–West Center, the McGovern-Dole Food for Education program, and Presidio Trust. During his April 28, 2021, address to Congress, President Biden did not say a single word about wasteful spending, nor has he made any other comments or issued any executive orders requiring federal agencies to identify or reduce inefficiency. Vice President Harris likewise has said nothing about these issues.
On April 13, 2011, President Obama proposed a “Framework for Shared Prosperity and Shared Fiscal Responsibility,” which proposed $4 trillion in deficit reduction over twelve years. A little more than ten years later, on May 28, 2021, President Biden proposed a $14.5 trillion cumulative increase in deficits over ten years.
After aiming at government waste during the Obama-Biden administration, Sheriff Joe has clearly hung up his badge.
Notice that Obama suddenly got interested in government waste after the 2010 mid-term “shellacking.” Biden won’t get interested unless he’s made to be interested. In 2022?
-
Dr. Richard Menger of the University of South Alabama:
I practice medicine in an emerging Covid-19 hot spot in a state with one of the lowest vaccination rates. Last year I saw Covid at its worst when I deployed to New York to take care of patients in an overflow intensive-care unit. I am vaccinated. I wouldn’t say I “believe in science,” because science doesn’t work that way, but I trust the scientific process. Yet when it comes to trust and persuasion, the medical profession isn’t always winning the Covid-19 battle, and it’s worth understanding why.
The current attempts at persuading people to get the vaccine follow the typical trinity of persuasion put forth by Aristotle —logos, pathos and ethos. Social media and government platforms focus on data (logos), such as the stark disparities in serious illness between the vaccinated and the not. Then it turns into emotional pleas (pathos), with personal stories of lost patients or loved ones. Advocates talk about a moral duty of getting a vaccine (ethos).
But when the desired response doesn’t materialize—when a substantial portion of the country still refuses a shot—the calls devolve into histrionic and condescending pleas. Many people respond by digging in their heels. Plenty of research shows that once people make a decision and attach a strong moral identity to it, they ignore contrary data.
The medical community needs to confront the ugly reality of distrust, especially in my state. The Tuskegee Syphilis Study is a living memory. Between 1932 and 1972, government researchers actively withheld treatment for syphilis while promising free medical care, meals and burial insurance. Some reluctance in blindly trusting a new vaccine is understandable.
It is also cause for pause that the government appears willing to coerce Americans to take a vaccine that doesn’t have full approval from the Food and Drug Administration. President Biden has considered a $100 payment for vaccination. Such fiscal rewards will likely have the biggest sway on the vulnerable populations. But the government can use sticks as well as carrots. Is it morally acceptable to tax the unvaccinated $100? How would you feel if the government applied this tactic to something you strongly disagreed with?
The best way to change minds is to talk to people and treat them with respect and dignity. I understand a lot of my healthcare colleagues are frustrated and tired, but a sensationalist, sanctimonious narrative driven by social media doesn’t help anyone. This is part of our job: persuading people to take medicine they don’t want to take.
Healthcare professionals have a challenging obligation to work to understand where people are coming from, build a relationship, address their fears to help them understand, gently correct information that is wrong, admit when medicine was wrong and medical authorities misled people, motivate them based on their needs, and develop networks of support in the community.
Using this approach and more, Jacqueline Brooks, superintendent of the Macon County, Ala., School District, helped lead the charge that resulted in universal vaccination among the district’s custodians, bus drivers, and lunchroom workers. Macon County includes the city of Tuskegee.
Ms. Brooks engaged in personal conversations, reduced barriers to appointments, formed a partnership with a local medical center, made sure people were comfortable with the decision, and praised them for making a “sacrifice” and “taking on risk” for the community. Most important, when an initial cohort was in a “wait-and-see” mode, she acknowledged the risk, didn’t pressure them, and offered reassurance and data as more people they knew became vaccinated. The results speak for themselves.
-
Another government failure, another outrage. This time the scandal is brought on by the less-than-orderly withdrawal of U.S. troops from Afghanistan and the realization that 20 years of military presence in the country achieved nothing but death and chaos. Observing another instance of large-scale mismanagement, I can’t help being surprised that anyone is still surprised.
One needn’t be a foreign policy expert to recognize that something in Afghanistan went terribly wrong. While many will blame the Biden administration for a fiasco that will have horrifying humanitarian consequences for the Afghan people, the failure also belongs to those who made the decision to go and remain there for two decades. These American officials argued that a continuing U.S. military presence there was important for achieving several goals, like training the Afghan army to resist the Taliban. Yet, today, the almost-immediate collapse of the U.S.-backed Afghan government makes it clear that whatever our strategy was, it failed.
Unfortunately, it’s unlikely that those who believed in nation building in the first place will realize from this dreadful episode that it never works as well as planned, even though the tragic scenario now unfolding before our eyes isn’t the first U.S. government foreign policy disaster. And it won’t be the last. People never seem to learn. Making matters worse is the fact that this sad state of affairs isn’t limited to foreign policy. It exists everywhere and throughout all levels of federal, state, and local government.
During the pandemic, for instance, I was baffled to see Congress put the Small Business Administration (SBA) in charge of dispensing unprecedented disaster relief. This agency has a disastrous record of extending the suffering of small business owners after disasters like Hurricane Katrina. Having written extensively on that issue, I knew that this time around would be no different. It wasn’t.
A few weeks ago, the Wall Street Journal published an investigative piece about the performance of the SBA’s COVID-19 disaster loan program. The conclusion is that it was terrible. The report is packed full of examples of the ordeal that small business owners went through, many of which were carbon copies of stories that I reported about during other disasters. One of the recipients of the loans described dealing with the SBA: The agency “puts you in a state of confusion and doesn’t allow you to focus on what you should be doing, and that is to continue to rebuild your business after a pandemic.”
The article also includes an admission by a former SBA regional administrator who helped with the agency’s response. “On the disaster side,” he admitted, “we did a terrible job.” However, it doesn’t quite matter, as there will be no consequences for the agency. And the next time around, whether for a pandemic or a hurricane, the SBA will be called to the front lines yet again. And when it fails again, people will be outraged and wonder how this could have happened again.
I’m picking on the SBA, but the same criticism applies to other agencies and many other government efforts. Remember the flaw-filled rollout of the Obama-era Healthcare.gov? Remember the invasion of Iraq and discovering that there weren’t weapons of mass destruction there after all? Remember former President Donald Trump’s trade war, which was supposed to bring jobs back to the United States?
Books will be written for years to come about the utter failure of the Centers for Disease Control and Prevention (CDC) to perform the most basic of responsibilities—preparing for a COVID-19-like pandemic. Authors will comment on all the CDC’s well-documented fiascoes, like the unwillingness to use existing COVID-19 tests and the failure to recommend that schools be opened like they are, successfully, in many other countries. Similar books will be written about the Food and Drug Administration’s poor handling of the crisis. The best of these books will even examine the agencies’ pasts and note that the recent mismanagements are just more of the same.
Unfortunately, the incentives within government are such that this pattern won’t change. After all, the same institution that’s unable to run the Postal Service or Amtrak without being in the red orchestrated the withdrawal from Afghanistan and our previous stay there for 20 years. The only thing that will make a difference is if we, the American people, start demanding accountability and reform. That may include the termination of a few—or perhaps many—agencies and programs.
