If you have a life, you probably missed the Republican presidential candidate debate Monday night.
Which means you missed the flap over Texas’ brief mandate of vaccination against the human papillomavirus, instituted by executive order of Gov. Rick Perry. The mandate was brief because Texas’ legislature blocked the mandate.
I think the charge of crony capitalism against Perry is valid generally and looks on target in this case in particular. The issue isn’t just that he got $5,000 from Merck. It’s that his former chief of staff was a lobbyist for Merck. I think Perry’s partial apology is heartfelt. He did it the wrong way and has said so. …
Meanwhile, I think Michele Bachmann’s attacks on Perry are irresponsible and borderline demagogic. References to the “government needle” being “pushed into innocent girls,” sound paranoid and exploitative to me. And fueling anti-vaccine fears to score political points against Perry is beneath her. I think Fox or some other news outlet should investigate Bachmann’s claim last night on Greta Van Susteren’s show. Bachmann said that a member of the audience came up to her and told her with tears in her eyes that Gardasil caused “mental retardation” in her daughter. I’m not doubting that someone told Bachmann that, but it’s a pretty serious — and unusual — claim. Regardless, the suggestion that Rick Perry is in any way responsible for it is ludicrous.
About Gardasil, the HPV vaccine, Henry I. Miller, M.D., of the Hoover Institution, points out:
Bachmann alluded to Perry’s executive order mandating the exposure of young girls to a “dangerous” vaccine and tried to distinguish Gardasil from other required pediatric vaccines that prevent infectious diseases. Note to Bachmann: The vaccine, Merck’s Gardasil, prevents infection with the most common strains of human papilloma virus. Once established, these viruses can ultimately cause genital warts as well as cervical, anal, vulvar, and vaginal cancers. Thus, by preventing the infection, the vaccine prevents all those sequelae.
In the extensive clinical studies (on more than 20,000 girls and women) that were performed prior to the FDA’s licensing of the vaccine, the vaccine was 100 per cent effective, a virtually unprecedented result. How safe is the vaccine? No serious side effects were detected; the most common side effect is soreness, redness and swelling in the arm at the site of the injection.
In summary, Gardasil has one of the most favorable risk-benefit ratios of any pharmaceutical. …
I am discouraged by politicians who not only don’t know much about science, technology, or medicine (which is perhaps understandable) but also don’t know what they don’t know (which is unacceptable).
“Mental retardation” typically takes place in a pre- or neo-natal event. Autism becomes apparent in the first couple of years of life — and primarily affects boys. Gardasil vaccinations take place among girls between 9-12 years of age. Even assuming that this anecdote is arguably true, it wouldn’t be either “mental retardation” or autism, but brain damage.
The FDA has received no reports of brain damage as a result of HPV vaccines Gardasil and Cervarix. Among the reports that correlate seriously adverse reactions to either, the FDA lists blood clots, Guillain-Barre Syndrome, and 68 deaths during the entire run of the drugs. The FDA found no causal connection to any of these serious adverse events and found plenty of contributing factors to all — and all of the events are exceedingly rare.
The “mental retardation” argument is a rehash of the thoroughly discredited notion that vaccines containing thimerasol caused a rapid increase in diagnosed autism cases. That started with a badly-botched report in Lancet that allowed one researcher to manipulate a ridiculously small sample of twelve cases in order to reach far-sweeping conclusions about thimerasol. That preservative hasn’t been included in vaccines for years, at least not in the US, and the rate of autism diagnoses remain unchanged.
Someone on Twitter pointed out that there is a long list of vaccines that are required for children before they enter school. And vaccines work less because you get one than because most of the population gets one. (Note that no one gets polio or smallpox anymore.) Moreover, teaching abstinence is important, but children have a habit of not following their parents’ advice. Can you guarantee that your child will fall in love with someone as abstinent as your child?
The better debate point, particularly for those who are skeptical of Perry’s conservatism, is Perry’s attempt to mandate the vaccine. Perry pointed out that the mandate had an opt-out provision, but the mandate was apparently to require health insurers to cover the vaccine; an opt-in provision reportedly would have meant that insurers need not cover the vaccine, saddling parents with the $300 cost. But again, the opt-out option did exist, and ignorance of the regulation is not much of an excuse.
To the extent voters are even paying attention, though, this kerfuffle may make nonaligned voters assume that the GOP is anti-science and anti-medicine. Independent voters are more likely to be impressed by medical experts’ positions on the HPV vaccine, as listed by the Los Angeles Times, than presidential candidates’ positions on the HPV vaccine:
Several health and medical organizations have released position statements or recommendations for the HPV vaccine. The American Academy of Pediatrics’ 2011 policy statement on childhood and adolescent immunization schedules recommends both the quadrivalent vaccine (otherwise known as Gardasil, which acts on four types of HPV) and the bivalent vaccine (also known as Cervarix, which protects against two HPV types) to guard against cervical cancer and genital warts. The AAP adds that the quadrivalent vaccine may be given to boys age 9 to 18 to help prevent genital warts.
Both Gardasil and Cervarix are also recommended for girls by the CDC in its 2011 interim Vaccine Information Statement. It stresses that the vaccine is important because “it can prevent most cases of cervical cancer in females” if given before exposure to the virus.
The American Cancer Society recommends the HPV vaccine for girls 11 to 12 years old, adding that girls as young as age 9 can get the vaccination as well. It holds off on suggesting women age 19 to 26 get the vaccine, citing a lack of evidence on its effectiveness in that group.
Opponents of mandatory vaccination include social conservatives who believe the vaccine will increase promiscuity, though we suspect watching MTV is a greater spur to teen sex. Opposition to state involvement in treatment decisions has more force: HPV is not casually communicable like polio or measles. Yet the executive order included a clause that allowed families to opt out for “reasons of conscience” or “to protect the right of parents to be the final authority on their children’s health care.” At a certain point, the distinction between “opt in” and “opt out” becomes academic when the violation of liberty is filling out minor paperwork.
The larger opportunity here is to eradicate a potentially terminal disease that has huge economic, social and other costs. Such progress is especially welcome when other government trends—the FDA’s cancer drug approvals, the eventual treatment restrictions inherent in national health care—are running in the opposite direction. …
Mrs. Bachmann’s vaccine demagoguery is another matter. After the debate the Minnesotan has been making the talk show rounds implying that HPV vaccines cause “mental retardation” on the basis of no evidence. This is the kind of know-nothingism that undermines public support for vaccination altogether and leads to such public health milestones as California reporting in 2010 the highest number of whooping cough cases in 55 years.
The GOP critique of government in the age of Obama would be more credible if the party’s candidates did not equate trying to save lives with tyranny.
It’s one thing for the voters of one Congressional district to send Bachmann to Congress. (After all, Milwaukee voters have been sending Gwen Moore to Congress and Lena Taylor to the state Senate.) It’s quite another for someone who apparently believes what she’s told without skepticism or even attempting to fact-check (vaccines cause mental retardation?) to get your vote for president. Bachmann wasn’t going to get the GOP nomination anyway, but one issue has demonstrated she doesn’t deserve it.
And, of course, there is this irrefutable point from Twitter:
I see Gardasil is still being debated. I’m sure the millions of unemployed are really worried about that.
Today in 1956, Elvis Presley had his first number one song:
Today in 1965, Ford Motor Co. began offering eight-track tape players in their cars. Since eight-track tape players for home audio weren’t available yet, car owners had to buy eight-track tapes at auto parts stores.
Today in 1970, Vice President Spiro Agnew said in a speech that the youth of America were being “brainwashed into a drug culture” by rock music, movies, books and underground newspapers.
I would say something about how all that ended, but that’s way too easy.
The number one single today in 1973:
The short list of birthdays begins with Ola Brunkert, who played drums for ABBA:
First (and items are not necessarily in order of the headline), ponder the irony of this:
Wisconsin’s largest teachers union has a problem.
A union problem.
This week, National Support Organization, which bills itself as the world’s largest union of union staffers, posted an online notice discouraging its members from seeking work with the Wisconsin Education Association Council.
“Don’t apply for WEAC vacancies!” screams the headline.
The reason for the boycott?
Chuck Agerstrand, president of the National Support Organization, is accusing WEAC officials of “breaching staff contracts and destroying any working relationship with its employees.”
“WEAC management is taking a page out of Gov. (Scott) Walker’splaybook and making up new employment rules not in the (United Staff Union) contract,” Agerstrand said on the labor group’s website. “They should be looking to the 42 employees they laid off to fill vacancies before they go outside the state.” …
Agerstrand said in his online post that WEAC management recently came up with a new rule for employees who have been laid off. According to Agerstrand, the rule says “an employee must have successfully passed a year’s probation in the job he/she wants to bump into or the employee has no recall rights.” He said this rule is contrary to WEAC’s contract with the United Staff Union, which is challenging the provision.
WEAC apparently needs layoff rules because they’re laying off employees (instead of adjusting the pay of their overcompensated management) since teachers who now have the choice of paying WEAC dues are choosing not to.
This is not the first example of unions or union supporters saying one thing and doing something contrary. President Obama’s comments about collective bargaining at his Thursday stimulus speech might make you think he favors collective bargaining for federal employees. Don’t bet that proposal will get to Congress, because, among other things, every Democratic president since Wisconsin public employees gained collective bargaining rights in the late 1950s has declined to do so.
The most vile thing you will read about 9/11 that does not have a Middle Eastern source comes from New York Times columnist Paul Krugman:
What happened after 9/11–and I think even people on the right know this, whether they admit it or not–was deeply shameful. The atrocity should have been a unifying event, but instead it became a wedge issue. Fake heroes like Bernie Kerik, Rudy Giuliani, and, yes, George W. Bush raced to cash in on the horror. And then the attack was used to justify an unrelated war the neocons wanted to fight, for all the wrong reasons.
The Wall Street Journal’s James Taranto demolishes Krugman’s entire argument:
Krugman goes on to observe that beside Bush, Giuliani and Kerik, “a lot of other people behaved badly. How many of our professional pundits–people who should have understood very well what was happening–took the easy way out, turning a blind eye to the corruption and lending their support to the hijacking of the atrocity?”
He has half a point here. We remember one professional pundit who behaved quite badly, writing on Sept. 14, 2001: “It seems almost in bad taste to talk about dollars and cents after an act of mass murder,” he observed, then went ahead and did so: “If people rush out to buy bottled water and canned goods, that will actually boost the economy. . . . The driving force behind the economic slowdown has been a plunge in business investment. Now, all of a sudden, we need some new office buildings.”
That was former Enron adviser Paul Krugman, who added that “the attack opens the door to some sensible recession-fighting measures,” by which he meant “the classic Keynesian response to economic slowdown, a temporary burst of public spending. . . . Now it seems that we will indeed get a quick burst of public spending, however tragic the reasons.” He went on to denounce the “disgraceful opportunism” of those who “would try to exploit the horror to push their usual partisan agendas”–i.e., conservatives who he said were doing exactly what he was doing.
What Krugman wrote (which was only online, not in the printed Times) is bad enough, although in a free society he has the right to hold scumbag opinions. But try to find this on the New York Times website. You can’t, because after announcing “I’m not going to allow comments on this post, for obvious reasons,” the Times erased the post. The Times should erase Krugman’s employment too.
Fortunately, better things are happening. For instance, Forbes covers the on- and off-field juggernaut that is the Green Bay Packers:
With some 112,000 shareholders, the Packers are the only publicly owned team in America. Add to that Green Bay’s distinction as the country’s smallest major league sports market and they seem a nostalgic aberration amid megamoneyed rivals like the Dallas Cowboys and New England Patriots. The longstanding line among football aficionados pegs the Packers as a charming welfare case that exists thanks only to the sufferance of other, richer NFL franchises. They allow the team to stay put in tiny Green Bay as an emblem of the league’s working-class roots.
The problem with that story: It isn’t true.
In reality the Green Bay Packers are an emerging financial power in the NFL. Despite their minuscule market, revenues for fiscal 2010 hit an alltime high of $259 million, 11th out of 32 teams and well above major-market franchises like the San Francisco 49ers ($234 million) and the Atlanta Falcons ($233 million). The Packers are regularly one of the 15 teams that pay into the league’s reserve fund rather than draw from it (so much for welfare). Their Super Bowl win, coming enhancements at the stadium and the league’s new collective bargaining agreement with players will make them stronger still.
“They’re an anomaly,” says Andrew Brandt, president of the website National Football Post. “They’re clearly the smallest-market team in all of professional sports, yet they’re a high-revenue team with no debt. There are a lot of big-market teams that wish they had that kind of financial situation.”
I’ve followed the Packers long enough to remember when Lambeau Field was one of the smallest stadiums in the NFL. By the time construction on the south end zone expansion is completed, it will be the fourth largest stadium in the NFL —smaller only than MetLife Stadium in the New York area, Cowboys Stadium, and FedEx Field — in the smallest market in major professional sports.
Major professional sports has demonstrated that teams in the smallest markets do well only with superior management. On the other hand, the Bears and Vikings have access to the same league-wide resources that the Packers have, and the Bears are in a much larger market. And yet the Bears and Vikings (the latter in their final year of their lease at its stadium) significantly underperform.
Today in 1968, ABC-TV premiered “The Archies,” created by the creator of the Monkees, Don Kirshner:
The number one single today in 1974 is a confession and correction:
Stevie Wonder had the number one album today in 1974, “Fulfillingness First Finale,” which wasn’t a finale at all:
Today in 1979, the film “Quadrophenia,” based on The Who’s rock opera, premiered:
Paul Young hadn’t had a very long career when he released “From Time to Time — The Singles Collection,” and yet he still had the number one British album today in 1991:
Today in 1994, Steve Earle was sentenced to a year in jail not for shooting the sheriff, but for selling crack cocaine:
Birthdays start with Pete Agnew of Nazareth:
Steve Gaines of Lynyrd Skynyrd:
Paul Kossoff of Free:
Barry Cowsill of the Cowsills:
Steve Berlin of Los Lobos:
Morten Harkett of A-Ha:
Amy Winehouse, whose biggest hit turned out to be prophetic:
I didn’t get around to commenting on President Obama’s latest attempt to get the economy going until now because (1) more important things were going on, (2) it seemed like an exercise in futility, and, by the way, (3) the bill that Obama wants passed didn’t exist.
Waiting a few days meant that at least there now is an actual bill. The bill did not improve from concept to introduction, but it did get more expensive, from $300 billion, Obama claimed, Thursday to $447 billion Monday.
National Review’s Jim Geraghty asked a few inconvenient questions Friday:
I didn’t think this was the worst speech Obama gave. It’s not even that all of the ideas in it are all that terrible. It’s just that they’re reheated leftovers, reruns, small-ball initiatives that are likely to be as effective as every other stimulus program that repaves sidewalks or funds research on exotic ants. We’re a $14 trillion economy that makes everything from timber to jumbo jets to firearms to smart-phone apps to Hollywood movies to every food product under the sun. The notion that some grab bag of tax credits and federal grants is going to kick-start a hiring binge to put 14 million Americans back to work or that the economy is one tax credit for hiring veterans away from recovery is laughable.
The recession we’ve endured for the past three years is far from normal, and yet we keep getting the normal Keynesian responses. I realize I’m about to offer blasphemies and shockers on par with Rick Perry’s Ponzi-scheme comparison, but what if Obama was wrong last night, and a big issue is that some of the people of this country do not, in fact, work hard to meet their responsibilities? What if decades of a lousy education system have left us with a workforce that has too many members with no really useful skills for a globalized economy? What if way too many college students majored in liberal arts and are entering the workforce looking for jobs that will never exist? What if the massive housing bubble got Americans to condition themselves to work in an economy that’s never coming back? (How many realtors are unemployed right now?) What if we have good workers who can’t move to take new jobs because they’re underwater on their mortgages and can’t sell their house?
… How many Americans can argue that we indisputably provide the best value as an employee compared to any other group of workers in the world? Are we still the smartest? Are we still the hardest-working? Are we still the most innovative?
Instead, [Thursday] night we were assured that “tax breaks for millionaires and billionaires” were preventing us from “put[ting] teachers back to work so our kids can graduate ready for college and good jobs.” Sigh. As Michael Barone scored it, “Straw men took a terrible beating from the president.”
A lot of people liked this succinct Yuval Levin assessment: “Spend $450 billion dollars now, it will create jobs, and I’ll tell you how I’m going to pay for it a week from Monday. If you disagree, you want to expose kids to mercury. That about sums up the Obama years.”
The difference between the American Recovery and Reinvestment Act of early 2009 and the American Jobs Act is about $340 billion. Criticism of Stimulus I was mostly in two camps: (1) it wasn’t going to work, or (2) it wasn’t large enough. If the second camp was correct, then a smaller Stimulus II isn’t going to work either.
The first camp would include those who noticed in retrospect that the term “shovel-ready” applied to almost nothing besides the U.S. 41 project between Neenah and Oshkosh. That group also would point out the number of units of government that, two years after ARRA became law, had to make substantial budget cuts (such as this state) because those one-time payments were indeed one-time payments.
Moreover, the minimal tax cuts designed to compel employers to hire aren’t likely to work. Employers hire employees because the employer has business that needs to be done. With the economy still sluggish at best, employers aren’t going to hire more people than they need to conduct today’s business, regardless of tax cuts.
Commentary’s Jim Pethokoukis has another inconvenient observation:
The American Recovery and Reinvestment Act was Barack Obama’s signature achievement in dealing with the most worrisome set of economic conditions since the Great Depression. It was how Obama, to use a pair of his now seemingly abandoned metaphors, sought to drag the economy out of the ditch while the Republicans were standing around sipping Slurpees.
As Obama said on the first anniversary of signing the bill, “It is largely thanks to the Recovery Act that a second Depression is no longer a possibility.” Economic analysis from the White House credits the Recovery Act with having saved or created between 2.4 million and 3.6 million jobs by the end of March, 2011. …
But Republicans have a competing argument. Instead of saving us from a Greater Depression, the Obama stimulus (together with his health-care plan and financial reforms) was a two-year waste of precious time and money that may actually have impeded economic growth. The evidence for their proposition comes in part from the White House itself; its own economists predicted the stimulus would prevent the unemployment rate from hitting 8 percent. But the rate actually rose as high as 10.1 percent, has settled in above 9 percent now, and even Obama’s own team currently hopes for a rate of, at best, 8.25 percent by the end of 2012—if nothing else goes wrong.
To be sure, the economic disaster that led to the longest recession the United States has ever suffered was something Obama inherited, but there is no question everyone (on all sides of the aisle) believed that natural cyclical forces would have led to recovery long before now. Natural cyclical forces were not given a chance to work themselves out. Far from it. In addition, Republicans can argue that regulatory uncertainty and fear over the rising national debt—debt that Obama’s Recovery Act helped intensify—have chilled American business. …
Economist Brian Wesbury of First Trust Portfolios thinks the huge increase in government spending under the Obama and Bush administrations has hurt the economy. Cutting it back would boost growth. His economic model suggests that without the large increase in government spending that occurred in recent years, “real GDP would be 3.2 percent larger today than it is, the unemployment rate would be 7.6 percent, the U.S. would have 2.5 million more jobs, and the stock market would be 24 percent higher.” …
Did Obama make it worse? It is certainly the case that he only deepened a long-term trend that threatens American prosperity more than any other. The events of 2008–2009 exposed a truth about the U.S. economy from which we had shielded ourselves: economic growth has been slowing in a worrisome way throughout the decade. The nation’s GDP has averaged 3.3 percent annual growth for the past half century. But from 2001 to 2007—before the recession hit—it averaged only 2.6 percent. Going forward, growth might be even slower due to the aftermath of the financial crisis and the aging of the population. The Congressional Budget Office, for instance, pegs long-term growth at just 2 percent or so.
But that downshift isn’t fated. The McKinsey Global Institute thinks a higher retirement age and smarter immigration policy could make the labor force grow more quickly, while smarter tax and regulatory policy could boost worker productivity. Replacing the income tax with a consumption tax, for instance, would likely make the economy grow faster over the long run by increasing investment.
The other absurdity was Obama’s claim that Stimulus II was fully paid for. It was “paid for” only in the sense that the deficit supercommittee, tasked with finding $1.5 trillion in cuts, was going to have to find nearly $2 trillion in cuts, since Obama just added the Stimulus II cost to the supercommittee’s task. I of course was and am skeptical that the supercommittee will find that $1.5 trillion, and if that’s the case it certainly won’t be able, politically speaking, to find another $447 billion.
But Monday the Obama administration had an answer: Increase taxes (by eliminating tax breaks) on single people making more than $200,000, families making more than $250,000, oil companies and hedge funds. Obama’s speech Thursday suggested it’s crazy to increase taxes during a recession, and yet the American Jobs Act increases taxes during a recession, apparently to assuage the class warrior now residing at 1600 Pennsylvania Ave.
Today in Great Britain in the first half of the 1960s was a day for oddities.
Today in 1960, a campaign began to ban the Ray Peterson song “Tell Laura I Love Her” (mentioned here Friday) on the grounds that it was likely to inspire a “glorious death cult” among teens. (The song was about a love-smitten boy who decides to enter a car race to earn money to buy a wedding ring for her girlfriend. To sum up, that was his first and last race.)
The anti-“Tell Laura” campaign apparently was not based on improving traffic safety. We conclude this from the fact that three years later, Graham Nash of the Hollies leaned against a van door at 40 mph after a performance in Scotland to determine if the door was locked. Nash determined it wasn’t locked on the way to the pavement.
One year later, a concert promoter hired two dozen rugby players to form a human chain around the stage at a Rolling Stones concert at the Empire Theatre in Liverpool. Rugby players are tough, but not tough enough to take on 5,000 spectators.
The number one album today in 1980 was Jackson Browne’s “Hold Out,” Browne’s only number one album:
Birthdays begin with a pair of horn rock legends — David Clayton Thomas of Blood Sweat & Tears …
… and Peter Cetera of Chicago:
Producer Don Was, who formed Was (Not Was) …
… was born the same day as Randy Jones of the Village People:
Since 9/11, it’s been said that Americans can be divided into two groups depending on whether they’ve learned any lessons from 9/11 — 9/10 Americans or 9/12 Americans.
Michelle Malkin claims the wrong lessons are being taught, and the right lessons haven’t been learned:
“Know your enemy, name your enemy” is a 9/11 message that has gone unheeded. Our immigration and homeland security policies refuse to profile jihadi adherents at foreign consular offices and at our borders. Our military leaders refuse to expunge them from uniformed ranks until it’s too late (see: Fort Hood massacre). The j-word is discouraged in Obama intelligence circles, and the term “Islamic extremism” was removed from the U.S. national security strategy document last year.
Similarly, too many teachers refuse to show and tell who the perpetrators of 9/11 were and who their heirs are today. My own daughter was one year old when the Twin Towers collapsed, the Pentagon went up in flames and Shanksville, Pa., became hallowed ground for the brave passengers of United Flight 93. In second grade, her teachers read touchy-feely stories about peace and diversity to honor the 9/11 dead. They whitewashed Osama bin Laden, militant Islam and centuries-old jihad out of the curriculum. Apparently, the youngsters weren’t ready to learn even the most basic information about the evil masterminds of Islamic terrorism. …
A decade after the 9/11 attacks, Blame America-ism still permeates classrooms and the culture. A special 9/11 curriculum distributed in New Jersey schools advises teachers to “avoid graphic details or dramatizing the destruction” wrought by the 9/11 hijackers, and instead focus elementary school students’ attention on broadly defined “intolerance” and “hurtful words.”
No surprise: Jihadist utterances such as “Kill the Jews,” “Allahu Akbar” and “Behead all those who insult Islam” are not among the “hurtful words” studied.
The Wall Street Journal conducted a “symposium” asking the provocative question of whether the U.S. overreacted, or mis-reacted, to the attacks.
Former deputy defense secretary Paul Wolfewicz, a favorite 2000s target of Democrats and anti-Semites, made the parallel between 9/11 and World War II:
Preventing further attacks required the U.S. to drop its law-enforcement approach to terrorism and recognize that we were at war. Consider the difference between Khalid Sheikh Mohammed—the mastermind of 9/11 who told us much of what we now know about al Qaeda—and his nephew Ramzi Yousef, the mastermind of the 1993 attack on the World Trade Center who can’t be questioned (even most courteously) without his lawyer present and has told us nothing of significance. Or consider the difference between the ineffective retaliatory bombing of Afghanistan in 1998 and the 2001 response that brought down the Taliban regime.
We went to war with Germany in 1941 not because it had attacked Pearl Harbor but because it was dangerous. After 9/11, we had to do more to deal with state sponsors of terrorism than simply place them on a prohibited list, especially if they had connections to biological, chemical or nuclear weapons. Saddam Hussein—who was defying numerous United Nations resolutions and was the only head of a government to praise 9/11, warning that Americans should “suffer” so they will “find the right path”—presented such a danger.
That we made mistakes in Afghanistan and Iraq does not prove that we overreacted. (Costly mistakes were also made in World War II: sending poorly prepared troops to North Africa, failing to plan for the hedgerows beyond the beaches at Normandy, failing to anticipate the German counterattack in Belgium.) The real question is whether a significantly different response would have produced a better result.
Author Mark Helprin advocated for a significantly more severe response:
We underreacted in failing to declare war and put the nation on a war footing, and thus overreacted in trumpeting hollow resolution. We underreacted in attempting quickly to subdue and pacify, with fewer than 200,000 soldiers, 50 million famously recalcitrant people in notoriously difficult terrain halfway around the world. We are left with 10,000 American dead here and abroad, a bitterly divided polity, a broken alliance structure, emboldened rivals abroad, and two fractious nations hostile to American interests with little changed from what they were before.
We overreacted by attempting to revolutionize the political culture—and therefore the religious laws with which it is inextricably bound—of a billion people who exist as if in another age. The “Arab Spring” is less a confirmation of this illusion than its continuance. If you think not, just wait. …
Rather than embarking upon the reformation of the Arab world, we should have fully geared up, sacrificed for, and resolved upon war. Then struck hard and brought down the regimes sheltering our enemies, set up strongmen, charged them with extirpating terrorists, and withdrawn from their midst to hover north of Riyadh in the network of bases the Saudis have built within striking distance of Baghdad and Damascus. There we might have watched our new clients do the work that since 9/11 we have only partially accomplished, and at a cost in lives, treasure, and heartbreak far greater than necessary.
Helprin seems to be channeling his inner George Patton, who said, “The object of war is not to die for your country but to make the other bastard die for his.”
New Republic literary editor Leon Wieseltier, who supported then opposed the Iraq War, does an interesting zig-zag:
I was deceived in my support of the Iraq war, but I rejoice in the dictator’s destruction, and in the stirrings of Iraqi democracy despite the best efforts of Islamists and Iranians to thwart it. Good outcomes may come of bad origins.
On another website I wrote: “Since 9/11, I don’t think we’re more safe, but we are less free,” without spelling out how or why. David McElroy does:
Americans are demonstrably less free today than we were 10 years ago this morning. It’s easier for our masters to wiretap us, demand private information about us without warrants and to monitor what we’re doing in our financial transactions. We have police who think it’s a crime to take photos of public buildings from public property — and police officials who think it’s a reasonable policy to violate the rights of peaceful Americans just for taking pictures. We are subjected to treatment at airports that we wouldn’t have put up with 10 years ago. Even with all of this, a determined and decently financed group could easily still pull off a credible attack that would terrify everyone.
According to Justice Department memos released in 2009 by the Obama Administration, “since March 2002, the intelligence derived from CIA detainees has resulted in more than 6,000 intelligence reports and, in 2004, accounted for approximately half of the CTC’s [Counterterrorism Center’s] reporting on al Qaeda. . . . The substantial majority of this intelligence has come from detainees subjected to enhanced interrogation techniques.” Our friends on the left often call these memos the “torture memos.” The real torture is what happens to maimed victims of terrorist atrocities that intelligence agencies were blind to prevent.
That’s a lesson the Obama Administration has taken to heart. Though the President came to office promising to undo his predecessor’s antiterror legacy, he has for the most part preserved it. That goes for re-authorizing key provisions of the Patriot Act (including that favorite ACLU bugaboo, the so-called library-records provision); moving forward with military tribunals for Khalid Sheikh Mohammed and other detainees; keeping Guantanamo open (albeit grudgingly), and giving the CIA authority to dramatically increase the use of drones against terrorist leaders. As for some of Mr. Obama’s other promises, such as ending the use of enhanced interrogations or closing down the black sites, these were already accomplished facts well before George W. Bush left office.
Constrained interrogations excepted, these developments not only increase America’s margin of safety against another attack, but also put the Democratic Party’s visible imprimatur on the war on terror, much as Dwight Eisenhower’s foreign policy put the GOP stamp on Harry Truman’s containment policies.
They also expose the accusation that President Bush was trampling America’s civil liberties as a particularly vulgar partisan maneuver—one that magically disappeared the moment Mr. Obama came to office. We certainly don’t like removing our shoes at the airport, but the larger truth is that American civil liberties are as robust today as they were on the eve of 9/11. Then again, we shudder to think of the kinds of measures the American public would have demanded had there been further attacks on the scale of 9/11. The internment of Japanese-Americans during World War II, it’s worth recalling, was mainly the doing of those two great civil libertarians Franklin Roosevelt and Earl Warren.
One does have to point out that nowhere in the Constitution do we have a right to air travel, let alone inconvenience-free air travel. (If that were the case, then we have an 18-year-old lawsuit to file against Northworst Airlines.) And it’s unclear to me why we are supposed to give constitutional rights to non-Americans, particularly those trying to kill us. To quote former Supreme Court Justice Robert Jackson, one of the Nuremburg war trial prosecutors, the Constitution is not a suicide pact.
One of the most interesting 9/11 viewpoints comes from author Christopher Hitchens, who was no one’s idea of a conservative before 9/11, but whose eyes opened:
The proper task of the “public intellectual” might be conceived as the responsibility to introduce complexity into the argument: the reminder that things are very infrequently as simple as they can be made to seem. But what I learned in a highly indelible manner from the events and arguments of September 2001 was this: Never, ever ignore the obvious either. To the government and most of the people of the United States, it seemed that the country on 9/11 had been attacked in a particularly odious way (air piracy used to maximize civilian casualties) by a particularly odious group (a secretive and homicidal gang: part multinational corporation, part crime family) that was sworn to a medieval cult of death, a racist hatred of Jews, a religious frenzy against Hindus, Christians, Shia Muslims, and “unbelievers,” and the restoration of a long-vanished and despotic empire.
To me, this remains the main point about al-Qaida and its surrogates. I do not believe, by stipulating it as the main point, that I try to oversimplify matters. I feel no need to show off or to think of something novel to say. Moreover, many of the attempts to introduce “complexity” into the picture strike me as half-baked obfuscations or distractions. These range from the irredeemably paranoid and contemptible efforts to pin responsibility for the attacks onto the Bush administration or the Jews, to the sometimes wearisome but not necessarily untrue insistence that Islamic peoples have suffered oppression. (Even when formally true, the latter must simply not be used as nonsequitur special pleading for the use of random violence by self-appointed Muslims.) …
To begin with, I found myself for the first time in my life sharing the outlook of soldiers and cops, or at least of those soldiers and cops who had not (like George Tenet and most of the CIA) left us defenseless under open skies while well-known “no fly” names were allowed to pay cash for one-way tickets after having done perfunctory training at flight schools. My sympathies were wholeheartedly and unironically (and, I claim, rationally) with the forces of law and order. Second, I became heavily involved in defending my adopted country from an amazing campaign of defamation, in which large numbers of the intellectual class seemed determined at least to minimize the gravity of what had occurred, or to translate it into innocuous terms (poverty is the cause of political violence) that would leave their worldview undisturbed. How much easier to maintain, as many did, that it was all an excuse to build a pipeline across Afghanistan (an option bizarrely neglected by American imperialism after the fall of communism in Kabul, when the wretched country could have been ours for the taking!). …
Ten years ago I wrote to a despairing friend that a time would come when al-Qaida had been penetrated, when its own paranoia would devour it, when it had tried every tactic and failed to repeat its 9/11 coup, when it would fall victim to its own deluded worldview and—because it has no means of generating self-criticism—would begin to implode. The trove recovered from Bin Laden’s rather dismal Abbottabad hideaway appears to confirm that this fate has indeed, with much labor on the part of unsung heroes, begun to engulf al-Qaida. I take this as a part vindication of the superiority of “our” civilization, which is at least so constituted as to be able to learn from past mistakes, rather than remain a prisoner of “faith.”
The battle against casuistry and bad faith has also been worth fighting. So have many other struggles to assert the obvious. Contrary to the peddlers of shallow anti-Western self-hatred, the Muslim world did not adopt Bin-Ladenism as its shield against reality. Very much to the contrary, there turned out to be many millions of Arabs who have heretically and robustly preferred life over death. In many societies, al-Qaida defeated itself as well as underwent defeat.
Finally, one of the fascinating things for media geeks is to watch coverage of breaking news such as 9/11. The Internet Archive has two pages — one that follows American media as the events occurred, and the other shows worldwide TV.
The Media Research Center, usually quite critical of the news media, reposted its 2002 tribute to 9/11 TV news coverage.
And finally, what Ground Zero looked like last night:
Sept. 11, 2001 started out as a beautiful day, in Ripon, New York City and Washington, D.C.
I remember almost everything about the entire day. Sept. 11, 2001 is to my generation what Nov. 22, 1963 was to my parents and Dec. 7, 1941 was to my grandparents.
I had dropped off our oldest son, Michael, at Ripon Children’s Learning Center. As I was coming out, the mother of one of Michael’s group told me to find a good radio station; she had heard as she was getting out with her son that a plane had hit the World Trade Center.
I got in my car and turned it on in time to hear, seemingly live, a plane hit the WTC. But it wasn’t the first plane, it was the second plane hitting the other tower.
As you can imagine, my drive to Fond du Lac took unusually long that day. I tried to call Jannan, who was working at Ripon College, but she didn’t answer because she was in a meeting. I had been at Marian University as their PR director for just a couple months, so I didn’t know for sure who the media might want to talk to, but once I got there I found a couple professors and called KFIZ and WFDL in Fond du Lac and set up live interviews.
The entire day was like reading a novel, except that there was no novel to put down and no nightmare from which to wake up. A third plane hit the Pentagon? A fourth plane crashed somewhere else? The government was grounding every plane in the country and closing every airport?
I had a TV in my office, and later that morning I heard that one of the towers had collapsed. So as I was talking to Jannan on the phone, NBC showed a tower collapsing, and I assumed that was video of the first tower collapse. But it wasn’t; it was the second tower collapse, and that was the second time that replay-but-it’s-not thing had happened that day.
Marian’s president and my boss (a native of a Queens neighborhood who grew up with many firefighter and police officer families) had a brief discussion about whether or not to cancel afternoon or evening classes, but they decided (correctly) to hold classes as scheduled. The obvious reasons were (1) that we had more than 1,000 students on campus, and what were they going to do if they didn’t have classes, and (2) it was certainly more appropriate to have our professors leading a discussion over what had happened than anything else that could have been done.
I was at Marian until after 7 p.m. I’m sure Marian had a memorial service, but I don’t remember it. While I was in Fond du Lac, our church was having a memorial service with our new rector (who hadn’t officially started yet) and our interim priest. I was in a long line at a gas station, getting gas because the yellow low fuel light on my car was on, not because of panic over gas prices, although I recall that one Fond du Lac gas station had increased their prices that day to the ridiculous $2.299 per gallon. (I think my gas was around $1.50 a gallon that day.)
Two things I remember about that specific day: It was an absolutely spectacular day. But when the sun set, it seemed really, really dark, as if there was no light at all outside, from stars, streetlights or anything else.
For the next few days, since Michael was at the TV-watching age, we would watch the ongoing 9/11 coverage in our kitchen while Michael was watching the 1-year-old-appropriate stuff or videos in our living room. That Sunday, one of the people who was at church was Adrian Karsten of ESPN. He was supposed to be at a football game working for ESPN, of course, but there was no college football Saturday (though high school football was played that Friday night), and there was no NFL football Sunday. Our organist played “God Bless America” after Mass, and I recall Adrian clapping with tears down his face; I believe he knew some people who had died or been injured.
Later that day was Marian’s Heritage Festival of the Arts. We had record attendance since there was nothing going on, it was another beautiful day, and I’m guessing after five consecutive days of nonstop 9/11 coverage, people wanted to get out of their houses.
In the decade since then, a comment of New York City Mayor Rudy Giuliani has stuck in my head. He was asked a year or so later whether the U.S. was more or less safe since 9/11, and I believe his answer was that we were more safe because we knew more than on Sept. 10, 2001. That and the fact that we haven’t been subject to another major terrorist attack since then is the good news.
Osama bin Laden (who I hope is enjoying Na’ar, Islam’s hell) and others in Al Qaeda apparently thought that the U.S. (despite the fact that citizens from more than 90 countries died on 9/11) would be intimidated by the 9/11 attacks and cower on this side of the Atlantic Ocean, allowing Al Qaeda to operate with impunity in the Middle East and elsewhere. (Bin Laden is no longer available for comment.) If you asked an American who paid even the slightest attention to world affairs where a terrorist attack would be most likely before 9/11, that American would have replied either “New York,” the world’s financial capital, or “Washington,” the center of the government that dominates the free world. A terrorist attack farther into the U.S., even in a much smaller area than New York or Washington, would have delivered a more chilling message, that nowhere in the U.S. was safe. Al Qaeda didn’t think to do that, or couldn’t do that. The rest of the Middle East also did not turn on the U.S. or on Israel (more so than already is the case with Israel), as bin Laden apparently expected.
The bad news is all of the other changes that have taken place that are not for the better. Bloomberg Businessweek asks:
So was it worth it? Has the money spent by the U.S. to protect itself from terrorism been a sound investment? If the benchmark is the absence of another attack on the American homeland, then the answer is indisputably yes. For the first few years after Sept. 11, there was political near-unanimity that this was all that mattered. In 2005, after the bombings of the London subway system, President Bush sought to reassure Americans by declaring that “we’re spending unprecedented resources to protect our nation.” Any expenditure in the name of fighting terrorism was justified.
Six years later, though, it’s clear this approach is no longer sustainable. Even if the U.S. is a safer nation than it was on Sept. 11, it’s a stretch to say that it’s a stronger one. And in retrospect, the threat posed by terrorism may have been significantly less daunting than Western publics and policymakers imagined it to be. …
Politicians and pundits frequently said that al Qaeda posed an “existential threat” to the U.S. But governments can’t defend against existential threats—they can only overspend against them. And national intelligence was very late in understanding al Qaeda’s true capabilities. At its peak, al Qaeda’s ranks of hardened operatives numbered in the low hundreds—and that was before the U.S. and its allies launched a global military campaign to dismantle the network. “We made some bad assumptions right after Sept. 11 that shaped how we approached the war on terror,” says Brian Fishman, a counterterrorism research fellow at the New America Foundation. “We thought al Qaeda would run over the Middle East—they were going to take over governments and control armies. In hindsight, it’s clear that was never going to be the case. Al Qaeda was not as good as we gave them credit for.”
Yet for a decade, the government’s approach to counterterrorism has been premised in part on the idea that not only would al Qaeda attack inside the U.S. again, but its next strike would be even bigger—possibly involving unconventional weapons or even a nuclear bomb. Washington has appropriated tens of billions trying to protect against every conceivable kind of attack, no matter the scale or likelihood. To cite one example, the U.S. spends $1 billion a year to defend against domestic attacks involving improvised-explosive devices, the makeshift bombs favored by insurgents in Afghanistan. “In hindsight, the idea that post-Sept. 11 terrorism was different from pre-9/11 terrorism was wrong,” says Brian A. Jackson, a senior physical scientist at RAND. “If you honestly believed the followup to 9/11 would be a nuclear weapon, then for intellectual consistency you had to say, ‘We’ve got to prevent everything.’ We pushed for perfection, and in counterterrorism, that runs up the tab pretty fast.”
Nowhere has that profligacy been more evident than in the area of homeland security. “Things done in haste are not done particularly well,” says Jackson. As Daveed Gartenstein-Ross writes in his new book, Bin Laden’s Legacy, the creation of a homeland security apparatus has been marked by waste, bureaucracy, and cost overruns. Gartenstein-Ross cites the Transportation Security Agency’s rush to hire 60,000 airport screeners after Sept. 11, which was originally budgeted at $104 million; in the end it cost the government $867 million. The homeland security budget has also proved to be a pork barrel bonanza: In perhaps the most egregious example, the Kentucky Charitable Gaming Dept. received $36,000 to prevent terrorists from raising money at bingo halls. “If you look at the past decade and what it’s cost us, I’d say the rate of return on investment has been poor,” Gartenstein-Ross says.
Of course, much of that analysis has the 20/20 vision of hindsight. It is interesting to note as well that, for all the campaign rhetoric from candidate Barack Obama that we needed to change our foreign policy approach, President Obama has changed almost nothing, including our Afghanistan and Iraq involvements. It is also interesting to note that the supposed change away from President George W. Bush’s us-or-them foreign policy approach hasn’t changed the world’s view, including particularly the Middle East’s view, of the U.S. Someone years from now will have to determine whether homeland security, military and intelligence improvements prevented Al Qaeda from another 9/11 attack, or if Al Qaeda wasn’t capable of more than just one 9/11-style U.S. attack.
Hindsight makes one realize how much of the 9/11 attacks could have been prevented or at least their worst effects lessened. One year after 9/11, the New York Times book 102 Minutes: The Untold Story of the Fight to Survive Inside the Twin Towers points out that eight years after the 1993 World Trade Center bombing, New York City firefighters and police officers still could not communicate with each other, which led to most of the police and fire deaths in the WTC collapses. Even worse, the book revealed that the buildings did not meet New York City fire codes when they were designed because they didn’t have to, since they were under the jurisdiction of the Port Authority of New York and New Jersey. And more than one account shows that, had certain people at the FBI and elsewhere been listened to by their bosses, the 9/11 attacks wouldn’t have caught our intelligence community dumbfounded. (It does not speak well of our government to note that no one appears to have paid any kind of political price for the 9/11 attacks.)
I think, as Bloomberg BusinessWeek argues, our approach to homeland security (a term I loathe) has overdone much and missed other threats. Our approach to airline security — which really seems like the old error of generals’ fighting the previous war — has made air travel worse but not safer. (Unless you truly believe that 84-year-old women and babies are terrorist threats.) The incontrovertible fact is that every 9/11 hijacker fit into one gender, one ethnic group and a similar age range. Only two reasons exist to not profile airline travelers — political correctness and the assumption that anyone is capable of hijacking an airplane, killing the pilots and flying it into a skyscraper or important national building. Meanwhile, while the U.S. spends about $1 billion each year trying to prevent Improvised Explosive Device attacks, what is this country doing about something that would be even more disruptive, yet potentially easier to do — an Electromagnetic Pulse attack, which would fry every computer within the range of the device?
We haven’t taken steps like drilling our own continent’s oil and developing every potential source of electric power, ecofriendly or not, to make us less dependent on Middle East oil. (The Middle East, by the way, supplies only one-fourth of our imported oil. We can become less dependent on Middle East oil; we cannot become less dependent on energy.) And the government’s response to 9/11 has followed like B follows A the approach our culture has taken to risk of any sort, as if covering ourselves in bubblewrap, or even better cowering in our homes, will make the bogeyman go away. Are we really safer because of the Patriot Act?
American politics was quite nasty in the 1990s. For a brief while after 9/11, we had impossible-to-imagine moments like this:
And then within the following year, the political beatings resumed. Bush’s statement, “I ask your continued participation and confidence in the American economy,” was deliberately misconstrued as Bush saying that Americans should go out and shop. Americans were exhorted to sacrifice for a war unlike any war we’ve ever faced by those who wouldn’t have to deal with the sacrifices of, for instance, gas prices far beyond $5 per gallon, or mandatory national service (a bad idea that rears its ugly head in times of anything approaching national crisis), or substantially higher taxes.
Then again, none of this should be a surprise. Other parts of the world hate Americans because we are more economically and politically free than most of the world. We have graduated from using those of different skin color from the majority as slaves, and we have progressed beyond assigning different societal rights to each gender. We tolerate different political views and religions. To the extent the 9/11 masterminds could be considered Muslims at all, they supported — and radical Muslims support — none of the values that are based on our certain inalienable rights. The war between our world, flawed though it is, and a world based on sharia law is a war we had better win.
In one important sense, 9/11 changed us less than it revealed us. America can be both deeply flawed and a special place, because human beings are both deeply flawed and nonetheless special in God’s eyes. Jesus Christ is quoted in Luke 12:48 as saying that “to whomsoever much is given, of him shall be much required.” As much as Americans don’t want to be the policeman of the world, or the nation most responsible for protecting freedom worldwide, there it is.