At the end of September, the Global Commission for the Certification of Poliomyelitis Eradication convened in Bali and, after reviewing the reports of its member nations, declared poliovirus type 2 eradicated in the wild. This was really only a bureaucratic stamp on a fact: The last case of type 2 polio was identified in Aligarh, India, in 1999. Thanks in no small part to the initiative of the world’s Rotarians — one of those “little platoons” of which Edmund Burke was so fond — polio has been eradicated everywhere on Earth except for two places where those who would eradicate it are forbidden to operate: Afghanistan and Pakistan. That’s the Taliban’s gift to the Islamic world: paralytic polio.
Despite some recent setbacks, including funding troubles after the financial crisis and the emergence of anti-vaccine nuttery in the United States and elsewhere, measles and rubella are next on the hit list. Those diseases will almost certainly be a thing of the past a decade or two hence.
The Princeton economist Angus Dean, recently awarded the Nobel prize, has spent much of his career working on how we measure consumption, poverty, real standards of living, etc. It is thanks in part to his work that we can say that the global rate of “extreme poverty,” currently defined as subsistence on less than the equivalent of $1.90 a day, is now the condition of less than 10 percent of the human race. In the 1980s, that number was 50 percent — half the species — and as late as the dawn of the 21st century, one-third of the human race lived in extreme poverty. The progress made against poverty in the past 30 years is arguably the most dramatic economic event since the Industrial Revolution. It did not happen by accident.
Good news abroad, and good news at home: In 1990, there were 2,245 murders in New York City. That number has fallen by 85 percent. Murders are down, often dramatically, in cities across the country. The overall rate of violent crime has fallen by about half in recent decades. U.S. manufacturing output per worker trebled from 1975 to 2005, and our total manufacturing output continues to climb. Despite the no-knowthings who go around complaining that “we don’t make things here anymore,” the United States continues to make the very best of almost everything and, thanks to our relatively free-trading ways, to consume the best of everything, too. General-price inflation, the bane of the U.S. economy for some decades, is hardly to be seen. Flexible and effective institutions helped ensure that we weathered one of the worst financial crises of modern times with surprisingly little disruption in the wider economy. Despite politicians who would usurp our rights, our courts keep reliably saying that the First Amendment and the Second Amendment pretty much mean what they say. I just filled up my car for $1.78 a gallon.
The world isn’t ending.
To the economist Tyler Cowen the world is indebted for the phrase “the fallacy of mood affiliation,” which he explains:
It seems to me that people are first choosing a mood or attitude, and then finding the disparate views which match to that mood and, to themselves, justifying those views by the mood. I call this the “fallacy of mood affiliation,” and it is one of the most underreported fallacies in human reasoning. In the context of economic growth debates, the underlying mood is often “optimism” or “pessimism” per se and then a bunch of ought-to-be-independent views fall out from the chosen mood.
This is a more eloquent version of what I sometimes refer to as the black-hats/white-hats school of political analysis. Examples of that are the fact that a great many people with an interest in Israeli–Palestinian issues begin and end consideration of any particular fact by asking whose fault it is (in the case of negative developments) or who gets the credit (in the case of positive developments). You know the type: If a hurricane should come crashing into the Holy Land, the imams and the progressive columnists will find a way to blame it on the Jews.
The Right engages in a fair amount of mood affiliation: The country must have suffered ruination, because the Obama administration, abetted by the hated “Republican establishment,” can have done nothing but ruin the country. But then you visit New York City or Los Angeles or Chicago, or you drive across northern Mississippi or the Texas Panhandle and see all those splendid farms and technology companies and factories producing all the best things that mankind can dream of, and, well, it certainly doesn’t look like a ruined country. In the past few years, I’ve been to the Netherlands, Norway, Germany, Switzerland, Spain, Costa Rica, the United Kingdom, Mexico, and a few years further back India, Colombia, the Dominican Republic — it doesn’t look like ruined world. Of course there are unhappy corners: Haiti, Pakistan.
Francis Fukuyama was mocked for declaring “the end of history” as the Cold War came to a close, but he wasn’t really wrong. Haiti and Pakistan, and the territories currently held by the so-called Islamic State, do not represent the emergence of a credible competitor to liberal democracy; they are only failed states, and failure is something of which there is, alas, to be no end. Even in the case of such deeply illiberal and undemocratic regimes as the one ensconced in Beijing, the drive toward free enterprise, toward higher quality in governance, and even toward accountability (implicit rather than explicit in China) is present. China’s political situation isn’t good; it is, however, better. And, given the institutional failures we have seen in other countries when procedural democracy emerged before effective and accountable institutions — Haiti, again — it may turn out that in 100 years China’s path will, despite the many horrors associated with its rulers’ brutality, turn out to have been something closer to the right one than the alternatives we liberal democrats in Anno Domini 2015 imagined. Even within the relatively narrow world of capitalist democracies, the old debate between the social democrats and the partisans of Anglo-American liberalism includes a great deal more consensus than it did 60 years ago.
The Americans are looking at the Danes and the Danes at the Swiss and the Swiss at the Singaporeans and the Singaporeans at the Koreans and the Koreans at the Americans, and there is a just barely detectable coalescent understanding that while there will always be national and cultural differences (we have different nations and cultures in part because people have genuinely divergent preferences about how to live), the common thread seems to be that effective states are deep but narrow: strong states that can do what needs doing but do so with the understanding that this includes a limited menu of items. Local conditions may vary, but there’s no reason you can’t have free trade and a good metro rail. Works in Hong Kong, works in Copenhagen, works in Zurich. It would work in New York if that city’s Sandinista regime were interested in governmental quality rather than ersatz class warfare. The declines of such scourges as polio and famine provide no neat, satisfying answers either for us classical-liberal/libertarian conservatives or for progressives who prefer a more activist mode of government. Yes, private philanthropists really did take the lead in polio eradication, 1.2 million Rotary Club members around the world singing dopey songs at lunch meetings and raising money and dispatching volunteers all over the world — that was a big, big part of how it was done. But there were also grants and projects from central governments and their public-health agencies, international organizations such as WHO, etc. The key was that each element was permitted to work on the aspect of the problem most suited to its capabilities.
The world is healthier, wealthier, and less hungry mainly because of the efforts of millions of unknown investors, entrepreneurs, farmers, workers, bankers, etc., all working without any central coordinating authority. But the spread of those benefits to places such as India and China was the work of political actors, and the entrenchment of free enterprise will require much more from those same political actors on matters such as infrastructure and education. (Maybe you have some High Rothbardian ideas about why political actors should be irrelevant here, and maybe you aren’t wrong. But should be isn’t is, and the world in your theory relates to the actual world in approximately the same way your Dungeons & Dragons campaign relates to Europe in the Middle Ages.) Ideas are powerful and philosophy matters, but all the real problems and real solutions are terribly specific and particular and, being embedded in real conditions rather than theoretical conditions, resistant to purely ideological management. The world is getting better because real people are doing real work to make it better, not because your political preferences or mine are attached to some sort of Hegelianly inevitably capital-H History.
There is much left to do: We have unsustainable fiscal situations in the Western welfare states, irreconcilable Islamist fanatics originating in points east but spread around the world, environmental challenges, and that tenth of the human race that still needs lifting out of hardcore poverty. But we have achieved a remarkable thing in that unless we mess things up really badly, in 50 years we’ll be having to explain to our grandchildren what a famine was, how it came to be that millions of people died every year for want of clean water — and they will look at us incredulously, wondering what it must have been like to live in the caveman times of the early 21st century.
What was in Williamson’s coffee when he wrote this?
In the run-up to the 2015 U.N. Climate Change Conference in Paris from Nov. 30 to Dec. 11, rich countries and development organizations are scrambling to join the fashionable ranks of “climate aid” donors. This effectively means telling the world’s worst-off people, suffering from tuberculosis, malaria or malnutrition, that what they really need isn’t medicine, mosquito nets or micronutrients, but a solar panel. It is terrible news. …
All these pledges had their genesis in the chaos of the Copenhagen climate summit six years ago, when developed nations made a rash promise to spend $100 billion a year on “climate finance” for the world’s poor by 2020. Rachel Kyte, World Bank vice president and special envoy for climate change, recently told the Guardian (U.K.) newspaper that the $100 billion figure “was picked out of the air at Copenhagen” in an attempt to rescue a last-minute deal. Yet achieving that arbitrary goal is now seen as fundamental to the success of the Paris summit.
This is deeply troubling because aid is being diverted to climate-related matters at the expense of improved public health, education and economic development. The Organization for Economic Cooperation and Development has analyzed about 70% of total global development aid and found that about one in four of those dollars goes to climate-related aid.
In a world in which malnourishment continues to claim at least 1.4 million children’s lives each year, 1.2 billion people live in extreme poverty, and 2.6 billion lack clean drinking water and sanitation, this growing emphasis on climate aid is immoral.
Not surprisingly, in an online U.N. survey of more than eight million people from around the globe, respondents from the world’s poorest countries rank “action taken on climate change” dead last out of 16 categories when asked “What matters most to you?” Top priorities are “a good education,” “better health care, “better job opportunities,” “an honest and responsive government,” and “affordable, nutritious food.”
According to a recent paper by Neha Raykar and Ramanan Laxminarayan of the Public Health Foundation of India, just $570 million a year—or 0.57% of the $100 billion climate-finance goal—spent on direct malaria-prevention policies like mosquito nets would reduce malaria deaths by 50% by 2025, saving an estimated 300,000 lives a year.
Providing the world’s most deprived countries with solar panels instead of better health care or education is inexcusable self-indulgence. Green energy sources may be good to keep on a single light or to charge a cellphone. But they are largely useless for tackling the main power challenges for the world’s poor.
According to the World Health Organization, three billion people suffer from the effects of indoor air pollution because they burn wood, coal or dung to cook. These people need access to affordable, reliable electricity today. Yet too often clean alternatives, because they aren’t considered “renewable,” aren’t receiving the funding they deserve.
A 2014 study by the Center for Global Development found that “more than 60 million additional people in poor nations could gain access to electricity if the Overseas Private Investment Corporation”—the U.S. government’s development finance institution—“were allowed to invest in natural gas projects, not just renewables.”
Addressing global warming effectively will require long-term innovation that will make green energy affordable for everyone. Rich countries are in a rush to appear green and generous, and recipient countries are jostling to make sure they receive the funds. But the truth is that climate aid isn’t where rich countries can help the most, and it isn’t what the world’s poorest want or need.
Britishers with taste bought this single when it hit the charts today in 1961:
Today in 1965, the four Beatles were named Members of the Order of the British Empire by Queen Elizabeth. The Beatles’ visit reportedly began when they smoked marijuana in a Buckingham Palace bathroom to calm their nerves.
The Beatles’ receiving their MBEs prompted a number of MBE recipients to return theirs. “Lots of people who complained about us receiving the MBE received theirs for heroism in the war — for killing people,” said John Lennon, previewing the public relations skills he’d show a year later when he would compare the Beatles to Jesus Christ. “We received ours for entertaining other people. I’d say we deserve ours more.”
Lennon returned his MBE in 1969 as part of his peace protests.
Today in 1963, the Beatles played two shows in Sundstavagen, Sweden, to begin their first tour of Sweden. The local music critic was less than impressed, claiming the Beatles should have been happy for their fans’ screaming to drown out the group’s “terrible” performance, asserting that the Beatles “were of no musical importance whatsoever,” and furthermore claiming their local opening act, the Phantoms, “decidedly outshone them.”
Three thoughts: Perhaps the Beatles did have a bad night. But have you heard a Phantoms song recently? It is also unknown whether the Beatles’ “Norwegian Wood” was intended as revenge against the Swedes.
One year later, a demonstration of why the phrase “never say never” holds validity: Today in 1964, the Rolling Stones made their first appearance on CBS-TV’s Ed Sullivan Show.
A riot broke out in the CBS studio, which prompted Sullivan to say, “I promise you they’ll never be back on our show again.” “Never” turned out to be May 2, 1965, when the Stones made the second of their six performances on the rilly big shew.
In the four-year history of this blog, I have written little about clothing except for athletic uniforms.
Then I read Grantland, which was inspired by the next Star Wars movie:
A long time ago, in a galaxy far, far away, dudes wore dope space jackets. Judging from the just-released and possibly final trailer for Star Wars: The Force Awakens, that tradition — like Stormtroopers that can’t shoot straight — continues. And it’s all for the good. Just because there’s a devastating galactic civil war in progress that has already involved multiple planetary genocides doesn’t mean that a man can’t look his dashing best while bull’s-eyeing womp rats in his T-16, or zipping betwixt the lumbering legs of an AT-AT in a snowspeeder, or flying an X-wing into a superweapon’s utility trench. Say what you will about those scruffy, neosocialist Rebel Alliance hippies, but they understood the important branding message of looking rad. How else are you going to get people to sign up for a suicidal war or fly the Y-wing, the scrub vehicle of the Rebel Alliance?
Cool jackets are integral to Star Wars and the wider sci-fi/fantasy realm. They’re what separates a pop-culturally important work of imaginative fiction from the Star Wars kid; make your characters look cool or they will come off like nerds. From the trailer, it appears J.J. Abrams gets the cool-jacket aspect of Star Wars absolutely right. Which is yet another reason the Episode I-III prequels were unmitigated space trash. Those movies contain zero dope jackets. Because of the overtly wack Jedi focus of those films, every dude was stuck wearing those lame-ass brown monk bathrobes and loose-fitting, rough-spun kimono tunics. Also, pro tip: When a Jedi starts wearing black robes, maybe keep an eye on that person. Just a thought. …
Now that we’re done running through what didn’t work and disqualifying these cosmic affronts to fashion, here are the definitive Star Wars jacket ratings:
1. Luke Skywalker’s Battle of Yavin Medal Ceremony Jacket, A New Hope
LUCASFILM
This look is untouchable. Equally at home in the vast galactic void, the roller rink, or on your princess/sister’s bedroom floor, this maize-colored space-satin-and-polyester lady slayer is the jacket that started it all. Accept no substitutes.
2. Han Solo’s Cloud City Casual, The Empire Strikes Back
LUCASFILM
Han wears this dark navy space-cotton windbreaker for basically the entirety of The Empire Strikes Back. Smart man. When you’re trying to smash with royalty, you want to look cool, of course, but equally important is looking like you don’t really give a shit if you smash or look cool. This jacket says, “I’m awesome, I know it, and so do you.” Han even wears it while being brutally tortured on Darth Vader’s rack-of-random-car-parts machine.
3. Han Solo’s Hoth Parka, The Empire Strikes Back
LUCASFILM
Want to look fresh as uncut conflict diamonds while tucking your too-turnt best dude into the sliced-open stomach cavity of a dead bipedal pack animal? Then this military-style anorak with fur-lined hood is for you. Canada Goose — which accounts for two out of every three winter jackets in New York City — legit charges almost $1,000 for knockoffs of this coat.
4. Han Solo’s Jacket, The Force Awakens
LUCASFILM
Old-ass Han Solo, meanwhile, is — as per usual — still rocking out with a rakish fashion sensibility even if this jacket isn’t quite as awesome as others he’s worn throughout the series.
Han always knew the value of a great jacket. And what are those three metal vials on his left breast? High-caliber bullets? Space whippets? Corellian Viagra? Whatever they are, it’s probably illicit. Smugglers gotta smuggle.
Han-related aside: My low-key favorite part of Return of the Jedi is that everyone in the rebel raiding party sent to Endor, including Luke and Princess Leia, are wearing forest-green camo ponchos and helmets — the better to blend into the sylvan woods — and Han just wears a cowboy-style duster and his regular vest-over-shirt look because, like, whatever. The whole galaxy depends on stealthily turning off the new Death Star’s energy shield? Doesn’t mean you can’t still look great. …
6. Finn’s Bomber, The Force Awakens
LUCASFILM
Take a look at our man Finn (John Boyega) and his possibly Empire-issued leather quasi-bomber jacket, which is pretty OK from a Star Warsouterwear perspective.
Finn is giving off that vibe of like “Hey, this jacket is OK and hopefully I get a doper one for the sequel.” It actually looks cooler from the back.
LUCASFILM
7. Bossk’s Yellow Flight Suit and Greedo’s Biker Jacket, Empire and A New Hope(Tie)
LUCASFILM
You see Bossk for only like 30 seconds in Empire, which meant that owning his action figure was a mark of status among the neighborhood kids. Owning a Bossk figure said “I know Star Wars.” I prefer Bossk’s flight suit to the semi-wack jumper Luke wears in Empire when he goes to Dagobah. Greedo, meanwhile, doesn’t get enough credit for the two-tone biker jacket he wears under his totally unnecessary but very Star Wars–ian vest. The guy — or fish or seahorse or whatever — really knew how to accessorize his skin.
8. Lando’s “I’m a General Now” Officer’s Jacket and Cape, Return of the Jedi
I’ve always been confused at the ease with which various characters got promoted up the ranks of the Rebel Alliance. In Empire, Lando betrayed our heroes to the Empire, which got Han tortured, frozen, and hung on Jabba’s wall like a Rothko. Yes, he had little choice and felt bad about it, and he later helped Leia, Chewie, Luke, and the droids escape, but facts is facts. Then, by the middle of Return of the Jedi, Lando was not only accepted into the rebellion, but he became a general. I guess beggars can’t be choosers; if the rebellion turned away everyone who used to snitch for the Empire, who’d be left to volunteer to fly suicide missions into the new Death Star? …
10. Ponda Baba’s Orange Biker/Bomber Jacket, A New Hope
LUCASFILM
Ponda Baba is an Aqualish pirate and thug who you may remember as the alien who tried beefing with a young Luke Skywalker only to get his arm sliced off by Obi-Wan Kenobi. Which is sad. Not because of the arm — those, as we’ve seen time and again, are easily replaced in the galaxy far, far away — but because that saber slice ruined a perfectly fly jacket.
11. Luke’s Dagobah Jacket, The Empire Strikes Back
LUCASFILM
The Empire Strikes Back came out in 1980, so it’s kind of weird Luke so rarely wore a jacket with a poppable collar. Sadly, it’s the weakest sartorially of his non-Jedi kimono jackets, a tan canvas safari number. Which, yeah, he was in the swamp.
NON–STAR WARS SPECIAL MENTION SECTION
Two other jackets in the wider sci-fi/fantasy realm deserve attention.
Star-Lord’s crimson Han Solo–inspired space jacket:
MARVEL STUDIOS
Jaime Lannister’s “Going to Dorne” jacket:
HBO
Whether it’s in outer space or the Seven Kingdoms, cool dudes wear cool jackets.
Or dope or wack jackets, apparently. But this is not a new development. In the real world, flight jackets date back to the first days of military flight, World War I, though they started to shorten from overcoat length toward World War II.
… World War II fliers wore what became called “bomber jackets” because the jackets were warm at high altitude in nonpressurized airplanes. That didn’t mean jackets weren’t worn in slightly inclement (as in Great Britain) weather, of course.
The weather for ground troops got dicey as well, and once the Army figured out what it had wasn’t appropriate for a worldwide war, the Army developed …
… a jacket that could be worn underneath a longer wool coat for layering. However, Gen. Dwight Eisenhower requested around the same time a jacket that looked like the British “battle jacket,” though more distinctive. So Eisenhower’s idea for a jacket became known as …
… the Eisenhower jacket, or the “Ike” jacket, in part because of Eisenhower’s popularity among his troops.
After World War II ended, police found that a waist-length jacket allowed better access to guns, batons, clubs, etc., and so police started wearing them:
Yes, that photo depicts fictional officers Malloy and Reed of “Adam-12,” but creator Jack Webb was a stickler for accuracy, much more so than others in Hollywood:
You might reasonably ask why a Navy pilot assigned to San Diego, which has arguably the nicest weather in the entire country, would wear a leather jacket. Because style, man. (There are bigger issues with “Top Gun” than what Tom “Maverick” Cruise is wearing.)
In the post-World War II days leather jackets started showing up in pop culture …
Marlon Brando in “The Wild One”James DeanElvis Presley
I have served in neither the military nor the police (bad eyesight, among other things), and I’m not an actor. But I recognize the value of a stylish, yet usable, jacket. In fact, I have managed to accumulate several leather jackets, though I don’t have the first one I purchased due to its 1970s/1980s reddish-brown color. (I purchased it with $104 of my own money in 1982. One week later for unrelated reasons my first girlfriend broke up with me while I was wearing that jacket. It’s a good thing I bought it then anyway, though, because one week after that my first employer closed its doors.) I do have a black Top Gun-style jacket with zip-liner, a beaten-up-brown bomber jacket, a black leather blazer, and a longer (though not trenchcoat-length) brown leather “car coat.”
After our oldest son was born, I found myself in Sheboygan to do a story near Sheboygan Harley–Davidson. With a bit of time to kill, I walked into the store not to look at the motorcycles, which I was not about to purchase, but in the clothing session. And there I saw a toddler-sized leather-looking biker jacket. I didn’t have a cellphone with camera at the time; I just called Mrs. Presteblog and said we have to have this. The three of us wore black leather for a family photo.
The jacket with sentimental value, though, is not leather:
This is my UW Band jacket from my five years in the world’s greatest college marching band. It obviously is similar to a letter jacket (though reversible for going incognito), which I never got to wear because of my athletic suckage. You have to buy the jacket, but the band W goes to those who complete two years in the band, and the pin on the collar goes to three-year marchers. Obviously I have almost no opportunity or reason to wear it, but at least it fits again now that I weigh about what I did when I graduated from UW–Madison.
Back to science fiction: Stylish jackets also can be found in the Star Trek universe, though they are much harder to find …
… and elsewhere in the TV universe:
Col. Ryan’s jacket in the movie “Von Ryan’s Express …”… ended up on “Hogan’s Heroes” …… while the A2 was also worn by Steve McQueen, one of the Three Cool Steves. As for the other Steves, I found no photo of the Six Million Dollar Man, Steve Austin, wearing a leather jacket, and why would Steve McGarrett wear leather in Hawaii?“Voyage to the Bottom of the Sea,” where Captain Crane flies the Flying Sub while Admiral Nelson apparently is the navigator.Starsky and Hutch also wore leather jackets, despite the temperate weather in Los Angeles, or “Bay City.”
As a second-generation (out of three, as you know) musician (if that’s what you want to call it in my case), I had to post this, from MainStreet:
The band geeks are having the last laugh. Considering all forms of music education – whether it was being in a choir, taking formal instrument lessons, or playing gigs in a garage band – American adults say such early experiences pay off later in life.
Seven in ten (71%) adults responding to a Harris poll say that the lessons and habits gained from music education equip people to be better team players in their careers. More than two-thirds say the “Glee” factor provides people with a disciplined approach to solving problems (67%) and prepares someone to manage the tasks of their job more successfully (66%).
And it wasn’t just about being in the marching band. Over three-quarters of Americans (76%) have had some sort of music education during school – half (49%) were in a chorus and more than two in five (43%) took formal instrument lessons. Many (39%) played in a school orchestra or band, while some played in an informal group, such as a garage band (14%) or took formal vocal lessons (13%).
The benefits of a music education may be even more tangible. The U.S. Department of Education compiled data on more than 25,000 secondary school students and found that students with high levels of involvement in instrumental music in their middle and high school years had “significantly higher levels of mathematics proficiency by grade 12.”
A 2007 study from the University of Kansas reported that students in elementary schools with top-quality music education programs scored 19% higher in English than students in schools without a music program – and 17% higher in mathematics.
Those who are looking to relocate, particularly with children, might want to consider somewhere named one of the Best Communities for Music Education. (We moved from one to another one.)
A horrible irony today in 1964: A plane carrying all four members of the group Buddy and the Kings crashed, killing everyone on board. Buddy and the Kings was led by Harold Box, who replaced Buddy Holly with the Crickets after Holly died in a plane crash in 1959:
Today in 1976, Chicago had its first number one single, which some would consider the start of its downward slope to sappy ballads:
Some people feel the Republican Party is in chaos, with no obvious front-runner for president, no one apparently willing to be Speaker of the House of Representatives, and the party in danger of losing control of the U.S. Senate in an election year that seems to favor Democrats.
But if you think the GOP has it bad, James Taranto reports on a lefty writer who thinks the opposite is the case:
Matt Yglesias delivers this cheery news to readers of the liberal young-adult website Vox.com: “The Democratic Party is in much greater peril than its leaders or supporters recognize, and it has no plan to save itself.” He has a pretty good argument. Look past President Obama’s “victory lap” and the current disarray in the House and the GOP presidential contest, and Republicans dominate elected politics in most of the country.
The GOP holds majorities in both houses of Congress. Thirty-one governors are Republicans, as are majorities of state attorneys general and secretaries of state. Republicans hold majorities in 69 of 99 state legislative chambers (including Nebraska’s unicameral Senate, which is formally nonpartisan) and have “unified control”—governor and legislative majorities—of 24 state governments (Yglesias erroneously says 25).
The most surprising fact in Yglesias’s piece: Apart from California, the most populous state with unified Democratic control is Oregon, which ranks 27th in population. That surprised Yglesias, too. At the end of his piece is this correction: “Earlier versions of this article said that Minnesota or Washington was the biggest non-California Democratic-controlled state, but in fact the Republicans control one legislative house in both of those states.”
Yglesias argues that even the House Republican leadership crisis is a sign of Democratic weakness: It “reflects, in some ways, the health of the GOP coalition. Republicans are confident they won’t lose power in the House and are hungry for a vigorous argument about how best to use the power they have.”
That confidence, in his view, is well-founded. Republicans have a natural geographic advantage in district-based lawmaking bodies—state legislatures and the U.S. House—because their voters are dispersed, whereas Democrats’ tend to be concentrated in big cities. Majorities are self-perpetuating, both because incumbents tend to win and because control of state government usually entails the power to draw district lines with the goal of enhancing the majority party’s advantage.
And “ ‘wave’ elections in which tons of incumbents lose are typically driven by a backlash against the incumbent president. Since the incumbent president is a Democrat, Democrats have no way to set up a wave.” Recent history bears out that point. The House majority last switched in a sitting president’s favor in 1948; since then, the president’s party has lost its majority five times, three of them since the 1990s.
Washington Monthly’s Ed Kilgore offers a rebuttal, the strongest point of which is that “even if you regard the presidency as a thin, fragile thread by which the Democratic Party holds onto a share of power, it’s a pretty damn important thread.”
To which we would add that Yglesias doesn’t dwell much on the Senate, where a Democratic majority after 2016 is well within reach. Republicans will be defending seven seats in states Obama carried in 2012, and a four-seat pickup would be sufficient for a majority assuming the Democrats win the presidency. A Democratic president and Senate could populate both the administrative agencies and the courts with liberal ideologues. With Justices Antonin Scalia and Anthony Kennedy both turning 80 next year, a reliably liberal Supreme Court majority is a serious possibility.
In 1981-86, the situation was reversed: Democrats dominated the down-ballot offices, but Republicans held the White House and Senate majority. In terms of enacting their agenda, Republicans were much better off then than they are now. (Here a caveat: Because the parties were less ideologically polarized in the ’80s, a House from the opposite party was not as great an obstacle to the president’s legislative agenda as it is today.)
Not that Yglesias would disagree with any of this. He leaves no doubt about the urgency, in his view, of electing a Democratic president:
Winning a presidential election would give Republicans the overwhelming preponderance of political power in the United States—a level of dominance not achieved since the Democrats during the Great Depression, but with a much more ideologically coherent coalition. Nothing lasts forever in American politics, but a hyper-empowered conservative movement would have a significant ability to entrench its position by passing a national right-to-work law and further altering campaign finance rules beyond the Citizens United status quo.
A subtext of Yglesias’s argument is a warning to those Democrats who are (and have excellent reasons to be) wary of Mrs. Clinton that if she loses, they would find the consequences dire.
That complements another recent Yglesias piece, in which he argued that Mrs. Clinton’s contempt for “procedural niceties”—i.e., laws, rules and customs—would make her an “effective president,” one willing to do whatever it takes to get things done. (As we argued, Yglesias may be engaged in wishful thinking if he expects Mrs. Clinton to employ her amoral tactics in the service of a cause other than herself. Recall that Bill Clinton’s boldest assertion of executive power was Clinton v. Jones, the 1997 case in which the Supreme Court unanimously rejected his assertion that his office immunized him against a private lawsuit for sexual harassment.)
Yglesias’s piece is weaker on the question of just what the Democrats should do about their down-ballot predicament. He accuses them of “complacency and overconfidence” and observes that “the party is marching steadily to the left on its issue positions . . . even though existing issue positions seem incompatible with a House majority or any meaningful degree of success in state politics.” He cites the example of Wendy Davis, the pro-abortion extremist who thrilled national Democrats but was an obvious mismatch for her state. She got trounced in last year’s Texas governor race.
But his only real advice is this: “The first step for Democrats is admitting they have a problem.” Gee, thanks.
He is weaker still on the question of how the party ended up in this situation. Indeed, he doesn’t address it at all. But when Obama took office in 2009, his party had large majorities in both House and Senate and was considerably better off by every other measure Yglesias cites. It is probably already accurate to say that no president since Herbert Hoover has overseen such a calamitous down-ballot performance by his party. And Hoover was in office at the time of a financial crisis. Obama was supposed to play the role of FDR.
One word that never appears in the Yglesias essay is “ObamaCare.” (Nor does he refer to it by its euphemistic formal title, the Patient Protection and Affordable Care Act, or by a generic term like “health-care reform.”) Putting aside the question of its merit as policy, can anyone deny that the politics of ObamaCare were and have remained disastrous? Obama’s “signature achievement” was an ideological act of recklessness that put the diminished and discredited GOP back on the right side of public opinion. More than any other factor, it enabled the overwhelming Republican victories in 2010 and, after its effects began becoming clear, in 2014.
The Democrats’ best hope for 2016 is that voters will conclude the Republican nominee is manifestly unqualified. The latest NBC News/Wall Street Journal poll finds Donald Trump still leading the Republican field, with 25%. Ben Carson is second, with 22%. If the Obama years have made Democrats overconfident, they’ve made many Republicans desperate enough to turn to men who’ve never even run for public office. Recklessness can as easily be born of anger as of complacency.
This schizoid view is represented in Wisconsin as well. Republicans control both houses of the Legislature, all but one partisan statewide office, and five of eight Congressional seats. On the other hand, a majority of Wisconsin voters haven’t voted for a Republican for president since Ronald Reagan in 1984, and while there is one U.S. senator of each party, U.S. Sen. Ron Johnson (R–Wisconsin) is not favored for reelection next year.