The number one single today in 1961:
The number one British single today in 1964 was sung by a 21-year-old former hairdresser and cloak room attendant:
That day, the Rolling Stones made their second appearance on BBC-TV’s “Top of the Pops”:
The number one single today in 1961:
The number one British single today in 1964 was sung by a 21-year-old former hairdresser and cloak room attendant:
That day, the Rolling Stones made their second appearance on BBC-TV’s “Top of the Pops”:
Today in 1955, Billboard magazine reported that sales of 45-rpm singles …
… had exceeded sales of 78-rpm singles for the first time.
The number one single today in 1966:
The number one album today in 1966 was the Beatles’ “Rubber Soul”:
The number one country and western single today in 1956 was the singer’s number one number one:
The number one British album today in 1984 was the Thompson Twins’ “Into the Gap”:
The number one single today in 1984 was adapted by WGN-TV for its Chicago Cubs games — a good choice given that the Cubs that season decided to play like an actual baseball team:
After 20 years, the big Outdoor Retailer trade show is leaving Salt Lake City — not because it ran out of space or got a better deal elsewhere but because Utah lawmakers opposed an expansion of the industry’s biggest federal subsidy.
To most Americans, national parks and monuments are places to enjoy the outdoors while preserving natural and historical treasures. To the Outdoor Industry Association, they’re also a business necessity. It calls public lands “the backbone of the industry’s sales.”
“Utah elected officials do not support public lands conservation nor do they value the economic benefits — $12 billion in consumer spending and 122,000 jobs — that the outdoor recreation industry brings to their state,” Rose Marcario, the president and chief executive of Patagonia, declared in a statement announcing that her company would no longer attend the Salt Lake City show. Other industry leaders, including Polartec LLC and Arc’teryx Equipment Inc., quickly joined in. Then last Thursday, after a conference call with Utah Governor Gary Herbert that ended on a “curt” note, the show’s organizers said they’ll go elsewhere when their contract expires next year. Colorado is campaigning for their business.
At issue is the December designation of 1.35 million acres of federal land as Bears Ears National Monument. How to preserve the area has long been a contentious subject in Utah. President Barack Obama’s late-term action thrilled environmentalists and tribal leaders, upset ranchers and other rural residents, and thwarted oil and mineral development and the blue-collar jobs it might mean. The Republican governor, legislature, and congressional delegation all opposed the designation. In response, the legislature passed a resolution calling on the Trump administration to reverse Obama’s decision. When the governor signed it, calls for the boycott began.
Since it frankly acknowledges that its sales depend on public lands, is the outdoor industry applying a moral veneer to its quest for profits? Or is it using a corporate fig leaf to promote managers’ political views? The two motives are in fact impossible to separate.
When it comes to public lands, the outdoor industry shares the view of pioneering labor leader Samuel Gompers: “We do want more, and when it becomes more, we shall still want more.” As with the union movement, the industry’s financial interests are inextricable from its social values.
As I’ve previously written, the industry is one example of a much larger cultural and economic phenomenon: the shift from function to meaning as a source of economic value and, with it, the melding of consumption, politics and identity. What we buy increasingly expresses who we are.
Brands built on specific political or cultural values will inevitably take public stances, using their economic clout to influence public policy, whether out of genuine conviction, cold-eyed market positioning, or both. It’s not surprising that Patagonia Inc., the outdoor apparel brand most prominently built on its political stances, led the anti-Utah charge.
The bigger question is whether in the Trump era brands that aren’t traditionally political will feel forced to choose sides. Overtly political shopping is on the rise. Every week seems to bring a new boycott: against North Carolina over its bathroom bill, Nordstrom Inc. stores because they carry — or discontinue — Ivanka Trump’s merchandise, Kellogg Company for dropping ads on Breitbart News, Under Armour Inc. for its chief executive’s nice words about Donald Trump, Starbucks Corporation for pledging to hire refugees, and on and on. Wegmans Food Markets Inc. recently sold out of Trump Winery products in Virginia. The reason: calls for the chain to drop the wines, which produced a pro-Trump backlash.
Wine, breakfast cereal, workout clothes and business apparel aren’t inherently political goods. Brand choices choices may reflect largely unconscious tribal affinities, but they allow some play. You can eat organic food and vote Republican or drive an SUV and vote Democratic. Conservatives can enjoy Meryl Streep and liberals can esteem Clint Eastwood. “Vote right, live left,” an urbanite conservative advised me many years ago. Despite Trump’s frequent attacks on Jeff Bezos, Americans of all stripes like Amazon.com Inc.
Now, however, that pluralism is at risk. We seem headed toward an economy of red brands and blue brands, red employers and blue employers, with no common ground. In this context, the outdoor industry’s action is a disturbing bellwether, as is the increasing partisanship of once-evenhanded fashion magazines like Vogue. Outdoor activity appeals to Americans of all political persuasions, and the country’s western landscape has long helped define the national identity. People can disagree over how best to enjoy and protect that landscape, and how to weigh preservation against other values, while still sharing much in common. Enforcing the party line by declaring an entire state off limits is an extreme step.
Writing with the memory of religious wars, Voltaire in 1733 offered a peaceful alternative. “Go into the London Stock Exchange — a more respectable place than many a court — and you will see representatives from all nations gathered together for the utility of men,” he wrote:
Here Jew, Mohammedan and Christian deal with each other as though they were all of the same faith, and only apply the word infidel to people who go bankrupt. Here the Presbyterian trusts the Anabaptist and the Anglican accepts a promise from the Quaker. On leaving these peaceful and free assemblies some go to the Synagogue and others for a drink, this one goes to be baptized in a great bath in the name of Father, Son and Holy Ghost, that one has his son’s foreskin cut and has some Hebrew words he doesn’t understand mumbled over the child, others go to their church and await the inspiration of God with their hats on, and everybody is happy.
Once the great solvent of difference, commerce threatens to become its enforcer. And everyone is unhappy.
Wisconsinites know we’ve already had that here. Read the list from the Scott Walker Watch website of companies whose employees and/or management committed the unforgivable crime of giving money to Walker’s campaign, including Kwik Trip, Johnsonville Sausage and Georgia–Pacific, owned by The Evil Koch Brothers. At least two advocated boycotting the entire state.
That prompted an Isaac Newton-like “buycott,” where Walker supporters encouraged themselves and others to buy from companies whose employees and/or management contributed to Walker’s campaign. On the one hand, Walker has been reelected twice since the boycott attempt, and on the other hand, I believe no company on the boycott list has gone out of business, and Wisconsin appears to have survived. The political fortunes of those supporting boycotts have sunk like a Chicago Bears football season.
This is nearly all the fault of liberals. The phrase “the personal is political” came from neither conservatives nor libertarians; it came from a feminist, Carol Hanisch. There are a few liberals (this writer and the late Christopher Hitchens, for two) who understand how vapid that assertion is, but in our hyperpolitical times, now some conservatives are touting boycotting New Glarus Brewing, Penzey’s Spices, Madison and the musical “Hamilton.”
Part of the problem is that many people don’t grasp that for nearly every business (and I have yet to find one beyond one or two employees for which this is not the case) employee pay (including benefits) far exceeds that business’ profits. So if you think your not purchasing something from a company will hurt the owners, you’re wrong; it will hurt that company’s employees first and foremost. The American Enterprise Institute provides this chart …

… that shows how ignorant Americans are about business.
If you are old enough to remember the Glory Years Packers, the answer to the question of who was the Packers’ announcer those years might be Ray Scott, from CBS-TV.
Unless you missed their home games on TV because you lived near Green Bay or Milwaukee in the old NFL blackout days, in which case the answer might be radio announcer Ted Moore:
And if you’re not old enough to remember Moore, surely you remember Jim Irwin:
Before Moore, who started announcing Packers games in 1960, there was Mike Walden, who announced Badger, Packer and, on TV, Milwaukee Braves games. One of Walden’s games was the 1963 Rose Bowl, which he announced on the NBC radio broadcast with USC announcer Tom Kelly:
Apparently Walden liked southern California, because he then left Wisconsin and moved to California, replacing Kelly on radio while Kelly moved to TV.
The Los Angeles Times reports:
USC’s broadcaster Mike Walden was in enemy territory when the Trojans’ basketball team finally handed UCLA its first loss at Pauley Pavilion in 1969. When it was over, Walden climbed atop the announcer’s table and yelled, “The Trojans win! The Trojans win! The Trojans win!” much like the legendary Harry Caray.
So Walden lost a few friends several years later when he took a job across town and became the only person to serve as the broadcast voice for both USC and UCLA.
“But Mike Walden was a journalist first, and did not want to be known as a homer,” his son, Gregory Walden, reminisced in an email.
Walden, a Southern California Sports Broadcasters Hall of Fame member best known for his coverage of the Trojans and Bruins, and for his loud sport coats, died Sunday at his home in Tarzana from complications related to a stroke, his son said Thursday. He was 89.
The interesting thing about the aforementioned Walden, Kelly (who died in June), Enberg, Miller and longtime Los Angeles Lakers announcer Chick Hearn is that they all grew up in the Midwest. Kelly’s first radio job was in Janesville, and though he started broadcasting for USC in 1962, he returned to Illinois for years to broadcast the Illinois state boys basketball tournament. Miller was one of the two UW hockey radio announcers (two stations broadcasted games until Clear Channel purchased both stations). Enberg is from Michigan, graduated from Central Michigan University, and earned a Ph.D. at Indiana while announcing its games before he too headed west. (Hmmm … do I know anyone who grew up in Wisconsin and then headed to California …) Hearn, who grew up in Illinois, preceded Kelly (for one season) at USC, and once worked with Kelly on the Illinois state tournament.
The number one single today in 1973:
Today in 1976, the Eagles’ “Their Greatest Hits” became the first platinum album, exceeding 1 million sales:
Today in 2000, Carlos Santana won eight Grammy Awards for “Supernatural”:
Ann Althouse poses this hypothetical:
1. Imagine a President Trump whose policies all accord with your own. Substantively, he’s like, perhaps, Barack Obama. He’ll appoint the Supreme Court Justice who will give the liberal faction a decisive 5-person majority. He’s very accepting of undocumented immigrants, committed to Obamacare, etc. etc. — whatever it is that you like. But he has all the personal characteristics of Donald Trump. He entered politics from a successful business career, funded his own campaign using his private wealth, and figured out how to do politics on the fly, making mistakes and correcting his course. He got knocked around in the press and by party insiders who wanted to stop him, but he kept going, overcoming 16 opponents. He had his own way of talking and he took it straight to the people, with hundreds of rallies, and he especially connected with working class people. They just loved him, as the elite shook their heads, because he didn’t have the diplomacy and elegance they’d come to expect from a President. Be honest now. How would you like this man? How would you speak about his personal style?
2. Imagine a President Trump with all of the substantive policies of the real Donald Trump — all of them, exactly the same. But this Donald Trump meets your stylistic ideal. He looks, acts, and speaks the way you picture a perfect President. He never seems at all rude or crude or imprecise in his words. His tone — you know the word ‘tone’? — is well-modulated. His sentences are the right length, his vocabulary large without verging into show-offiness. He seems confident, but not arrogant. He’s nice looking and the right age, perhaps 58, and his wife, who’s only exactly as good-looking as he is, is almost the same age. He’s got what everyone regards as a “good temperament.” He’s on task and organized — his administration is up and running like a fine-tuned machine — and putting through all these policies that you loathe and dread. What would you be saying about this Donald Trump?
The first would be loved by Democrats. The second would never get elected, but would still be called the next Hitler by Democrats.
So does a later comment:
Re: option #2–Mitt Romney would have been a great president, but the Trump haters didn’t like him either.
And …
Thought Experiment #1 is what people were hoping to get with Bernie (well, maybe not a successful businessman, but someone they viewed as an outsider.)
In fact, look at TE1. That’s the mythologized version people have of… Obama. He came from academia; he was a political neophyte who learned his way against an antagonistic press (Faux News) and party insiders (Clinton) who funded his campaign through small donations (though this isn’t exactly true and ignores his reneging on his promise to McCain). Obama was a “straight shooter” who “told it like it was,” known for his eloquent speaking style (“Let me be clear,” and other Obama-isms.) Of course, Obama the myth is nothing like Obama the man (a standard left partisan who excelled at retail politics and fundraising), but that myth is exactly what the left WANTS.
And with Lent coming up, I really like this one too:
Parable of the Two Sons (Matthew 21:28-32), with the People being substituted as the one in charge.
The basic story is of a man [the People] with two sons who told his sons to go work in the vineyard. The first son refused, but later obeyed and went. The second son initially expressed obedience, but actually disobeyed and refused to work in the vineyard.
The son who ultimately did the will of the People was the first son because he eventually obeyed by confining his actions (like his Article II oath) to the Constitution and the laws (like immigration) made in pursuant thereof.
“Does the government fear us? Or do we fear the government? When the people fear the government, tyranny has found victory. The federal government is our servant, not our master!”
What’s your answer to the two questions of the previous paragraph on April 15th?
One of the disturbing things about Trump is not Trump himself, but his biggest supporters, who to me are treating Trump just like Democrats and liberals treated Obama and treat (with the exception of Comrade Bernie’s supporters) Hillary Clinton. Trump is president because of how Obama and his apparatchiks demonized white males as Ku Klux Klan members over Obama’s eight years in office, and continued to do so during the 2016 presidential campaign (right, “Deplorables”?), and are still doing so today. Reacting to that kind of demonization is human nature, not racism. In the words of another commenter …
Conclusion: none of it makes any effing difference, except insofar as Trump’s boorishness further inflames progs’ smug hatred of the deplorables, and his unconventional politics threatens their political interests.
Trump at least makes you think he’s working for you (that is, his supporters), which is the way it’s supposed to be, and not the other way around, which Democrats appear to believe. In that sense his recklessness with words works in his favor, unless, I suppose, you’re one of his targets.
Speaking of targets, one reason for Trump’s most ardent supporters’ most ardent support is his campaign against the news media, which is a symbiotic–parasitic relationship given how much free publicity the media gave him and continues to give him. I don’t like to be lumped in with the MSNBC idiots, but ignoring your readers or listeners or viewers is at your own peril, particularly these days when media companies’ bottom lines prompt job cuts. (I speak from experience.) It’s a corollary of today’s unfortunate reality that everyone today seems to make decisions based on emotion and not facts and logic.
Anyone who thinks all Republicans have problems forming complete sentences and all Democrats are, to borrow a phrase from Dan Patrick, cool as the other side of the pillow, have not seen several Wisconsin Democrats, including, for inexplicable reasons, the people the Democrats hire as their media spokespeople. Democrats in this state look like Lon Chaney’s Wolfman whenever is brought up Gov. Scott Walker, who is such a (insert your favorite pejoratives here, printable or not) that he’s been elected three times by a majority of Wisconsin voters and will win his fourth term next year. (Yes, that’s a prediction, partly due to the ineptitude of the Wisconsin Democratic Party.)
Back to Trump and his boorish style, which is unquestionably a selling point among his most ardent supporters, of which I am not one. (Worshipping politicians is evil.) Trump’s style gets in the way of whatever he’s trying to do. Trump’s administration’s inability to speak without having to, in the words of a former boss of mine, re-rack what he or they say (how many illegal immigrants do we have — 12 million? 14 million? 140 million?) makes defending him difficult for those who are not merely responding emotionally based on their hatred of the other side. Trump’s pulling the plug on Obama’s most onerous regulations also helps, though I wish Trump would simply eliminate (and therefore fire) as many federal employees as are necessary for the federal government to reach the currect size. (Because, unlike Trump, I am not a big-government conservative. In my ideal world, millions of Americans wouldn’t be able to identify elected officials not out of ignorance, but because they’re not important.)
More important than his style is the substance. I didn’t support Trump largely because I didn’t believe he would actually govern in a conservative fashion. Happily most of his Cabinet picks have been conservatives, though I’m not a fan of all of them. (The jury is still out about Secretary of State Rex Tillerson, for one, but I’m a big fan of Environmental Protection Agency administrator Scott Pruett, because the environmental movement has far too much power in this country.) So is his first Supreme Court pick, Neil Gorsuch. I’m also not particularly fan of his conclusion that the majority of our problems are because of immigrants, legal or not. (The evidence does not support that conclusion.)
The other big reason I didn’t support Trump is his trade policies, which will result in higher prices for American consumers, which is a far larger group of Americans than those affected by the North American Free Trade Agreement or the Trans Pacific Partnership. Who has benefitted from NAFTA and would benefit from TPP? Wisconsin farmers, whose work totals one-third of our economy. (And actually larger than that, given how many ag-related manufacturers Wisconsin has.) So who will be hurt by Trump’s next trade war? Wisconsin farmers. The fact no one I’m aware of from the Wisconsin GOP has spoken out against Trump’s wrongheaded trade policies is disturbing.
Trump hasn’t said what kind of tax reform he favors. (The border adjustment tax apparently supported by Congressional Republicans is a tax by another name, and the last thing we need is any kind of tax increase.) He also hasn’t said what kind of ObamaCare repeal and/or replacement he favors. Unless you’re a Trump lover or hater, the jury is still out.
Trump (and his fans) may have made the divide between political left and political right worse, but he certainly didn’t create the divide. “E Pluribus Unum” is a Utopian fantasy, probably forevermore.
George Mason University Prof. Walter E. Williams:
It was Nobel laureate economist Milton Friedman who made famous the adage, “There’s no such thing as a free lunch.” Friedman could have added that there is a difference between something’s being free and something’s having a zero price.
For example, people say that there’s free public education and there are free libraries, but public education and libraries cost money.
Proof that they have costs is the fact that somebody has to have less of something by giving up tax money so that schools and libraries can be produced and operated. A much more accurate statement is that we have zero-price public education and libraries.
Costs can be concealed but not eliminated. If people ignore costs and look only to benefits, they will do darn near anything, because everything has a benefit. Politicians love the fact that costs can easily be concealed.
The call for import restrictions, in the name of saving jobs, is politically popular in some quarters. But few talk about the costs. We know there are costs because nothing is free.
Let’s start with a hypothetical example of tariff costs. Suppose a U.S. clothing manufacturer wants to sell a suit for $200. He is prevented from doing so because customers can purchase a nearly identical suit produced by a foreign manufacturer for $150.
But suppose the clothing manufacturer can get Congress to impose a $60 tariff on foreign suits in the name of leveling the playing field and fair trade.
What happens to his chances of being able to sell his suit for $200? If you answered that his chances increase, go to the head of the class.
Next question is: Who bears the burden of the tariff? If you answered that it’s customers who must pay $50 more for a suit, you’re right again.
In his 2012 State of the Union address, President Barack Obama boasted that “over 1,000 Americans are working today because we stopped a surge in Chinese tires.”
According to a study done by the Peterson Institute for International Economics, those trade restrictions forced Americans to pay $1.1 billion in higher prices for tires. So though 1,200 jobs were saved in the U.S. tire industry, the cost per job saved was at least $900,000 in that year. According to the Bureau of Labor Statistics, the average annual salary of tire builders in 2011 was $40,070.
Here’s a question for those of us who support trade restrictions in the name of saving jobs: In whose pockets did most of the $1.1 billion that Americans paid in higher prices go? It surely did not reach tire workers in the form of higher wages.
According to the Peterson Institute study, “most of the money extracted by protection from household budgets goes to corporate coffers, at home or abroad, not paychecks of American workers. In the case of tire protection, our estimates indicate that fewer than 5 percent of the consumer costs per job saved reached the pockets of American workers.”
There is another side to this. When households have to pay higher prices for tires, they have less money to spend on other items—such as food, clothing, and entertainment—thereby reducing employment in those industries.
Some people point out that other countries, such as Japan, impose heavy tariffs on American products. Indeed, Tokyo levies a 490 percent tariff on rice imports to allow Japanese rice growers to gain higher income by charging Japanese consumers four times the world price for rice.
Therefore, some suggest that Congress should even the playing field by imposing stiff tariffs on Japanese imports to the U.S. Such an argument differs little from one that says that because the Japanese government screws its citizens, the U.S. government should retaliate by screwing its own citizens.
Putting the issue in another context: If you and I are at sea in a rowboat and I commit the foolish act of shooting a hole in my end of the boat, would it be intelligent for you to retaliate by shooting a hole in your end of the boat?
The number one song today in 1991:
Today in 1998, the members of Oasis were banned for life from Cathay Pacific Airways for their “abusive and disgusting behavior.”
Apparently Cathay Pacific knew it was doing, because one year to the day later, Oasis guitarist Paul Arthurs was arrested outside a Tommy Hilfiger store in London for drunk and disorderly conduct.
Nicholas M. Eberstedt:
Whatever else it may or may not have accomplished, the 2016 election was a sort of shock therapy for Americans living within what Charles Murray famously termed “the bubble” (the protective barrier of prosperity and self-selected associations that increasingly shield our best and brightest from contact with the rest of their society). The very fact of Trump’s election served as a truth broadcast about a reality that could no longer be denied: Things out there in America are a whole lot different from what you thought.
Yes, things are very different indeed these days in the “real America” outside the bubble. In fact, things have been going badly wrong in America since the beginning of the 21st century.
It turns out that the year 2000 marks a grim historical milestone of sorts for our nation. For whatever reasons, the Great American Escalator, which had lifted successive generations of Americans to ever higher standards of living and levels of social well-being, broke down around then—and broke down very badly.
The warning lights have been flashing, and the klaxons sounding, for more than a decade and a half. But our pundits and prognosticators and professors and policymakers, ensconced as they generally are deep within the bubble, were for the most part too distant from the distress of the general population to see or hear it. (So much for the vaunted “information era” and “big-data revolution.”) Now that those signals are no longer possible to ignore, it is high time for experts and intellectuals to reacquaint themselves with the country in which they live and to begin the task of describing what has befallen the country in which we have lived since the dawn of the new century.
Consider the condition of the American economy. In some circles people still widely believe, as one recent New York Times business-section article cluelessly insisted before the inauguration, that “Mr. Trump will inherit an economy that is fundamentally solid.” But this is patent nonsense. By now it should be painfully obvious that the U.S. economy has been in the grip of deep dysfunction since the dawn of the new century. And in retrospect, it should also be apparent that America’s strange new economic maladies were almost perfectly designed to set the stage for a populist storm.
Ever since 2000, basic indicators have offered oddly inconsistent readings on America’s economic performance and prospects. It is curious and highly uncharacteristic to find such measures so very far out of alignment with one another. We are witnessing an ominous and growing divergence between three trends that should ordinarily move in tandem: wealth, output, and employment. Depending upon which of these three indicators you choose, America looks to be heading up, down, or more or less nowhere.
From the standpoint of wealth creation, the 21st century is off to a roaring start. By this yardstick, it looks as if Americans have never had it so good and as if the future is full of promise. Between early 2000 and late 2016, the estimated net worth of American households and nonprofit institutions more than doubled, from $44 trillion to $90 trillion. (SEE FIGURE 1.)
Although that wealth is not evenly distributed, it is still a fantastic sum of money—an average of over a million dollars for every notional family of four. This upsurge of wealth took place despite the crash of 2008—indeed, private wealth holdings are over $20 trillion higher now than they were at their pre-crash apogee. The value of American real-estate assets is near or at all-time highs, and America’s businesses appear to be thriving. Even before the “Trump rally” of late 2016 and early 2017, U.S. equities markets were hitting new highs—and since stock prices are strongly shaped by expectations of future profits, investors evidently are counting on the continuation of the current happy days for U.S. asset holders for some time to come.
A rather less cheering picture, though, emerges if we look instead at real trends for the macro-economy. Here, performance since the start of the century might charitably be described as mediocre, and prospects today are no better than guarded.
The recovery from the crash of 2008—which unleashed the worst recession since the Great Depression—has been singularly slow and weak. According to the Bureau of Economic Analysis (BEA), it took nearly four years for America’s gross domestic product (GDP) to re-attain its late 2007 level. As of late 2016, total value added to the U.S. economy was just 12 percent higher than in 2007. (SEE FIGURE 2.) The situation is even more sobering if we consider per capita growth. It took America six and a half years—until mid-2014—to get back to its late 2007 per capita production levels. And in late 2016, per capita output was just 4 percent higher than in late 2007—nine years earlier. By this reckoning, the American economy looks to have suffered something close to a lost decade.
But there was clearly trouble brewing in America’s macro-economy well before the 2008 crash, too. Between late 2000 and late 2007, per capita GDP growth averaged less than 1.5 percent per annum. That compares with the nation’s long-term postwar 1948–2000 per capita growth rate of almost 2.3 percent, which in turn can be compared to the “snap back” tempo of 1.1 percent per annum since per capita GDP bottomed out in 2009. Between 2000 and 2016, per capita growth in America has averaged less than 1 percent a year. To state it plainly: With postwar, pre-21st-century rates for the years 2000–2016, per capita GDP in America would be more than 20 percent higher than it is today.
The reasons for America’s newly fitful and halting macroeconomic performance are still a puzzlement to economists and a subject of considerable contention and debate.1Economists are generally in consensus, however, in one area: They have begun redefining the growth potential of the U.S. economy downwards. The U.S. Congressional Budget Office (CBO), for example, suggests that the “potential growth” rate for the U.S. economy at full employment of factors of production has now dropped below 1.7 percent a year, implying a sustainable long-term annual per capita economic growth rate for America today of well under 1 percent.
Then there is the employment situation. If 21st-century America’s GDP trends have been disappointing, labor-force trends have been utterly dismal. Work rates have fallen off a cliff since the year 2000 and are at their lowest levels in decades. We can see this by looking at the estimates by the Bureau of Labor Statistics (BLS) for the civilian employment rate, the jobs-to-population ratio for adult civilian men and women. (SEE FIGURE 3.) Between early 2000 and late 2016, America’s overall work rate for Americans age 20 and older underwent a drastic decline. It plunged by almost 5 percentage points (from 64.6 to 59.7). Unless you are a labor economist, you may not appreciate just how severe a falloff in employment such numbers attest to. Postwar America never experienced anything comparable.
From peak to trough, the collapse in work rates for U.S. adults between 2008 and 2010 was roughly twice the amplitude of what had previously been the country’s worst postwar recession, back in the early 1980s. In that previous steep recession, it took America five years to re-attain the adult work rates recorded at the start of 1980. This time, the U.S. job market has as yet, in early 2017, scarcely begun to claw its way back up to the work rates of 2007—much less back to the work rates from early 2000.
As may be seen in Figure 3, U.S. adult work rates never recovered entirely from the recession of 2001—much less the crash of ’08. And the work rates being measured here include people who are engaged in any paid employment—any job, at any wage, for any number of hours of work at all.
On Wall Street and in some parts of Washington these days, one hears that America has gotten back to “near full employment.” For Americans outside the bubble, such talk must seem nonsensical. It is true that the oft-cited “civilian unemployment rate” looked pretty good by the end of the Obama era—in December 2016, it was down to 4.7 percent, about the same as it had been back in 1965, at a time of genuine full employment. The problem here is that the unemployment rate only tracks joblessness for those still in the labor force; it takes no account of workforce dropouts. Alas, the exodus out of the workforce has been the big labor-market story for America’s new century. (At this writing, for every unemployed American man between 25 and 55 years of age, there are another three who are neither working nor looking for work.) Thus the “unemployment rate” increasingly looks like an antique index devised for some earlier and increasingly distant war: the economic equivalent of a musket inventory or a cavalry count.
By the criterion of adult work rates, by contrast, employment conditions in America remain remarkably bleak. From late 2009 through early 2014, the country’s work rates more or less flatlined. So far as can be told, this is the only “recovery” in U.S. economic history in which that basic labor-market indicator almost completely failed to respond.
Since 2014, there has finally been a measure of improvement in the work rate—but it would be unwise to exaggerate the dimensions of that turnaround. As of late 2016, the adult work rate in America was still at its lowest level in more than 30 years. To put things another way: If our nation’s work rate today were back up to its start-of-the-century highs, well over 10 million more Americans would currently have paying jobs.
There is no way to sugarcoat these awful numbers. They are not a statistical artifact that can be explained away by population aging, or by increased educational enrollment for adult students, or by any other genuine change in contemporary American society. The plain fact is that 21st-century America has witnessed a dreadful collapse of work.
For an apples-to-apples look at America’s 21st-century jobs problem, we can focus on the 25–54 population—known to labor economists for self-evident reasons as the “prime working age” group. For this key labor-force cohort, work rates in late 2016 were down almost 4 percentage points from their year-2000 highs. That is a jobs gap approaching 5 million for this group alone.
It is not only that work rates for prime-age males have fallen since the year 2000—they have, but the collapse of work for American men is a tale that goes back at least half a century. (I wrote a short book last year about this sad saga.2) What is perhaps more startling is the unexpected and largely unnoticed fall-off in work rates for prime-age women. In the U.S. and all other Western societies, postwar labor markets underwent an epochal transformation. After World War II, work rates for prime women surged, and continued to rise—until the year 2000. Since then, they too have declined. Current work rates for prime-age women are back to where they were a generation ago, in the late 1980s. The 21st-century U.S. economy has been brutal for male and female laborers alike—and the wreckage in the labor market has been sufficiently powerful to cancel, and even reverse, one of our society’s most distinctive postwar trends: the rise of paid work for women outside the household.
In our era of no more than indifferent economic growth, 21st–century America has somehow managed to produce markedly more wealth for its wealthholders even as it provided markedly less work for its workers. And trends for paid hours of work look even worse than the work rates themselves. Between 2000 and 2015, according to the BEA, total paid hours of work in America increased by just 4 percent (as against a 35 percent increase for 1985–2000, the 15-year period immediately preceding this one). Over the 2000–2015 period, however, the adult civilian population rose by almost 18 percent—meaning that paid hours of work per adult civilian have plummeted by a shocking 12 percent thus far in our new American century.
This is the terrible contradiction of economic life in what we might call America’s Second Gilded Age (2000—). It is a paradox that may help us understand a number of overarching features of our new century. These include the consistent findings that public trust in almost all U.S. institutions has sharply declined since 2000, even as growing majorities hold that America is “heading in the wrong direction.” It provides an immediate answer to why overwhelming majorities of respondents in public-opinion surveys continue to tell pollsters, year after year, that our ever-richer America is still stuck in the middle of a recession. The mounting economic woes of the “little people” may not have been generally recognized by those inside the bubble, or even by many bubble inhabitants who claimed to be economic specialists—but they proved to be potent fuel for the populist fire that raged through American politics in 2016.
So general economic conditions for many ordinary Americans—not least of these, Americans who did not fit within the academy’s designated victim classes—have been rather more insecure than those within the comfort of the bubble understood. But the anxiety, dissatisfaction, anger, and despair that range within our borders today are not wholly a reaction to the way our economy is misfiring. On the nonmaterial front, it is likewise clear that many things in our society are going wrong and yet seem beyond our powers to correct.
Some of these gnawing problems are by no means new: A number of them (such as family breakdown) can be traced back at least to the 1960s, while others are arguably as old as modernity itself (anomie and isolation in big anonymous communities, secularization and the decline of faith). But a number have roared down upon us by surprise since the turn of the century—and others have redoubled with fearsome new intensity since roughly the year 2000.
American health conditions seem to have taken a seriously wrong turn in the new century. It is not just that overall health progress has been shockingly slow, despite the trillions we devote to medical services each year. (Which “Cold War babies” among us would have predicted we’d live to see the day when life expectancy in East Germany was higher than in the United States, as is the case today?)
Alas, the problem is not just slowdowns in health progress—there also appears to have been positive retrogression for broad and heretofore seemingly untroubled segments of the national population. A short but electrifying 2015 paper by Anne Case and Nobel Economics Laureate Angus Deaton talked about a mortality trend that had gone almost unnoticed until then: rising death rates for middle-aged U.S. whites. By Case and Deaton’s reckoning, death rates rose somewhat slightly over the 1999–2013 period for all non-Hispanic white men and women 45–54 years of age—but they rose sharply for those with high-school degrees or less, and for this less-educated grouping most of the rise in death rates was accounted for by suicides, chronic liver cirrhosis, and poisonings (including drug overdoses).
Though some researchers, for highly technical reasons, suggested that the mortality spike might not have been quite as sharp as Case and Deaton reckoned, there is little doubt that the spike itself has taken place. Health has been deteriorating for a significant swath of white America in our new century, thanks in large part to drug and alcohol abuse. All this sounds a little too close for comfort to the story of modern Russia, with its devastating vodka- and drug-binging health setbacks. Yes: It can happen here, and it has. Welcome to our new America.
In December 2016, the Centers for Disease Control and Prevention (CDC) reported that for the first time in decades, life expectancy at birth in the United States had dropped very slightly (to 78.8 years in 2015, from 78.9 years in 2014). Though the decline was small, it was statistically meaningful—rising death rates were characteristic of males and females alike; of blacks and whites and Latinos together. (Only black women avoided mortality increases—their death levels were stagnant.) A jump in “unintentional injuries” accounted for much of the overall uptick.
It would be unwarranted to place too much portent in a single year’s mortality changes; slight annual drops in U.S. life expectancy have occasionally been registered in the past, too, followed by continued improvements. But given other developments we are witnessing in our new America, we must wonder whether the 2015 decline in life expectancy is just a blip, or the start of a new trend. We will find out soon enough. It cannot be encouraging, though, that the Human Mortality Database, an international consortium of demographers who vet national data to improve comparability between countries, has suggested that health progress in America essentially ceased in 2012—that the U.S. gained on average only about a single day of life expectancy at birth between 2012 and 2014, before the 2015 turndown.
The opioid epidemic of pain pills and heroin that has been ravaging and shortening lives from coast to coast is a new plague for our new century. The terrifying novelty of this particular drug epidemic, of course, is that it has gone (so to speak) “mainstream” this time, effecting breakout from disadvantaged minority communities to Main Street White America. By 2013, according to a 2015 report by the Drug Enforcement Administration, more Americans died from drug overdoses (largely but not wholly opioid abuse) than from either traffic fatalities or guns. The dimensions of the opioid epidemic in the real America are still not fully appreciated within the bubble, where drug use tends to be more carefully limited and recreational. In Dreamland, his harrowing and magisterial account of modern America’s opioid explosion, the journalist Sam Quinones notes in passing that “in one three-month period” just a few years ago, according to the Ohio Department of Health, “fully 11 percent of all Ohioans were prescribed opiates.” And of course many Americans self-medicate with licit or illicit painkillers without doctors’ orders.
In the fall of 2016, Alan Krueger, former chairman of the President’s Council of Economic Advisers, released a study that further refined the picture of the real existing opioid epidemic in America: According to his work, nearly half of all prime working-age male labor-force dropouts—an army now totaling roughly 7 million men—currently take pain medication on a daily basis.
We already knew from other sources (such as BLS “time use” surveys) that the overwhelming majority of the prime-age men in this un-working army generally don’t “do civil society” (charitable work, religious activities, volunteering), or for that matter much in the way of child care or help for others in the home either, despite the abundance of time on their hands. Their routine, instead, typically centers on watching—watching TV, DVDs, Internet, hand-held devices, etc.—and indeed watching for an average of 2,000 hours a year, as if it were a full-time job. But Krueger’s study adds a poignant and immensely sad detail to this portrait of daily life in 21st-century America: In our mind’s eye we can now picture many millions of un-working men in the prime of life, out of work and not looking for jobs, sitting in front of screens—stoned.
But how did so many millions of un-working men, whose incomes are limited, manage en masse to afford a constant supply of pain medication? Oxycontin is not cheap. As Dreamland carefully explains, one main mechanism today has been the welfare state: more specifically, Medicaid, Uncle Sam’s means-tested health-benefits program. Here is how it works (we are with Quinones in Portsmouth, Ohio):
[The Medicaid card] pays for medicine—whatever pills a doctor deems that the insured patient needs. Among those who receive Medicaid cards are people on state welfare or on a federal disability program known as SSI. . . . If you could get a prescription from a willing doctor—and Portsmouth had plenty of them—Medicaid health-insurance cards paid for that prescription every month. For a three-dollar Medicaid co-pay, therefore, addicts got pills priced at thousands of dollars, with the difference paid for by U.S. and state taxpayers. A user could turn around and sell those pills, obtained for that three-dollar co-pay, for as much as ten thousand dollars on the street.
In 21st-century America, “dependence on government” has thus come to take on an entirely new meaning.
You may now wish to ask: What share of prime-working-age men these days are enrolled in Medicaid? According to the Census Bureau’s SIPP survey (Survey of Income and Program Participation), as of 2013, over one-fifth (21 percent) of all civilian men between 25 and 55 years of age were Medicaid beneficiaries. For prime-age people not in the labor force, the share was over half (53 percent). And for un-working Anglos (non-Hispanic white men not in the labor force) of prime working age, the share enrolled in Medicaid was 48 percent.
By the way: Of the entire un-working prime-age male Anglo population in 2013, nearly three-fifths (57 percent) were reportedly collecting disability benefits from one or more government disability program in 2013. Disability checks and means-tested benefits cannot support a lavish lifestyle. But they can offer a permanent alternative to paid employment, and for growing numbers of American men, they do. The rise of these programs has coincided with the death of work for larger and larger numbers of American men not yet of retirement age. We cannot say that these programs caused the death of work for millions upon millions of younger men: What is incontrovertible, however, is that they have financed it—just as Medicaid inadvertently helped finance America’s immense and increasing appetite for opioids in our new century.
It is intriguing to note that America’s nationwide opioid epidemic has not been accompanied by a nationwide crime wave (excepting of course the apparent explosion of illicit heroin use). Just the opposite: As best can be told, national victimization rates for violent crimes and property crimes have both reportedly dropped by about two-thirds over the past two decades.3 The drop in crime over the past generation has done great things for the general quality of life in much of America. There is one complication from this drama, however, that inhabitants of the bubble may not be aware of, even though it is all too well known to a great many residents of the real America. This is the extraordinary expansion of what some have termed America’s “criminal class”—the population sentenced to prison or convicted of felony offenses—in recent decades. This trend did not begin in our century, but it has taken on breathtaking enormity since the year 2000.
Most well-informed readers know that the U.S. currently has a higher share of its populace in jail or prison than almost any other country on earth, that Barack Obama and others talk of our criminal-justice process as “mass incarceration,” and know that well over 2 million men were in prison or jail in recent years.4 But only a tiny fraction of all living Americans ever convicted of a felony is actually incarcerated at this very moment. Quite the contrary: Maybe 90 percent of all sentenced felons today are out of confinement and living more or less among us. The reason: the basic arithmetic of sentencing and incarceration in America today. Correctional release and sentenced community supervision (probation and parole) guarantee a steady annual “flow” of convicted felons back into society to augment the very considerable “stock” of felons and ex-felons already there. And this “stock” is by now truly enormous.
One forthcoming demographic study by Sarah Shannon and five other researchers estimates that the cohort of current and former felons in America very nearly reached 20 million by the year 2010. If its estimates are roughly accurate, and if America’s felon population has continued to grow at more or less the same tempotraced out for the years leading up to 2010, we would expect it to surpass 23 million persons by the end of 2016 at the latest. Very rough calculations might therefore suggest that at this writing, America’s population of non-institutionalized adults with a felony conviction somewhere in their past has almost certainly broken the 20 million mark by the end of 2016. A little more rough arithmetic suggests that about 17 million men in our general population have a felony conviction somewhere in their CV. That works out to one of every eight adult males in America today.
We have to use rough estimates here, rather than precise official numbers, because the government does not collect any data at all on the size or socioeconomic circumstances of this population of 20 million, and never has. Amazing as this may sound and scandalous though it may be, America has, at least to date, effectively banished this huge group—a group roughly twice the total size of our illegal-immigrant population and an adult population larger than that in any state but California—to a near-total and seemingly unending statistical invisibility. Our ex-cons are, so to speak, statistical outcasts who live in a darkness our polity does not care enough to illuminate—beyond the scope or interest of public policy, unless and until they next run afoul of the law.
Thus we cannot describe with any precision or certainty what has become of those who make up our “criminal class” after their (latest) sentencing or release. In the most stylized terms, however, we might guess that their odds in the real America are not all that favorable. And when we consider some of the other trends we have already mentioned—employment, health, addiction, welfare dependence—we can see the emergence of a malign new nationwide undertow, pulling downward against social mobility.
Social mobility has always been the jewel in the crown of the American mythosand ethos. The idea (not without a measure of truth to back it up) was that people in America are free to achieve according to their merit and their grit—unlike in other places, where they are trapped by barriers of class or the misfortune of misrule. Nearly two decades into our new century, there are unmistakable signs that America’s fabled social mobility is in trouble—perhaps even in serious trouble.
Consider the following facts. First, according to the Census Bureau, geographical mobility in America has been on the decline for three decades, and in 2016 the annual movement of households from one location to the next was reportedly at an all-time (postwar) low. Second, as a study by three Federal Reserve economists and a Notre Dame colleague demonstrated last year, “labor market fluidity”—the churning between jobs that among other things allows people to get ahead—has been on the decline in the American labor market for decades, with no sign as yet of a turnaround. Finally, and not least important, a December 2016 report by the “Equal Opportunity Project,” a team led by the formidable Stanford economist Raj Chetty, calculated that the odds of a 30-year-old’s earning more than his parents at the same age was now just 51 percent: down from 86 percent 40 years ago. Other researchers who have examined the same data argue that the odds may not be quite as low as the Chetty team concludes, but agree that the chances of surpassing one’s parents’ real income have been on the downswing and are probably lower now than ever before in postwar America.
Thus the bittersweet reality of life for real Americans in the early 21st century: Even though the American economy still remains the world’s unrivaled engine of wealth generation, those outside the bubble may have less of a shot at the American Dream than has been the case for decades, maybe generations—possibly even since the Great Depression.
The funny thing is, people inside the bubble are forever talking about “economic inequality,” that wonderful seminar construct, and forever virtue-signaling about how personally opposed they are to it. By contrast, “economic insecurity” is akin to a phrase from an unknown language. But if we were somehow to find a “Google Translate” function for communicating from real America into the bubble, an important message might be conveyed:
The abstraction of “inequality” doesn’t matter a lot to ordinary Americans. The reality of economic insecurity does. The Great American Escalator is broken—and it badly needs to be fixed.
With the election of 2016, Americans within the bubble finally learned that the 21st century has gotten off to a very bad start in America. Welcome to the reality. We have a lot of work to do together to turn this around.