East Side, West Side

Readers know I grew up on the far east side of Madison, a mile south of where Interstate 90 splits off for Chicago, Interstate 94 heads to Milwaukee, and I–90/94 goes north to the Wisconsin Dells, La Crosse and the Twin Cities.

This was (well, still is) the old neighborhood, Heritage Heights, which years earlier had been part of a large farm. (More on that presently.) My parents built their first house, a green and yellow ranch with a two-car garage on the left side behind a rather steep driveway, in 1971, the year our street and the street behind (to the north) our house was paved. (The basement for our house was poured on my sixth birthday, and the street wasn’t paved yet.) There were basically three house designs on the entire block, with a couple of exceptions — a one-story ranch (with garage to left or right), a two-story house (on either side of our house), and a split-level house.

We had moved there from another house my parents had purchased upon having two sons in the house, 1.5 miles to the south. My future second-grade teacher lived two houses down, and across the street was a childless couple, older than my parents, who would have us over on numerous occasions.

Neither of those neighborhoods was a suburb of Madison, since they were in the city, but they felt like they were, given the distance around either Lake Mendota or Lake Monona to downtown or the UW campus, seven miles (if you drive through downtown) and a world away. When late 1960s Vietnam War protests hit national TV, we had relatives who were concerned that marauding rioters would endanger us. They didn’t realize how far it was to campus and the reality that any UW student who got that far east was lost.

It took until I (permanently) left Madison for me to realize what an unusual neighborhood it was. The nearest gas station and grocery store were one mile away. Want to have a drink at the neighborhood bar? There wasn’t one; the closest bar was two miles away. (Farther away yet was a combination bar and barber shop building where the males of the house got haircuts.) Want to go out to dinner? The nearest nice restaurant (which I never went to) was The Pig’s Ear, 1.4 miles away. (There were both bars and bars with non-bar food a couple of miles away, but at the time those were in what could be called “rural Madison,” the towns of Blooming Grove and Burke.)  Unless you mowed grass or babysat, any part-time jobs required a commute.

There was one church in the neighborhood, for what then was called the Reorganized Church of Jesus Christ of Latter Day Saints. (Not the Mormons, and now called the Community of Christ, though I think the building itself, which became our Boy Scout home base, isn’t a church anymore.) The neighborhood had houses and one park, and that was it. Our neighborhood was impossible to live in if you didn’t have a car. (Madison Metro’s J route went through, but try bringing home groceries on a bus.)

Public-school kids in my neighborhood went to John F. Kennedy Elementary School (though I went to Elvehjem Elementary School for kindergarten until we moved), which was a one-mile walk through Heritage Heights Park and its culvert that filled with fast-rushing water from spring snowmelt (and the future home of legendary 1980s Heritage Bowl touch football games, but that’s another story). until the completion of a road behind our house reduced the distance considerably, just in time for me to leave Kennedy for (the hellhole that was) Schenk (now Whitehorse) Middle School. And then surviving Schenk, off we went to Robert M. La Follette High School (sports teams known as the Lancers, not the Fighting Bobs), 3.7 miles and 15 minutes away from our house down Cottage Grove Road and U.S. 51 (Stoughton Road). (No wonder my mother was so annoyed when her sons stayed late after school and asked separately for rides. Two round trips constituted a gallon of gas in our 1975 Chevrolet Caprice, EPA-rated at 13 city and 18 highway miles per gallon. At $1 a gallon, that adds up.)

For comparison purposes: The local high school is 15 minutes from our house. By foot. The only reason it takes 10 minutes to get there by car is if you’re stuck trying to get across two state highways at non-stoplight intersections. (There is a roundabout, but four years after it opened most locals don’t seem to be able to figure out how to drive in it.) I can get to a neighboring community’s high school in 10 minutes, and two others’ high schools in 15 minutes. Those of us who grew up in my neighborhood were as far away from our own high school as those who grow up in rural school districts if measured by time. (I figured out after I moved from Madison that a 15-mile drive at 60 mph seems shorter than a 15-minute drive at 25 mph, though the former obviously is farther in distance. The driver feels like he’s getting somewhere at highway speeds, as opposed to the Far East formula of drive to the end of the street, stop, drive a few blocks, stop, drive one block, stop, etc.)

This long preamble has now reached the point of this blog: It could have been different. Stu Levitan takes us back to 1967, four years after La Follette opened its doors:

The new high school—or not

In 1966, voters had approved by a margin of 2-1 to a $26.5 bond issue which included funds to open a new east side high school in 1969. Things didn’t quite work out as planned—especially for a powerful board member and the lame-duck superintendent.

Atty. Albert J. Mc Ginnis, former chair of the Madison Redevelopment Authority, who lost to mayor Henry Reynolds in 1963, chaired the board’s site selection committee for the new school. He picked a site on the Sprecher farm on Milwaukee St., adjacent to Kennedy elementary school—which just happened to be within the Heritage Heights plat that he had developed before his election to the board in 1965, and still owned. North side Alds. Kopp and Smith, who want the school in Warner Park, howl, accusing McGinnis of an obvious conflict of interest. Later that month, more than 350 people pack a school board public hearing, calling for a Warner Park site.

On April 28, his last day before resigning to assume his duties in Denver, [school] superintendent [Robert] Gilberts recommends to the board that it buy the parcel McGinnis has identified on Milwaukee St. But three days later, in a stunning and costly rebuke of its administration, the board votes 4-3 against building any new far East Side high school at all, endorsing instead a new junior high at La Follette High School, and a similar one at Kennedy “as needed.” Among the likely repercussions: when Central HS closes in 1969, all south side students now at Central will go to West—which cannot accommodate them.

Levitan adds the numbers for the four public high schools’ Classes of 1967:

West: 677
East: 512
La Follette: 339
Central: 271

There was a high school about half the distance to La Follette in a different direction. That was Queen of Apostles High School, just on the opposite side of I–90, across Cottage Grove Road from a branch of my father’s bank. (Where I met former Packer Ray Nitschke, but that’s a different story.) QAS, as it was locally known, apparently started as a seminary back in 1948, 20 years before the Interstate bypassed Madison. QAS was the first home of my Boy Scout troop, which moved to the RLDS church after QAS closed. (QAS’ last graduation was on my 14th birthday.) QAS was on the way to closing by the time I neared high school age, and I never considered going there or to Edgewood, the remaining Catholic high school in Madison.

(The area between the Interstate and Cottage Grove is unrecognizable now compared to when I lived there. When I was driving from Madison to Cottage Grove to cover government meetings in my first journalism job, there was only one place you had to slow down on those seven miles, at Vilas, about halfway there. Now, it is wall to wall houses and businesses, and the speed limit is 35 mph.)

Another high school is even closer to La Follette, but that’s in a different school district — Monona Grove, on the opposite side of the Monona Golf Course. Monona Grove, for non-Madisonians, is the school district that combines Monona (which is on Lake Monona and surrounded by Madison) and Cottage Grove, which is about eight miles east. (MGHS students who live in Cottage Grove have to go through Madison to get to school. When the school district built a new high school in 1999, it was built in Monona, which has shrunk a quarter in population over the past 40 or so years, and not Cottage Grove, which is now only slightly smaller than Monona in population.)

Levitan’s piece, part of a larger work chronicling a rather turbulent year in Madison to say the least (including, one assumes though I don’t remember, my own Terrible Twos), is the first time I knew there was a proposal to build an east-side high school farther east than the Far East Side high school, La Follette. Or a middle school. Really Far East Side High School (perhaps it would have had some sort of Asiatic nickname in those pre-politically correct days) would have been no more than a mile away from Kennedy. Kennedy and Don’t-Call-It-Schenk-Anymore (which had an attached elementary school) were just two miles apart by car, and Really Far East Side Middle School would have been even closer than that. (As it was despite being just two miles away, going from Kennedy to Schenk was like entering a different world; the Schenks were in an older neighborhood, and, well, it was a middle school, a toxic combination of burgeoning hormones and tween Social Darwinism.)

To say the least this would have changed things. I’m not sure where the high school attendance boundaries were in the pre-open enrollment says, but one oddity was that students who lived in Maple Bluff, the richest part of greater Madison, went to East, the most blue collar high school. That probably would have changed with RFESHS; indeed all the high schools’ attendance boundaries would have shifted eastward. (Students who lived downtown, who went to Central before it closed and, I believe, went to West thereafter, probably would have gone to East.)

In those days (and probably now) the four high schools were easy to stereotype. La Follette had white-collar families — bankers, insurance agents, small business owners, salespeople, etc. East had blue-collar families. West was where UW-employed families lived. Memorial families had money, though we didn’t know from where. Then as now, the biggest high school rivalry in Madison was East vs. West, followed by West vs. Memorial and East vs. La Follette. (The latter rivalry introduced police to hockey games after East fans threw rocks at our band bus.) James Madison Memorial (which could have been the name of RFESHS) was built instead of RFESHS (or the sought-after Warner Park-area high school) and given the anticipated growth of the Far West Side (three words: “West Towne Mall”) a high school was likely to be built there anyway. (La Follette Junior High became Sennett Middle School, connected to La Follette by a concrete supposed-to-be-no-man’s-land under the La Follette library known as The Pit, a favorite stop of those who related to the Brownsville Station song and Poison cover “Smokin’ in the Boys Room.”)

Then there’s this:

I went to grade school with two players on the varsity roster and another player who wasn’t on the varsity  roster for state. Two other players went, I think, to the local Catholic school instead of Kennedy or Schenk Middle. I do not intend to denigrate their athletic abilities by pointing out that none of them were named “Rick Olson,” who went on to play at Wisconsin, or “Steve Amundson,” who went on to play at Western Michigan. La Follette may have still won the 1982 state championship, but none of us at RFESHS would have been part of that.

There has always been a rivalry between Madison’s East and West sides, and those of us who lived on the East Side (however you define that) felt some sense that we were getting ripped off. Madison’s two newest high schools were an example — Memorial got a football field and track (which hosted the state track meet until 1990), but La Follette did not. (Of course, neither did Central, East or West; they shared Breese Stevens Field until East and West shared Warner Park, while West plays home games at Memorial’s stadium. La Follette did not play at Monona Grove’s stadium even though it would have been more convenient and nicer than Warner Park, which was worse than some smallest-division fields. La Follette now does have a football field and track, and East plays there too.) Memorial got a planetarium, as a reader reminded me.

East Towne was bigger than West Towne (an important point), but while there were several Catholic churches on the West Side, there was one near-side Catholic church (St. Bernard’s, on Atwood Avenue not far from my father’s bank), and one closer to us, St. Dennis, two miles away. St. Dennis held church services in its school gym from the beginning of my memory, and we parishioners helped out at Friday fish fries in the same gym to raise money for the new church, which was finally completed my senior year in high school. (The new church was immediately packed nearly every Sunday, which suggests the diocese should have located more churches closer to the Far East Side than Monona and Cottage Grove.) As far as I can remember, the annual Madison Parade of Homes were always on the West Side. (Including the house with the two-level garage.)

We also felt we were getting ripped off in such city services as police response time, though there was little reason for the police to show up in our neighborhood. (Other than a rock-throwing incident next door, we may literally have gone years without having a police car on our street.) The nearest fire station was across the street from my first employer, Bridgeman’s Ice Cream Restaurant and Parlour, about 2.5 miles away. The nearest fire station now is on the other side of the Interstate. Our streets were always the absolute last in Madison to get plowed after snowfalls (assuming they were, and often they weren’t), always timed for when we had just finished shoveling. The Far West Side (where four of my cousins grew up and, sad to say, attended Memorial) seemed to have nicer houses and therefore more money, though young minds don’t necessarily know much about how much it costs to buy 4,000-square-foot houses with two-level garages.

One thing that’s changed in Madison is high school enrollments. The Madison high schools when I was growing up had around 2,000 students each, I believe. East and La Follette are 75 to 80 percent of their former size, while Memorial and West are still around 2,000. However, Sun Prairie, one of the smallest schools in the Big Eight in the ’80s, is now bigger than any Madison high school (Sun Prairie just built a new high school but is considering another), as is Middleton, which was too small to be in the Big Eight. Verona, which was Monona Grove’s size, now is La Follette’s size. Part of that is that nearly every Madison-area school district has alternative high schools, but part of that is smaller families, though that has hit rural school districts harder than Madison-area schools.

I’ve written before that I had a pretty drama-free childhood. I don’t know what went on in other houses, but Heritage Heights felt so far away from downtown Madison that we might as well have been living out in the ‘burbs. (There were people who lived in the school district, with Madison addresses, but didn’t live in the city; they were east of the Interstate. I assume most of those houses were annexed into the city.) It certainly would have been different not having many of my classmates be classmates, although with 500 classmates no one could know where everyone lived.

 

The TV reverse Midas touch

The late Trio cable channel had a series called “Brilliant But Canceled” about shows on TV all too briefly:

I wouldn’t call what follows “brilliant,” but they were definitely canceled, and shortly after I started watching them at a very young age.

“The Interns,” which was on briefly in 1970, includes an actor from “Star Trek” and “The FBI,” another “Star Trek” actor, B.J. from “M*A*S*H,” one of those actors whose face you recognized (before his untimely death at 49), and the star of “Highway Patrol” in a series that lasted one season:

Before I knew Glenn Ford as a movie actor of long standing, I saw him in this one-season series:

My viewing preferences of TV series with cars probably started with the 13-episode “Bearcats!”

Perhaps because of his recently canceled “Get Smart,” I watched the next sitcom of Don Adams, “The Partners.” The only episode I recall was when their car’s driver’s side door was sheared off by a passing car, creating a three-door detective car.

“Partners” was moved halfway into its only season. Its time-slot replacement was “Emergency!”

You’ve already read here about “Chase,” produced by Jack Webb;

Another Webb series not long for the screen was “Project UFO”:

Movie fans may remember an actor named Khigh Dhiegh, the Chinese bad guy in “The Manchurian Candidate.” TV fans remember him (name at birth, believe it or don’t: Kenneth Dickerson) as Wo Fat in the original “Hawaii Five-O.” While playing Wo Fat, Dheigh briefly was the lead of “Khan,” about a San Francisco private detective. “Khan” is so rare that you can’t even find a snippet of it on YouTube, perhaps because it was canceled two episodes into its four-episode run.

As you know, quality and popularity are not synonyms. I don’t remember much about any of these series, but since none lasted very long, neither does anyone else.

The irreligious Trump and his religious fans

Michael Gerson on thrice-married Donald Trump and some of his biggest supporters, who you’d think wouldn’t approve of three marriages, two of which ended in divorce:

In the compulsively transgressive, foul-mouthed, loser-disdaining, mammon-worshiping billionaire, conservative Christians “have found their dream president,” according to Jerry Falwell Jr.

It is a miracle, of sorts.

In a recent analysis, the Pew Research Center found that more than three-fourths of white evangelical Christians approve of Trump’s job performance, most of them “strongly.” With these evangelicals comprising about a quarter of the electorate, their support is the life jacket preventing Trump from slipping into unrecoverable political depths.

The essence of Trump’s appeal to conservative Christians can be found in his otherwise anodyne commencement speech at Liberty University. “Being an outsider is fine,” Trump said. “Embrace the label.” And then he promised: “As long as I am your president, no one is ever going to stop you from practicing your faith.” Trump presented evangelicals as a group of besieged outsiders, in need of a defender.

This sense of grievance and cultural dispossession — the common ground between The Donald and the faithful — runs deep in evangelical Christian history. Evangelicalism emerged from the periodic mass revivals that have burned across America for 300 years. While defining this version of Christianity is notoriously difficult, it involves (at least) a personal decision to accept God’s grace through faith in Christ and a commitment to live — haltingly, imperfectly — according to his example.

In the 19th century, evangelicals (particularly of the Northern variety) took leadership in abolitionism and other movements of social reform. But as a modernism based on secular scientific and cultural assumptions took control of institution after institution, evangelicals often found themselves dismissed as anti-intellectual rubes.

The trend culminated at the 1925 Scopes Monkey Trial, in which evolution and H.L. Mencken were pitted against creation and William Jennings Bryan (whom Mencken called “a tin pot pope in the Coca-Cola belt and a brother to the forlorn pastors who belabor half-wits in galvanized iron tabernacles behind the railroad yards”). Never mind that Mencken was racist, anti-Semitic and an advocate of eugenics and that Bryan was the compassionate progenitor of the New Deal. Fundamentalists (a designation adopted by many evangelicals) lost the fundamentalist-modernist controversy, even in their own minds.

After a period of political dormancy — which included discrediting slumber during the civil rights movement — evangelicals returned to defend Christian schools against regulation during the Carter administration. To defend against Supreme Court decisions that put tight limits on school prayer and removed state limits on abortion. To defend against regulatory assaults on religious institutions. Nathan Glazer once termed this a “defensive offensive” — a kind of aggrieved reaction to the perceived aggressions of modernity.

Those who might be understandably confused by the current state of evangelicalism should understand a few things:

First, evangelicals don’t have a body of social teaching equivalent, say, to Catholic social doctrine. Catholics are taught, in essence, that if you want to call yourself pro-life on abortion, you also have to support greater access to health care and oppose the dehumanization of migrants. And vice versa. There is a doctrinal whole that requires a broad and consistent view of social justice. Evangelicals have nothing of the sort. Their agenda often seems indistinguishable from the political movement that currently defends and deploys them, be it Reaganism or Trumpism.

Second, evangelicalism is racially and ethnically homogeneous, which leaves certain views and assumptions unchallenged. The American Catholic Church, in contrast, is one-third Hispanic, which changes the church’s perception of immigrants and their struggles. (Successful evangelical churches in urban areas are now experiencing the same diversity and broadening their social concern.)

Third, without really knowing it, Trump has presented a secular version of evangelical eschatology. When the candidate talked of an America on the brink of destruction, which could be saved only by returning to the certainties of the past, it perfectly fit the evangelical narrative of moral and national decline. Trump speaks the language of decadence and renewal (while exemplifying just one of them).

In the Trump era, evangelicals have gotten a conservative Supreme Court justice for their pains — which is significant. And they have gotten a leader who shows contempt for those who hold them in contempt — which is emotionally satisfying.

The cost? Evangelicals have become loyal to a leader of shockingly low character. They have associated their faith with exclusion and bias. They have become another Washington interest group, striving for advantage rather than seeking the common good. And a movement that should be known for grace is now known for its seething resentments.

Whether you approve or not, the cause of this is obvious — Trump’s predecessor in the White House. Barack Obama was as big a fan of abortion as Bill Clinton, and opposed religious liberty for conservative Christians. (See “wedding cakes.”) In these divisive days, you’re either for something or against something, and apparently doing what you say is preferable to doing what you do in your private life.

 

The vassal decided to vacillate on deciding when to put up the trellis, which the bailiff in official raiment found to be a bagatelle, but was sheer drudgery

A high school classmate of mine who works in history found this this week:

This is from the Wisconsin State Journal 40 years ago Wednesday. (Pause while I wipe the tears from my eyes and loudly blow my nose over The March Of Time!)

The previous Saturday, April 30, 1977, I won the Madison City Spelling Bee in my third attempt. In those days, at least in Madison, if you won your elementary- or middle-school spelling bee you advanced to the city spelling bee. I won my spelling bees at Kennedy Elementary School in fourth and fifth grade, but obviously failed to win at the city level until I won my first Schenk (now Whitehorse) Middle School spelling bee in 1977.

Spelling bees were the first kind of competition in which I did reasonably well. Spelling bees are analogous to competing in a non-relay track event or swimming race, in that your success is based on what you do in comparison to what others do. Since I sucked (and still do) at athletics, and never was a very good musician, this was my thing for five years growing up.

Spelling today is, if not a dying art, then a seriously ill art, given creative spellings in the business world (“Kwik Trip”), spellcheck in word processing applications (though spellcheck doesn’t pick up homophones, a correctly spelled incorrect word), and abbreviations in social media. (IMHO. SMH. BTT.)

The 1977 bee was, for me, somewhat of a white-knuckle experience toward the end, even though I was a grizzled veteran of spelling bees by then. Similar to team sports, winning a spelling bee requires some luck in getting words you can spell (or at least correctly guess) and your competitors getting words they could not spell. My first city spelling bee in fourth grade was a two-and-done; the second word I got was “trellis” (a frame or structure of latticework used as a support for growing trees or plants), which I had never heard of, and that ended that. One year later, I got up near the top 10, but lost on “raiment” (clothing). In my fourth city spelling bee, in seventh grade, I finished 13th on “vacillate” (to waver in mind or opinion), another word I had never heard of. (I studied, but I guess I didn’t get down toward the end of the dictionary. As it was, I think I learned more words by reading them than being told to spell words in a dictionary.)

The word “thespian” probably doesn’t show up often in a sixth-grader’s vocabulary, so when I heard it it sounded to me like, well, a gay actress, but I figured out it ended with “=pian” and not “-bian” and got it right. Shortly thereafter the runner-up missed her word, I spelled it correctly, and then I got “vassal” (according to Dictionary.coma person granted the use of land, in return for rendering homage, fealty, and usually military service or its equivalent to a lord or other superior”), another word not found in a sixth-grade vocabulary. (In those days, again with only one person advancing to the next level, if speller A missed his or her word, speller B had to spell that word correctly and then another word.) The first four letters were easy enough, but not the last two — “-al,” “-il” or “le”?

To quote the ancient knight in “Indiana Jones and the Last Crusade,” I chose … wisely. I got to bring home a traveling trophy that was half as tall as I was, and got on the front page of the Wisconsin State Journal, along with a story in the Monona Community Herald (which I would end up working for, but you knew that).

A week later, I departed the state spelling bee after spelling “bagatelle” (something of little value or importance), again not a word in a sixth-grader’s vocabulary, correctly. The first eight letters seemed obvious, but was there a silent E at the end? I got to the last L, stopped, heard no reaction, and added the E. A woman sitting in the front row protested, and the judges decided that I meant “bagatell” instead of “bagatelle.”

Two years later, as longtime readers know

… I returned to state as the city bee winner.

(Note, by the way, how fashions changed between 1977 and 1979. Plaid pants gave way to polyester disco shirts and pants, and though I hadn’t ditched the glasses yet, there was a lot more hair, though not as much as in the previous year.)

The story quotes me as saying that “declaration” and “drudgery” were two words I had a little difficulty with spelling. Truth be told, I really didn’t, other than to make sure I had the letters in the correct order. Winning the previous school bee made me 5-for-5 in winning school spelling bees, and apparently I was the first person to win two Madison spelling bees. (A girl in that 1979 bee became the first to win two consecutive city spelling bees. Interestingly, she was hearing-impaired.)

The state spelling bee the week afterward at my future high school went sort of like my first city spelling bee, a two-and-done, though on a word that, had I thought about it for a couple of seconds, I would have spelled correctly — “bailiff.” The night before I attended my middle school’s eighth-grade dessert dance, and sl0w-danced with five girls, three of whom I’d had crushes on at some point (along with perhaps half of the other girls in what would become the Class of 1983). So on Saturday afternoon my head was still back in Friday night. And that ended that, because there are no spelling bees in high school.

The bee format appears to have changed to where more than just the winner moves on. Wisconsin’s top three state-bee spellers advanced to the national bee. Advancing to the national bee appears to be now based on some sort of population formula, or perhaps number of organizations willing to sponsor a regional bee, given that Wisconsin had three national contestants, Iowa had two, and Illinois had 18. It’s kind of become like expanding the baseball playoffs from including only division winners to adding wild-cards, where, like the 1997 and 2003 Marlins, 2002 Angels, 2004 Red Sox, 2011 Cardinals and 2014 Giants, teams can win the World Series without winning their division. (Or the 2010 Packers, winners of Super Bowl XLV as the last NFC playoff team.) The state bee also had a rule, since rescinded, that if you won the state bee you couldn’t compete anymore. (From what I’ve read most national bee winners are not first-time national competitors, which makes sense.)

For those who haven’t moved on to something else by this point, you might be asking (other than why the hell did I write this) what I got out of the whole experience. (Other than embarrassment every time I misspell a word, though mistyping a word isn’t exactly the same thing as spelling it wrong.) It was the first experience in my life of being sort of locally famous, which as those who have achieved some fame know is a mixed blessing. With the exception of friends of mine (and you know who you are, Ruste), few of my classmates appeared to care much, though adults in my two schools did. (On my last day at Schenk, a teacher — not one I’d had for a class — actually thanked me for bringing positive recognition to the school. I didn’t know what to say.) Something similar happens now because my photo has been in publications I work for (and my face has shown up on TV), and it’s one of those strange experiences where more people know me than I know them.

Was it fun? Well … it was not like my UW Band experience or a successful athletic accomplishment (the latter certainly not based on personal experience, of course) in which your first thrill is the accomplishment and then you realize what a great experience you had along the way to that accomplishment. I certainly didn’t hate it (some people can spell well but don’t do well in bees because they get excessively nervous in competition), but I can’t say I have fond memories of, say, school spelling bees or traveling somewhere with my mother giving me words out of the dictionary to spell. The experience for me was just what it looked like: (1) walk up to microphone, (2) spell the word, (3) go back to your seat, (4) watch others spell correctly or not; lather, rinse, repeat. I remember the spelling bees where I won fondly (not that I think about them often), and the ones I didn’t win I don’t remember fondly. (Athletic competition is sald to provide more lessons in losses than in wins, in the sense that failures can teach more than successes. I’m not sure that translates to spelling bees — if you don’t know a word, you don’t know it — except that if you lose that could mean you needed to study more, or, in the case of the 1979 state bee, try focusing on what you’re supposed to be doing instead of something else.) Spelling wasn’t fun; winning was fun, and when I won, I was the only winner.

The experience also taught me that being smart (or, as the British put it, “clever”) isn’t really valued in our society. (Anyone who thinks this is a recent phenomenon due to the current president has not been paying attention well before now.) Part of it probably is that there are a lot of people who are threatened by someone who can do something better than that person can. Americans like to think of ourselves as a society striving for equality, but we also like to think of ourselves as a meritocracy, and the two concepts are somewhat at cross-purposes to each other. Part of it as well probably is because some smart people unintentionally make people feel inferior, or (particularly if they’re young) haven’t figured out interpersonal skills to not do that. (Some people, such as myself, like to claim that they don’t care what people think about them, but that’s not likely to be true.) Harry Truman famously said the world is run by C students, so intelligence does not necessarily translate to leadership skills; nor does it necessarily lead to sense or wisdom, as, well, whatever political debate you choose to cite proves.

Last year for the Scripps National Spelling Bee, Time magazine did an online where-are-they-now feature. One spelling bee champion became a spelling bee coach. None profiled did what I did, go into journalism, one of the few lines of work that actually appreciates spelling ability. (Or where you get nailed for lack of spelling ability. However, the standard in print journalism is to write at an eighth-grade level, and words in more advanced spelling bees are well past eighth grade.)

Not in the Time story was Joanne Lagatta of Clintonville, the first and only Wisconsinite to win the national bee. (On “inappetence,” lack of appetite, and “antipyretic,” a drug used to combat fever.) She is now a pediatric physician, certainly a better profession for the world than journalism, though I wonder if she can use both of those words in the same sentence, such as “She gave her patient an antipyretic in part because the fever had caused her patient inappetence for several days.” Of course, given the rare use of those words, most people probably wouldn’t know what she was talking about, similar to the sentence in the title of this blog.

Mark of eccentricity, or when did this seem like a good idea?

I am a member of several automobile-oriented groups on Facebook, including Cars Modified in Ways That Dumbfound.

While that group has whose vehicles were modified in ways that defy reason …

… this country’s automakers have occasionally sent out into the world vehicles with features, or sometimes lacking features, that make one wonder about the decision process that led up to that dubious judgment.

It seems as though most of those came from General Motors, which by the 1950s was the nation’s largest company in terms of revenues as a percentage of Gross Domestic Product, the largest private-sector employer in the world, and the first U.S. company to pay more than $1 billion in taxes.

 

For all of GM’s innovations — the first self-starter, the Cadillac V-8s, the Chevy small-block V-8, their current automatic transmission design — GM’s major sin has been sending vehicles and technology into the marketplace before they were really ready to be there. One example was the 1980s Pontiac Fiero, a mid-engine two-seater with not much engine until a V-6 was installed. The much more fun driving experience of a V-6 in a light car lasted until the Fiero was canceled three years after the V-6 was introduced. (Apparently publicity about engine fires, a hallmark of the original engine, has a negative effect on car sales.) A much wider example was GM’s 1981 Computer Command Control, which was supposed to improve performance and fuel economy. Instead, it introduced the car-buying world to the yellow Check Engine light.

In 1960, Chevrolet began selling the Corvair, its first rear-engine rear-wheel-drive vehicle after decades of selling cars with the engine in the front and the drive wheels in back. Yes, Volkswagen sold air-cooled rear-mounted flat-engine rear-drive Beetles. Yes, Nash, which became AMC, entered the compact market 10 years earlier with the Rambler. But, for instance, one has to question the utility of a station wagon …

… or a pickup truck …

… or a van …

… where loads have to be placed over the engine. (Each was gone by the Corvair’s redesign in 1965.)

The bigger issue was the Corvair’s handling; engine weight over the rear wheels made them oversteer, as opposed to the traditional understeer of front-engine rear-drive vehicles. The suspension design not only made oversteer worse, it helped make the rear wheels bounce off the road surface, which doesn’t help, you know, controlling the vehicle. A change in suspension design for the second-generation Corvair made them handle much better, but why didn’t GM introduce that design for the first generation? Did GM test the cars in (imitation) real-world use enough to figure out they had a suspension problem?

Shortly after the Corvair debuted, GM’s other divisions (except for Cadillac, which was 15 years away from introducing something that wasn’t the size of an aircraft carrier) introduced their own compact cars on the same unibody chassis — the Pontiac Tempest, whose base engine was half (literally) of a V-8, with a rear-mounted transaxle (later to be seen on the Corvette); the Oldsmobile F-85; and the Buick Special, each of which had as standard or optional an aluminum-block 215 V-8, the basic design of which can be found in, of all vehicles, the Land Rover Range Rover. (The Buick also offered a 198 V-6, the design of which was sold to Jeep a few years later, only to have GM get it back in the early 1970s and use it into the late 1980s.) Chevy, meanwhile, came up with its own compact, the Chevy II, though on a different platform after the Corvair was badly outsold by the Ford Falcon.

The Tempest, F-85 and Special at least were attempts at innovation, which went away by 1964, when the redesigned intermediates were all variations on the same basic design, in part because of a steady increase of buyer interest in horsepower. But GM, and other carmakers, noticed an interest in smaller import cars, smaller even than the Corvair. And so with uncommon (for GM) speed, the company introduced …

The car was styled by Bill Mitchell, so of course it looks good. Ed Cole, who brought the world the modern Chevy small-block V-8, was its designer; he became GM’s president. The lead designer had worked on the Nova and Camaro, the small-block V-8 with Cole, and the Turbo Hydramatic automatic transmission, which has been used by GM for more than 50 years.

And that’s where things started to go horribly wrong. CheatSheet Autos explains:

Nothing had ever been done like this before, and from the get go, it was apparent that GM simply wasn’t up to the task. Dubbed project XP-887, Cole tapped recently-appointed Chevy brand chief John DeLorean to be cheerleader for the new car in the press. Almost immediately, he didn’t like what he saw. In his book On A Clear Day You Can See General Motors, DeLorean recalled:

From the first day I stepped into Chevrolet, the Vega was in trouble. General Motors was basing its image and reputation on the car, and there was practically no interest in it in the division. We were to start building the car in about a year, and nobody wanted anything to do with it. Chevy’s engineering staff was only going through the motions of preparing the car for production, but nothing more. Engineers are a very proud group. They take interest and pride in their designs, but this was not their car and they did not want to work on it.

By late 1968, Chevy had a running prototype, but in a sign of things to come, the front end sheared off after just eight miles of testing.

Fixing the front end meant adding weight, so engineers looked to shed pounds elsewhere. Inner fenders were deleted, as were plastic fender lines to combat rust, saving a whopping $2.28 per car. And instead of an iron block, they used a new 2.3 liter die-cast aluminum block inline-four. GM had built aluminum engine blocks with no major issues before, but this mill was different. Its heavy cast-iron head outweighed the block, and on top of vibration issues that caused the carburetors to rattle them selves apart, high compression caused engine blocks to warp and fail should temperatures climb over 230 degrees.

Strike three for the Vega was the car’s construction. Contrary to popular belief, GM did rust proof the cars, but its design allowed for air pockets to develop between the front fenders, cowl, and firewall during the rustproofing process, leaving the steel in those areas dangerously unprotected. Yet while all these defects were known to company brass, the Vega debuted on September 10, 1970, and just like GM hoped, it was a huge success.

1970 was when Detroit got serious about import fighters. American Motors snagged a Newsweek cover in April with the release of its subcompact Gremlin, and Ford’s Pinto became ubiquitous overnight, famously being described as “the car nobody loved but everybody bought.” Chevy was late to the party, but it hardly made a difference. Two years overdue and well over-budget (GM spent $200 million on the XP-887 project, or around $1.2 billion today), the Vega instantly became the star of the American subcompact segment, with its good looking kamm-tail body and mini-Camaro front end. It was slightly more expensive than the Pinto and Gremlin, but looked better, handled better – and at first, was almost completely unavailable.

General Motors spent $75 million retooling its production facility in Lordstown, Ohio specifically for the Vega, going full speed ahead while labor relations hit an all-time low. The company boasted that the semi-automated assembly line could build 100 Vegas and hour, and fit 30 of them into specially-designed freight cars, a drastic improvement over the standard 18 vehicles per boxcar.

But these advances in automation came with waves of pay cuts and layoffs. Workers at Lordstown briefly went on strike in late 1970, and again for 22 days in 1972, with the plant becoming the focal point of national labor relations as workers intentionally slowed down production and sabotaged cars to retaliate against company policies. As Vega supplies ebbed and flowed, “Lordstown Syndrome” became shorthand for the troubling times in the automotive world. For GM, it was only going to get worse.

The Vega won Motor Trend’s Car of the Year award for 1971, with the magazine concluding, “…the Chevrolet Vega 2300 is Motor Trend’s 1971 Car of the Year by way of engineering excellence, packaging, styling, and timeliness. As such, we are saying that, for the money, no other American car can deliver more.”

But within a year, the car was already starting to lose its shine. By the end of 1972, GM would issue recalls for over 500,000 Vegas to cover defective axles, sticky throttles, and electrical issues that could cause a fire. And once that first winter hit, owners in the Northeast began complaining about their fenders rusting through; by ’73, Vegas in the arid Southwest began to rust too. Engines began to seize, and soon Chevy was swamped with angry customers. The American public had been conned; the Vega was a lemon, and it quickly became the poster child for everything that was wrong in Detroit. GM addressed the corrosion issues with galvanized bodies in ’73 and ’74, and completed an emergency redesign for the car and engine for ’75, but it was too little, too late. …

From 1971 to 1980, Chevy sold over 3.5 million Vegas and other H-body models. Despite their fantastic potential, a litany of recalls and atrocious build quality means that they’ve all but disappeared from American roads. The few that survive have become a symbol for all that went wrong in the American auto industry after 1970. There’s a direct line from the Vega to the Monza to the Cavalier to the Cobalt, and while each of them could be considered a fantastic sales success for the company, their reputations for being unreliable, unsafe, and embarrassing to be seen in ensure that none of them are remembered very fondly.

I have written about station wagons occasionally on this blog. This was another area for strange GM decisions. Before the early 1970s GM had a normal tailgate for its big …

… midsize …

… and compact wagons.

But someone thought it would be fun, or something, to mess with a successful, though conventional, design. And so GM introduced …

… the clamshell tailgate for its full-sized wagons, with the metal half folding into the floor and the glass half moving upward into the ceiling. (Except in cases of mechanical failures.)

For its midsize wagons …

… instead of the aforementioned conventional tailgate (being used by its competitors) GM thought a giant hatchback would be a good idea. (That was the completely opposite direction of the clamshell.)

Neither Ford nor Chrysler jumped on the clamshell bandwagon, nor did either jump on the monster hatchback trend. When GM redesigned its big cars, the clamshell did not survive:

The aforementioned wagon represented the first phase in GM’s three-phase downsizing of most of its cars in the late ’70s.

The full-size B-bodies and C-bodies were a home run in terms of design.

The intermediate A-bodies, well, less so.

Notice the rear door. The window did not roll down. (Nor on the wagon.) The reason was that the engineers removed the window mechanism in order to add elbow room through hollowing out the door. (I can speak from experience that extra elbow room was useless where it was located.)

The A-bodies were the second round of GM downsizing. The third round was the most radical, replacing the Chevy Nova …

… with the revolutionary (for GM, though Japanese manufacturers had them for years before) Chevy Citation, Pontiac Phoenix, Olds Omega and Buick Skylark:

(Side note: In 1977, the downsized Caprice was the same size as the last-year Malibu. In 1989, the downsized Malibu was the same size as the next-to-last-year Nova. One might think GM could have just called the 1977 Malibu a Caprice and the 1978 Nova a Malibu. GM did that with the Pontiac Bonneville a few years later. Bad, bad idea.)

Given that GM had started working on the Citation five years before introduction (in 1979 as a 1980 model), you’d think GM would have fixed its issues before Citations went to dealerships. Instead, the more than 800,000 first-model-year Citation buyers discovered an old-design and crude four-cylinder engine (previously seen in the Vega), a new concept called “torque steer” (front-wheel-drive oversteering upon applying foot to gas pedal), rear brakes that locked alarmingly often (antilock brakes were a few years away), and, by the way, poor build quality and worse reliability. The most radical design GM had attempted to date lasted six model years, though its inclusion on numerous Worst Car Ever Sold lists has lasted far longer.

On to pickup trucks. For the first half-century or so of their use, pickup trucks had one seat. While International offered a crew cab pickup in 1957 …

… followed by Dodge in 1963 …

… Ford in 1965 …

… and Chevrolet in 1973:

The crew cabs had fairly small customer bases — the military, government and railroads. You’ll note that the crew cabs are very long, longer than even the huge sedans of the day.

The same year Chevy and GMC got around to introducing their crew cabs Dodge introduced its Club Cab pickup, which was the first to have a (very small) back seat in a two-door pickup truck:

Two years later, in 1975 Ford introduced its SuperCab:

It only took Chevy and GMC 13 more years to build an extended-cab pickup:

One assumes based on the introduction dates — 1973 for the Crew Cab and 1988 for the extended cab, the first years of new-design pickups — that GM lazily waited until a new pickup design to get around to a new model already being offered by its competitors. (And when the extended-cab pickup came out, Chevy and GMC kept selling their old-design crew cab pickup; GM didn’t get around to a new-design crew cab until four years later.) No wonder Ford has outsold Chevy in pickups for years.

The same cannot be said about minivans, introduced by Chrysler in 1984, GM in 1985 and Ford in 1986. The difference between Chrysler’s minivans and GM and Ford’s is that the latter were based on their compact pickups, the S-10 and Ranger, respectively. (Each also developed a small SUV, the Blazer and Bronco II, respectively, based on the same small pickups.) Lee Iacocca came up with the idea for the minivan at Ford …

… but Henry Ford II wasn’t interested. Upon being fired at Ford, Iacocca took his idea with him to Chrysler, whose minivan was based on the K-car platform. The Chrysler minivans sold much better than their competition, and probably paved the way for more car-like vehicles such as the Honda Odyssey van and Pilot SUV, both based on the Accord sedan.

GM had multiple responses when it became apparent the Astro (and GMC Safari) were losing in the minivan sales race. Its first was the Lumina APV, based on the (by now midsize) Lumina sedan …

… immediately dubbed the “Dustbuster” (as were the Pontiac Trans Sport and Olds Silhouette) for their resemblance to the Black & Decker portable vacuum cleaner.

A few years after the Dustbuster went away, Pontiac came out with a concept vehicle, the Aztek …

… which proved popular enough on the car show circuit to make Pontiac decide to build them. But to prove the old saw about a camel being a horse designed by committee, and repeating its sin with the Trans Sport, which unlike the concept …

… was a warmed-over Dustbuster …

… Pontiac didn’t send that Aztek to market:

So what went wrong? Popular Mechanics explains:

Back in the bad old days at GM, the people in charge of vehicle manufacturing had huge control over how vehicles looked. Designers knew what they wanted GM’s first crossover to look like, but in the convoluted corporate world of the 1990s, GM’s own manufacturing team wouldn’t give it to them. The excuse? It would have cost too much.

That decision cost GM dearly … and not just in dollars.

The hideous slab-sided production horror that debuted in 2001 shares little with the 1999 concept. … Their proportions are completely different. The most visible alteration was to the angular roof of the concept that looked much like the production Chevrolet Equinox.

If the concept had made it to production, the fate of the Aztek would likely have been much different. Instead, the Aztek earned its title as the ugliest car in the world, and helped kill off the Pontiac brand.

Road & Track columnist Bob Lutz was hired from Chrysler to GM right as the production Aztek was introduced:

A bad car happens in stages. The Aztek concept car was a much leaner vehicle. Decent proportions. It got everybody excited. At the time, GM was criticized for never doing anything new, never taking a chance. So Wagoner and the automotive strategy board decreed that henceforth, 40 percent of all new GM products would be “innovative.” That started a trend toward setting internal goals that meant nothing to the customer. Everything that looked reasonably radical got green-lit.

These things require a culture of complete acquiescence and intimidation, led by a strong dictatorial individual who wants it that way.

The guy in charge of product development was Don Hackworth, an old-school guy from the tradition of shouts, browbeating, and by-God-I-want-it-done. He said, “Look. We’ve all made up our minds that the Aztek is gonna be a winner. It’s gonna astound the world. I don’t want any negative comments about this vehicle. None. Anybody who has bad opinions about it, I want them off the team.” As if the public is gonna give a sh** about team spirit. Obviously, the industry is trying to get away from that approach.

Early on, the Aztek obviously failed the market research. But in those days, GM went ahead with quite a few vehicles that failed product clinics. The Aztek didn’t just fail—it scored dead last. Rock bottom. Respondents said, “Can they possibly be serious with this thing? I wouldn’t take it as a gift.” And the GM machine was in such denial that it rejected the research and just said, “What do those a**holes know?”

The danger with the totalitarian management style is that people won’t speak up when there’s a problem. They’ll get their heads cut off or the messenger gets shot. …

One guy I informally interviewed about how the Aztek happened was one of the top guys on the project. And this guy, he looks at me and he says, “I’m proud of it.” Proud of the Aztek? “Yup. That was the best program we ever did at GM. We made all our internal goals, we made the timing, and I’m really proud of the part I played in it.” He had tears in his eyes. It was almost tragic. Everybody wanted to will this thing to succeed, and it didn’t work. All the emotional commitment and pride in the program was that it achieved all its internal objectives. And it was probably one of the great defeats in his life, or in his career.

For a company known for excess bureaucracy — which might explain the tardy introduction of extended-cab or crew-cab pickups, which cost GM lots of money given that pickups then and now are hugely profitable — one wonders how the Corvair and the other design oddities got to market. (“Totalitarian managment” is probably right on.) Every time a business debuts a poorly-thought-out product or service, or rebrands itself in a curious way (for instance, Wisconsin Electric renaming itself “We Energies”), one thinks there was a guy in a room who did not speak up when he (or she) should have about how dumb this idea was. Call him or her Mr. or Ms. Non-Groupthink.

 

When color was invented

Todd Radom admits to not being a great baseball player (join the crowd), but watched baseball in the 1970s because …

I was focused on Reggie Jackson’s titanic home runs, but I was also mesmerized by the green and gold Oakland A’s uniforms.
I doodled sports logos on school notebooks and conjured my own teams — not so much for games as for creating logos and uniforms for them. I studied the cap marks of Major League Baseball teams and rendered them in painstaking detail with felt-tipped markers and cheap ballpoint pens.

I was fascinated by the visual culture of sports, and I still am, having devoted my life to sports design. Lucky for me, as a young baseball fan, I hit the lottery: My formative sports-aesthetics years came in the 1970s, the game’s most vibrant, colorful decade, with its smorgasbord of audacious and often garish uniforms. Bold graphics and sensationally showy colors were synthesized into some of sports history’s most memorable uniforms — a golden age of sports identity.

Sometimes, the results were mixed — not unexpected, coming off baseball’s longstanding adherence to traditional aesthetics — but that was just fine by me. My formative years coincided with the opening of modern, multipurpose stadiums, color TV, and a new approach to what sports could look like, played by athletes with long hair and flamboyant mustaches. While any number of the uniforms were considered ugly by contemporary standards, they also projected a sense of optimism and a fresh take on a very visible and vital aspect of American popular culture.

Whiskies in the jars

The AV Club discusses one of Irish music’s most popular songs:

“Whiskey In The Jar” has had one of the longest, most colorful histories of any Irish song. The thousands of versions of the tune include not only the rock ’n’ roll ones everyone knows—mainly by Thin Lizzy and Metallica—but they also include The Dubliners’ revered folk take, The Grateful Dead’s rehearsal version, bedroom covers, raucous bar-band versions, spritely Irish-punk covers, and live acoustic renditions. The song’s wide-ranging surface appeal is obvious: It’s a rollicking tune that’s fun to sing, especially while hoisting a pint or two. But that “Whiskey In The Jar” has become so revered is also somewhat mystifying: How did a centuries-old folk tale about an Irish criminal who plunders and robs people he encounters—and then gets shipped off to jail after his woman betrays him—endure and become a cover staple?

Certainly its simple foundation and nod to tradition has something to do with it. The song was particularly popular in American folk circles in the ’50s and ’60s, when Burl Ives, The Brothers Four, and The Limeliters covered it as “Kilgary Mountain,” and Peter, Paul, And Mary recorded it as “Gilgarra Mountain.” Yet “Whiskey In The Jar” is also quite malleable, which has allowed it to transcend genres and eras. The Pogues teamed up with The Dubliners for a slightly disheveled, folky version that hit No. 4 in Ireland in 1990, while bluegrass icon David Grisman and Jerry Garcia collaborated on a light-footed take in the mid-’90s, around the same time Pulp did a predictably droll version of the song. A mid-’00s cover by Belle And Sebastian was sighing and slightly desperate, while ’80s new-wavers Simple Minds amped up the urgency for a U2-esque, spacey version in 2009. Even Kings Of Leon’s 2003 single “Molly’s Chambers” has ties to the song; the title is a reference to a phrase from Thin Lizzy’s version, and zeroes in on the temptation aspect of the tune.

Naturally, the evolution of “Whiskey In The Jar” itself is also complicated. Folklorists point out that the rough outline of the “Whiskey In The Jar” story dates back to 1650 and the exploits of a vile criminal named Patrick Flemming, an Irish highwayman who maimed and killed civilians galore before being executed—caught only because his weapon was intentionally dampened so it would malfunction. A tune called “Patrick Flemmen He Was A Valiant Souldier” appears in the early 1680s in conjunction with an English broadside ballad, “The Downfal Of The Whiggs, Or, Their Lamentation For Fear Of A Loyal Parliament”—but the actual text of the “Patrick Flemming” tune surfaced in a later collection of ballads kept by noted curator Sir Frederick Madden, and adds the detail of the betrayal by a woman. This woman had a name (Molly) by a circa-1850s broadside ballad called “Sporting Hero, Or Whiskey In The Bar”; in other variations, she came to be known as “sportin’ Jenny” or just “Jenny.” The author of the 1960 book Irish Street Ballads includes the tune “There’s Whiskey In The Jar,” and notes his Limerick-based mother learned the song in 1870 from a native of Cork. Over time, the villainous plundering became a simpler, man-on-man crime—and the person being robbed generally tended to be English, frequently a higher-up in the army (e.g., “Captain Farrell,” “Colonel Pepper”).

But because “Whiskey In The Jar” is considered to be a traditional, there’s no definitive version of the song or its lyrics. In truth, chronicling the variations of the song in popular music just during the last half-century or so is mind-boggling. Sometimes when the protagonist’s lady rats him out for his plundering, his weapon does work, and he kills the person who confronts him. In some cases, he languishes in prison for his crimes; in other cases, he manages to escape with his AWOL-from-the-army brother and they both hide in the mountains. And depending on the version of the song, either the main character would rather be dabbling in sex and drinking above all, or else he’s just a hooligan who’s unruly on whiskey.

Despite this bawdy and violent origin, the song tends to end up a lighthearted celebration of debauchery, a communal sing-along that’s like a drunk Grimm’s fairy tale. In a recent interview with The A.V. Club, Thin Lizzy founding member/original guitarist Eric Bell underscored this point by noting the song’s importance to the band’s native Ireland. “There’s lots of Irish folk songs, like drinking songs,” he said. “Everybody has a few drinks and they go down to the pub. It’s just part of the Irish tradition. It’s the same with America—you’ve got your bluegrass music, your country music. It’s part of America.”

Thin Lizzy’s 1972 take on “Whiskey In The Jar”—which hit No. 1 in Ireland and went top 10 in the U.K.—is widely considered to be the definitive rock ’n’ roll version of the song, and for good reason: At the time, its combination of old and new sounds was revolutionary. “For a folk song to become a hit in the ’70s—but played on electrical instruments, not traditional instruments, like bodhráns and Irish pipes and violins and fiddles—our version was extremely modern,” Bell described. “Still, it somehow kept that Irish feel.”

As the guitarist tells it, his band covering “Whiskey In The Jar” happened “purely by accident,” during an otherwise uneventful rehearsal at a London pub. “We used to work original stuff, [but] on this particular day, it just wasn’t happening. We were going to pack up, and Philip [Lynott, vocalist] put down the bass and picked up the other six-string guitar, and he just started messing about with various stupid songs. About 20 minutes later, he started singing ‘Whiskey In The Jar’ as one of those stupid songs. Me and [drummer] Brian Downey, at this point we were extremely bored, and we started playing along with him a little bit.”

In a fateful twist, then-Thin Lizzy manager Ted Carroll happened to be coming up the stairs at the time with a new amplifier for Bell. Overhearing the jam session, he pressed the group on what they were playing, Bell recalls. “We said, ‘Whiskey In The Jar.’ He said, ‘You’ve got your first single to record for Decca in about six weeks. Have you got an A-side?’ and we said, ‘Yeah, ‘Black Boys In The Corner.’ He said, ‘Have you got a B-side?’ We said, ‘Not at the moment.’ He said, ‘Start thinking about rearranging ‘Whiskey In The Jar.’ We couldn’t believe that he wanted us to record that song.”

Six weeks later, when Thin Lizzy went to record “Black Boys In The Corner,” the band still didn’t have a B-side, so “Whiskey In The Jar” it was. Unlike the popular ’60s version by The Dubliners, however—a twee, brisk take on the song that was relatively unconcerned with prison time—the band’s approach was from a much different, moodier place. Lynott’s vocals are soulful and impassioned, and deeply invested in the tragic storyline. His delivery humanizes the narrator and sympathizes with his anguish over being double-crossed by his lady: “And I got drunk on whiskey-oh / And I loved, I loved, I loved, I loved, I loved, I loved my Molly-o.” At the very end, the band throws in a reference to another standard trope well-known in Irish folk circles, the “dirty old town.” The lyric—“And she wheels a wheelbarrow through that old dirty town / Oh, it’s a dirty old town”—can be interpreted as longing for freedom, or a dig on Molly that she too is stuck in a hellhole of her own doing.

Thin Lizzy’s version remains distinctive as well due to the guitar parts Bell added atop the basic melody: a keening, mournful wail as an intro; a lively, rippling guitar line cascading throughout the song; and an on-the-edge-of-a-squall bridge with jammy, bluesy roots. Guitar-wise, Bell called it “one of the most difficult songs I’ve ever worked on in my life, to try and come up with original ideas for.” In fact, in order to hit on the right formula, he had to avoid approaching the song like a guitarist would.

“We were gigging one night, Thin Lizzy in England,” he recalls. “And on the way home, Philip used to play cassettes in the car when we were traveling. He had different people that he was into: Janice Brown, The Rolling Stones, [Jimi] Hendrix, Bob Marley. And he was also into Irish songs, [like] the Chieftains. As we were traveling home that night, he put the Chieftains cassette on. I got this idea to approach the intro as an Uilleann pipe—you know, Irish pipes—rather than thinking as a guitar player.”

In an interview with The A.V. Club, guitarist Richard Fortus, who played with Thin Lizzy in 2011 and currently performs in Guns N’ Roses, noted the significance of these varied influences coming together. “That whole Irish rock band thing—[Thin Lizzy] were the first ones to really do it,” he says. “At that time, artists like Van Morrison—he was trying to sound American. [Thin Lizzy] were the first ones to break through with that Celtic vibe. Their version of it is just so great.”

Despite the respectful origins and fresh take on the song, not everyone was thrilled with Thin Lizzy’s version, especially the old guard. “Everybody that’s heard ‘Whiskey In The Jar,’ heard The Dubliners’ version: banjos, tin whistles, and so on and so on,” Bell said. “We came along and completely and totally rearranged that song. A lot of Irish people didn’t really like it, you know?… We were told we bastardized it. An awful lot of Irish people said that to us, actually used that word. [Assumes a stern Irish grizzled accent.] ‘Jesus, lads, you bastardized that song.’” …

“Whiskey In The Jar” becoming a hit was also polarizing internally for Thin Lizzy, both a blessing and a curse. Bell said the song helped bolster his reputation as a musician and keep him financially solvent. (“It’s sort of helped me pay the rent the last 20 years. Before ‘Whiskey In The Jar,’ I hadn’t a pot to piss on, really.”) But Thin Lizzy failed to immediately follow up “Whiskey In The Jar” with another huge U.K. hit (although a subsequent pair of singles, including the now-classic “The Rocker,” hit the Top 15 of the Irish charts).The band was saddled with a one-hit wonder reputation perpetuated by the press, as Lynott noted in a 1976 interview. “I was conscious that the media saw that we didn’t follow up ‘Whiskey In The Jar,’” he said. “And we didn’t in terms of record sales. The only place we seemed to be happening was on the street. But, you know, that’s Thin Lizzy summed up for you. Like an album and three singles after ‘Whiskey In The Jar,’ man, you’d get people mentioning ‘Whiskey Jar’ in interviews—and I’d go ‘Oh Jeezuz.’ That was how far behind the press got on the band. They really lost contact.”Moving immediately into performing at larger venues also did a number on the band. “There would be about 800 people there to see us, and they didn’t know what to expect,” Bell recalled. “We just walked out and we did our set that we always played in the pubs and clubs: rock music, blues, some original stuff. Nobody took a blind bit of notice of us—maybe 30 people standing watching us playing. Then at some part of the night, I went [sings the start of “Whiskey In The Jar”] and 1,000 people turned up, appeared right in front of us, and stood and went crazy until the song ended. Then we started playing our own blues and stuff again—and they all disappeared again. That’s what it was like. We went through this major change, of being a rock-blues band to a band that had their first hit record. It really, really throws you.”Still, “Whiskey In The Jar” was perhaps the first chance many had to discover Thin Lizzy. Witnessing the band perform on Top Of The Pops in 1972 became a life-changing experience for Northern Ireland native Ricky Warwick. He is now the frontman of Black Star Riders, the moniker under which the current Thin Lizzy lineup—including guitarist Scott Gorham, who replaced Bell when he left the group in 1973— releases new music and plays shows.

“The first time I saw Thin Lizzy in black and white on TV was playing ‘Whiskey In The Jar,’” Warwick told A.V. Club. “I was just captivated by the sound and also by the way Phil looked, because he was so different-looking to any sort of rock & roller at that time. The whole thing just captured my imagination. That was the first time I heard the song, and I fell in love with it straightaway. It was the sound, it was that guitar hook, it was the whole vibe of it, it was Phil’s voice. Everything was captivating to me.”

Warwick noted that Black Star Riders currently close their set with “Whiskey In The Jar” (which, of course, stays faithful to the Thin Lizzy version). He witnesses the song’s enduring popularity every night—and has his own theory as to why it endures. “It’s that Irish drinking thing, it absolutely is,” Warwick said. “You have the nautical lyrics in the chorus, which is very much Irish folklore—diddly-um, diddly-i [and] musha-ring-dam-a-do-dam-a-die. It’s almost like rhyming with the music, and it really doesn’t mean anything, it’s just a drinking song.

“But also, the verses have a lot of meaning,” he continued. “It’s the Irish villain robbing the English general and getting one over on the English, which the Irish always love to do. It’s a magical story—it’s timeless. That song comes on, no matter where you are, and especially if it’s cranked up loud, people just want to drink and have a good time, and raise their fists in the air.”

Although “Whiskey In The Jar” was always huge in Ireland, the U.K., and Europe, the song surged worldwide in the late ’90s thanks to Metallica’s slash-and-burn take on Thin Lizzy’s version, from the 1998 B-sides/covers album Garage Inc. “I can’t speak for them, but I know Thin Lizzy’s always been a big band for Metallica,” Bob Rock, who produced the first disc of Garage Inc. with frontman James Hetfield and drummer Lars Ulrich. Rock told A.V. Club: “That particular song, they really liked the fact it was Eric Bell [on the track]—kind of an earlier song of [Thin Lizzy’s]. We just tried to do it justice. It was one of the most simple ones on the album, because their heart was in it.

“All the lyrics and the imagery, with the farm and the field, really was what got James [Hetfield] into it as well,” Rock continued. “It was really a live performance of [Metallica] playing it, which you hear, the enthusiasm and the excitement.”

Metallica’s version of the song honors the spirit of Thin Lizzy’s deliberate approach, whether it’s Ulrich’s fat drum splashes or the precision with which the band emulates and amplifies Bell’s original guitar parts. Metallica’s cover has a looser, elastic feel, however—matching the debauched party scenes depicted in the song’s video—and revels in its villainous ways. Even when reaching the song’s denouement, when the narrator is in jail, the band takes a defiant stance. Of course, this again has much to do with the vocals: “Whiskey In The Jar” features peak Hetfield vocal enunciation, from his sharp-cornered delivery of the “musha-ring-dam-a-do-dam-a-die” lyric—a part Rock stressed they “had to make sure James could own that… We wanted to make sure we got that right”—and the syllabic uprising he employs on words such as “jar-uh.”

“We treated it like it was a Metallica song, in a way,” Rock said about the band’s approach. “Sometimes [when bands are covering other] records, maybe [they] do it quickly, because it’s not theirs. We actually made sure that we took time to make sure everything was right, and it was a record everybody could be proud of. That’s the difference. It probably shows in what comes across—we tried to make a great record.”

“Whiskey In The Jar” was the second song from Garage Inc. to hit No. 1 on the Billboard Mainstream Rock Tracks chart, and nabbed Metallica the 2000 Grammy Award for Best Hard Rock Performance. Certainly Metallica’s status as one of the biggest bands in the world helped propel the song to such great heights. But why did this version resonate so widely?

“It’s kind of folky,” Rock described. “And it gets corny, but folk music and that kind of traditional song makes you feel good. It’s very powerful and very happy, what we did, but we didn’t take away from the song. Traditional songs like that resonate through generations. It resonated with the Pogues; they did it with The Dubliners, a different generation. Metallica did theirs. It’s kind of the great thing about music—and particularly traditional music—is to carry it through generations, so other people actually get a chance to hear that stuff.”

But Metallica’s version of the traditional standard offered Bell a bit of a surprise: “Years later, after I left Thin Lizzy, I was doing a tour with my own band in Sweden,” he recalled. “People came to the changing room after the gig to talk and have autographs and so on. Everyone that came into the changing room said, [assumes Swedish accent] ‘Eric, have you heard Metallica’s version of ‘Whiskey In The Jar’? And I said, ‘Who? Metallica?’ I had never heard of Metallica in my life, because I’m not into that type of music. So when I got back to England I thought, ‘Wow, I must check this band out.’”

Once he checked out the album, he was in for another surprise. “Thing was, on the sleeve notes of the album, it said, ‘Whiskey In The Jar’ and then in brackets, [Traditional Arrangement, Metallica.]” So I put it on and—gotcha. That’s my riff; I made that up. I phoned up Thin Lizzy’s management and I said, ‘Listen, I was in Sweden and there’s a band called Metallica…’ and they said, ‘Yes, we know, our lawyers are talking to their lawyers at the moment.’” Surpassing any legal issues, Metallica performed the song live in Dublin in 1999 with Bell on guitar.

It’s understandable why Bell and other past and present members of Thin Lizzy are so protective of “Whiskey In The Jar,” and not just for financial reasons. “There’s [been] many, many different versions of it through the years,” Black Star Riders’ Warwick said. “It’s just part of our culture. Music’s so ingrained in our society—in every street, every bar, every house, there’s a musical instrument, or there’s music going on. You just grow up with it—it’s part of who you are, what you are. People think of us as a nation of fighters, [but] we’re a nation of dreamers as well.” “Whiskey In The Jar”’s longstanding value to Irish culture remains immeasurable; it represents what makes the country and its artistic output influential and meaningful, in any rendition.

Members of the Incumbent Party

A recent comment from a reader asked …

Oh, and sometime would you to write about the change if the republican (note,little r) party, and its fall from grace when it was still the Grand Old Party!

Upon thinking about it, I figured out the answer, which requires a bit of history. In 1954, voters gave control of both houses of Congress to the Democrats. That meant that Republican presidents Dwight Eisenhower, Richard Nixon and Gerald Ford had to deal with Democratic Congressional leaders, which limited what they were able to do that dovetailed with what the GOP wanted to do.

Small government is always the correct answer, but it’s easier to tout small government when you’re not in charge. Whatever Richard Nixon was, he was no conservative. What conservative would enact wage and price controls to stop inflation (which only pushed it down the road), and create the Environmental Protection Agency and Occupational Safety and Health Administration (both of which metastasized into the most anti-business job-killing agencies of the federal government)? What conservative raises taxes? (See the Tax Reform Act of 1969.)

Then in 1980, Ronald Reagan was elected president and the Republicans took over control of the Senate. My theory is that Republicans enjoyed being in charge in Washington and decided that they liked government the size it was as long as they were in charge. Government did not shrink under Reagan’s presidency, nor did it under George H.W. Bush, nor did it under George W. Bush. (While 9/11 was a reason in the latter case, 9/11 should have been a reason to radically reduce the size of government to get more resources for the war on terror without increasing taxes or the deficit.)

Wisconsin Republicans have really never been small-government conservatives. State and local government is twice the size it should be given growth in inflation and population since the late 1970s. A constitutional mechanism like the Taxpayer Bill of Rights would stop unjustified growth in government, but despite the fact that the GOP has been in total power in Madison since 2011 (and on and off for 25 years before that), there are no constitutional controls on growth in government in Wisconsin, only (weak) legislative controls, which obviously can be overturned by a future Legislature.

My theory proves that there is less difference than one might think between parties — not in ideology, but in the desire to get into office, get more Ds or Rs into office, and keep them in office. The more power government has, the more the stakes are raised in elections, and the more expensive and nastier politics gets. State Republicans obviously believe that the current size and scope of government is just fine as long as they’re in charge of it. Constitutional controls on what government can do would give people one less reason to vote for Republicans, since with the right limits on taxation and spending Democrats couldn’t raise spending and taxes whenever they felt like it.