Today is election day. In less than a day, to paraphrase Gerald Ford, our long national nightmare will be over.
Election Day is a long, long day for three groups — candidates and their supporters, poll workers and municipal and county clerk offices, and the media.
(Before I resume: Kudos to Ted Ehlen for finding these YouTube clips.)
The first election radio coverage took place in 1920. The first election TV coverage took place in 1948:
The first presidential election I was alive for was 1968, but given that I was 3 and my parents strictly enforced bedtimes, I’m guessing I didn’t watch:
The first presidential election I remember was 1972. The rumor around my elementary school on the far East Side of Madison was that George McGovern was going to make us go to school on Saturdays. At least in some parts of the People’s Republic of Madison, Richard Nixon was the one that election.
The presidential election I first paid real attention to, though, was 1980. I was getting ready to stay up late, possibly all night, for the too-close-to-call election. And then, at 7:15 p.m., NBC-TV called the presidential election for Ronald Reagan. (While Reagan was taking a shower, by the way.)
Calls that early don’t happen anymore because the networks now are loath to call an election before the polls close on the West Coast, lest a particular projection result in people not bothering to vote. That famously occurred in 2000, when CBS called Florida for Al Gore while people were still voting in the Panhandle, which is one time zone behind the rest of the state. (More about that election in a few paragraphs.)
The only election I’ve been involved in as a participant, sort of, was 1984, when I worked on the successful campaign of a state representative trying to advance to the state Senate. (You’ll never guess who the candidate was.) I may have been the only person at the election party that night who was totally satisfied with the results, because my choices for state Senate and Assembly and president (the latter of whom was from a different party from the legislative candidates) all won.
That was while I was in college. College is a great time to be involved in politics because, even though you think otherwise, you don’t have much invested in the outcome. When you become an adult and have things like homes, retirement savings, etc., suddenly watching becomes less fun because you have more (literally) invested in the outcome.
The first election I ever worked in the media was 1988, when I was calling in results to the Associated Press while compiling results from the Grant County Courthouse. That night I devised the Last Precinct Game, trying to figure out which precinct would be the final precinct to report its results. The chances of being the Last Precinct increase by distance, and nearly every election afterward I called some town clerk in the middle of the night to get their election results.
My least favorite election night was 1992. Not because of the results, but because of the fact that very, very early Election Day was the day my wife and I returned from our honeymoon to Mexico. When we left, it was 85 and partly cloudy; when we arrived at O’Hare International Airport in Chicago, it was 37 and sleeting. At 3:30 the morning after, I was sitting in the old Tri-County Press office in Cuba City barely able to type and trying to get the newspaper done.
My highlight from 1996’s election night was going to two parties for Congressional candidates. The first party was for the winning candidate; the second was for the losing candidate, who I knew and for whom I had voted. Winning-candidate parties are more fun (particularly if the winner’s dog is allowed to drink champagne at the party).
The most memorable election night, which was much longer than one night, took place four years later. The month-long election night began when I made an appearance on a radio station that had to end before midnight because my wife was on call with the local ambulance starting at midnight. Even though journalism is the opposite of math, we were able to figure out that the one state that would decide the election was Florida, or, as NBC’s Tim Russert wrote on his famous whiteboard, “FLORIDA FLORIDA FLORIDA.”
Because our oldest son was sick, I held him and paced in front of our TV and watched the results until CNN projected George W. Bush’s winning Florida and thus the election around 1:15 a.m.
I put our son to bed, watched about an hour longer, and then went to the kitchen to clean up, and for some reason turned the TV back on to hear NBC report that the margin was strangely tightening despite the networks’ projection. That seemed too bizarre to be true, so I turned off the TV and went to bed.
An hour later, we were in the hospital emergency room because our son started coughing. The doctors gave us the news he … had a cold. I got 90 minutes of sleep that night, which was 90 more minutes of sleep than anyone working in daily media.
Wisconsin Public Television had a Friday-night public affairs show, “WeekEnd,” where I occasionally appeared as a pundit. The Friday after elections “WeekEnd” had what it called the Election Hangover Show. I ran into a fellow panelist who had announced he was leaving the show after the election; he pointed out that he couldn’t retire if the election wasn’t over.
The election was certainly not over. Those who follow politics learned more than they ever wanted to about Florida election law. (You age yourself if you know the meaning of the term “hanging chads.”) I was sending daily emails to readers of my business magazine making observations and predictions based on logic … predictions that were almost all wrong.
That election finally ended one month later, when our viewing of NBC’s “Law & Order” was interrupted by Tom Brokaw’s announcement of a “split decision,” which, as NBC’s Carl Stern and Dan Abrams reported, was not a split decision at all.
In my lifetime, the 1968, 1976, 1992 and 2004 elections weren’t called until Wednesday. In case you think that’s late, 2000 proves there is such a thing as later. Much later.
I have lots of choices for readers to click upon, because it’s a busy weekend.
Want my nonpartisan view of the election? Click here.
Yesterday started Movember, a month in which men should grow mustaches to increase awareness of prostate cancer (complications of which killed my grandfather). I can’t grow what I already have, but you can read my dissertation on facial hair.
Yesterday also started National Novel Writing Month, or NaNoWriMo. I wish I could find the “Novel ideas” piece I wrote on the late (and apparently wiped-from-the-Internet) Marketplace of Ideas blog. This post is about the broadcast version of fiction, specifically cop TV, and this post is about reporters in movies and TV. Keep this in mind: Fiction has to make sense.
To hear me announce a Level 3 high school football playoff game between Lancaster (wishbone) and Durand (single-wing), click here Saturday before 2 p.m.
Daylight Savings Time ends Sunday at 2 a.m. For my view on DST, click here.
And for those interested in how their votes may be predicted by their tastes in entertainment, peruse on:
What became known as the Cuban Missile Crisis started well before Oct. 22, 1962. Fidel Castro overthrew Fulgencio Batista at the beginning of 1959. By the end of 1960, Castro had aligned Cuba with the Soviet Union, which put the Soviets in essence in the middle of the southeastern U.S. (Cuba is between Florida and Puerto Rico.)
Recall last week that John F. Kennedy was a fan of Ian Fleming’s Bond. James Bond. Kennedy apparently inherited more than one plot to reinvoke the Platt Amendment, which gave the U.S. control of Cuba from 1901 until 1934. Kennedy pledged on April 12, 1961 that the U.S. would not invade Cuba, five days before the Bay of Pigs invasion, a covert attempt to, yes, invade Cuba. Seven months later, the Kennedy administration hatched Operation Mongoose, the next attempt to overthrow Castro through sabotage.
Six months after that, in late May 1962, a Soviet delegation to Cuba suggested installing nuclear missiles in Cuba. Castro announced in late July that Cuba was taking measures that would make any direct U.S. attack on Cuba the equivalent of a world war, with help, he said, from the U.S.S.R. U.S. Sen. Kenneth Keating (R–New York) announced in the Senate Aug. 31 that evidence existed of missile installations in Cuba. (Keating’s reward was to be defeated in 1964 by Robert F. Kennedy.)
The missile and launcher parts began arriving in September by freighter, but were not discovered by the Americans until a U-2 flight Oct. 14, 1962, four days after Keating announced Cuba had six missile sites. After the missiles were identified as medium-range surface-to-surface missiles, Kennedy’s morning Oct. 16 began with news of the missiles being installed in Cuba. (Interestingly, Defense Secretary Robert McNamara and National Security Advisor McGeorge Bundy knew about the missiles one day before Kennedy did.)
Kennedy formed an executive committee of some of his cabinet and others, none of whom were named Lyndon Johnson. The committee met for a week while Kennedy made various campaign appearances outside Washington, and met with Soviet Foreign Minister Andrei Gromyko, without telling Gromyko what the U.S. knew. Kennedy then broke off his trip and returned to Washington, according to Press Secretary Pierre Salinger, because of a cold.
Truth be told, the cold war was in danger of getting radioactively hot. Kennedy decided on a quarantine of Cuba because the chairman of the Joint Chiefs of Staff said he could not guarantee that an air strike could not guarantee destroying all the missile sites. He spoke to the country a day after asking the press (and you’d never see this today) to report nothing until the speech, and after meeting with congressional leaders. (Another thing you’d never see today: No leaks.)
If Kennedy’s speech alarmed the public, their panic didn’t compare to how the public should have felt. The day of Kennedy’s speech, Castro announced a war alert. One day after the speech, the U.S. proceeded with an atomic bomb test in the South Pacific. (Perhaps in retrospect that should have been rescheduled, along with the Atlas missile test at Vandenberg Air Force Base in California.) One day later, on Oct. 24, U.S. forces were upgraded from Defense Condition 3 to DEFCON 2, their highest alert level in history. (DEFCON refers to how imminent a nuclear attack is; even though U.S. forces are in harm’s way overseas, we’re at DEFCON 5 now. The U.S. was at DEFCON 3 during the 1973 Yom Kippur war and on Sept. 11, 2001. If we ever reach DEFCON 1, you’ll probably be hearing air raid sirens, and that will not be a test.)
I think you will understand me correctly if you are really concerned for the welfare of the world. Everyone needs peace: both capitalists, if they have not lost their reason, and all the more, communists — people who know how to value not only their own lives but, above all else, the life of nations. WE communists are against any wars between states at all, and have been defending the cause of peace ever since we came into the world. WE have always regarded war as a calamity, not as a game or a means for achieving particular purposes, much less as a goal in itself. Our goals are clear, and the means of achieving them is work. War is our enemy and a calamity for all nations.
In contrast to the tone of the letter, one day later, more aerial photos showed the Soviets had not slowed down on their missile site work, and were beginning to camouflage the sites. Meanwhile, Castro wrote a letter to Khrushchev:
From an analysis of the situation and the reports in our possession, I consider that the aggression is almost imminent within the next 24 or 72 hours.
There are two possible variants: the first and likeliest one is an air attack against certain targets with the limited objective of destroying them; the second, less probable although possible, is invasion. I understand that his variant would call for a large number of forces and it is, in addition, the most repulsive form of aggression, which might inhibit them. …
At this time I want to convey to you briefly my personal opinion. If the second variant is implemented and the imperialists invade Cuba with the goal of occupying it, the danger that that aggressive policy poses for humanity is so great that following that event the Soviet Union must never allow the circumstances in which the imperialists could launch the first nuclear strike against it.
I tell you this because I believe that the imperialists’ aggressiveness is extremely dangerous and if they actually carry out the brutal act of invading Cuba in violation of international law and morality, that would be the moment to eliminate such danger forever through tan act of clear legitimate defense, however harsh and terrible the solution would be, for there is no other.
Khrushchev reportedly read the third paragraph as Castro’s suggesting that the Soviet Union launch a first nuclear strike.
That same day, United Nations ambassador Adlai Stevenson spoke to the UN:
That same day, ABC-TV reporter John Scali got a request to meet with Aleksandr Fomin, a top Soviet intelligence officer in Washington. (I guarantee you that doesn’t happen at weekly newspapers.) Fomin proposed to Scali that the Soviet Union would dismantle the Cuban missile bases if the U.S. publicly pledged not to invade Cuba.
A day later, on Oct. 27, a U-2 plane was shot down by Soviet orders over Cuba, and its pilot was killed. Another U-2 in Alaska got too close to Soviet airspace and was nearly intercepted by Soviet fighters. Kennedy reportedly ordered an attack on Cuba to begin Monday morning, two days from then.
That same day, though, Kennedy got a letter from Khrushchev repeating Fomin’s offer. Kennedy then got another, more bellicose, letter from Khrushchev, demanding the removal of U.S. missiles from Turkey, and hinting of a possible coup attempt in the Kremlin. It must have taken some nerve to decide to respond to the first, more conciliatory letter, and pretend the second letter didn’t exist. (Kind of like pretending you didn’t read that email.) Kennedy wrote a letter pledging that the U.S. wouldn’t invade Cuba, and a day later Khrushchev announced on Moscow Radio that the Soviet Union would pull the missiles out of Cuba, without, incidentally, consulting Castro on the issue.
Our October 27 message to President Kennedy allows for the question to be settled in your favor, to defend Cuba from an invasion and prevent war from breaking out. Kennedy’s reply, which you apparently also k now, offers assurances that the United States will not invade Cuba with its own forces, nor will it permit its allies to carry out an invasion. In this way the president of the United States has positively answered my messages of October 26 and 27, 1962. …
With this motive I would like to recommend to you now, at this moment of change in the crisis, not to be carried away by sentiment and to show firmness. I must say that I understand your feelings of indignation toward the aggressive actions and violations of elementary norms of international law on the part of the United States. …
Therefore, I would like to advise you in a friendly manner to show patience, firmness and even more firmness. Naturally, if there’s an invitation it will be necessary to repulse it by every means. But we mustn’t allow ourselves to be carried away by provocations, because the Pentagon’s unbridled militarists, now that the solution to the conflict is in sight and apparently in your favor, creating a guarantee against the invasion of Cuba, are trying to frustrate the agreement and provoke you into actions that could be used against you. I ask you not to give them the pretext for doing that.
Castro’s response was to issue his own conditions for resolving the crisis, including ending the U.S. trade embargo, ending U.S. support for attempts to overthrow Castro, the end of violations of Cuban naval and air space, and the return of the U.S. naval base at Guantanamo Bay to Cuba. The U.S. told Castro to go jump in Guantanamo Bay, so to speak.
The conventional wisdom, promoted heavily by the JFK propaganda machine, was that this was a victory for the U.S. The Associated Press pokes more holes in the conventional wisdom that the U.S. prevailed in the crisis:
CONVENTIONAL WISDOM: The crisis was a triumph of U.S. brinkmanship.
REALITY: Historians say the resolution of the standoff was really a triumph of backdoor diplomacy.
Kennedy resisted pressure from aides advising that he cede nothing to Moscow and even consider a preemptive strike. He instead engaged in intense behind-the-scenes diplomacy with the Soviets, other countries and the U.N. secretary-general.
Attorney General Robert F. Kennedy met secretly with the Soviet ambassador on Oct. 27 and conveyed an olive branch from his brother: Washington would publicly reject any invasion of Cuba, and Khrushchev would withdraw the missiles from the island. The real sweetener was that Kennedy would withdraw Jupiter nuclear missiles from U.S. installations in Turkey, near the Soviet border. It was a secret pledge known only to a handful of presidential advisers that did not emerge until years later. …
CONVENTIONAL WISDOM: Washington won, and Moscow lost.
REALITY: The United States came out a winner, but so did the Soviet Union.
The Jupiter missiles are sometimes described as nearly obsolete, but they had come online just months earlier and were fully capable of striking into the Soviet Union. Their withdrawal, along with Kennedy’s assurance he would not invade Cuba, gave Khrushchev enough to feel he had saved face and the following day he announced the imminent dismantling of offensive weapons in Cuba.
Soon after, a U.S.-Soviet presidential hotline was established and the two nations initiated discussions that led to the Limited Test Ban treaty and ultimately the nuclear Non-Proliferation Treaty.
“The major lesson is the necessity of compromise even when faced with a crisis like that,” said Robert Pastor, an international relations professor at American University and former national security adviser for Latin America under President Jimmy Carter. …
CONVENTIONAL WISDOM: It was an intelligence coup for the CIA.
REALITY: Along with being a day late on the turnaround by Soviet ships, the CIA missed several key developments that would have helped Kennedy and his advisers navigate the crisis.
The CIA learned late in the game about the ballistic missiles’ presence in Cuba, and they were already operational by the time Kennedy was informed of their existence.
The agency was also unaware of other, tactical nuclear missiles in Cuba that could have been deployed against a U.S. attack. The Soviets had even positioned nuclear-tipped missiles on a ridge above the U.S. naval base at Guantanamo Bay in preparation for an invasion. …
CONVENTIONAL WISDOM: The crisis lasted just 13 days.
REALITY: This myth has been perpetuated in part by the title of Robert F. Kennedy’s posthumous memoir, “Thirteen Days,” as well as the 2000 movie of the same name starring Kevin Costner.
Indeed it was 13 days from Oct. 16, when Kennedy was first told about the missiles, to Oct. 28, when the Soviets announced their withdrawal.
But the “October Crisis,” as it is known in Cuba, dragged on for another tense month or so in what [Cuba analyst Peter] Kornbluh dubs the “November Extension,” as Washington and Moscow haggled over details of exactly what weapons would be removed.
The Soviet Union also had problems dealing with Fidel Castro, according to a Soviet document made public this month by Svetlana Savranskaya, a Russia analyst for the National Security Archive.
Deputy Premier Anastas Mikoyan traveled to Cuba that Nov. 2 and spent 20 days in tense talks with the Cuban leader, who was angry the Soviets had reached a deal without consulting him. Castro lobbied hard but unsuccessfully to keep the tactical nuclear weapons that the Americans had not learned about.
An interesting, though not necessarily accurate, perspective about how the Soviets and Cubans saw the crisis comes from Sad and Luminous Days: Cuba’s Secret Struggles with the Superpowers after the Cuban Missile Crisis, excerpted on HistoryofCuba.com:
In reality, Kennedy was both more flexible than the early postmortems suggested and more sensitive to the Soviet need to salvage something positive from the crisis. In order to buy some time and avoid a direct confrontation with the Soviets, on October 25 he permitted a Soviet tanker (the Bucharest) to proceed through the quarantine. On October 28 the president instructed the ExComm members, as Robert Kennedy recalled, “that no interview should be given, no statement made which would claim any kind of victory. [President Kennedy] respected Khrushchev for properly determining what was in his own country’s interest and what was in the interest of mankind.” Perhaps most importantly, he offered up removal of the U.S. missiles in Turkey and was prepared to accept a public trade of the missiles if that was necessary to prevent a conflagration. The appropriate lesson that should have been drawn from this behavior, then, is that flexibility, compromise, and respect for an adversary’s calculus of its vulnerability is essential for the peaceful outcome of a crisis. Instead, the traditional view of what is needed in a crisis-toughness and inflexibility-seemingly has guided U.S. officials for decades, in confrontations from Vietnam to Iraq.
A second lesson of the crisis emerged from the plaudits given to Kennedy for the way he handled the crisis. Arthur Schlesinger captured this lesson-that crises can be managed-in his elusive observation that the world escaped a nuclear war and the United States achieved its aims because of the president’s “combination of toughness and restraint, of will, nerve, and wisdom, so brilliantly controlled, so matchlessly calibrated.” A clearer way of stating this lesson, though, might be that nuclear crises can be managed only when several unlikely conditions are present: leaders have sufficient time away from the glare of the media to learn about each other’s positions and interests; good fortune at that moment provides each of the adversaries with leaders who have adroit political skills, the political will to limit their objectives, and sufficient self-confidence to reject advice from forceful advisers; and unforeseen events and unanticipated behavior by any of the thousands of people involved does not set off an uncontrollable chain reaction.
Since then, there have been many critiques of the view that the United States can act with blithe confidence that nuclear crises can be managed, though none is more poignant than the one articulated by Robert McNamara, who originally had embraced the traditional view. He noted that, had the Soviets launched any of their nuclear weapons in 1962, “the damage to our own [country] would have been disastrous.” Then he added,
“But human beings are fallible. We know we all make mistakes. In our daily lives, mistakes are costly, but we try to learn from them. In conventional war, they cost lives, sometimes thousands of lives. But if mistakes were to affect decisions related to the use of nuclear forces, there would be no learning period. They would result in the destruction of entire nations. Therefore, I strongly believe that the indefinite combination of human fallibility and nuclear weapons carries a very high risk of a potential nuclear catastrophe.”
Notably, this was the lesson the Soviets took away from the crisis. For them, it was not the threat of force that ended the crisis. They saw U.S. threats-in the form of Gilpatric’s speech and seeming plans to invade Cuba-as the cause of the crisis. Though some in the Kremlin may have derived a lesson similar to U.S. policymakers-that superior U.S. force led to a humiliating withdrawal that they would avoid in the future by building up their military forces.-the Soviet leadership believed the crisis ended because both Soviet and U.S. officials realized they were at the brink and that the crisis was threatening to destroy humankind. They did not fear only for their immediate safety and were not worried merely about losing a battle in Cuba. That kind of fear is of a personal nature, where one’s own safety is at risk. That is the kind of fear evoked by the image of leaders going eyeball to eyeball. But a leader whose decisions may result in the deaths of thousands of others may experience a second kind of fear that is not common, the fear of deciding the fate of so many others, even civilization itself. Leaders in the United States and the Soviet Union experienced the second kind of fear during the missile crisis, which in fact was what enabled them to reach a peaceful solution. …
Cuba viewed the crisis from the vantage point of a small power, for whom an invasion by conventional means would be as threatening as a nuclear confrontation would be to a superpower. The Kennedy-Khrushchev agreement seemed to place Cuba in a perilous situation. It had been transformed into a strategic U.S. target when the Soviet Union placed missiles there. But then Soviet withdrawal of the missiles in the face of U.S. pressure made Cuba even more vulnerable. The Soviet Union’s acquiescence suggested that it would not come to Cuba’s assistance were the United States to attack the island. The Soviet posture, in Cuba’s view, had created a new set of conditions that would encourage hard-liners in the United States to press for an invasion.
The Soviets did not seem to comprehend this perspective, and so they did not appreciate fully why the removal of the IL-28 bombers was so significant to the Cubans. Mikoyan tried to explain to Castro that the Soviets were leaving other weapons in Cuba that were superior to the IL-28s. But the Cuban leader saw the withdrawal of the bombers as tantamount to inviting a U.S. invasion, because it demonstrated to the United States that the Soviet Union would not stand with Cuba in the face of U.S. threats. “We realized,” Castro explained to the 1968 Central Committee, “how alone we would be in the event of a war.” In the same mode, he described the Soviet decision to remove all but 3,000 of its 42,000 military personnel from Cuba as “a freely granted concession to top off the concession of the withdrawal of the strategic missiles.”
The primary lesson Cuba drew, then, was that neither superpower could be trusted. It viewed U.S. guarantees as ploys and Soviet promises as hollow. Both countries ignored Cuba during the crisis, and Castro’s suspicion that the Soviets were treating Cuba as a bargaining chip were confirmed early in 1963 during his trip to the Soviet Union. He learned inadvertently then about the secret agreement between Kennedy and Khrushchev to exchange U.S. missiles in Turkey for Soviet ones in Cuba.
Though the United States posed the immediate menace to Cuba in 1962, Castro was concerned about Cuba’s relationship with the other superpower. Given the Soviet arrogance and lack of concern about Cuba’s fundamental rights, joining the Soviet camp as a subservient member posed a potential long-term threat to Cuban sovereignty and independence.
If that does indeed show the Soviet perspective, then this is one of those rare instances of history being written by the losers. Kennedy was assassinated a little more than a year later, Khrushchev was deposed in 1964, and the Soviet Union died a completely unmourned death in 1989. Castro has managed to delay his permanent residence in Hell by a half-century.
So was the Cuban Missile Crisis a triumph for the U.S.? Obviously it succeeded in that no nuclear war occurred and the Soviets removed the nukes from Cuba.
On the other hand, the Cold War went on for nearly three more decades before the Soviet Union and the Communist governments in the Warsaw Pact collapsed. The Soviet Union is responsible for the deaths of upwards of 70 million people, nearly 7 million of them after Joseph Stalin’s death. The Soviet Union helped North Vietnam in the Vietnam War, Nicaragua’s Sandinista government, various Marxist governments in Africa, and opponents of Israel in the Middle East. Perhaps having survived the Cuban Missile Crisis without losing their Cuban allies emboldened Soviet leadership to engage in other international adventures. And that’s all in addition to the massive Soviet arms buildup that was not even slowed down by the 1963 test-ban treaty, the Nuclear Non-Proliferation Treaty, the Strategic Arms Limitation Talks, and SALT II (which the Senate rejected).
Relativists who believe there is no moral difference between the U.S. and any other country might not care. Those who believe that, flawed as we are, the United States remains the last, best hope of democracy on this planet, might think an American response short of taking out Castro and his government (which also funded the Sandinistas and the aforementioned African Marxists) was insufficient. An invasion of Cuba eventually would have eliminated the Castros; the question is at what cost, and not just for Cubans.
The crisis was made into an ABC-TV movie, “The Missiles of October,” which I recall my parents allowed me (a student at, yes, John F. Kennedy Elementary School) to watch despite its late hour:
One interesting thing to me as a media history geek is pondering how the broadcast networks would have covered this had more of what was going on been publicly known. Kennedy’s assassination 13 months later ushered in the era of breaking TV news, but the coverage of Kennedy’s assassination (as you have seen on this very blog) was definitely learning on the fly, with multiple technical snafus, inaccurate information being reported (you didn’t know Johnson had been shot and had had a heart attack? That was news to him too), and other things that happen in writing the first draft of history live, in color and in HD.
The Cuban Missile Crisis occurred a year before Kennedy’s death; it also occurred nearly a dozen years before Watergate. It’s not that reporters weren’t professionally cynical before Watergate, but since then the guiding principle that politicians lie has been embedded into reporters’ DNA. (The embedding process gave me an ear infection, weirdly enough.) I’ve read for the past few weeks speculation about the October Surprise, or even November Surprise, of a military nature that the Obama administration is supposedly going to spring upon voters just in time to win reelection Nov. 6. Whether or not you buy that, I think it highly likely that a Cuban Missile Crisis-style crisis, sprung, as is the case right now, two weeks before national elections would be met with a great deal of cynicism, even if the crisis were real.
A press blackout of the kind Kennedy requested before his speech 50 years ago tonight is impossible today. Moreover, you’d get claims that the president was lying, or exaggerating, or that the president was springing a “Wag the Dog” incident to take attention away from something else.
For fun reading (if you enjoy reading about nuclear war, that is), read this version, or this version, of what could have happened.
This year is the 50th anniversary of the first James Bond movie, “Dr. No.”
It’s also time for another Bond movie, “Skyfall.”
The Wall Street Journal takes an exhaustive look at the Bond half-century, including all 22 Bond movies, the villains …
Bond: “You expect me to talk?” Goldfinger: “No, Mr. Bond, I expect you to DIE.”The third Blofeld survived to play the role of the professor/narrator in “The Rocky Horror Picture Show.” Another example of doing the Time Warp again …Wouldn’t it have been ironic if Scaramanga had also been a vampire?
… the weapons …
… the vehicles (those last two are sometimes the same) …
… and, duh, all the Bond girls:
London’s Telegraph reports the results of a survey of the 22 Bond movie themes.
Their number one (based on measurements of radio, TV, live and online performances) matches my number one:
The author of most of the Bond novels, Ian Fleming, got a presidential boost when President John F. Kennedy told reporters he read the Bond novels. And then Dr. No hit the silver screen, and 007 has been an icon ever since. (Bond far outlived Fleming, who died in 1964, the year the second Bond movie, “From Russia with Love,” came out.)
The secret-agent genre has been popular since approximately 1907, the year Joseph Conrad published his novel The Secret Agent. The John Le Carre novels featuring George Smiley made apparent that the secret agent was vastly exaggerated, but that was never the point.
The formula — good guy, bad guy, girl, exotic setting, gadgets — well, how could you go wrong with that? It’s interesting that neither the actors who played the villains, nor the actresses who played the babes, were usually name actors at the time. (The few instances that wasn’t the case were probably Christopher Lee in “The Man with the Golden Gun” and Christopher Walken in “A View to a Kill,” along with Diana Rigg in “On Her Majesty’s Secret Service, ” and Halle Berry in “Die Another Day.” Of those four, “Golden Gun” probably gets the best ranking, which says something about the importance of story over casting.
Other than being a sports hero or a superhero, Bond might be the most popular male fantasy figure out there. Everyone with the XY chromosome would like to be able to face a deadly situation with
There are some great offscreen ironies in the movies, beginning with the actors who were preferred over the Bonds, or turned down the Bond role. Richard Burton rejected the role three times. Cary Grant wanted to do only one film, and James Mason wanted to do only two. Patrick McGoohan played “Danger Man,” “Secret Agent” and “The Prisoner,” but refused to play Bond because Bond was too promiscuous. Michael Caine could have been Bond for “On Her Majesty’s Secret Service,” but he didn’t want to be typecast after having played anti-Bond Harry Palmer. Mel Gibson and Christopher Lambert weren’t British. Liam Neeson didn’t want to do action movies. (So what was “Taken”?)
Sean Connery won out over Rex Harrison and David Niven (who was Fleming’s personal choice). Timothy Dalton turned the role down twice before taking it for “The Living Daylights.” Ian Ogilvy, who played a TV adaptation of Simon Templar, “The Saint,” as Roger Moore had in the 1960s, was being considered until Moore returned. Pierce Brosnan was to replace Moore in 1986, but he couldn’t get out of “Remington Steele.” Alex O’Loughlin, now playing Steve McGarrett in “Hawaii Five-O,” was considered but lost out to Daniel Craig.
The general consensus is that Connery was the best Bond. He is certainly the Bond to which the others are compared. The additional irony is that Connery left after the first five movies, then came back for “Diamonds Are Forever,” in which he looked old. Connery was replaced by Roger Moore, who was six years … older. Moore had auditioned for Bond by playing “The Saint.”
Even though Moore had aged out of the role by “View to a Kill,” I identify more with Moore as Bond than Connery. Connery’s Bond was on ABC-TV Sunday nights. Moore’s Bond was in theaters. Two of the best soundtracks, “Live and Let Die,” and “The Spy Who Loved Me,” were Moore films.
Dalton appeared to be the Bond producers’ attempt to redo Connery’s Bond. Brosnan appeared to be the Bond producers’ attempt to redo Moore’s Bond. Craig’s Bond might be more like Fleming intended, but I’m not a fan because he lacks the urbane smoothness of the other Bonds.
“Live and Let Die” is my favorite, followed by “The Spy Who Loved Me.” The latter was the first Bond movie I saw in a theater. The former has the best combination of soundtrack …
… Bond Girl (Jane Seymour) …
… vehicle (note I didn’t write “car”) chase …
… and villain’s demise (the villain, played by Yaphet Kotto, blows up, you might say, in the end):
Today is Constitution Day, a day that should be a bigger holiday in the U.S. than it is.
Back in 1987, the publisher of the first newspaper I worked for called, on the 200th Constitution Day, to “celebrate … cerebrate … the Constitution.” (He was … fond … of … ellipses.)
“Cerebrate” apparently is a word, given that it shows up in a web search. So, consider this some cerebration on this Constitution Day.
For those who consider the Constitution to be important (which is a distressingly small group), it is fashionable to complain that the Constitution is being shredded more than ever by whoever happens to be in power at that particular time. Which doesn’t mean that’s not an accurate statement.
Consider Article I, section 8:
The Congress shall have Power To lay and collect Taxes, Duties, Imposts and Excises, to pay the Debts and provide for the common Defence and general Welfare of the United States; but all Duties, Imposts and Excises shall be uniform throughout the United States;
To borrow Money on the credit of the United States;
To regulate Commerce with foreign Nations, and among the several States, and with the Indian Tribes;
To establish an uniform Rule of Naturalization, and uniform Laws on the subject of Bankruptcies throughout the United States;
To coin Money, regulate the Value thereof, and of foreign Coin, and fix the Standard of Weights and Measures;
To provide for the Punishment of counterfeiting the Securities and current Coin of the United States;
To establish Post Offices and post Roads;
To promote the Progress of Science and useful Arts, by securing for limited Times to Authors and Inventors the exclusive Right to their respective Writings and Discoveries;
To constitute Tribunals inferior to the supreme Court;
To define and punish Piracies and Felonies committed on the high Seas, and Offences against the Law of Nations;
To declare War, grant Letters of Marque and Reprisal, and make Rules concerning Captures on Land and Water;
To raise and support Armies, but no Appropriation of Money to that Use shall be for a longer Term than two Years;
To provide and maintain a Navy;
To make Rules for the Government and Regulation of the land and naval Forces;
To provide for calling forth the Militia to execute the Laws of the Union, suppress Insurrections and repel Invasions;
To provide for organizing, arming, and disciplining, the Militia, and for governing such Part of them as may be employed in the Service of the United States, reserving to the States respectively, the Appointment of the Officers, and the Authority of training the Militia according to the discipline prescribed by Congress;
To exercise exclusive Legislation in all Cases whatsoever, over such District (not exceeding ten Miles square) as may, by Cession of particular States, and the Acceptance of Congress, become the Seat of the Government of the United States, and to exercise like Authority over all Places purchased by the Consent of the Legislature of the State in which the Same shall be, for the Erection of Forts, Magazines, Arsenals, dock-Yards, and other needful Buildings;–And
To make all Laws which shall be necessary and proper for carrying into Execution the foregoing Powers, and all other Powers vested by this Constitution in the Government of the United States, or in any Department or Officer thereof.
The U.S. Archives says of the Constitution: “The work of many minds, the U. S. Constitution stands as a model of cooperative statesmanship and the art of compromise.”
One main reason our politics are as disgusting as they are today is because of the lack of “cooperative statesmanship and the art of compromise.” That is because whatever party is in power seeks to grow its power over the government. Everything detestable about politics today — lack of civility, excessive campaign spending, the commercials, etc. — is because the stakes in elections are too high; the power of government is too great.
It’s also fashionable to say that there is less respect than ever before now for the First Amendment. Which, again, does not make that an inaccurate statement. The fact that politics has in fact been nastier in centuries past than now (two words: Civil War) is somewhat irrelevant given that history isn’t that important to most people, and that most people’s frame of reference is their own lifetime.
So: There is less respect now than ever before for the First Amendment specifically and the Constitution generally. President Obama has kept the Department of Homeland Security and the Transportation Security Administration essentially the same as the Bush administration. Neither has contributed to domestic security. (The term “homeland security” sounds vaguely fascist to me.) The disaster-in-progress known as ObamaCare is supposedly constitutional under the general welfare clause of the Constitution, which is a laughable premise. Obama and the Democrats favor shredding the First Amendment because they don’t like results of recent elections (which come, they think, from campaign spending from the wrong places, instead of voters rejecting their bilge.) Closer to home, anyone who supports stoplight cameras to catch those driving through red lights ignores the fundamental constitutional right of the ability to confront your accuser in court.
I doubt you could find 1 in 10 people who understand which body of government is responsible for which governmental responsibilities in the Constitution. Or the concept of small-R republican government, as opposed to small-D democracies. (In the latter case, 51 percent of the population could vote to imprison 49 percent of the population.) Or that the purpose of the Bill of Rights is to protect the rights of political minorities. Or that — news flash — the U.S. Supreme Court gets things wrong from time to time. (Two words: Dred Scott.)
But we have freedom of expression. Or, more accurately stated, freedom of expression (as long as you agree with my point of view). Those who do not agree with your point of view get threats of, or actually, canceling your subscription, or hitting the off switch, or sticking their fingers in the ears and making enough noise to not hear a contrary point of view. Conservatives more often than not do not listen to Wisconsin Public Radio. I know this based on their online poll results, which suggest either that they don’t, or that 85 percent of Wisconsinites are left-wingers. Liberals watch MSNBC; conservatives watch Fox News.
None of this should mean to you that the Constitution is perfect. The Founding Fathers lacked the foresight of being able to predict the latter-day followers of Karl Marx, so it doesn’t include an Economic Bill of Rights (as devised by Milton Friedman, someone who actually earned his Nobel Prize) to require balanced federal budgets, sound money, free trade and controls on government waste — I mean, spending.
Constitution Day 2012 comes at the same time that this country has had American soldiers and an ambassador killed within the past few days in the Middle East by adherents of the radical form of a religion that support neither free expression nor freedom of religion. It is ironic beyond words that the Democratic Party, by refusing to consider radical Islam a threat to this nation, is siding with a radicalized religion that supports nothing the Democratic Party does — for instance, gay rights and women’s rights.
The U.S. Constitution is 225 years old today. At this rate, I doubt it, or we, will last four more years.
Today’s blog for Collector Car Appreciation Day is about cars that are not likely to become collectors, combining two themes from my past — the 1980s, and the cars therein:
I graduated from high school and college in the ’80s. I’m a fan of ’80s music. I think ’80s fashion is generally unremarkable, except for leg warmers and really big hair. I am not a fan of ’80s cars, nor should you be.
I’ve written here before that today’s cars are unquestionably more capable than collector cars, though they lack the soul, for lack of a better term, of collector cars. Cars of the ’80s generally lacked both qualities.
Collectible Automobile found two newspaper ads of the day for those wondering what was available:
The reason I have pleasant memories of my own ’80s transportation will be revealed at the end.
Imagine being a Playboy Magazine writer and having to write this for the October 1983 issue:
OK, the thrill is back. The decade of dullness has come and gone. Cars are exciting and driving is fun again. …
There’s a new breed of machine in the land: the pocket rocket – your basic economy sedan or coupe with a massive horsepower and handling transfusion.
What cars was the writer referring to? The 90-horsepower Volkswagen Rabbit GTI, the 110-horsepower Dodge Shelby Charger, the 100-horsepower turbocharged (!) Nissan Pulsar NX, the 116-horsepower turbocharged Ford EXP Turbo, and the 150-horsepower turbocharged Pontiac Sunbird S/E.
Most of the worst examples of ’80s cars were the result of their design during or immediately after the second energy crisis in the late ’70s. Consumers wanted smaller cars, but Detroit didn’t have much ability to design small cars beyond just making them smaller than the cars they were replacing and putting weaker engines in them. Front-wheel drive was starting to appear in showrooms, but those cars mostly demonstrated that you don’t want to purchase the first iteration of a car.
Experience number one was a 1981 Chevrolet Malibu purchased as an upgrade from the car my father had been driving. The Malibu was part of the second wave of GM’s downsizing, which started in 1977 with the full-size Chevy Impala/Caprice, Pontiac Catalina/Bonneville, Oldsmobile Delta 88/98, Buick LeSabre/Electra and Cadillac de Villes. With rebodying in 1991, those cars lasted until GM (stupidly) killed its full-size rear-drive cars in 1996.
The downsizing of GM’s mid-sized cars — the Malibu, Pontiac LeMans, Olds Cutlass and Buick Century, and their personal luxury companion Monte Carlo, Grand Prix, Cutlass Supreme and Regal — didn’t go as well. The genius of the 1977 B- and C-body redesign was that, even though the cars were smaller and lighter, buyers didn’t feel as though they were buying less car.
1978 Buick Century sedan. In 1978, fastback sedans were not popular.
That was not the case with GM’s A-bodies. If you bought a sedan or station wagon, the rear-seat passengers were unable to roll down their windows. GM’s designers (and I use the term loosely for this decade) removed the rear window mechanisms in order to improve rear-seat elbow room. (That is, for those whose arms fit into the indentation in the rear doors. No, mine didn’t.)
Ford also downsized, with equally bad results, such as the early ’80s Thunderbird, or, as an owner called it, “Thunderchicken”:
Chrysler was trying to bore everyone to death with the Plymouth Horizon/Dodge Omni …
… and the Plymouth Reliant/Dodge Aries:
AMC did so poorly that it imported Renaults, such as the Alliance …
… and Fuego:
Chrysler’s purchase of AMC in 1987 could almost be called a mercy killing.
Engines were dropping in power in those days largely because of air pollution regulations. One solution was to equip them with computers to control spark and various other engine functions. Unfortunately, GM’s Computer Command Control was sent into the world with insufficient refinement, resulting in the Check Engine light going on and off for no apparent reason. (Some things never change.) The automakers also hadn’t figured out that electronic fuel injection was a more precise way to send fuel into the engine than carburetors.
The automakers were starting to figure out that one way to improve fuel economy was to reduce highway RPMs through transmission and rear-end gearing. That’s why cars of today are equipped with overdrive top gear(s). Unfortunately, no one had figured out how to put overdrive gears into automatic transmissions, with the result that rear ends were equipped with tall (that is, numerically low) gear ratios. This was good for highway cruising, but not so good for acceleration from a stop with the three-speed non-overdrive automatics of the day.
Cars of most of the decade also featured a prominent reminder of the stupidity of the Carter administration — speedometers with the top listed speed of 85 mph, and 55 mph highlighted to remind drivers that they dare not drive faster than that.
(I believe this was about the time I started to hate government, come to think of it.)
Tip number one that our experience with the Malibu would not be positive was two days into our ownership experience, when the bratty kid up the street started throwing rocks at it, chipping the black paint. (On the other hand, maybe he knew more than we did about the car.)
Within days of the car’s arrival, we took it on what I dubbed the Rust Belt Vacation, a route that included Chicago, Gary, Ind., Detroit, Toronto, Buffalo, Cleveland and Toledo. (Not that I didn’t enjoy the vacation, because I did, but one could not better plan a better tour of industrial blight than Interstate 94 east from Madison and Interstate 90 coming back.) Everything that happened with the car nicely accidentally symbolized what was happening to the Rust Belt in those days.
On day 2 in Toronto, our car got rear-ended by a driver who gave a false name and address to the Toronto police. (Perhaps the hit-and-runner knew something about our car too.) The Check Engine light went on and off for no apparent reason. The air conditioner started making odd noises, which you don’t want to hear in the summertime. Enough things went wrong that I was commissioned to make a page-long list for the dealer upon our return to Madison. The last thing on that list was the front seat, which broke on the New York State Thruway, leading to the seat’s sliding back upon acceleration (which did not amuse the back-seat passengers) and sliding forward upon braking (which did not amuse the driver).
Similar experiences followed for nearly six years. (On the next vacation, to Florida and Louisiana, the car ran right every other day.) The end came right after my father woke me up one morning to have me take him to work because the Malibu had died one house down the street. I interrupted my father’s streak of, uh, colorful metaphors about the car by asking why didn’t he just get rid of the damn car. A couple months later, he did, succeeding in getting someone else to buy the piece of crap so he could buy a new Honda Accord sedan.
As bad as the GM A-body cars were, they paled in comparison to GM’s next brilliant idea, its first front-drive cars, the X-body Chevy Citation, Pontiac Phoenix, Olds Omega and Buick Skylark. Popular Mechanics explains why “X” stood for “execrable”:
These four awkwardly proportioned “X-Body” front-drivers directly replaced GM’s rear-drive compacts (of which the Chevy Nova was the most prominent) and promised a revolution in how the corporation designed and built cars. Chevy alone sold an incredible 811,540 Citations during that prolonged 1980 model year based on that promise. Unfortunately, the reality was that these four- and six-cylinder cars probably suffered more recalls and endemic problems than any other GM vehicle program.
The problem wasn’t so much the basic engineering of the X-Body cars as it was that no one apparently spent any time doing the detailed engineering that determines a car’s success. So customers complained of disintegrating transmissions, suspension systems that seemed to wobble on their own mounts, and brakes that would make the whole car shudder every time they were applied. There were so many niggling faults and a seemingly endless series of recalls that sales of the car almost tanked by its third year. Still, through 1985, a few million escaped to the public, souring hundreds of thousands on GM.
The father of a girlfriend and our next-door neighbor had one. My father’s bank did too, and he occasionally drove it, and that one apparently wasn’t much of a problem. My experience, though, came as a passenger when the next-door neighbor’s daughter from his first marriage briefly lived with them, resulting in a carpooling arrangement to our high school. I got in the back seat and put my hand on the B-pillar, just in time to have her slam the front door on my hand. The irony was that she had slammed it on three knuckles, and it didn’t even hurt after a couple minutes.
GM had some engine issues during the 1980s, to say the least. One was the infamous Olds diesel V-8, a modified 350 V-8 that wasn’t sufficiently redesigned for diesel fuel’s requirements. Car industry observers claim that Americans 30 years later won’t buy diesel-powered cars because of the Olds diesel. (You can buy diesel full-size pickups from Chevy, GMC, Ford and Dodge — I mean Ram — but you can buy neither a diesel compact pickup nor a diesel car of any kind from them.)
Not to be outdone, Cadillac tried to improve fuel economy with its V-8–6–4, an attempt to disable two or four cylinders on their V-8s based on how they were being driven. It is nearly impossible to find a working V-8–6–4 because nearly every owner had their favorite mechanic disable the controls. Popular Mechanics calls it “one more half-developed, cynically marketed technology that GM just couldn’t make work.”
Cadillac also foisted on the buying public a luxury small car, or so it thought, the Cimarron, which earned, if you want to call it that, the honor, if you want to call it that, of making Time Magazine‘s 50 Worst Cars list:
Everything that was wrong, venal, lazy and mendacious about GM in the 1980s was crystallized in this flagrant insult to the good name and fine customers of Cadillac. Spooked by the success of premium small cars from Mercedes-Benz, GM elected to rebadge its awful mass-market J-platform sedans, load them up with chintzy fabrics and accessories and call them “Cimarron, by Cadillac.” Wha…? Who? Seeking an even hotter circle of hell, GM priced these pseudo-caddies (with four-speed manual transmissions, no less) thousands more than their Chevy Cavalier siblings. This bit of temporizing nearly killed Cadillac and remains its biggest shame.
By the time Johnny Z. got the factory in Northern Ireland up and running — and what could possibly go wrong there? — the losses were piling up fast. The car was heavy, underpowered (the 2.8-liter Peugeot V6 never had a chance) and overpriced. And De Lorean was having a few dramas of his own, resulting in one of law enforcement’s more memorable hidden-camera tableaux: the former GM executive sitting in a hotel room with suitcases on money, discussing the supply-and-demand of nose candy. The Giugiaro-designed DMC-12 sure was cool looking, though. In August of this year, the Texas company that controls the rights to the name announced it will build a small number of new DMC-12’s. How’s that for time travel?
The ’80s were also a demonstration of the maxim that just because you can doesn’t mean you should. (See “DeLorean.”) If you purchased a 1984–1989 Corvette, this is what stared back at the driver:
The switches to the right of this cluster (I can add four letters to that term to more accurately describe it) allowed the driver to select between oil pressure and oil temperature, and between engine temperature and volts, when most drivers would prefer to be able to see all that information. GM instead felt drivers would be more interested in the Corvette’s fuel economy. And while this Vette had a digital trip odometer, it had a conventional regular odometer. (And a butt-ugly steering wheel.)
You’ll be shocked — shocked! — to know that fixing these instrument clusters drains your wallet quickly, because the circuit board died and the lights would fade. C4 owners can replace the digital gauges with analog gauges, which leaves the owner with the choice of originality or function. (Choose the latter.)
The Corvette was not the only car with an instrument panel of regrettable design. Late ’80s buyers of Chevy S-10s and GMC S-15s had to choose between, as Car & Driver put it, something designed by Playskool …
… or this:
How about some Fun with Fonts:
Detroit wasn’t the only creator of automotive dreck in the ’80s, as MArooned lists:
1. 1988 Suzuki Samurai – I had a friend growing up who traded in a 1983 Pontiac Trans Am, Daytona 500 25th Anniverary edition on a Suz. Worst. Trade. Ever. Not only was the Samarai notoriously underpowered, poorly engineered, and slow; it was also prone to rollover crashes at moderate speeds. Bad, bad, bad.
2. 1985 Yugo GV – this one’s masquerading as a GTI, but failing. What can you say about the Yugo other than, well, you get what you pay for? Manufactured by Soviet bloc comrades, this car was as ugly as a CZ-52 but nowhere near as reliable or durable. …
6. Volkswagen pick-up. Whoever thought of this concept should be dragged off and shot. A front-wheel drive pickup truck? WTF? Uh, guys, the idea of a pickup truck is that you put extra weight in the back. When the drive wheels are in the front, extra weight in the back means that it’s a LOT harder to move… Duh! …
10. Nissan Pulsar. All the aerodynamics of a door wedge. All the frightening raw power of a weedwhacker. Pop-up headlights that broke within weeks. The only way this car could possible have gotten worse would have been to give it a restyle with a modular ass end. Oh, wait, that’s what they did…
Two facts about the Yugo: The engines on the first Yugos failed shortly after purchase, requiring a replacement engine that cost a few hundred dollars less than the car. A couple years ago, a caller to WTMJ radio’s Charlie Sykes reported that his parents had purchased a Yugo in the ’80s. When they contacted their bank to get a car loan, the bank told them it would make the loan, but would not accept the Yugo as collateral.
The first car I purchased was a 1988 Chevy Beretta GT.
It was a manufacturer buyback, which should have been my first warning; I assumed that GM had fixed the problem in question. The problem was mysterious indications of overheating; as I discovered, the car’s temperature gauge would peg at H, and the Low Coolant light would light up, even though the car didn’t act as if it was overheating, and it wasn’t low on coolant. Two car dealers and one repair shop could not determine whether or not the car was in fact overheating, and could not repair the problem.
Beyond its generally cheap design, the Beretta (which I think is Italian for “lemon”) developed other problems. The car got into and out of tune to the point where when I pushed in the clutch, the engine would quit, necessitating either starting the car or letting it back out so the momentum of the car reengaged the engine. There were also mysterious electrical gremlins. I didn’t impress my then-girlfriend (now wife) when the turn signals stopped working and I had to purchase a fuse, only to have it immediately pop, in the Quad Cities of Illinois and Iowa. I was wondering if I brought enough money for replacement fuses, but the second one didn’t pop. I had to replace the exhaust system, the first and only exhaust system I’ve ever had to replace before or since then. The last straw (because I concluded that making simultaneous car and car repair payments sucked), was when, four years into its life, I had to have the front disc brakes completely replaced because they were rusting from the inside.
It’s amusing to me that any car built in the 1980s can now be licensed in Wisconsin as a collector car … not that you’d want to. Yet I still have fond vehicular memories of the ’80s. That’s because I didn’t own an ’80s car for more than a year of the ’80s. (In fact, I should have kept the car I had, 11 mpg or not.)
Almost every American knows the traditional story of July Fourth—the soaring idealism of the Declaration of Independence, the Continental Congress’s grim pledge to defy the world’s most powerful nation with their lives, their fortunes and their sacred honor. But what else about revolutionary America might help us feel closer to those founders in their tricornered hats, fancy waistcoats and tight knee-breeches?
Those Americans, it turns out, had the highest per capita income in the civilized world of their time. They also paid the lowest taxes—and they were determined to keep it that way.
By 1776, the 13 American colonies had been in existence for over 150 years—more than enough time for the talented and ambitious to acquire money and land. At the top of the South’s earners were large planters such as George Washington. In the North their incomes were more than matched by merchants such as John Hancock and Robert Morris. Next came lawyers such as John Adams, followed by tavern keepers, who often cleared 1,000 pounds a year, or about $100,000 in modern money. Doctors were paid comparatively little. Ditto for dentists, who were almost nonexistent.
In the northern colonies, according to historical research, the top 10% of the population owned about 45% of the wealth. In some parts of the South, 10% owned 75% of the wealth. But unlike most other countries, America in 1776 had a thriving middle class. Well-to-do farmers shipped tons of corn and wheat and rice to the West Indies and Europe, using the profits to send their children to private schools and buy their wives expensive gowns and carriages. Artisans—tailors, carpenters and other skilled workmen—also prospered, as did shop owners who dealt in a variety of goods. Benjamin Franklin credited his shrewd wife, Deborah, with laying the foundation of their wealth with her tradeswoman’s skills. …
America in 1776 was also a diverse nation. The first census, taken in 1790, revealed that only about 60% of the people came from England. The rest were German, Irish, Dutch, Scottish, Swedish and African. …
Another American tradition beginning to take root was female independence. The wife of Sueton Grant ran her husband’s shipping business in Newport, R.I., for more than 30 years after his death in 1744. As a teenager, Eliza Lucas began experimenting with various plants on her father’s Wappoo Creek Plantation, near Charleston, S.C. Soon she was raising indigo, which became one of the most profitable crops in the South.
Philadelphia’s Lydia Darragh, America’s first female undertaker, operated her business for almost a decade before the Revolutionary War began. During the war she was one of George Washington’s most successful spies.
“Domestic felicity” was considered vital to everyone’s peace of mind, and although divorce was legal, it was also rare. Although money played a part in marriages among the more affluent, family life was often full of affection. The love letters Col. Thomas Jones of Virginia wrote to his wife began “My Dearest Life.” …
By 1776, the Atlantic Ocean had become what one historian has called “an information highway” across which poured books, magazines, newspapers and copies of the debates in Parliament. The latter were read by John Adams, George Washington, Robert Morris and other politically minded men. They concluded that the British were planning to tax the Americans into the kind of humiliation that Great Britain had inflicted on Ireland.
As eight years of war engulfed the continent, not a few of the rebels saw that the Revolution was a spiritual enterprise that would never really end. Dr. Benjamin Rush, a Pennsylvanian who signed the Declaration of Independence, wrote that the war was onlsy the first step in the Revolution’s destiny to transform America and the world.
History confirmed his intuition. In the next hundred years, other nations and peoples would issue 200 similar declarations.
Paul Brandus, author of the West Wing Report, asks the question of what the Founding Fathers would think of what they wrought, specifically Thomas Jefferson:
Thomas Jefferson’s glorious sentence from his Declaration of Independence — arguably the most influential sentence in the history of the English language — holds true to this day, and remains a beacon to all who cherish or yearn for the human rights he espoused. Abraham Lincoln considered that specific passage one of the most important things he ever read, and regarded it as the bedrock of his political philosophy. …
The Economist Intelligence Unit’s Democracy Index — which bases its ratings on civil liberties, conduct of elections, media freedom, public opinion, functioning government, corruption, and stability — ranks the United States the world’s 19th best democracy, down from 17th in 2010. It says:
“U.S. democracy has been adversely affected by a deepening of the polarization of the political scene and political brinkmanship and paralysis. …
“The U.S… remain(s) at the bottom end of the full democracy category. There has been a rise in protest movement. Problems in the functioning of government are more prominent.”
Specifically, on a scale of 1-10, we get a 9.17 for our electoral process and pluralism, 8.53 for civil liberties, 8.13 for political culture, 7.50 for functioning government, and 7.22 for political participation. Room for improvement, indeed.
Brandus suggests Scandinavia as more democratic than the U.S. Which may be politically true, but that ignores the importance of economic freedom. The Scandinavian countries’ tax rates meet no legitimate economist’s definition of economically free.
About which …
The Index of Economic Freedom, published annually by The Wall Street Journal and the conservative Heritage Foundation, also shows some erosion. On a scale of 1-100 (100 is most free), the United States gets a 76.3. That’s down from 77.8 in 2011, and 81 in 2008, but it still puts the U.S. in the top 10 most economically free countries. Here’s how Heritage and the Journal break the data down, and how it compares with 2011:
Rule of law
· Property rights: 85.0 (no change)
· Freedom from corruption: 71.0 (worsened)
Limited government
· Government spending: 46.7 (worsened)
· Fiscal freedom: 69.8 (improved)
Regulatory efficiency
· Business freedom: 91.1 (improved)
· Labor freedom: 95.8 (improved)
· Monetary freedom: 77.2 (worsened)
Open Markets
· Trade freedom: 86.4 (no change)
· Investment freedom: 70.0 (worsened)
· Financial freedom: 70.0 (no change)
Heritage and the Journal blast what they call “government intervention,” regulations, growing spending at all levels of government, and growing uncertainty in the private sector, and also warn of “fading confidence in the government’s determination to promote or even sustain open markets.”
Nice description of the Obama administration in that last paragraph.
Brandus wraps up with a subject of occupational interest:
Meantime, what of one of Jefferson’s most cherished freedoms: That of the press? “Were it left to me to decide whether we should have a government without newspapers, or newspapers without a government, I should not hesitate a moment to prefer the latter” he famously said.
Alas, on this point, the U.S. has fallen sharply. Reporters Without Borders, in its annual Press-Freedom Index, says America has plunged to 47th in the world, down from 20th a year ago. It blames the crackdown and repression of journalists covering the ongoing Occupy movement around the country. Where are press freedoms greatest? Again, try Scandinavia: Finland and Norway top the list. …
As we pursue our own happiness today, it’s important to remember how perishable the freedoms we often seem to take for granted really are. “The natural progress of things,” Jefferson observed, “is for liberty to yield and government to gain ground.”
What’s ironic about that last paragraph is that many local-level Democratic parties call their annual dinners/fundraisers Jefferson–Jackson Day. Neither Jefferson nor the personally violent Andrew Jackson seem appropriate symbols for the Democratic Party of today, let alone the freedom-squashing Obama administration.
M.D. Kittle of the Wisconsin Reporter channels another Founding Father:
I imagine John Adams in that stuffy room, amid the hot summer stink of Philadelphia, far away from his one true love — writing to her as he could find the time.
As this Revolutionary War slogged on, as one last chance at peace with the mother country died, Adams wrote these words to his Abigail.
“Your Description of the Distresses of the worthy Inhabitants of Boston, and the other Sea Port Towns, is enough to melt a Heart of Stone,” Adams penned on July 7, 1775, as he served in a Second Continental Congress still not entirely sold on the idea of American independence from the empire. …
“Our consolation must be this, my dear, that Cities may be rebuilt, and a People reduced to Poverty, may acquire fresh Property,” he continued, adding a now-famous line that stands like a beacon for any and all liberty-loving people.
“But a Constitution of Government once changed from Freedom, can never be restored. Liberty once lost is lost forever.
“When the People once surrender their share in the Legislature, and their Right of defending the Limitations upon the Government, and of resisting every Encroachment upon them, they cannot regain it ..” …
The signers of that bold declaration, who pledged to each other their lives, fortunes and sacred honor, some giving every measure of that sacred vow of independence, must be appalled by the level of dependence their progeny have on their government.
From nearly $80 billion in foods stamps distributed each year to billions of dollars more handed out in corporate welfare to an estimated $1 trillion-plus marked for the U.S. Supreme Court-blessed Patient Protection and Affordable Care Act, commonly known as Obamacare, the U.S. citizen has become more dependent on its government than any Tory ever was. The better comparison may be government as dealer, citizen as dope addict.
While caring for its weakest – its poor, destitute, sick and aged – is the mark of a good and gracious society, taxing citizens to pay for every ill borne by society is the stain of a foolish and failing government.
Hence, a federal debt rapidly approaching $16 trillion.
There is an arrogance that is often mistaken for generosity, I think, at the core of the government that attempts to be all things to all bodies, which, in return, raises generations of government dependents. …
But how does a nation borne on the principles and statutes of liberty save itself from government dependence?
If only John Adams and his gang of revolutionaries were here to answer that big question.
Jefferson, the first Democratic (then Democratic–Republican) president, also said, “I hold it that a little rebellion now and then is a good thing, and as necessary in the political world as storms in the physical. … It is a medicine necessary for the sound health of government.” We have four months and one day to create our own little rebellion, lest the Obama administration, unfettered by the necessities of reelection, gets the chance to do what it really has wanted to do ever since 2009.