The rationalization starts as soon as the act is committed — we can’t ascribe this to any particular religion, the person is just insane.
Well, all evil is not madness.
It is my belief that a rational human being does feel the need to do violence, to meet force with force to end the threat. It is the willful suppression of the “fight” part of the “fight or flight” natural human instinct because political correctness tells us that aggression is bad and violence is never the answer.
Well, history has proven time and time again that violence is often the answer.
In the intro to the movie “Lone Survivor,” clips from the training of baby SEAL candidates are shown and something that one instructor said has stuck in my mind. Paraphrased, he said:
“Take all that pain, that cold, that fatigue you feel and turn it into righteous aggression! SEALs don’t quit!”
General George S. Patton is reported to have said to his men that they must forget all the high minded rules of war that were written in the parlors of the moralizing politicians. He said that the enemy doesn’t care about our rules and he will fight according to his own rules (or absence of them). He said that the only way to win was to fight according to the enemy’s rules — or even dirtier.
In our own minds, we know this to be true. On the battlefield, holding true to any rule that renders you defenseless or restricts your ability to mount an offensive will get you killed.
Somehow I doubt that an ISIS jihadi will praise you for your adherence to high moral standards after he cuts your head off.
Patton was right. War is about victory on the battlefield.
So called “evolved” Western societies seem to be in a race to their own doom. The Democrats’ so-called “terror report” is a perfect example. Combine that with the putative presidential candidate Hillary Clinton saying that we must “empathize” with our enemy, and idiotic Democratic representative saying that we owed al Qaeda an apology for waterboarding their people and here in America, every time a criminal uses a gun, the cries go out that law abiding citizens must give up their Second Amendment rights as a result.
The radical Islamist terrorists and their wannabe proxies have defined that battlefield as our skies (9/11), our streets (Boston Marathon), our workplaces (Moore, Oklahoma), our shops (Lindt in Sydney) and our schools (the Taliban just attacked a school in Pakistan).
It is they who have brought this war to our living rooms.
Patton’s admonition to his men that “No bastard ever won a war by dying for his country. He won it by making the other poor dumb bastard die for his country.” is clear — to protect civilization, there are times when men must act in uncivilized ways.
One must wonder when (or if) the civilized man will understand that the time has come to put down the flowers, stop building emotional memorials and pick up a weapon and start building stone bulwarks and set to defeating the uncivilized man.
Category: History
-
No comments on The world as it really is
-
… that you don’t work here. Unless you do.
-
Daylight Saving Time ends Sunday at 2 a.m.
Many people see DST as a pain. Some people apparently even see time zones as too much work. Recall the advocacy of “universal time,” which would base all clocks around the world on Greenwich Mean Time, regardless of what the sun tells you. We Central Time Zone residents would work not from 8 to 5, but from 14:00 to 22:00 (because eliminating time zones would also eliminate DST).
For those who prefer the old days, Slate should disabuse you of that notion:

Between the advent of railway travel and telegraphic communication in the middle of the 19th century and the establishment of standardized time zones in the 1880s, newly mobile people trying to do business from a distance were confounded by a wide array of local times. Charts like this one, which were intended to help, now show us how utterly confusing everything was.
The chart, which is from the 1874 version of Johnson’s New Illustrated Family Atlas (published in New York), uses Washington, D.C., as its standard. The clocks radiating outward are arranged in concentric circles, with Latin American cities on the interior, large European and Asian cities next, and American and Canadian cities and towns occupying the outer rings.
Because locations established time based on local readings of the sun and the phases of the moon, even towns as close together as Galveston and Austin, Texas, (located at about the 8 o’clock location on this chart) ran an annoying 11 minutes apart.
Look on the left of this calendar, and you’ll notice that if it’s noon in Washington, it’s 10:53 a.m. in Des Moines, 10:55 a.m. in St. Paul, 11:02 a.m. in Iowa City and Quincy, Ill., 11:08 a.m. in St. Louis, 11:09 a.m. in Springfield, Ill., 11:10 a.m. in Madison, 11:12 a.m. in Janesville, 11:16 a.m. in Milwaukee, and 11:17 a.m. in Chicago.
-

Outside the Oakwood Lounge (R.I.P.), Lancaster, Wis., Oct. 24, 1992. Yes, there are two perms in this photo. Three years ago I wrote this on the occasion of our 19th wedding anniversary.
A few things have changed, like jobs and address. We also have three teenagers in the house, although only one is a chronological teenager. Other than that, you can probably add three years to everything listed in there.
Here’s what has not changed: I still love my wife.
-
As a kindergarten student a half century ago, I read the same book over and over again, nearly every day. My favorite in our classroom’s little library, it had a title imbued with confidence and promise: You Will Go to the Moon.
Needless to say, I have not done so.
So I sympathize with science-fiction writer Neal Stephenson and venture-capitalist Peter Thiel, whose new books lament the demise of grand 20th-century dreams and the optimistic culture they expressed. “I worry that our inability to match the achievements of the 1960s space program might be symptomatic of a general failure of our society to get big things done,” writes Stephenson in the preface to “Hieroglyph,” a science-fiction anthology hoping “to rekindle grand technological ambitions through the power of storytelling.” In “Zero to One,” a book mostly about startups, Thiel makes the argument that “we have to find our way back to a definite future, and the Western world needs nothing short of a cultural revolution to do it.”
Their concerns about technological malaise are reasonable. As I’ve written here before, “political barriers have in fact made it harder to innovate with atoms than with bits.” It’s depressing to see just about any positive development — a dramatic decline in the need for blood transfusions, for instance — greeted with gloom. (“The trend is wreaking havoc in the blood bank business, forcing a wave of mergers and job cutbacks.”)
When a report about how ground-penetrating radar has mapped huge undiscovered areas of Stonehenge immediately provokes a comment wondering whether the radar endangers the landscape, something has gone seriously wrong with our sense of wonder. “There’s an automatic perception … that everything’s dangerous,” Stephenson mused at a recent event in Los Angeles, citing the Stonehenge example, “and that there’s some cosmic balance at work–that if there’s an advance somewhere it must have a terrible cost. That’s a hard thing to fix, but I think that if we had some more interesting Apollo-like projects or big successes we could point to it might lift that burden that is on people’s minds.”
He’s identified a real problem, but his remedy — “more interesting Apollo-like projects — won’t work. If it did, the baby boomers who grew up with Apollo wouldn’t be so down on progress.
Besides, we have plenty of big projects. The human genome has been sequenced. Enormous libraries of books and collections of paintings and drawings have been scanned and made searchable online. James Turrell is making great monumental art in the Arizona desert. Three — three! — billionaires are running their own space programs. Space is so popular among his peers that Bill Gates, whose own modest goals run to conquering malaria and other tropical scourges, finds himself telling interviewers that “it’s not an area that I’ll be putting money into.” If there’s public malaise about progress, it isn’t because nobody is doing anything bold.
The dystopian science fiction Stephenson’s Project Hieroglyph aims to counter isn’t the cause of our cultural malaise. It’s a symptom. The obstacle to more technological ambitions isn’t our idea of the future. It’s how we think about the present and the past.
Americans in the mid-20th century were not in fact sanguine about the future. Anxieties about the march of technology were common. In February 1961, a statistics-filled Time magazine feature warned that automation was wiping out jobs and, worse, “What worries many job experts more is that automation may prevent the economy from creating enough new jobs.” At least nine episodes of the original “Star Trek” series were about threatening or out-of-control computers. (Still others involved menacing androids or ominous artificial intelligences whose exact nature was vaguely defined.) Movies such as “Colossus: The Forbin Project” (1970) and, of course, “2001: A Space Odyssey” (1968) picked up the scary-computer theme. Nor was the space program as universally popular as we nostalgically imagine. Americans liked the moon race, but only in July 1969 — the month of the moon landing — did a majority deem the Apollo program “worth the cost.”
Meanwhile, back in those good old days people were already voicing worries about technological stagnation that sound a lot like Stephenson’s and Thiel’s. “Before 1913,” Peter Drucker wrote in 1967, economic development “was taken for granted, but since then we’ve apparently gone sterile. And we don’t know how to start it up.” He noted that “with the exception of the plastics industry, the main engines of growth in the past 50 years were already mature or rapidly maturing industries, based on well-known technologies, back in 1913.”
Only information technology, Drucker suggested, might reverse the prospects of stagnation. “One cannot predict what it will lead to, and where and when and how,” he wrote of the computer. “A change as tremendous as this doesn’t just satisfy existing wants, or replace things we are now doing. It creates new wants and makes new things possible.”
We’ve lived to enjoy those unpredictable new wants and possibilities. But hacker hero though he is, Stephenson has begun treating them as incredibly dull and inconsequential compared to the Hoover Dam. The most visible technological progress of our times is, he now seems to think, #borrring. Worse, the Internet and its consequences are supposedly distractions from the important work of building great physical objects.
“What I’m sort of hoping,” Stephenson said at a Technology Review forum in 2012, “is that if we look back on this era 100 years ago, we’ll say, well, it was a very actively inventing and creating society and then the Internet happened and everything got put on hold for a generation while the Internet was kind of absorbed and we figured out what to do with it–and then we got going again and got caught up on the things we failed to do while we were Facebooking”
This attitude is self-defeating. We already have plenty of critics telling us that our creativity and effort are for naught, our pleasures and desires absurd, our civilization wicked and destructive. We live in a culture where condemnatory phrases like “the ecosystems we’ve broken” are throwaway lines, and the top-grossing movie of all time is a heavy-handed science-fiction parable about the evils of technology and exploration. We don’t need Neal Stephenson piling on.
The reason mid-20th-century Americans were optimistic about the future wasn’t that science-fiction writers told cool stories about space travel. Science-fiction glamour in fact worked on only a small slice of the public. (Nobody else in my kindergarten was grabbing for “You Will Go to the Moon.”) People believed the future would be better than the present because they believed the present was better than the past. They constantly heard stories — not speculative, futuristic stories but news stories, fashion stories, real-estate stories, medical stories — that reinforced this belief. They remembered epidemics and rejoiced in vaccines and wonder drugs. They looked back on crowded urban walk-ups and appreciated neat suburban homes. They recalled ironing on sweaty summer days and celebrated air conditioning and wash-and-wear fabrics. They marveled at tiny transistor radios and dreamed of going on airplane trips.
Then the stories changed. For good reasons and bad, more and more Americans stopped believing in what they had once viewed as progress. Plastics became a punch line, convenience foods ridiculous, nature the standard of all things right and good. Freeways destroyed neighborhoods. Urban renewal replaced them with forbidding Brutalist plazas. New subdivisions represented a threat to the landscape rather than the promise of the good life. Too-fast airplanes produced window-rattling sonic booms. Insecticides harmed eagles’ eggs. Exploration meant conquest and brutal exploitation. Little by little, the number of modern offenses grew until we found ourselves in a 21st century where some of the most educated, affluent and culturally influential people in the country are terrified of vaccinating their children. Nothing good, they’ve come to think, comes from disturbing nature.
Optimistic science fiction does not create a belief in technological progress. It reflects it. Stephenson and Thiel are making a big mistake when they propose a vision of the good future that dismisses the everyday pleasures of ordinary people — that, in short, leaves out consumers. This perspective is particularly odd coming from a fiction writer and a businessman whose professional work demonstrates a keen sense of what people will buy. People are justifiably wary of grandiose plans that impose major costs on those who won’t directly reap their benefits. They’re even more wary if they believe that the changes of the past have brought only hardship and destruction. If Stephenson wants to make people more optimistic about the future and more likely to undertake difficult technological challenges, he shouldn’t waste his time writing short stories about two-kilometer-high towers. He should find a way to tell tales about past transformations that don’t require 2,000-plus pages. (I say this as someone who has enjoyed his massive Baroque Cycle of 17th-century historical fiction.)
Storytelling does have the potential to rekindle an ideal of progress. The trick is not to confuse pessimism with sophistication or, conversely, to demand that optimism be naive. The past, like the present and the future, was made by complicated and imperfect people. Recapturing a sense of optimism requires stories that accept the ambiguities of history — and of life — while recognizing genuine improvements.
-
With the Packers hosting their neighbors to the immediate west tonight in a rivalry that has featured these excellent moments …
… two news items seem appropriate, the first particularly noteworthy for those of us of Nordic descent …
… from National Geographic:
The Vikings gave no quarter when they stormed the city of Nantes, in what is now western France, in June 843—not even to the monks barricaded in the city’s cathedral. “The heathens mowed down the entire multitude of priest, clerics, and laity,” according to one witness account. Among the slain, allegedly killed while celebrating the Mass, was a bishop who later was granted sainthood.
To modern readers the attack seems monstrous, even by the standards of medieval warfare. But the witness account contains more than a touch of hyperbole, writes Anders Winroth, a Yale history professor and author of the book The Age of the Vikings, a sweeping new survey. What’s more, he says, such exaggeration was often a feature of European writings about the Vikings.
When the account of the Nantes attack is scrutinized, “a more reasonable image emerges,” he writes. After stating that the Vikings had killed the “entire multitude,” for instance, the witness contradicts himself by noting that some of the clerics were taken into captivity. And there were enough people left—among the “many who survived the massacre”—to pay ransom to get prisoners back.
In short, aside from ignoring the taboo against treating monks and priests specially, the Vikings acted not much differently from other European warriors of the period, Winroth argues.
In 782, for instance, Charlemagne, now heralded as the original unifier of Europe, beheaded 4,500 Saxon captives on a single day. “The Vikings never got close to that level of efficiency,” Winroth says, drily.
Just how bad were the Vikings?
Winroth is among the scholars who believe the Vikings were no more bloodthirsty than other warriors of the period. But they suffered from bad public relations—in part because they attacked a society more literate than their own, and therefore most accounts of them come from their victims. Moreover, because the Vikings were pagan, they played into a Christian story line that cast them as a devilish, malign, outside force.
“There is this general idea of the Vikings as being exciting and other, as something that we can’t understand from our point of view—which is simply continuing the story line of the victims in their own time,” Winroth says. “One starts to think of them in storybook terms, which is deeply unfair.”
In reality, he proposes, “the Vikings were sort of free-market entrepreneurs.”
To be sure, scholars have for decades been stressing aspects of Viking life beyond the warlike, pointing to the craftsmanship of the Norse (to use the term that refers more generally to Scandinavians), their trade with the Arab world, their settlements in Greenland and Newfoundland, the ingenuity of their ships, and the fact that the majority of them stayed behind during raids.
But Winroth wants to put the final nail in the coffin of the notion that the Vikings were the “Nazis of the North,” as an article by British journalist Patrick Cockburn argued last April. Viking atrocities were “the equivalent of those carried out by SS divisions invading Poland 75 years ago,” Cockburn wrote. …
Rather than being primed for battle by an irrational love of mayhem, Vikings went raiding mainly for pragmatic reasons, Winroth contends—namely, to build personal fortunes and enhance the power of their chieftains. As evidence Winroth enumerates cases in which Viking leaders negotiated for payment, or tried to.
For example, before the Battle of Maldon in England, a Viking messenger landed and cried out to 3,000 or more assembled Saxon soldiers: “It is better for you that you pay off this spear-fight with tribute … Nor have we any need to kill each other.” The English chose to fight, and were defeated. Like anyone else, the Vikings would rather win by negotiation than risk a loss, Winroth says. …
The Norse were prodigious traders, selling furs, walrus tusks, and slaves to Arabs in the East. Winroth goes so far as to argue that Vikings provided much needed monetary stimulus to western Europe at a crucial time. Norse trade led to an influx of Arabic dirhams, or coins, which helped smooth the transition to an economy of exchange instead of barter.
Yet even among scholars who attempt to see things from the Vikings’ perspective, disagreements persist about the nature of Viking violence. Robert Ferguson, for example, doesn’t downplay its ferocity, but he characterizes it as symbolic and defensive, a form of “asymmetric warfare.”
In the year 806, for example, the slaughter of 68 monks on the Isle of Iona, off the coast of Scotland, sowed terror in Europe. Ferguson suggests that the move was designed to convince Charlemagne and others that it would be very costly to expand Christianity into Scandinavia by force. The Vikings “were fighting to defend their way of life,” Ferguson says.
Tonight’s game is, of course, a sellout, which means it will be on TV in the Packers’ home markets, Green Bay and Milwaukee. The NFL prohibits home-market telecasts if games aren’t sold out 48 hours before kickoff. There were two games blacked out in 2013 — almost including the Packers’ playoff game against San Francisco, though the deadline was barely met — and 15 in 2012.
About which, the Washington Post reports:
Federal regulators on Thursday sacked the longstanding sports “blackout” rule that prevents certain games from being shown on TV if attendance to the live event is poor.
In a bipartisan vote, the Federal Communications Commission unanimously agreed to strike down the much-criticized 40-year-old policy. Under the blackout rule, games that failed to sell enough tickets could not be shown on free, over-the-air television in the home team’s own local market.
The FCC said the rule mainly benefits team owners and sports leagues, such as the NFL, by driving ticket sales but it does not serve consumers.
”For 40 years, these teams have hidden behind a rule of the FCC,” said FCC Chairman Tom Wheeler. “No more. Everyone needs to be aware who allows blackouts to exist, and it is not the Federal Communications Commission.”
The rule was initially put in place in 1975 amid concerns of flagging attendance at live sports games. At the time, almost 60 percent of NFL games were blacked out on broadcast TV because not enough fans were showing up at stadiums. Today, that figure stands at less than one percent, and professional football is so popular on TV that programming contracts contribute “a substantial majority of the NFL’s revenues,” said FCC Commissioner Ajit Pai.
The NFL has warned that ending the blackout rule would hurt consumers by encouraging leagues to move their programming exclusively to pay TV. But Pai pushed back against those claims Tuesday, saying teams can’t afford not to air their games on broadcast TV.
”By moving games to pay TV,” said Pai, “the NFL would be cutting off its nose to spite its face.”
The vote doesn’t mean that blackouts are going away immediately. The NFL still has blackout rules written into individual contracts with regional sports broadcasters. In general, these deals last until the beginning of the next decade. The FCC’s rule, which was struck down, essentially served as a stamp of approval for the NFL’s policy.
In new contracts, the NFL would have to renew those blackout provisions over the objections of the federal government. On Tuesday, the FCC’s message was clear: If the NFL chooses that path, it will be the only one bearing the brunt of consumer ire, particularly from low-income Americans and the disabled who can’t make it or have a harder time getting to the games.
The cable industry welcomed the 5-0 vote.
”We commend the commission’s unanimous decision to eliminate the antiquated sports blackout rule,” said the National Cable and Telecommunications Association. “As the video marketplace continues to evolve and offers consumers more competition and a growing variety of new services, we encourage the FCC to continue its examination of outdated rules that no longer make sense.” …
The NFL indicated Tuesday that it had no immediate plans to change how it broadcasts games.
“NFL teams have made significant efforts in recent years to minimize blackouts,” the NFL said in a statement. “The NFL is the only sports league that televises every one of its games on free, over-the-air television. The FCC’s decision will not change that commitment for the foreseeable future.”
The next to last sentence is not correct. The Packers host Atlanta Dec. 8, in a game that will be televised on ESPN. It will be on “free, over-the-air television” in Green Bay and Milwaukee, but not anywhere else in Wisconsin. If you don’t get ESPN on cable or satellite, and you can’t get channel 2 in Green Bay or channel 12 in Milwaukee, you won’t be watching.
The point here, which the Post finally got to, is that the FCC’s ruling has no weight given that the networks that carry the NFL have agreed to the blackout provision as part of their contracts.
What makes the blackout issue different now from the past is that the NFL has been taking a public relations beating (pun not intended) recently, thanks to the misbehavior of some of its players (which is not a new thing) and the NFL’s perceived mishandling of the issue. It’s impossible to say what the NFL’s public image will be in the early 2020s, when the NFL will be negotiating its next TV contracts after the current contracts expire after the 2021 season. That fact and the networks’ agreeing with the NFL to the blackout rules make the FCC’s decision less news than it may appear.
-
This year is the 25th anniversary of Hurricane Hugo, which arrived in Georgia the same night as a nationally televised football game at Georgia Southern University:
I’ve announced games during rain (while we announcers were outside), snow, heat, cold and wind. Two years ago, I announced a three-day-long baseball game that started on Wednesday, included a tornado warning, and then was postponed due to lightning. Two days later, the rescheduling having to be rescheduled due to pools of water on the field, the game ended during, of course, a severe thunderstorm watch.
Last year, our second game of the season ended up taking four hours because of a 45-minute halftime lightning delay. We arrived at the stadium around 6 p.m., and left at 11:10 p.m., having announced a game that was literally the length of a Super Bowl.
A hurricane would be a first, though. Hurricanes don’t get up this far north, of course, though the remnants of hurricanes can, as low-pressure areas with geographically appropriate inclement weather.
-
This week apparently includes two anniversaries, according to the American Association of State Highway and Transportation Officials:
In 1924, AASHO recommended the adoption of uniform sign practices based to a large extent upon the action of the Mississippi Valley Conference, but distinguished colors for “luminous signs,” such as yellow for caution, red for “stop,” and green for safety. It was, however, some time before the “reflectorized” sign came into extensive usage, awaiting the development of economic and effective materials.
And, then…
In 1957, the Chief Administrative Officers of the several Member Departments were meeting in LaSalle, Illinois, on August 14, to attend a policy meeting dealing with the AASHO Road Test Project (more about that in a later post!) The occasion was used for these Administrators to view suggested route marker designs on a section of country road near the Road Test Project. They were viewed under night and day-time conditions, and after some discussion it was decided to adopt a marker that combined certain features of designs submitted by the States of Texas and Missouri. The Committee on Administration thereupon by unanimous vote adopted the official marker which is used on the routes of the National System of Interstate and Defense Highways to this day.
Road signs have been another of my odd interests over time. (Which means I must have an interest in design, though there’s nothing I really can design other than meals.) Starting with a trip to Detroit (the Ford plant where the Mustang II was being made, the Kellogg’s cereal plant in Battle Creek — which you can’t tour anymore — and Greenfield Village, a must-see for gearheads), I would draw road signs as Dad drove the Caprice along whatever Interstate highway we were on.
The graphic is cool because it depicts the original design of state highway signs in all their variety. Wisconsin, as you know, was the first state to number its state highways (though county highways have letters, as you also know), though the first design wasn’t the triangle behind a square it was …

… a triangle. Wh0ever decided on that made a distinctive choice, though not a very usable design, which is how we got the square-over-triangle sign, which is at least original compared with circles (Iowa) or squares (other states).
The graphic shows that many states have, or had, state highway signs that either followed, or at least included, their state’s shape. (Including Minnesota.) I’ve always preferred that, though Wisconsin’s shape doesn’t exactly lend itself to such a design, except possibly in outline. (And why the Division of Motor Vehicles doesn’t use a Wisconsin shape in place of the dash on non-personalized license plates is something I don’t understand either.)
Speaking of Interstate highways …

Last week was the 56th anniversary of the opening of the first segment of Wisconsin’s first Interstate highway, I–94 between what now is Wisconsin 164/Waukesha County Y/Waukesha County JJ and Waukesha County SS.

Seven years later, on Oct. 27, 1965, Gov. Warren Knowles celebrated my impending five-month birthday by opening the last segment of I–94 between Madison and Milwaukee. In the pre-Interstate days, getting from Madison to Milwaukee required going on either Wisconsin 30 (pretty much the current I–94 route), or U.S. 18, which meant going through Cambridge, Jefferson, Oconomowoc and Waukesha to get to Milwaukee.
One year after the first part of I–94 opened, the first part of Interstate 90 opened, from the Illinois Tollway just south of the Wisconsin–Illinois state line to Janesville. The Interstate east of Madison (I–90 from U.S. 12/18 to I–94, and I–90/94 northward to the Dells and, eventually, Tomah) opened in 1961.
From the 1940s, when what became the Interstate Highway System began to be mapped out, I–94 was always intended to be a Twin Cities-to-Eau Claire-to-Madison-to-Milwaukee-to-Chicago route. I–90 was intended to be a Madison-to-Beloit route, but west from Madison things changed.

Notice that the freeway west from Milwaukee goes straight west. What became I–90 was originally supposed to follow U.S. 18’s approximate route into Iowa. Instead …

… I–90 went north to link to La Crosse and Rochester, Minn., saving money as well because of using the I–94 routing to Tomah. The original I–90 routing, or a proposal to have I–90 follow U.S. 14 from La Crosse to Madison via what now is the South Beltline, could have changed western and southwestern Wisconsin development substantially.
Speaking of the Beltline, according to the state Department of Transportation, its history dates back to first construction in 1949 of the “South Beltline” and “East Beltline,” which is U.S. 51, more commonly known as Stoughton Road. I had no idea the Beltline was that old. Obviously it was designed in a day before Madison took an official position against the automobile.

The red shows the Beltline and Madison in 1956. According to maps I’ve seen, by 1956 the Beltline was four lanes from Park Street (in the middle-lower right) west to about the curve west of Verona Road, where it didn’t get upgraded to four lanes until the late 1960s. (I always remember the West Beltline, which is technically from Park Street westward, as four lanes, though it was two lanes north of Mineral Point Road until the mid-2000s.
The Beltline comes to mind because a massive reconstruction project is under way at the Beltline–Verona Road interchange. The portion of U.S. 151 from east of Verona to the Beltline slows traffic down to stoplights. It is a huge bottleneck, and as usual the state is about 30 years behind upgrading that portion. Worse, in this case, because there is no good away around Verona Road, the project is taking place while traffic goes through it, both delaying construction and making the bottleneck even worse.
-
Today is Constitution Day, a day that should be a bigger holiday in the U.S. than it is.
Since there are no Constitution Day parades, festivals or fireworks, I suggest you read the Constitution. The whole thing, amendments and all.
-
Yesterday, as you know, was the 13th anniversary of 9/11.
An event as cataclysmic as 9/11 was understandably could lead to some inappropriate reaction, the result of a trauma with no frame of reference to guide your thoughts or actions.
That bit of psychobabble is as charitable as I can be to describe this bit of ignominious history, from the Examiner:
Not only were the events of that day a change of course for history, but they were also a psychological attack on the hearts and minds of every American; and the effects still resonate to this day. It was scary. People didn’t know to react or wrap their head around what had just happened. The government reacted with immediate military action. Normal citizens reacted by donating any money they could to aid the relief effort. Some even joined the military. Some corporate entities responded with censorship.
In the days following 9/11, Clear Channel Communications, the owner of more than 1,200 radio stations covering every market demographic in the United States at the time issued a ban of around 165 songs from being played on any of their radio stations. I truly think that the Clear Channel memorandum had good intentions by trying to suppress any song about death, fire, tall buildings, or bombs because music moves people emotionally and Clear Channel must have been of the opinion that American minds had been put through enough horror. Clear Channel felt the need to spread only positivity through the power of the airwaves, therefore attempting to keep the public’s heads up by pure psychology. (You see this in bars and restaurants all time. When’s the last time you bought a cheap drink at a bar that was playing classical music? You haven’t)
The reasoning behind the decision was one thing. The list itself is crazy. Songs that personally make me feel inspired and empowered such as “New York, New York” by Frank Sinatra, “What a Wonderful World” by Louis Armstrong, and “Imagine” by John Lennon were on this list. Any time I hear any of those three songs, I receive a feeling inside that not many songs can dish out. I feel hope in the face of adversity. I see a light at the end of the tunnel. I feel an ability to overcome. And most importantly, I feel inspired to try to make the world a better place with my own two hands.
So the very thought of a song about New York City or a song inspired by a war 30 years prior we as Americans were considered too fragile to hear. If history has taught us anything, it has taught us Americans are resilient and hold a rock-solid resolve. History has also taught us that simply you can’t stop New York City.
Before 9/11 was the 1989 World Series earthquake in San Francisco, which WOLX radio in Madison announced, followed by, I kid you not, Carole King’s “I Feel the Earth Move Under My Feet.” This is what happens when a radio station creates a playlist who knows how long in advance, and doesn’t have DJs in the building who can make the decision to pull an temporarily inappropriate song. (Or doesn’t give the DJ the ability to do so.)
Clear Channel’s list is indeed crazy, including:
It seems that nearly every grunge rock band from the ’90s was blacked out.
The list includes at least one ironic choice: Paul Simon sang “Bridge Over Troubled Water” at a 9/11 benefit concert. And not playing “New York, New York” defies explanation. Yes, 9/11 took place on a Tuesday, but what reason other than that is there to not play Lynyrd Skynyrd’s “Tuesday’s Gone”?



