islam vs. Islamic terrorism

Melanie Phillips after yet another terrorist attack in Britain:

Even now, with Theresa May saying “enough is enough” after the London Bridge atrocities, we are still refusing to identify correctly the threat that has already claimed so many lives.

These attackers are not “evil losers”. They are not “sick cowards”. I They are not nihilists or psychiatric cases or lone wolves. They are devout and ecstatic Muslim fanatics who are waging a war of religion against us.

Mrs May correctly referred to “Islamist” terrorism. Yet she also said this was a “perversion of Islam”. How can it be a “perversion” when it is solidly rooted in religious texts and theological doctrine validated and endorsed by the world’s most powerful Islamic authorities?

In his article in The Times yesterday, the communities secretary Sajid Javid tied himself up in knots. He rightly said it wasn’t enough for Muslims merely to condemn terror attacks; they must ask themselves “searching questions”, and issue challenges.

Yet he also said the perpetrators were not “true Muslims” and that it was right to say the attacks were “nothing to do with Islam”. Well if that’s so, why should Muslims need to do anything at all?

The West views Islam through its own cultural prism, which equates religion with spirituality. The problem is that Islam is as much a political ideology as a source of spiritual guidance.

In 2010 a German study, which involved intensive questioning of 45,000 Muslim teenagers from 61 towns and regions across the country, found that the more religious they were the more likely they were to become violent.

Sheikh Mohammad Tawhidi, a Shia cleric in Australia who campaigns against Sunni extremism, has said: “The scriptures are exactly what is pushing these people to behead the infidel. Our books teach the beheading of people.”

Of course, millions of Muslims don’t subscribe to any of this. Some are merely cultural Muslims who observe no religious practices. Some, such as the Sufis or the Ahmadiyya sect, are pious Muslims who are truly peaceful (and are themselves victims of the Islamists).

But political, aggressive, jihadist Islam, constrained for so long by both the Ottoman empire and western colonialism, is now dominant once again in the Muslim world. Which is why in 2015 Egypt’s President Sisi remarkably told the imams of Al-Azhar university in Cairo — the epicentre of Islamic doctrinal edicts — that Islam’s corpus of sacred texts was “antagonising the entire world”, that it was “impossible” for 1.6 billion Muslims to “want to kill the rest of the world’s inhabitants”, and so Islam had to have a “religious revolution”.

We should be promoting and defending such Muslim reformers in the desperate hope that they succeed. Instead we knock the ground from under their feet by saying Islamist attacks have nothing to do with Islam. Until and unless Islam is reformed, we need to treat its practices on a scale ranging from extreme caution to outlawing some of them altogether.

Mrs May said we need to make people understand that our “pluralistic British values” were “superior to anything offered by the preachers and supporters of hatred”.

The problem is, though, that Islamists believe their values represent the literal word of God. So to them, no other values can possibly be superior. As a result, you can no more deradicalise them than you could have deradicalised the priests of the Inquisition.

We must require Muslims to take responsibility for the actions of all in their community. An ICM poll of British Muslims two years ago found that nearly a quarter wanted Sharia to replace British law in areas with large Muslim populations.

Four per cent — equivalent to more than 100,000 British Muslims — said they were sympathetic to suicide bombers fighting “injustice”.

In other words, we must see jihadist Islam as at the extreme end of a continuum of beliefs which are themselves incompatible with British society.

So we shouldn’t just be stopping people coming back to Britain from Syria or Libya, or detaining terrorist suspects through control orders. We should also be closing down radical mosques, deporting those born in other countries who are involved in extremism, stopping foreign funding for Muslim institutions and banning the Muslim Brotherhood.

We should also outlaw Sharia courts because, since Sharia does not accept the superior authority of secular legislation, it inescapably undermines the core British value of one law for all.

The message should be that British Muslims are welcome citizens but on the same basis as everyone else: that they subscribe to the binding nature of foundational British laws and values. If not, they will be treated as subversives.

The chances of any of these measures being taken, though, are slim. There will be inevitable claims that judge-made human rights law, which has often protected the “rights” of extremists rather than their victims, cannot be set aside without “destroying British values”.

Jihadist terrorists, however, are not trying to divide us, destroy our values or stop the general election. They are trying to kill us and conquer us.

“Us,” by the way, includes Muslims. The Religion of Peace has been keeping a daily Ramadan Bombathon count. As of this morning, the website counts 73 bombings with 809 dead, mostly Muslims.

Unlike some people I know on the right, I am not reflexively anti-Muslim or anti-Islam. Nor is Donald Trump, who unlike his predecessor is willing to condemn radical Islam by name. But there are facets of Islam that are incompatible with Western values (for instance, this), and unless you count the wack jobs at Westboro Baptist Church to be Christians (they’re not), Islam  is the only major world religion with adherents killing in the name of their religion today. Like it or not, it is up to Muslims to defeat radical Islam.

More advice Democrats aren’t taking

William McGurn:

Nine years after Barack Obama accused small-towners of clinging to guns or religion, nearly three years after Jonathan Gruber was shown to have attributed ObamaCare’s passage to the stupidity of the American voter, and eight months after Hillary Clinton pronounced half of Donald Trump’s voters “irredeemable,” Democrats are now getting some sophisticated advice: You don’t win votes by showing contempt for voters.

In the last week or so a flurry of articles have appeared arguing for toning down the looking-down. In the New Republic Michael Tomasky writes under the heading “Elitism Is Liberalism’s Biggest Problem.” Over at the New York Times, Joan C. Williams weighs in with “The Dumb Politics of Elite Condescension.” Slate goes with a Q&A on “advice on how to talk to the white working class without insulting them.” Stanley Greenberg at the American Prospect writeson “The Democrats’ ‘Working-Class Problem,’ ” and Kevin Drum at Mother Jones asks for “Less Liberal Contempt, Please.”

None of these pieces are directed at Trump Nation. To the contrary, they are pitched to progressives still having a hard time coming to grips with The Donald’s victory last November. Much of what these authors write is sensible. But it can also be hilarious, particularly when the effort to explain ordinary Americans to progressive elites reads like a Margaret Mead entry on the exotic habits of the Samoans.

Mr. Tomasky, for example, informs progressives that middle Americans—wait for it—“go to church.” They have friends (“and sometimes even spouses”) “who are Republicans.” “They don’t feel self-conscious saluting the flag.” Who knew?

Most of these writers allow that there is at least some fraction of Trump voters who are not deplorable. What they do not appreciate is how condescending they can be while advising their fellow Democrats to be less condescending. Exhibit A: Mr. Drum’s recommendation that Democrats can “broaden [their] appeal” because these are “persuadable, low information folks.”

Still, Mr. Drum comes across as Gandhi when set against the writer at Slate who interviews Ms. Williams. The following question conveys the tone: “What attitude should we be taking toward people who voted for a racist buffoon who is scamming them?”

Ms. Williams, a University of California law professor who has written a new book on the white working class, generously avoids telling her interviewer he is a perfect instance of the problem. But the larger progressive dilemma here is that contempt is baked into the identity politics that defines today’s Democratic Party.

When Mrs. Clinton labeled Trump voters deplorable (“racist, sexist, homophobic, xenophobic, Islamophobic, you name it”) she was simply following identity politics to its logical conclusion. Because identity politics transforms those on the other side of the argument—i.e., Americans who are pro-life, who respect the military, who may work in the coal industry—from political opponents into oppressors.

Which is precisely how they are treated: as bigots whose retrograde views mean they have no rights. So when the Supreme Court unilaterally imposes gay marriage on the entire nation, a baker who doesn’t want to cater a gay reception must be financially ruined. Ditto for two Portland women who ran a burrito stand that they shut down after accusations of cultural appropriation regarding their recipes.

No small part of the attraction of identity politics is its usefulness in silencing those who do not hew to progressive orthodoxy. This dynamic is most visible on campuses, where identity politics is also most virulent. It’s no accident, in other words, that the mob at Middlebury resorted to violence to try to keep Charles Murray ; after all, he’s been called a “white nationalist.” In much the same way identity politics has led Democrats to regard themselves as the “resistance” rather than the loyal opposition.

The great irony here is that this has left Democrats increasingly choosing undemocratic means to get what they want. From President Obama’s boast that he would use his pen and phone to bypass Congress to the progressive use of the Supreme Court as its preferred legislature to the Iran and climate deals that made end runs around the Constitution, it all underscores one thing: The modern American progressive has no faith in the democratic process because he has no trust in the American people.

Here it helps to remember the tail end of Mr. Obama’s snipe about guns and religion: it was a crack about voters clinging to “antipathy toward people who aren’t like them.” Sounds like a pretty accurate indictment of contemporary American liberalism, judging by all these articles begging progressives to be a little more broad-minded.

So good luck with the idea that the Democratic Party can restore its relationship with Middle America without addressing the identity politics that fuels it. Especially when it starts from the premise that the Americans they are condescending to will remain too stupid to figure it out.

Exhibit A would be the Wisconsin Democratic Party, whose seething contemptuous hatred of Gov. Scott Walker has been so successful that their party has been losing elections left and left since Recallarama, which culminated in …

 

If you think spelling is hard now …

Because the National Spelling Bee took place this week, Google Trends promoted this map:

Notice that Wisconsin is the only state that, according to Google Trends, has the biggest problem spelling its own name. People (Hawaii’s most queried word) have been making fun of Wisconsin for that reason, but I think that is less appalling than those in Oregon lacking the sense to know how “sense” is spelled, Are people in Rhode Island lying when they claim they can’t spell “liar”?

As an alleged spelling expert, I find this to be a stereotype-breaking map, Most people probably think North Carolina is in the Bible Belt, so why would “angel” be difficult to spell? Is Mississippi so poor that its residents don’t know what a “nanny” is, or West Virginia and Connecticut so unfamiliar with Disney works that neither state’s residents can spell “supercalifragilisticexpialidocious”? (I’m sure you’ll agree that that is something quite atrocious.) Of course, as a former spelling bee contestant I can say that “Wisconsin” would never come up in a spelling bee, because proper nouns are not included in spelling bees.

One reason for Wisconsin’s difficulties with “Wisconsin” may have to do with what the always-accurate Wikipedia reports:

The word Wisconsin originates from the name given to the Wisconsin River by one of the Algonquian-speaking Native American groups living in the region at the time of European contact. French explorer Jacques Marquette was the first European to reach the Wisconsin River, arriving in 1673 and calling the river Meskousing in his journal. Subsequent French writers changed the spelling from Meskousing to Ouisconsin, and over time this became the name for both the Wisconsin River and the surrounding lands. English speakers anglicized the spelling from Ouisconsin to Wisconsin when they began to arrive in large numbers during the early 19th century. The legislature of Wisconsin Territory made the current spelling official in 1845.

The Algonquin word for Wisconsin and its original meaning have both grown obscure. Interpretations vary, but most implicate the river and the red sandstone that lines its banks. One leading theory holds that the name originated from the Miami word Meskonsing, meaning “it lies red”, a reference to the setting of the Wisconsin River as it flows through the reddish sandstone of the Wisconsin Dells. Other theories include claims that the name originated from one of a variety of Ojibwa words meaning “red stone place”, “where the waters gather”, or “great rock”.

So “Wisconsin” is a combination of Miami, Algonquin or Ojibwa and French, going from “Meskonsing” or “Meskousing” to “Ouisconsin” to “Wisconsin.” “Oui” means “yes” in French, for what it’s worth. Clear as mud (or “clair comme de la boue” en français), but the red references might explain the decision of the University of Wisconsin to adopt red (to be precise cardinal, as you know) as its color.

The more annoying issue that we residents of Red Water Rock (in order in French, “rouge,” “eau” and “roche”) face is national sports announcers’ inability to pronounce this state’s name correctly. Badger fans who didn’t get to Pasadena certainly enjoyed the 1994, 1999 and 2000 Rose Bowl wins, except for ABC-TV’s Bob Griese’s pronouncing “Wisconsin” with the accent on the first syllable instead of the second. (Griese is from Evansville, Ind., and played for Purdue. Some people argue that southern Indiana and southern Illinois are, or sound like they are, in the South, so perhaps that has something to do with it.)

If you think spelling in American English is difficult now, read Hannah Poindexter:

English has always been a living language, changing and evolving with use. But before our modern alphabet was established, the language used many more characters we’ve since removed from our 26-letter lineup. The six that most recently got axed are:

Eth (ð)

The y in ye actually comes from the letter eth, which slowly merged with y over time. In its purest form, eth was pronounced like the th sound in words like this, that or the. Linguistically, ye is meant to sound the same as the but the incorrect spelling and rampant mispronunciation live on.

Thorn (þ)

Thorn is in many ways the counterpart to eth. Thorn is also pronounced with a th sound, but it has a voiceless pronunciation — your vocal cords don’t vibrate when pronouncing the sound — like in thing or thought.

Today, the same th letter combo is used for both þ and ð sounds. There is a pronunciation difference — thorn is a voiceless pronunciation and eth is voiced — but that’s just something you pick up as you learn to speak. Of course, you’ll never hear about this in school, because that’s English for you.

Wynn (ƿ)

Wynn was incorporated into our alphabet to represent today’s w sound. Previously, scribes used two u characters next to each other, but preferred one character instead and chose wynn from the runic alphabet. The double u representation became quite popular and eventually edged wynn out. Ouch.

Yogh (ȝ)

Yogh was historically used to denote throaty sounds like those in Bach or the Scottish loch. As English evolved, yogh was quickly abandoned in favor of the gh combo. Today, the sound is fairly rare. Most often, the gh substitute is completely silent, as in though or daughter.

Ash (æ)

Ash is still a functional letter in languages like Icelandic and Danish. In its original Latin, it denoted a certain type of long vowel sound, like the i in fine. In Old English, it represented a short vowel sound — somewhere between a and e, like in cat. In modern English, æ is occasionally used stylistically, like in archæology or medæval, but denotes the same sound as the letter e.

Ethel (œ)

Ethel also once represented a specific pronunciation somewhere between the two vowels o and e, though it was originally pronounced like the oi in coil. Like many clarifying distinctions, this letter also disappeared in favor of a simpler vowel lineup (a, e, i, o, u) with many different pronunciations.

 

Fightin’ Greg Gianforte

I’m not sure why Montana has special elections on Thursdays, but they did.

So CNN reports:

Republican Greg Gianforte has won the special election for Montana’s open US House seat, CNN projects, defeating Democrat Rob Quist and capping off a whirlwind final 36 hours of the campaign that saw Gianforte being charged for allegedly assaulting a reporter.

In his acceptance speech, Gianforte apologized by name to Ben Jacobs, the Guardian reporter who accused the Republican of “body-slamming” him and breaking his glasses.

“When you make a mistake, you have to own up to it,” Gianforte told his supporters at his Election Night rally in Bozeman. “That’s the Montana way.”

Saying he was “not proud” of his behavior, he added, “I should not have responded the way I did, for that I’m sorry. I should not have treated that reporter that way, and for that I’m sorry, Mr. Ben Jacobs.”

Members of the supportive crowd shouted, “You’re forgiven.” …

Gianforte was considered the favorite heading into Thursday’s election to fill the seat once held by Interior Secretary Ryan Zinke, but that was before the altercation with Jacobs on Wednesday. The Gallatin County Sheriff’s office later charged Gianforte with misdemeanor assault.

The congressional race in Montana pitted two diametrically opposed candidates against one another. Gianforte: an articulate millionaire and tech entrepreneur who sold his company RightNow Technologies to Oracle in 2012 for $1.8 billion. Quist: a first-time candidate and Montana folk singer who’d amassed moderate Montana fame in the 1970s as a member of the Mission Mountain Wood Band.

The early crowd of voters at Gianforte’s rally were standing by the candidate, unfazed by the events of the previous 24 hours.

“We whole-heartedly support Greg. We love him,” said Karen Screnar, a Republican voter who had driven all the way from Helena to support Gianforte. Screnar said she and her husband have known Gianforte for the better part of a decade. After Gianforte was charged with misdemeanor assault, Screnar said she was only “more ready to support Greg.”

“We’ve watched how the press is one-sided. Excuse me, that’s how I feel. (They’re) making him their whipping boy so to speak through this campaign,” Screaner said. “There comes a point where, stop it.”

Her husband, Terry, chimed in that he believed Gianforte was “set up.”

First: As my high school political science teacher pointed out, this demonstrates a major problem with early voting, which reportedly comprised 60 percent of the votes cast. Republicans have generally opposed early voting, but it may have benefitted the GOP this time since voters cacn’t change their minds once their early ballots are cast.

The biggest issue is brought up by Joe Concha:

Has the media become so unpopular that body-slamming journalists can actually be good for one’s reputation?

That appears to be the case for Rep.-elect Greg Gianforte (R-Mont.), who lost his temper and allegedly became physical with Guardian reporter Ben Jacobs less than 24 hours before voters went to the polls in a closely-watched special election.

Gianforte went on to win comfortably in the reliably red state. But here’s the real kicker: The candidate reportedly raised $100,000 in the aftermath of Jacobs’ claim that Gianforte “body-slammed” him and broke his glasses.

So one would think the backlash against Gianforte, who local police charged with misdemeanor assault, would be fairly swift. Except it’s anything but. …

The Drudge Report also has as its lead headline Friday morning, “FIGHTING FOR MONTANA!” with a link to an ABC News story of Gianforte winning.

Celebrating this kind of behavior cannot stand. The general sentiment on social media from some on the right, and its apparent, is that Jacobs and the media as a whole somehow deserved this.

And while it has been noted in this space on multiple occasions that many in political media are absolutely biased, dishonest and narcissistic, the argument will not be won by resorting to violence.

Think about it: Who wins in if this sort of thing happens again? Does the coverage improve? Does the media suddenly get scared straight into delivering the news straight? Of course not.

Bob Woodward said it best early this week in capturing the mood against the media and why so many Trump supporters were probably happy with Gianforte’s actions.

“I worry for the business, for the perception of the business, not just Trump supporters, they see that smugness,” Woodward said. “I think you can ride both horses, intensive inquiry, investigation, not letting up … at the same time, realize that it’s not our job to do an editorial on this.”

It’s hard to disagree with that sentiment from one of the few lucid and measured pros remaining in the business.

According to a September survey by Gallup, 86 percent of Republicans and 70 percent of Independents distrust the media.

Keep pointing out bias and deceit and unprofessional tone when you see it. Twitter and other forms of social media give everyone a megaphone.

But think twice before cheering Gianforte’s actions. Getting physical solves nothing. It could make the problem worse.

As someone in this silly line of work for 30 years who has been physically threatened, I must say the sanctimony on this issue is a bit much. Jacobs threatened a 16-year-old at a Conservative Political Action Conference and bragged about it on Twitter. If you believe in karma, Jacobs had it coming. (And perhaps reporters will finally grasp the virtues of the Second Amendment and concealed-carry rights.)

That’s pretty much what Kurt Schlichter argues:

I know it’s theoretically wrong for a Republican candidate to smack around an annoying liberal journalist, but that still doesn’t mean that I care. Our ability to care is a finite resource, and, in the vast scheme of things, millions of us have chosen to devote exactly none of it toward caring enough to engage in fussy self-flagellation because of what happened to Slappy La Brokenshades.

Sorry, not sorry.

And that’s not a good thing, not by any measure, but it is a real thing. Liberals have chosen to coarsen our culture. Their validation and encouragement of raw hate, their flouting of laws (Hi leakers! Hi Hillary!) and their utter refusal to accept democratic outcomes they disapprove of have consequences. What is itself so surprising is how liberals and their media rentboyz are so surprised to find that we normals are beginning to feel about them the way they feel about us – and that we’re starting to act on it. If you hate us, guess what?

We’re going to start hating you right back.

Cue the boring moralizing and sanctimonious whimpering of the femmy, bow-tied, submissive branch of conservatism whose obsolete members were shocked to find themselves left behind by the masses to whom these geeks’ sinecures were not the most important objective of the movement. This is where they sniff, “We’re better than that,” and one has to ask ,“Who’s we?” Because, by nature, people are not better than that. They are not designed to sit back and take it while they are abused, condescended to, and told by a classless ruling class that there are now two sets of rules and – guess what? –the old rules are only going to be enforced against them.

We don’t like the new rules – I’d sure prefer a society where no one was getting attacked, having walked through the ruins of a country that took that path – but we normals didn’t choose the new rules. The left did. It gave us Ferguson, Middlebury College, Berkeley, and “Punch a Nazi” – which, conveniently for the left, translates as “punch normals.” And many of us have had personal experiences with this New Hate – jobs lost, hassles, and worse. Some scumbags at an anti-Trump rally attacked my friend and horribly injured his dog. His freaking dog.

So when we start to adopt their rules, they’re shocked? Have they ever met human beings before? It’s not a surprise. It’s inevitable.

Team Fredocon, when they aren’t, “Oh well, I never!-ing” about Trump and his uncouth supporters, moan about the threat of “Whataboutism,” the tendency for people to explain their sub-optimal behavior by asking, “What about so-and-so? He did the same thing and you didn’t care.” But while “whataboutism” may be a logical fallacy, it’s still a devastatingly compelling argument.

Humans – especially normal Americans – won’t tolerate a double standard. But double standards apply all the time to liberals – they do it and it’s fine, but we do it and it’s Armageddon. The same jerks screaming for O‘Reilly’s scalp worship Bill Clinton and his drunken, perv-enabling pseudo-wife.

Or take the Trump-Russia black hole of idiocy – please. Remember how Obama whispered to the Russkies, “I’ll have more flexibility after election” and that was cool? But – according to an anonymous source reading a bar tab over the phone to some credulous WaPo hack – one of Trump’s relatives ordered a vodka once and it’s TREASON TREASON TREASON!!!!!!

It certainly applies to, “What about when they hit conservatives with a lock in a sock and the liberal media didn’t care?” Yeah, what about that? Where was the sackcloth and ashes act from Schumer, Pelosi, and Felonia von Pantsuit when our side was being bloodied and beaten? There wasn’t one, because the left supports us getting bloodied and beaten. It likes the zesty zing of violence. It makes them feel big and tough and edgy, except that it starts being a heck of a lot less fun when we right-wingers start adopting the same rules and punching back.

The left is shocked that the right has now stopped caring about the old rules, since for so long the left relied on the right to subordinate its human instincts and conform to those rules even when the left ignored them. We refused to stoop to their level, and for a long time, we were “better than that.” But you can only have one side being “better than that” for so long before people get sick of being the butt of the hypocrisy.

Hypocrisy is poison not because it makes people stop knowing right from wrong, but because it makes its victims stop caring about right and wrong. Ben Jacobs got smacked around, and millions of us just don’t give a damn.

We all know it was wrong for Greg Gianforte to beat up Ben Jacobs. But we also know the general attitude of the media is that when we conservatives get beat-up by leftists it’s perfectly excusable – even laudable – and thanks to the fact that Twitter is forever, we now know that Ben Jacobs himself specifically thinks it’s A-OK to slug conservative kids. So can someone tell me why anyone should be shocked that we conservatives refuse to devote one iota of caring to poor Ben’s wedgie?

This isn’t a good thing. This is nothing to be proud of. We should not be happy that our society is heading toward the lowest common denominator, which itself is in freefall. But the alternative is worse. Should we allow ourselves to continue to be figuratively and literally beaten up while smiling at our own purity, secure in the knowledge that even though our dignity and freedom are stripped from us, we have not fought back? Not happening. Letting these bastards play by their own rules, and thereby crush us, seems a pretty high price to pay just to gain the approval of the smug and sanctimonious David Frums and John Kasichs of the world.

We conservatives have been warning for a long time that liberals are not going to like it when everyone plays by the new rules, and – surprise! – they don’t. But guess what? Most of us don’t like the new rules either. Yet it’s ridiculous to expect human beings to remain in perpetual denial about the situation they face, and to forever live under a double standard that results in their faces getting pressed into the dirt.

The hypocrisy has become intolerable, and we have stopped tolerating it. This is just the beginning of the reaction, and – make no mistake – this entire situation is a bad thing. Our society is making choices that can lead only to ruin (and my new novel describing the consequences just dropped).

Lincoln mentioned “the better angels of our nature” – also at a time when Democrats were rejecting the rule of law in order to promote their subjugation of those they considered lesser beings – and the important thing to note is that “angels” is plural. You need two angels, not one angel and one demon. But that’s what we have, and if it doesn’t change we’ll have two demons, and everyone should care about that.

Speaking of Lincoln: This country is not as divided as it was around the Civil War. However, hatred of the news media is at unprecedented levels. And hatred of our fellow man based on his political views different from yours is going in the wrong direction, even within the parties (Hillary! supporters vs. Comrade Bernie’s apparatchiks, NeverTrumps vs. Trump’s fan club), let alone between the parties. I predicted some time ago that we were winding up toward assassinations of politicians and their supporters. I suspect you can add to that group journalists too.

If Schlichter is right, deescalating has to start somewhere. Will someone have to die first?

w

Moore. Roger Moore.

The James Bond I grew up with died earlier this week.

The 007 franchise, the longest currently running in movies, is now on its sixth Bond, Daniel Craig. Sean Connery started the series, left for two (the original “Casino Royale” and “On His Majesty’s Secret Service”), and returned for one (plus another not from the series’ producers).

It may be that “Diamonds Are Forever,” but Connery was not. His replacement was Roger Moore, who had already played a similar role on British TV that was picked up by NBC, “The Saint”:

I saw Connery’s Bond on TV (generally ABC’s Sunday Night Movie). I saw Moore’s Bond in theaters.

Between that and the fact that Moore acted as Bond the most of any of the Bonds (besides Connery there was one-Bond George Lazenby, parody Bond David Niven, Moore’s successor Timothy Dalton, Pierce Brosnan and now Daniel Craig), Moore has always been Bond to me. Connery may be more popular, and Craig may be more the Bond that author Ian Fleming intended, but Moore’s Bond is who I think of.

This story has been circulating on the interwebs:

James Freeman adds that the story …

… squares with what this column heard from a source who occasionally had the pleasure of Moore’s company. The actor was witty and well-read, but often preferred listening to others rather than telling stories of his own. The Times of London notes that despite his huge celebrity, Moore remained self-deprecating:

Sir Roger Moore may not have been the best Bond, indeed by his own estimation he was the fourth best, but off screen he was undoubtedly the most endearing of the actors who played the role. This likeability had much to do with his unwillingness, perhaps inability, to take himself too seriously. When he was cast in the 007 role, for example, he was asked what he thought he could bring to it. More brooding menace than Sean Connery, perhaps? More sex appeal than George Lazenby? He replied: “White teeth.” And when critics accused him of being a one eyebrow actor, he countered that this was unfair because he was, in fact, a two-eyebrow actor.

“The eyebrows thing was my own fault,” he once said in an interview. “I was talking about how talentless I was and said I have three expressions — eyebrow up, eyebrow down and both of them at the same time. And they used it — very well, I must say.”…

When he first took the role, the films’ producer Cubby Broccoli told him he needed to “lose a little weight and get into shape”. He replied: “Why didn’t you just cast a thin, fit fellow and avoid putting me through this hell?

Elsewhere in the U.K., the Gloucestershire Echo reports that “touching tributes are pouring in” for Moore and that among those with fond memories is a hotel manager named Olivier Bonte. Mr. Bonte tells the paper: “He was a very nice person to look after unlike some of the other A-listers we entertain. He was a true gentleman: polite and traditional.”

While this column appreciates the talent of Mr. Connery, Moore was the James Bond that your humble correspondent grew up watching. Leave it to the indispensable Kyle Smith to make the case that “Roger Moore Was the Best Bond:”

Sean Connery, with his big shoulders and his swaggering physicality, his touch of cruelty and menace, was a much larger screen presence than Moore. But it was Moore’s lighter touch — the arched eyebrow, the deadpan sense of humor, the movements graceful rather than aggressive — that was perfect for the times, when the ideal of screen manhood evolved from the irony-impervious scowl of John Wayne to the sardonic smirk of Burt Reynolds and the puzzled uncertainty of Warren Beatty.

If Connery’s Bond was a fantasy figure who projected British might, Moore’s Bond was a synecdoche for the new role of Britain — no longer the lion of the globe, it would measure its influence in soft power. For more than half a century, Britain has exerted its primary influence not through its troops and warships but in its popular culture, particularly in pop music, which without its British elements would scarcely be recognizable today. Moore’s Bond, like his country, had to be clever because he could no longer be overwhelming.

Mr. Smith, lion among movie critics, describes the new Bond finesse in the signature films of that era:

Who can forget how, in The Spy Who Loved Me (1977), Moore’s Bond socked Richard Kiel’s steel-toothed thug Jaws in the midsection as hard as he could — and was rewarded by being picked up and rammed against the ceiling by his much larger foe? Yet Bond won that round when he used Jaws’s deadliest attribute against him — by electrocuting his mouth with a lamp. What better illustration is there of the superiority of fancy footwork over brawn than in Live and Let Die (1973), when Moore’s Bond finds himself on a rock in the middle of a pond full of ravening crocodiles, uses the beasts as stepping stones and smartly walks away from them without even loosening his tie?

This column hasn’t mentioned the famous Bond girls and of course any discussion of this film franchise is bound to raise complaints, often justified, about the treatment of women at the hands of 007. But this week Jackie Bischof gamely argues that the Roger Moore films were distinctive for their strong female characters, including KGB Major Anya Amasova.

At least in a fictional story on film, here was a case where there really was collusion between a western power and a Russian state actor. Mr. Smith describes the closing moments of “The Spy Who Loved Me”:

Escaping from certain death with Russian spy Barbara Bach in a submersible pod that doubled as a ’70s love nest at the end of the film, Bond disdained to comment on the havoc around him and turned his attention to a surprise stowed in the pod. “Maybe I misjudged Stromberg,” he says. “Anyone who drinks Dom Perignon ’52 can’t be all bad.” With a single line (“Let’s get out of these wet things”), he convinces the foe sworn to kill him to sleep with him instead, then closes the curtain on his bosses as they peer through the window.

Sleep well, Sir Roger.

The irreligious Trump and his religious fans

Michael Gerson on thrice-married Donald Trump and some of his biggest supporters, who you’d think wouldn’t approve of three marriages, two of which ended in divorce:

In the compulsively transgressive, foul-mouthed, loser-disdaining, mammon-worshiping billionaire, conservative Christians “have found their dream president,” according to Jerry Falwell Jr.

It is a miracle, of sorts.

In a recent analysis, the Pew Research Center found that more than three-fourths of white evangelical Christians approve of Trump’s job performance, most of them “strongly.” With these evangelicals comprising about a quarter of the electorate, their support is the life jacket preventing Trump from slipping into unrecoverable political depths.

The essence of Trump’s appeal to conservative Christians can be found in his otherwise anodyne commencement speech at Liberty University. “Being an outsider is fine,” Trump said. “Embrace the label.” And then he promised: “As long as I am your president, no one is ever going to stop you from practicing your faith.” Trump presented evangelicals as a group of besieged outsiders, in need of a defender.

This sense of grievance and cultural dispossession — the common ground between The Donald and the faithful — runs deep in evangelical Christian history. Evangelicalism emerged from the periodic mass revivals that have burned across America for 300 years. While defining this version of Christianity is notoriously difficult, it involves (at least) a personal decision to accept God’s grace through faith in Christ and a commitment to live — haltingly, imperfectly — according to his example.

In the 19th century, evangelicals (particularly of the Northern variety) took leadership in abolitionism and other movements of social reform. But as a modernism based on secular scientific and cultural assumptions took control of institution after institution, evangelicals often found themselves dismissed as anti-intellectual rubes.

The trend culminated at the 1925 Scopes Monkey Trial, in which evolution and H.L. Mencken were pitted against creation and William Jennings Bryan (whom Mencken called “a tin pot pope in the Coca-Cola belt and a brother to the forlorn pastors who belabor half-wits in galvanized iron tabernacles behind the railroad yards”). Never mind that Mencken was racist, anti-Semitic and an advocate of eugenics and that Bryan was the compassionate progenitor of the New Deal. Fundamentalists (a designation adopted by many evangelicals) lost the fundamentalist-modernist controversy, even in their own minds.

After a period of political dormancy — which included discrediting slumber during the civil rights movement — evangelicals returned to defend Christian schools against regulation during the Carter administration. To defend against Supreme Court decisions that put tight limits on school prayer and removed state limits on abortion. To defend against regulatory assaults on religious institutions. Nathan Glazer once termed this a “defensive offensive” — a kind of aggrieved reaction to the perceived aggressions of modernity.

Those who might be understandably confused by the current state of evangelicalism should understand a few things:

First, evangelicals don’t have a body of social teaching equivalent, say, to Catholic social doctrine. Catholics are taught, in essence, that if you want to call yourself pro-life on abortion, you also have to support greater access to health care and oppose the dehumanization of migrants. And vice versa. There is a doctrinal whole that requires a broad and consistent view of social justice. Evangelicals have nothing of the sort. Their agenda often seems indistinguishable from the political movement that currently defends and deploys them, be it Reaganism or Trumpism.

Second, evangelicalism is racially and ethnically homogeneous, which leaves certain views and assumptions unchallenged. The American Catholic Church, in contrast, is one-third Hispanic, which changes the church’s perception of immigrants and their struggles. (Successful evangelical churches in urban areas are now experiencing the same diversity and broadening their social concern.)

Third, without really knowing it, Trump has presented a secular version of evangelical eschatology. When the candidate talked of an America on the brink of destruction, which could be saved only by returning to the certainties of the past, it perfectly fit the evangelical narrative of moral and national decline. Trump speaks the language of decadence and renewal (while exemplifying just one of them).

In the Trump era, evangelicals have gotten a conservative Supreme Court justice for their pains — which is significant. And they have gotten a leader who shows contempt for those who hold them in contempt — which is emotionally satisfying.

The cost? Evangelicals have become loyal to a leader of shockingly low character. They have associated their faith with exclusion and bias. They have become another Washington interest group, striving for advantage rather than seeking the common good. And a movement that should be known for grace is now known for its seething resentments.

Whether you approve or not, the cause of this is obvious — Trump’s predecessor in the White House. Barack Obama was as big a fan of abortion as Bill Clinton, and opposed religious liberty for conservative Christians. (See “wedding cakes.”) In these divisive days, you’re either for something or against something, and apparently doing what you say is preferable to doing what you do in your private life.

 

Creative class claptrap

I have written here previously about the false promise of community development strategies based on attracting the so-called “creative class.”

Now, its discoverer finds problems, as the Washington Post reports:

Richard Florida is rethinking things.

Since publishing the best-selling book “The Rise of the Creative Class” in 2002, Florida has used his considerable speaking and writing heft to push mayors, urban planners and company executives to cater to tech-savvy young professionals.

His argument, in short, was that in order to save themselves from post-industrial ruin, cities needed to attract the best young talent in computer programming, engineering, finance, media and the arts so their towns could build economies based upon the venture capital and start-up companies the new workforce would produce.

Often taking a cue from Florida’s mantra, real estate developers dialed up hip but tiny apartments designed for creative millennials and outfitted them with coffee bars, gyms, pool tables, bocce courts, pool decks and fire pits. Mayors invested in better sidewalks, bike lanes and business incubators aimed at nurturing the new arrivals and keeping them around longer.

Somewhere along the way, however, Florida realized that the workers he so cajoled were eating their cities alive.

In places like New York, San Francisco, Seattle and arguably Washington, the mostly white, young and wealthy “creative class” has so fervently flocked to urban neighborhoods that they have effectively pushed out huge populations of mostly blue-collar and often poor or minority residents.

“I think, to be honest, I and others didn’t realize the contradictory effect,” Florida said Tuesday at a panel discussion. He said he realizes now that prompting creative types to cluster in small areas clearly drove living costs to such heights that low-income and oftentimes middle-income households have been forced elsewhere, creating a divide he did not anticipate.

“We are cramming ourselves into this limited amount of space. And at the same time that the super-affluent, the advantaged, the creative class — we could go on and on [with what to call them] — the techies, global super-rich, absentee investors, invest in these cities, they push others out … and it carves these divides,” he said.

How much of the boom American, Canadian and European cities have experienced can be attributed to Florida’s influence is difficult to discern, but the popularity of his book and its sequels, along with his founding of the CityLab website in partnership with Atlantic Media, plus numerous speaking gigs, made him a household name in planning and business circles. In 2007, for instance, he shared a star turn with futurist Alvin Toffler and Pulitzer Prize-winning columnist Thomas Friedman at a National Conference of the Creative Economy, hosted by the Fairfax County Economic Development Authority.

Last week’s event, held at Union Market in Northeast Washington, drew a crowd of more than 500 people and must have felt like something of a reunion for people who have reshaped Washington since dysfunction and governmental malfeasance drove Congress to temporarily put the city under the authority of a financial control board in 1995. Two former city administrators and three D.C. planning directors attended, dating back to former mayor Anthony Williams’s administration.

But as inequality has deepened in top cities, writers on class and poverty have begun to take sharper aim at Florida’s theory, calling the “creative class” a fallacy and a failed experiment, not because he was wrong that investing in cities would help draw the creative class, but because he argued that doing so would benefit cities at large.

So although he still champions investments in urban areas, at the panel event Florida said the criticism had made a mark. “To be seen as the neoliberal devil, foisting gentrification on cities, is not a situation I like to be seen in,” he said.

Like any good ideas man, Florida has a new idea to fix the old idea, and a book to go with it, called The New Urban Crisis. In an excerpt published on his web site, Florida explained the turnaround in his thinking.

It became increasingly clear to me that the same clustering of talent and economic assets generates a lopsided, unequal urbanism in which a relative handful of superstar cities, and a few elite neighborhoods within them, benefit while many other places stagnate or fall behind. Ultimately, the very same force that drives the growth of our cities and economy broadly also generates the divides that separate us and the contradictions that hold us back.

I’m going to repeat part of a post on this subject in 2012:

Florida has an ideological message here too, as Steven Malanga pointed out:

But most important, to a generation of liberal urban policymakers and politicians who favor big government, Florida’s ideas offer a way to talk economic-development talk while walking the familiar big-spending walk. In the old rhetorical paradigm, left-wing politicians often paid little heed to what mainstream businesses—those that create the bulk of jobs—wanted or needed, except when individual firms threatened to leave town, at which point municipal officials might grudgingly offer tax incentives. The business community was otherwise a giant cash register to be tapped for public revenues—an approach that sparked a steady drain of businesses and jobs out of the big cities once technology freed them from the necessity of staying there.

Now comes Florida with the equivalent of an eat-all-you-want-and-still-lose-weight diet. Yes, you can create needed revenue-generating jobs without having to take the unpalatable measures—shrinking government and cutting taxes—that appeal to old-economy businessmen, the kind with starched shirts and lodge pins in their lapels. You can bypass all that and go straight to the new economy, where the future is happening now. You can draw in Florida’s creative-class capitalists—ponytails, jeans, rock music, and all—by liberal, big-government means: diversity celebrations, “progressive” social legislation, and government spending on cultural amenities. Put another way, Florida’s ideas are breathing new life into an old argument: that taxes, incentives, and business-friendly policies are less important in attracting jobs than social legislation and government-provided amenities. After all, if New York can flourish with its high tax rates, and Austin can boom with its heavy regulatory environment and limits on development, any city can thrive in the new economy. …

Except that …

But a far more serious—indeed, fatal—objection to Florida’s theories is that the economics behind them don’t work. Although Florida’s book bristles with charts and statistics showing how he constructed his various indexes and where cities rank on them, the professor, incredibly, doesn’t provide any data demonstrating that his creative cities actually have vibrant economies that perform well over time. A look at even the most simple economic indicators, in fact, shows that, far from being economic powerhouses, many of Florida’s favored cities are chronic underperformers.

Exhibit A is the most fundamental economic measure, job growth. The professor’s creative index—a composite of his other indexes—lists San Francisco, Austin, Houston, and San Diego among the top ten. His bottom ten include New Orleans, Las Vegas, Memphis, and Oklahoma City, which he says are “stuck in paradigms of old economic development” and are losing their “economic dynamism” to his winners. So you’d expect his winners to be big job producers. Yet since 1993, cities that score the best on Florida’s analysis have actually grown no faster than the overall U.S. jobs economy, increasing their employment base by only slightly more than 17 percent. Florida’s indexes, in fact, are such poor predictors of economic performance that his top cities haven’t even outperformed his bottom ones. Led by big percentage gains in Las Vegas (the fastest-growing local economy in the nation) as well as in Oklahoma City and Memphis, Florida’s ten least creative cities turn out to be jobs powerhouses, adding more than 19 percent to their job totals since 1993—faster growth even than the national economy. …

It’s no coincidence that some of Florida’s urban exemplars perform so unimpressively on these basic measures of growth. As Florida tells us repeatedly, these cities spend money on cultural amenities and other frills, paid for by high taxes, while restricting growth through heavy regulation. Despite Florida’s notion of a new order in economic development, the data make crystal-clear that such policies aren’t people- or business-friendly. The 2000 census figures on out-migration, for instance, show that states with the greatest loss of U.S. citizens in 1996 through 2000—in other words, the go-go years—have among the highest tax rates and are the biggest spenders, while those that did the best job of attracting and retaining people have among the lowest tax rates. A study of 1990 census data by the Cato Institute’s Stephen Moore found much the same thing for cities. Among large cities, those that lost the most population over a ten-year period were the highest-taxing, biggest-spending cities in America, with per-capita taxes 75 percent higher than the fastest-growing cities. Given those figures, maybe Florida should have called his book The Curse of the Creative Class.

My favorite demographer, Joel Kotkin, added after the 2010 election, which reversed much of the 2008 election, which Kotkin called “the triumph of the creative class”:

A term coined by urban guru Richard Florida, “the creative class” also covers what David Brooks more cunningly calls “bourgeois bohemians”–socially liberal, well-educated, predominately white, upper middle-class voters. They are clustered largely in expensive urban centers, along the coasts, around universities and high-tech regions. To this base, Obama can add the welfare dependents, virtually all African-Americans, and the well-organized legions of public employees. …

In contrast, the traditional middle class has not fared well at all. This group consists of virtually everyone who earns the national household median income of $50,000 or somewhat above. They tend to be white, concentrated outside the coasts (except along the Gulf), suburban and politically independent. In 2008 they divided their votes, allowing Obama, with his huge urban, minority and youth base, to win easily.

Since Obama’s inauguration all the economic statistics vital to their lives–job creation, family income, housing prices–have been stagnant or negative. Not surprising then that suburbanites, small businesspeople and middle-income workers walked out on the Democrats last night. They did not do so because they loved the Republicans but because the majority either fears unemployment or already have lost their jobs. Many were employed in the industries such as manufacturing and construction hardest hit in the recession; it has not escaped their attention that Obama’s public-sector allies, paid with their taxes, have remained not only largely unscathed, but much better compensated. …

The middle class is a huge proportion of the population. Thirty-five million households earn between $50,000 and $100,000 a year; close to another 15 million have incomes between $100,000 and $150,000. Together these households overwhelm the number of poor households as well as the highly affluent.

In contrast, the “creative class” represents a relatively small grouping. Some define this group as upward of 40% of the workforce–largely by dint of having a four-year college degree–but this seems far too broad. The creative class is often seen as sharing the hip values of the Bobo crowd. Lumping an accountant with two kids in suburban Detroit or Atlanta with a childless SoHo graphic artist couple seems disingenuous at best. In reality the true creative class, notes demographer Bill Frey, may constitute no more than 5% of the total.

As (apparently) a member of the creative class, I say that any politician who creates an economic development strategy based on 5 percent of the population deserves to be unemployed by the voters. (See Cieslewicz, Dave.) Official Madison has failed to notice that its quality of life is dropping like a rock due to the uncool issues of crime and schools, but on the other hand Madison is also increasingly unaffordable to live in. None of that is particularly friendly for families, regardless of how many parents they have in the house. Nor is substandard job growth.

There is only one demographic group worth pursuing: Families with children. No unit of government should spend 1 cent on attracting non-parents.

 

The latest from the culture wars

 

I was going to start this blog by saying that every once in a while news comes out of left field. However, that seems to be an increasingly common place from which news arrives these days.

For instance, did you think last week that Pope Francis was going to opine on libertarianism? If you did, play Powerball tonight. Breitbart reports:

Pope Francis had harsh words to describe libertarians Friday, saying they deny the value of the common good in favor of radical selfishness where only the individual matters.

“I cannot fail to speak of the grave risks associated with the invasion of the positions of libertarian individualism at high strata of culture and in school and university education,” the Pope said in an message sent to members of the Pontifical Academy of Social Sciences meeting in the Vatican and subsequently shared with Breitbart News.

“A common characteristic of this fallacious paradigm is that it minimizes the common good, that is the idea of ‘living well’ or the ‘good life’ in the communitarian framework,” Francis said, while at the same time exalting a “selfish ideal.” …

Francis said that libertarianism, “which is so fashionable today,” is a more radical form of the individualism that asserts that “only the individual gives value to things and to interpersonal relations and therefore only the individual decides what is good and what is evil.”

Libertarianism, he said, preaches that the idea of “self-causation” is necessary to ground freedom and individual responsibility.

“Thus, the libertarian individual denies the value of the common good,” the pontiff stated, “because on the one hand he supposes that the very idea of ‘common’ means the constriction of at least some individuals, and on the other hand that the notion of ‘good’ deprives freedom of its essence.”

Libertarianism, he continued, is an “antisocial” radicalization of individualism, which “leads to the conclusion that everyone has the right to extend himself as far as his abilities allow him even at the cost of the exclusion and marginalization of the more vulnerable majority.”

According to this mentality, all relationships that create ties must be eliminated, the Pope suggested, “since they would limit freedom.” In this way, only by living independently of others, of the common good, and even God himself, can a person be free, he said.

This isn’t the first time that the Pope has taken issue with popular social and political trends.

In March, Pope Francis told leaders of the European Union that the populist movements that are sweeping many parts of Europe and other areas are fueled by “egotism.”Populism, he said, is “the fruit of an egotism that hems people in and prevents them from overcoming and ‘looking beyond’ their own narrow vision.”

That prompted this response from Stephanie Slade:

“Pope Francis had harsh words to describe libertarians Friday,” Breitbartreports. That’s OK. I’m a Catholic libertarian, and I’ve had some harsh words to describe Pope Francis.

My main critique, which I published here at Reason on the eve of his 2015 visit to the United States, was that the pontiff’s ignorance of basic economics has led him to a bad conclusion about which public policies are best able to reduce the crushing yoke of poverty in the world. I went on to encourage him to consider that, as a matter of empirical fact, markets are the single greatest engine for growth and enrichment that humanity has yet stumbled upon.

I don’t doubt for a second that Pope Francis cares deeply about the least of his brothers and sisters. But I deny that his chosen prescriptions would do anything but make the problem worse.

This is not a bad time to be reminded that popes aren’t infallible, according to Catholic doctrine—instead, they are possessed of the ability to deliver infallible teachings on matters of faith and morals. As I pointed out in my piece, “In practice, such ‘definitive acts,’ in which a pope makes clear he’s teaching ‘from the chair’ of Jesus, are almost vanishingly rare.” Arguably, though, the pope’s remarks today to the Pontifical Academy of Social Sciences do pertain to faith and morals. He seems to be arguing that an outlook that places the individual above “the common good” is morally suspect.

As with his comments about capitalism, then, the problem is not so much that he’s speaking to issues that go beyond the scope of his office; the problem is his speaking to matters on which he is ill-informed. In this case, his statements betray a shallowness in his understanding of the philosophy he’s impugning. If he took the time to really engage with our ideas, he might be surprised by what he learned.

He might, for instance, be taken aback to discover that many libertarians hold beliefs that transcend an Ayn Randian glorification of selfishness (and that Ayn Rand rejected us, too, by the way). Or that what Pope Francis calls an “antisocial” paradigm in which “all relationships that create ties must be eliminated” (Breitbart‘s words) is better known by another name: the liberty movement, a cooperative and sometimes even rathersocial endeavor among people who cherish peaceful, voluntary human interactions. Or that lots of us are deeply concerned with the tangible outcomes that policies have on vulnerable communities, and that libertarians’ support for capitalism is very often rooted in its ability to make the world a better place. Or that some of us are even—hold on to your zucchetto—followers of Christ.

Most of all, he would likely be startled to find that, far from thinking “only the individual decides what is good and what is evil,” few libertarians are moral relativists. (Except the Objectivists, of course. Or am I getting that wrong?) Speaking as a devotee of St. John Paul II, one of the great articulators of the importance of accepting Truth as such, this one is actually personal.

It’s hard not to wonder whether Pope Francis knows any libertarians. In the event he’s interested in discussing the ideas of free minds and free markets with someone who ascribes to them, I’d be happy to make myself available.

Were I Roman Catholic instead of just raised Catholic, I’d be torn about this. I’m more a fan of Pope Francis and what could be reasonably called certain liberal Catholics (including my favorite nuns) than I am of certain conservative Catholics, such as Madison Bishop Robert Morlino. (But you knew that.) But Francis speaks from either ignorance about libertarians or deliberately ignoring our God-given free will. (Which includes our free-will choice to go to church or not, or go to a specific church or not.)

The Catholic Church is not a democracy, of course. A church also gets to decide its own rules, contrary to what “cafeteria Catholics” might like to think. Our choice is to attend and support a church, or not. I didn’t leave the Catholic church for any political reason, but as I’ve written here before, my decision to leave has been validated numerous times since then.

I wonder what Catholics think about this news, from the Kansas City Star:

Saying that Girl Scouts is “no longer a compatible partner in helping us form young women with the virtues and values of the Gospel,” the Archdiocese of Kansas City in Kansas is severing ties with the organization and switching its support to a Christian-based scouting program.

“I have asked the pastors of the Archdiocese to begin the process of transitioning away from the hosting of parish Girl Scout troops and toward the chartering of American Heritage Girls troops,” Archbishop Joseph F. Naumann said in a statement released Monday.

“Pastors were given the choice of making this transition quickly, or to, over the next several years, ‘graduate’ the scouts currently in the program. Regardless of whether they chose the immediate or phased transition, parishes should be in the process of forming American Heritage Girl troops, at least for their kindergartners, this fall.”

American Heritage Girls, founded in 1995, has become an option for those who say Girl Scouts has become too liberal and has relationships with organizations that support abortion rights and do not share traditional family values — allegations the Girl Scouts deny.

Naumann also called for an end to Girl Scout cookie sales in the archdiocese.

“No Girl Scout cookie sales should occur in Catholic Schools or on parish property after the 2016-2017 school year,” he said in a letter to priests in January.

The action has angered some Girl Scout leaders and parents in the archdiocese, who say Girl Scouts is a respected program that helps raise strong girls who become good stewards. They call the move punitive and unfair and say it treats girls in their troops like second-class citizens.

“This is frustrating; parents are very irritated,” said Maria Walters, a former Girl Scout leader in the archdiocese and mother of two Girl Scouts. “I feel we should all be together as one in the community. This does nothing but divide us.

“I don’t know why you would take an organization out of a school when it provides an option for girls to feel like they’re part of a group.”

Walters said her parish has had a Girl Scout troop for at least 25 years.

“They’ve done a father-daughter dance that has been a huge success,” she said. “And they do service projects at Children’s Mercy, animal shelters, battered women’s shelters, the Ronald McDonald House and projects around the parish.”

Walters said the troop used to have about 100 members but now has around 75.

“We have lost some to American Heritage Girls,” she said. “We are still allowed to meet here, but I don’t know for how long. It’s frustrating when you have American Heritage Girls and Boys Scouts in the school newsletter, but no Girl Scouts. We are not allowed to recruit on campus, so we’re going to have to use Facebook and other technology to reach out to people.”

Deacon Dana Nearmyer, the archdiocese’s director of evangelization, told The Star that careful thought went into the decision.

“Several years ago, a number of Catholic school moms called us up and said, ‘We’d like to have a Christian program for our after-school girls’ program,’ ” he said. “So we did a bunch of research and tried to find the best mission fit for us, and American Heritage Girls seemed like that was going to be the best fit.” …

American Heritage Girls, based in Cincinnati, is described as “a Christ-centered character development program for girls ages 5 to 18.”

“We use the methods of scouting to achieve our mission of building women of integrity through service to God, family, community and country,” said Patti Garibay, national executive director and founder. ,,,

The organization also was attractive to the archdiocese because of its opposition to abortion. Some of the troops have participated in protests and prayer vigils outside clinics that perform abortions. …

Some Girl Scout leaders disagreed, saying they were never consulted about the decision and that some of their girls had been bullied because they were involved in Scouts. …

The United States Conference of Catholic Bishops has studied the issue in recent years and said it held a lengthy dialogue with the Girl Scouts. It developed a resource guide for Catholics and concluded that the question of whether the church should sever its ties to Girl Scouts must be answered at the local level.

“Diocesan bishops have the final authority over what is appropriate for Catholic scouting in their dioceses,” the bishops’ conference said.

Last year, Archbishop Robert Carlson of the Archdiocese of St. Louis urged priests to drop Girl Scouts, saying the organization was “exhibiting a troubling pattern of behavior” and was “becoming increasingly incompatible with our Catholic values.” …

Naumann said Girl Scouts contributes more than a million dollars each year to the World Association of Girl Guides and Girl Scouts, which he called “an organization tied to International Planned Parenthood and its advocacy for legislation that includes both contraception and abortion as preventive health care for women.”

He also said that many of those who have been cited as role models by Girl Scouts “not only do not reflect our Catholic worldview but stand in stark opposition to what we believe.”

The Girl Scouts, which has 1.9 million girl members and 800,000 adult members nationwide, does not take a position or develop materials on human sexuality, birth control or abortion, according to its website. And despite what critics say, the organization says, it does not have a relationship with Planned Parenthood.

“Parents or guardians make all decisions regarding program participation that may be of a sensitive nature,” it says.

Girl Scouts officials say that each member organization of the World Association of Girl Guides and Girl Scouts creates its own programs that are based on the needs and issues affecting girls in its individual country. Girl Scouts does not always take the same positions or endorse the same programs as the world organization, they say.

Some parents in the archdiocese have nothing but praise for Girl Scouts. …

Some are wondering why the Catholic dioceses haven’t taken similar actions regarding Boy Scouts.

“I feel like we’re being discriminated against,” Walters said. “We’ve been wiped from the archdiocese website, and we have no leadership role in the church at all. There’s nothing like this going on with the Boy Scouts.”

The discrimination in the last paragraph isn’t because of sex; it’s because of viewpoint. As far as I know the Boy Scouts haven’t been supporting Planned Parenthood. (As you know, the Boy Scouts has taken considerable heat over its policy to not allow atheist Scouts or homosexual leaders to the point where United Way chapters pulled their funding of the Boy Scouts.)

This could be seen as an argument about centralized or decentralized organizational leadership analogous to arguments about federalism. Apparently dioceses are free to create their own policy about the Girl Scouts, and Girl Scout leaders are free to make their own decisions about “program participation that may be of a sensitive nature.”

The church has the right to create and enforce its own rules. It seems only logical that a church’s members have the choice to either follow those rules, or leave. (And if more Catholics did leave, some of those rules might change.) Not being Catholic, I have free will to agree or disagree with the pope or a bishop and feel free to not follow their instructions.

 

 

The problem with baseball is …

Andy McDonald claims:

I have the same conversation multiple times per year. “Ugh, baseball is so boring,” people tell me when I bring up ― what will always be ― the national pastime.

And every year I have to lay out the reasons why I think that, no, baseball is great, it’s you that’s boring.

I’m not going to dive too deep into the same tired arguments, so we’ll get those quickly out of the way.

“The games are so long!”

… They are as long as they’ve always been: nine innings. Sometimes that means it will go two-and-a-half hours. Sometimes that means four-and-a-half hours. It’s one of the reasons the game is so great. The clock has no impact on the field.

The average 2016 regular season NFL game was three hours and eight minutes, according to Pro-Football-Reference.com. According to the data from Baseball-Reference.com, the average 2016 regular season MLB game was three hours and five minutes.

“There’s not a lot of action!”

… This depends completely on what you consider “action.” Maybe you need people running around the field to prove to yourself that things are actually happening. …

“If we make the games shorter, people will more likely tune in!”

… You’re telling me that shaving 15 minutes off a baseball game will keep the average person interested in a baseball game? That was the issue this whole time??

Well, hand me a Pepsi can, who knew that was the answer!Listen, I’m sorry, we can’t squish a Major League Baseball game into a time-slot comparable to “The Voice” for the casual fan who is called a “casual fan” for a reason.

Baseball is a game of thoughtful pauses and contemplation. It’s a game of conversation and debate. It’s a shared experience, whether you’re at the game or not.

When there’s a break in the action, that’s when the other fun-but-often-overlooked part begins: interacting with another human being. For baseball fans, the discussion of the game is sometimes as exciting as the game itself.

Which brings me to my ultimate point:

Why doesn’t anyone want to talk to you? Why are you bored when things aren’t happening?

Because, if you’re bored when the action on the field stops, it means that you’re a boring person.

For reference:

Baseball has stood at the forefront of larger national conversations for a hundred years. Baseball is fascinating, on and off the field, action or “no action.”

So, I’m sorry you had to find out this way, but I’m afraid you suffer from being a boring person.

Or at least a person who cannot entertain himself or herself without increasingly loud external stimuli.

There is obviously a difference in experience between watching a game on your favorite broadcast device and attending a game in person. The commercial breaks are for such activities as dragging the infield (the former province of Bonnie Brewer — remember her?), videos on the scoreboard, running to the concession stand or bathroom, etc. If you’re not doing anything, the between-innings period can get tedious, and for that you can blame TV.

It should be obvious that the billion-dollar entertainment center that is now a major league ballpark is (in addition to pulling as much money out of the wallets of fans as possible) an attempt to attract the non-hardcore baseball fan. That may be a hopeless case, and one wonders why a sport would seek to attract non-hardcore fans at the risk of alienating their hardcore fans, who are much more likely to purchase season tickets than someone who might go to a game if he or she has nothing better to do.

The operating assumption is that hardcore fans won’t stop going to games as MLB tries to attract younger, less interested fans. How likely is that?

 

Higher education (if that’s what you want to call it)

In case you wonder how well your tax dollars are spent on higher education, begin with UW–Madison’s Daily Cardinal:

In classrooms across the country, students might be scolded for using “ain’t” instead of “isn’t.” But a UW-Madison student is working to erase the stigma against Ebonics, also known as African-American Vernacular English.

UW-Madison junior Erika Gallagher conducted research about code switching, also known as code meshing, in which people change their regular speech tendencies to fit into the mold of what is commonly accepted as appropriate.

Ebonics is a variety of English that is commonly found in the center of large cities that have been historically populated primarily by black people. It is commonly found in slam poetry, as well as hip-hop and rap music.

Gallagher, a Posse scholar, began her research during her time as an undergraduate Writing Fellow this semester. She said she realized, as she sat in her seminar class of predominantly white students, that she wanted to focus on standard written English and how it excludes marginalized groups.

“I want to center the voices of the people who need to be centered,” Gallagher said. “As a Writing Fellow, as a white-passing person, I have a lot of power and privilege that should be shared.”

Gallagher conducted much of her research through three interviews. She talked to UW-Madison student leaders from marginalized groups and asked how they felt about code switching. She said all three “overwhelmingly” said it felt oppressive—one said “it is the biggest form of cognitive dissonance that exists.”

She presented her research at the Collegiate Conference on Composition and Communication in Portland, Ore., earlier this semester. She was selected as one of roughly two dozen undergraduates from across the U.S. to participate in the conference, which is typically attended by graduate students and professors.

Gallagher said she hopes to develop her research into a nonprofit organization that “teaches teachers to teach,” with the goal that educators will eventually express disclaimers at the start of each semester that state they will accept any form of English that students are comfortable with.

She also hopes increased acceptance of different rhetoric will encourage the formation of a campus-wide diversity statement.

“Just because you speak a different way doesn’t mean you’re not smart, but there’s a huge stigma around it,” Gallagher said. “I want to teach [educators] a different rhetoric, teach them to be more accepting.”

A “white-passing person.” Really nothing more needs to be said after that.

Fortunately, not all of the Daily Cardinal’s readers are idiots, as demonstrated by these comments:

Using correct English doesn’t ‘exclude’ anyone. People choose to exclude themselves by refusing to use it. But hey, go ahead and stick it to the man by refusing the benefits of literacy: financial independence, career success, and the ability to think and reason.

Let’s just call it what it is, racist. I would have expected this in the 1960’s, not the 21st century. How does she explain the fact that immigrants can come to this country and speak perfect proper English in less than 10 years. Using Ebonics in places where it is never spoken would be detrimental to those speaking it.

BASED ON three interviews? Three? Really, just three? Has this young lady taken ANY courses in Statistics? Obviously not. This is complete and total nonsense. Three. Think about that.

‘A white-passing person’? Are you f-ing serious? If this is what tax payer-subsidized higher education has become in this country, it’s time for a national enema of this schools.

IMHO this isn’t logical; if children are to be afforded equal opportunity, then they must feel comfortable moving in all walks of life; It is difficult enough for a young adult to move out into the wider world without being saddled with ignorance of common social conventions; A child who is not taught basics in the home, such as table manners, forms of address, standard English, etc, will, in new social settings, be overloaded by the demands of unfamiliar social conventions, when they should be free to let their talents shine; “manners”, including a common tongue, are the lubricant that allows a diverse society to function smoothly. And, these things, and most particularly language, are most readily learned by the young.

But wait! There’s more, from the College Fix:

If you want to schedule a meeting at Clemson University that starts on time … well, that’s not going to happen.

The university warns faculty not to enforce start times for gatherings in an online training featuring “fictional characters,” made public by Campus Reform:

On another slide, a character named Alejandro schedules a 9:00 a.m. meeting between two groups of foreign professors and students. The first group arrived fifteen minutes early, while the second arrived ten minutes late [and wanted to “socialize” first]. According to the answers, it is wrong for Alejandro to “politely ask the second group to apologize,” or explain that “in our country, 9:00 a.m. means 9:00 a.m.”

It disrespects other people’s cultures to ask them to follow American conventions of appointments starting when they are literally scheduled to start, the slide continues:

Alejandro should recognize and acknowledge cultural differences with ease and respect. Cultures view many things, including death, prosperity and even colors, quite differently. Time may be considered precise or fluid depending on the culture. For Alejandro to bring three cultures together he must start from a place of respect, understanding that his cultural perspective regarding time is neither more nor less valid than any other.

Another slide explains hierarchies of privilege. A female hiring manager with a common white name is accused by a woman with an African American name of not giving her a job interview because her competition is a “white male.”

Hiring manager Stephanie should “reflect on her behavior to see if Tanisha is correct” and contact Clemson’s departments of human resources and “Access and Equity” about the African American woman’s accusation.

There is much more revealed in the training, created by compliance training provider Workplace Answers, which cost Clemson nearly $27,000. The invoice went to the department led by Chief Diversity Officer Lee Gill, who earns $185,850 per year.

Employees who do not complete the “inclusion awareness course” will get “two automated reminders,” according to emails to faculty from HR and the Office of Inclusion and Equity.

That prompted these comments:

Does the offensive line of Clemson’s football team have to show up at kickoff, or can they wander in during the first quarter?

Well football is important. This businessy stuff is just a bunch of nonsense anyway.

When they get fired for habitual tardiness they can thank the University for such poor guidance