What post 9-11 cockpit doors and the Patriot Act have in common

If reports on the cause of the crash of the Germanwinds flight in the French Alps are to be believed, the co-pilot of the flight, Andreas Lubwitz, intentionally crashed the plane. Authorities claim that Lubwitz locked the pilot out of the cockpit when he left, and refused his reentry, while also initiating a descent through the pushing of the “descend” button. (Which sounds curious to me, as there was no “descend” button in the helicopters I once flew for the US Army, but admittedly, helicopters are quite different from airliners in myriad ways.)

I’m skeptical that such a radical explanation for what happened could be so quickly and conclusively drawn. I will await the fullness of time before I cast my lot with what appears will be the official narrative, not least to calibrate for the human mind’s instinctive tendency to assign human malevolence as the causative agent for whatever mysterious effects arise. If the earth is getting warmer, then it must be the fault of some latent human evil. If a plane mysteriously crashes into a mountainside or ocean, the pilot must have been suicidal or terroristic or beset by some other character flaw. On the thinnest of evidentiary reeds, speculation always runs to ill intent on the part of human actors. But the plausibility of the narrative in this instance can’t be denied. The cockpit door of airliners, which has been redesigned in the wake of the 9-11 hijacking to secure the good guys and keep out bad guys who might do an airplane full of passengers harm, can readily be used for just the opposite, to keep in the bad guys and keep out the good guys. And whatever is possible eventually comes to pass.

Assume for a moment that the co-pilot did indeed do as is claimed, and intentionally flew the airplane into the ground in order to kill himself and all the passengers and destroy the airplane. The opportunity to do so arose directly out of efforts to prevent that very thing from happening. Much like the Patriot Act passed in the wake of 9-11 providing blanket authorization for the conduct of military operations against al Qaeda and its affiliates to prevent another orgy of death and destruction made death and destruction more likely. The wars and military occupations embarked upon by the United States that successive administrations have argued are authorized by the Patriot Act have directly lead to the rise of ISIS in the Levant and to civil war in Yemen and to the Taliban resurgence in Afghanistan. The cost in lives and treasure of these wars long ago far exceeded the total cost of all the terrorist attacks on US soil. And now it appears the fortifying of the cockpit doors in airliners to prevent the takeover of a plane like happened in 9-11 has caused the very same thing it was designed to prevent.

The US has the reverse Midas touch when it comes to post-Cold War international relations. Everything it does turns shit. Most every effect expressly intended with its efforts have yielded the exactly opposite result. Only the initial war with Iraq, in which the objective was limited to expelling Iraq from Kuwait, succeeded in achieving its goals. But that war was tantamount to shaking the Middle Eastern tar baby with both hands. The more the US subsequently struggled to get free, the more it got stuck the baby’s embrace.

Just today, it was reported in the New York Times that three Iraqi militias had decided on their own initiative to pull out of the battle for Tikrit against ISIS because they don’t believe that American help is necessary or desired. What sort of state is Iraq, when its military units can decide on their own initiative whether or not they will fight? Saddam Hussein, a native of Tikrit, might have ruled with an iron fist, but he wouldn’t have allowed his military to dictate their own rules of engagement. Through American military involvement in the Levant over the last quarter century, the Iraqi state has been reduced to nothing more than a loose coalition of warlords generally allied with Iran whose main difference with ISIS is that it enjoys official US sanction. Iran must be incredulous, watching in amazement as the US expands and protects the frontiers of Iran’s empire.

But it’s all good. The US gets to create a new crop of heroes for the adoration of its cravenly dull population, while at the same time generating the next Middle Eastern conflict in which to become embroiled. Once ISIS is eliminated as a threat to Iranian hegemony, the US will be forced to undo what it did in the Levant through its support of Iran’s allies. It will execute an “about face” and turn its guns eastward, towards Persia. Thus will the US’s existential angst be salved. The country can’t be happy unless it has an enemy to fight. Otherwise, where would Dancing With the Stars get its supply of deformed and disabled humans tugging at the heartstrings by somehow managing to overcome their disabilities and deformities so that they can dance?

Yemen is another American disaster, though a bit more opaquely so than Iraq. The US has been waging a drone war in Yemen ever since it developed the capacity for drone warfare. The existing Yemeni government, the recently toppled one, supported the US effort to weed out its version of al Qaeda. The rest of Yemen was not all that happy with its government’s alliance with the US, a reality that completely failed the braintrust of the US intelligence community, and its chief intelligencer, President Obama, who as recently as last summer touted Yemen’s drone war as an American interventionist success story, something about as likely to exist as a flying unicorn. Now Saudi Arabia and Egypt are allied to fight against the new Yemeni government, which incidentally is backed by Iran. While the US was busy fighting to expand Iran’s imperial reach in the Levant, Iran slipped through the Arabian desert to expand its hegemony a bit more, threatening to encircle Saudi Arabia with its influence by taking over the tip of the southwestern Arabian peninsula. The US, of course, is backing Saudi Arabia and Egypt in the Yemeni operations, so it may well come to pass that the US and Iran are allied on one field of battle (the Levant) and enemy combatants on another field of battle (Yemen), which would have to constitute a flying unicorn of international diplomatic history. Surely nothing as foolish has ever before happened.

But wait, it gets better. Iran is trying to develop a nuclear weapon. The US desperately wishes they wouldn’t, for not readily-obvious reasons, except perhaps that Israel (who already has nuclear weapons) doesn’t want a nuclear Iran to upset a favorable power balance in the Middle East. There have been relentless negotiations that all add up to nothing. Iran is officially under some sort of weak-kneed economic sanctions which it naturally wants lifted but the US wants some assurance that Iran will abandon its nuclear fuel processing development (building a bomb is relatively simple; distilling weapon’s grade uranium is quite difficult). John Bolton, former US Ambassador to the UN, advises that to prevent Iran getting the bomb, Iran must be bombed, preferring that Israel do the job, but willing to allow the US do it if Israel is unwilling. So the scenario could very well play out like this—while the US continues to fight to protect Iranian hegemony in the Levant, it commences battle in Yemen to expel an Iranian ally that had taken power in a coup even as it also conducts deep bombing raids in Iran to destroy its uranium processing capability. It would be impossible to make this up. No spy novelist could get a plot like this past his publisher. It’s just too ridiculous to be believed.

If the crash investigators prove correct, fortified cockpit doors didn’t prevent the takeover of the Germanwings flight. They in fact were critical to the plane’s takeover and intentional crash. Likewise, the Patriot Act hasn’t eliminated terrorism, but the exercise of its extra-constitutional powers have in fact magnified the threat and cost many times more in lives and treasure than terrorists could have dreamed to cause. Something tells me that the battle with ISIS in the Levant and the support of Saudi Arabia in the Yemen civil war and the negotiations with Iran over its nuclear capabilities won’t end well for the US either.

Unless the purpose for US actions in the Middle East have been to foment unrest and opposition such that the US might be indefinitely engaged militarily in the area (not an implausible scenario), nothing it has done has turned out well. Is there any reason to think that its actions battling ISIS or supporting the Saudis in Yemen or deterring Iran’s nuclear ambitions will turn out any better?

Angelina Jolie is at it again

For some reason, the ritual of having cut myself and feeling the pain, maybe feeling alive, feeling some kind of release, it was somehow therapeutic to me.

Angelina Jolie, in an interview with Paula Zahn on June 9th, 2005 on CNN.

She’s at it again. After experiencing the therapeutic release that came with the cutting of her breasts (clinically, bilateral prophylactic mastectomy) for the fear that her breasts might turn cancerous, she’s cutting herself again, this time to remove her ovaries and fallopian tubes (clinically, bilateral prophylactic salpingo-oophorectomy).

Jolie has the “breast cancer gene”. Except there is no such thing. There have been identified a pair of genes, dubbed BRCA 1 and BRCA 2, which when defective, i.e., deleteriously mutated, fail to do their assigned task of DNA repair, thus leading to the likelihood that a strand of DNA might be damaged so that the cell carrying it becomes immortal, i.e., cancerous. Specific mutations of the BRCA 1 and BRCA 2 genes that have been proved to be associated with higher breast and ovarian cancers are inherited. For all the hoopla about the magic of genes and how we will one day treat disease by rewriting DNA code, there have been identified very few gene to cancer correlations. The BRCA 1 and BRCA 2 correlation to breast and ovarian cancers represent two of the few.

Even so, having the gene does not guarantee that cancers of the female reproductive organs will arise (though men who have the BRCA 1 and BRCA 2 mutation get breast cancer at a higher rate than men without them, the rate is still so low as to be negligible). Having the mutation simply means that the probability of being afflicted with those sorts of cancers is significantly increased. The mutant genes therefore do not “cause” breast and ovarian cancers because if they caused breast and ovarian cancers anyone who had them would get the cancers. These aren’t cystic fibrosis or Huntington’s chorea type genetic malfunctions.

Angelina Jolie’s mom died of ovarian cancer, a tragedy of epic proportions in an age when Google executives and Israeli historians are making claims that eternal life, or at least really long life, is an imminent possibility. Jolie’s female self-emasculation is proof positive that Google executives and Israeli historians are premature in their prognostications. Jolie cut herself out of the fear she might die of a cancer she knows she has a predilection for. If eternal, or even very long life, were reliably around the corner, there would be no worry over getting cancer. Whatever therapy had been devised that prevented the body growing old and dying would also necessarily prevent the body suffering from latent diseases like cancer, else there wouldn’t be much point in the expectation of immortality.

Jolie hopes to eliminate the risk of death by female reproductive organ cancer. Even that’s not possible. She could still get cancer in the parts of her reproductive system that must be left behind. And there’s no guarantee that she won’t get cancer of another type. Or get run over by a bus. Death is inevitable; only its timing is uncertain. Besides all that, she is cutting herself this time for the possibility that an uptick in a metric used to monitor patients susceptible to ovarian cancer indicates she might have the disease. But the National Cancer Institute specifically states on its website that there is no reliable test to determine that metastatic ovarian cancer has developed, excepting actual biopsy of an ovarian tumor.

But once the cutting is through, she will undoubtedly feel as full of life as she did way back when she was young and she not only decided that a cutting was indicated but then performed the procedure herself. She at least has one thing right in all this. Life is pain. Without it, there is very little way of knowing it’s real. And what sort of pain could possibly trouble someone with everything except the chance that it might be lost? She might instead try pinching herself like people who aren’t sure they aren’t dreaming do, but that likely wouldn’t suffice.

For me, I like to take inflict pain by taking a good long run. I really know I’m alive somewhere about the eighth or ninth mile. But to each their own.

Book Review: Sapiens; A Brief History of Humankind (2011, Israel; 2015, US)

This is the rarest of books for me. I bought it in hardcover, before it came out in paperback. I am in no mad rush to read the latest books. And even after a lifetime of reading, there’re still constellations of great books I’ve yet to read. Practically all of which can be purchased in convenient and inexpensive paperback form. I actually prefer paperback books for reading. They don’t look as good on the bookshelf, but I buy books to read, not to impress people with what I have displayed in my bookshelf, and the physical act of reading is easier with a paperback. So it was quite unusual that I bought this book in hardback. But then, I used a Christmas gift card, so I didn’t feel so bad about the extra money. And from everything I could tell before purchase, the book would make a really good read of just the sort that I enjoy. So the hardback version of Sapiens was a holiday gift from me to me.

Turns out, I was pretty good to myself with this one. The binding is quite luxurious, with that heavy-stock shiny paper with which coffee table books are bound. But that is the only resemblance to a coffee table book it bears. Unlike the average coffee table tome, and more important than the binding, this is a quite profitable read. It has a few pictures and illustrations interspersed throughout, but not enough that its gist might be discerned by flipping through it like a catalog.

It is not an academically challenging book. The writing is crisp and clear and free of jargon. And while it seems there is no subject too arcane or erudite for Harari to explore regarding the history of H. sapiens, the intellectual byways he travels are readily accessible to the layman. The aptly named chapter titles run the gamut from “A Day in the Life of Adam and Eve” to “The Law of History” to “The Law of Religion”, etc. There are few stones left unturned so far as the saga of mankind’s history and development is concerned.

Harari lumps development into three eras he calls revolutions—the Cognitive Revolution, which he claims kick-started history 70,000 years ago; the Agricultural Revolution which sped it up about 12,000 years ago, and the Scientific Revolution which started about 500 years ago.

What Harari calls the Cognitive Revolution is the time period that started with mankind leaving its East African Eden to disperse, in less than 50,000 years in most cases, throughout the globe. It was during this era that mankind learned to make and use sophisticated tools and became artistic and more linguistically inclined. Harari believes that during this period man learned to think abstractly—as he puts it, to conjure fictitious entities like gods and devils, and to fabricate stories about his and their origins and histories. Like so many others, Harari thinks that the use of the mind to imagine things that don’t exist, and the use of language for communicating ideas both abstract and concrete is what makes H. sapiens who he is. He acknowledges the similarities that H. sapiens has with all the other homo species, but believes language usage and mental capacity distinguishes them.

But Harari’s claims beg a few questions. Why 70,000 years ago? What changed that yielded this explosion in cognitive power 70,000 years ago while modern H. sapiens had been quiescently living in East Africa for at least that long without a historical murmur? It had somehow to be genetic because we know the environment didn’t significantly change. Was it a chance mutation that quickly swept through the small H. sapiens population like an Ebola virus that was opposite in its effect, enhancing their survival and propagation prospects by some mechanism? Nobody really knows. But the notion that a Cognitive Revolution occurred, without even trying to attribute causes to it is something like knowing that Napoleon lost at Waterloo but not knowing why. It’s history’s version of kissing your sister (something which Napoleon might have actually enjoyed, as he was suspiciously close to his sister, if the darker aspects of his history are to be believed).

The idea of an Agricultural Revolution is much more widely embraced because the reason for its development is readily understood—through the domestication of plants and animals, mankind figured out how to make food come to him instead of having to chase after or forage for it. As he gained skill in domestication and cultivation, he was able to produce many more calories than he alone could eat, thus freeing up others of the community to spend time in activities that didn’t involve food production. Thus was society stratified and specialized. There arose ruling classes and intellectual classes (mostly shamans, medicine men and fortune tellers at first) and soldierly classes, etc. Always near the bottom were the producers, the class upon which all the others depended. Thus has society been illogically stratified ever since the first field of wheat was sown.

The Agricultural Revolution is with us yet today, and is directly responsible for what came next, the Scientific Revolution. How so? It took a secure, well-fed, critical mass of thinkers to begin the train of questions that would finally lay bare the myths and lies and superstitions cluttering the human psyche about the nature of the universe during the late Middle Ages. Without agricultural surpluses, the Agricultural Revolution would have not have been a Revolution and the Scientific Revolution would not have been possible.

Harari justifiably points out that the Scientific Revolution was European in origin. It wasn’t in Asia or Africa or the Americas where gods were cast aside for the clarity of reason in explaining the nature of nature. It was driven by European thinkers, which is one reason why Europe came to rule the world within a few hundred years of it’s the revolution’s inception, which is generally today called the Renaissance or Early Enlightenment (the other reason being that Europe wanted to rule the world, whereas, for instance, China and India had little interest in what lie beyond their naturally-bounded empires).

The Agricultural Revolution was not an unmitigated good. Mankind had to fit its hunting and foraging genes into the straitjacket of domesticity in a period of time that was far too short for its genetic code to substantially change to accommodate the new way of life. When mankind domesticated corn and wheat and sunflowers and dogs and goats and sheep, what he really got in the bargain was that he domesticated himself, a fact of which I am constantly reminded anytime I see someone being led around by their dog, at the ready for scooping up the dog’s excrements when it decides to go, wherever, of course, it wants. But it was not only detrimental to man. The Agricultural Revolution devastated biodiversity, killing off vast numbers of species in favor of only those that were useful to H. sapiens. What species extermination the Cognitive Revolution failed to complete as mankind sprawled to every corner of the globe, killing off other human species and large mammals along the way, the Agricultural Revolution often finished.

My main quibble with the insights and prognostications offered by Harari arise from his linear, progressive view of history. A great many people, professional historians and lay people alike, view history as a story about the fulfillment of some purposeful or meaningful aspect of mankind’s journey through time. But so far as we know, this is not a journey and mankind has no purpose beyond simply being, a truth which Harari admits late in the book, even after having spent most of the book pandering to a popular audience who he knows is reading for the purpose of trying to find meaning and purpose in the confusion of existence through the sort of knowledge that reading a book about mankind’s history imparts.

Harari predicts that mankind might one day conquer the biochemical frontiers of the body so effectively that we become what he calls “a-mortal”, not immortal, because stepping in front of a bus will still kill us, but a-mortal, because old age and disease won’t. He thinks this might happen in his own lifetime (He’s about 40). I seriously doubt any such thing happens, mainly because it’s not clear that being a-mortal would be offer any great improvement to the human condition. In fact, as the angst and ennui wrought by the surpluses of the Agricultural Revolution attest, the further removed we become from our core existence as mammalian animals of the homo genus, the less content with our lives we seem to become.

People in the developed world already have little worry over from where their next meal will come. The ease with which food is acquired is as much bane as benefit. Obesity and boredom abound. There is little point to life when its continuation is more or less assured, which is how it has come to pass that the most valuable commodities these days are ways to infuse meaningless moments with purpose and passion. It is no accident that iPhones and football-playing skills are highly-desirable items in the wake of the Agricultural/Scientific Revolutions. When acquiring food is no problem, what other way is there for filling up the meaningless hours of a day except through personal and collective entertainments?

With a-mortal life providing a new layer of existential certainty and thereby meaninglessness, it’s not hard to imagine how utterly despondent life could become. E.O. Wilson pointed out that the central problem of collective human activity is that there is no point to it beyond the immediate continuation of the individual gene-carrying vessel—the body—through time and space. The point of being is being, and a too-easy-to-assure being robs being of its purpose. When the continuation of being is difficult and fraught with uncertainty, every moment is purposeful and full. Ask a soldier in combat, or a mother whose child is dying of starvation, whether their moments are meaningful.

If what Harari predicts comes to pass, and individual human beings are afforded the opportunity live an a-mortal life, it may well signal the end of civilization as it is now known. To be sure, civilization was not designed to accommodate hunter/gatherer genes but has nonetheless managed to survive. But how much less was it designed to accommodate a-mortal human beings? Imagine the severe mismatch between modern life and our ancient -history-besotted genes that would obtain. There is no way the human genome could have prepared for such a contingency. The body is designed to assume that time is always limited, thus the mind knows things no other way. What if it weren’t? The results might not be pretty. Alas, I don’t expect to be losing any sleep over it. Harari seems a bit overly enamored, sort of like the financial markets about now, with the potential of biochemistry to change the essence of the H. sapiens condition.

Harari ends with a poignant observation that if the newly discovered biochemical powers humans now possess are unleashed, never mind a-mortality, humans could be designed from the ground up for genetic superiority. Which implies a genetic-perfection arms race could ensue, not unlike the nuclear arms race of the just-ended era. If genetic perfectionism becomes acceptable, H. sapiens could ultimately go extinct through evolving to a different form. But that’s not so radical an idea. In fact, unless H. sapiens is the lone difference of all the living creatures, it is guaranteed that he will go extinct, either by dying out or by evolving to a new and different species.

Take away a few quibbles like those just explained, and Sapiens is an outstanding book. For the reader who is generally unfamiliar with the history of H. sapiens, from its humble beginnings to its world-altering present, the book should enlighten and entertain. For people with a good foundational knowledge of the basic contours of anthropology, sociology, economics, history, biology, philosophy, etc. (as I believe myself to be), the book offers a good refresher course while simultaneously presenting original insights and perspectives in those fields. As I was reading, I couldn’t help thinking that this is the book I have always wanted to write. Perhaps one day. For the expert in the various fields, Harari does a good job of presenting all the extant and viable theories of how things were or why.

In short, it is a very, very good book. Maybe even a masterpiece. Harari is brilliant and witty and insightful, and it shows on practically every page. Everyone should read it. Don’t wait for the paperback.

Musical Review: The Book of Mormon

I wondered what the people streaming out of the theater in Birmingham, Alabama on a mild February evening might have been thinking after having watched what is surely one of the raunchiest, most irreverent, most sacrilegious and blasphemous Broadway productions to ever come down the pike. This is the Bible Belt. No. This is the buckle of the Bible Belt. The very next day, a larger percentage of the local population than almost anywhere else in the United States would be taking their places (implicitly reserved) in pews all over the fair city, the faint sheen of sin at having watched and laughed at the musical still clinging to them like the sweaty essence of a clandestine lover. There’s a thin line, like Jimmy Buffet says, between Saturday night and Sunday morning.

Were they feeling a bit of guilt at having seen the Mormon Church being mercilessly ridiculed?

Did they figure the Mormons weren’t real Christians like them, making it okay to single them out for mockery and contempt?

Did it occur to them that had it been mainstream Christianity in the dock, the perpetrators of the ridicule would, at the very least, have been severely ostracized, while the uproar amongst the faithful would have been deafening? Had it been Islam or Judaism, aside from ostracism and howls of derision from the faithful, political correctness scolds would have had a field day. There would have been marches against religious bigotry, with echoes of the Holocaust reverberating against city walls. There might even have been violence, maybe even of the type that struck Charlie Hebdo in France.

But what came of all this laughter at the Mormon’s expense? Nothing, really. Not a peep from the Progressive/liberal/politically-correct crowd, yet it would be hard to imagine anything more politically incorrect, illiberal and retrograde as making fun of a whole theology’s religious beliefs and practices. But the Mormon’s pain was my gain. Never mind the laughter, and don’t get me wrong, the musical was funny in a slapstick sort of way, I scored a copy of the real Book of Mormon on my way out of the theater, as the church had strategically placed volunteers outside to distribute free copies after the show, sort of like the Congressional rebuttals that come after a President’s speech. So I finally got to learn a bit of what the religion is about.

I don’t believe in any of the origin mythologies that mankind has devised and propagated in order to create a purposeful narrative of existence that might ameliorate the terrible lightness of being that sentience has bestowed upon us. I don’t believe that God created the earth and its geologic features and its life in six days and rested on the seventh, such as the Hebrew, Christian and Islam holy texts provide. Likewise, I don’t believe that the universe began with a Big Bang and is today filled with some mysterious Dark Matter and Dark Energy that keeps the heavens quiescently cycling along, as goes the origin myth propagated by people (mainly theoretical physicists) claiming to be scientific materialists goes, but who are in fact more akin to Pythagorean mystics. And I most certainly don’t believe that God led a lost tribe of Israel to American shores some six hundred years before Christ who were then visited by Christ after the resurrection but before his birth who then wrote the tales of their journey on golden tablets which were buried in what became known as Wayne County, New York, awaiting the arrival of seventeen-year-old Joseph Smith, who dug them up on the instructions of an angel of God and transcribed them from New Egyptian into what became The Book of Mormon. No, I do not believe that nonsense at all. It is just as fanciful a creation myth as one that requires 96% of the universe to be comprised of forces and stuff that no one has seen or even detected.

To my reckoning, the myths are just that, but that’s okay. People need to believe in something. People will believe in something. It might be secular, such as the beliefs claimed of the scientific materialists that the Big Bang Theory began time itself, with no need of a creator with supernatural powers. Or it might be religious, like the Judeo-Christians, who worship a paternalistic God that watched over his flock like the wandering tribes of shepherds from whom the myths originated watched over their sheep and goats. Or, it might be quite outlandish, such as that a lost tribe of Israel buried a bunch of gold tablets in what became the United States that were then dug up to provide another testament of Christ which was translated to English in roughly the year 1830. One and all, the beliefs fill a void of meaning and purpose in people’s lives. They’re all myths spun from fantasy, but myths that help to keep people engaged in the ridiculousness of a life that has no apparent cosmic purpose, eternally struggling to roll that boulder of eternal life up the hill, to fully well know that as soon as it’s let go, it will roll right back down.

And I say that’s just fine. If whatever is believed helps people get through the day without harming themselves or others, I say believe to the limit of the heart’s desire. Pretty much everyone tells themselves and others a dozen or more lies and rationalizations by lunch time just in order to get through a morning immersed in civil society. If belief helps make the lies palatable, so much the better. I respect whatever a person believes unless and until their beliefs begin harming me.

As for me, my only belief is that I should afford others the same respect and consideration that I hope they will afford me. Which is why the Book of Mormon musical was troubling. I don’t ridicule the beliefs of others and don’t see that an artistic piece should either. Even the famous “Piss Christ” photograph didn’t ridicule Christianity. It plaintively rejected Christianity for all its sins, profaning it for not living up to its promise, which is quite a different thing than ridiculing a religion for its peculiar catechism.

Mormon beliefs are quirky, there is no doubt. Which makes them an easy punch line, of which the playwrights took advantage at every turn. There was not a parcel of low-hanging fruit left on the Mormon mockery tree by the time the musical was through. And a good deal of it was sophomoric and vulgar. Imagine all those god-fearing Christians showing up for church the next day having seen a musical that got things really going with a rousing rendition of Hasa Diga Eebowa, which means “Fuck You, God”, sung by a group of Ugandan natives led by General Butt-Fucking-Naked. The show was written by three guys, Trey Parker, Matt Stone and Robert Lopez, the first two of whom bestowed upon American culture the animated vulgarity and irreverence of South Park, and it shows.

I don’t have the slightest problem with the sentiments expressed in “Fuck You, God”, where the natives are lamenting their misfortunes and blaming it all on God. In fact, if the Judeo-Christian notion of an omniscient, omnipresent, omnipotent God is carried to its logical end, God nigh well has to be blamed, and credited, for everything—good, bad and indifferent—that happens. That’s what it means to have all those infinities. I have many times looked in the mirror during particularly unfortunate stretches of my life, giving God the old Bronx salute, blaming him for my misfortune. (Yes, the best place to see God is in your own mirror—being created in God’s image is quite the same thing as creating God in one’s own image. So the best place to cuss Him is right there, in the mirror, with Him looking back at you.) But if the Christians and Jews and Muslims in the audience weren’t offended, it is because they were the rare souls who actually understand the implications of his purported attributes. Most Christians think of God as something of a Superman who is concerned for their welfare, willing to swoop into a telephone booth to don his cape anytime troubles call; that God is there to do battle for whatever the individual defines as good. But no omniscient, omnipotent, omnipresent entity ever needs to battle anything, or even can, unless it could be imagined that he would battle himself.

The litany of vulgarities in the musical included a song, Baptize Me, that was hardly about baptism, but instead was a double entendre with “baptize” basically meaning ‘to have sex with’. The laughs never quit coming. No, that’s not a raunchy pun.
The plot line is predictable. Two young guys, just graduated high school or college, are sent out, ‘two by two’ to proselytize the world on behalf of the Church of Jesus Christ of Latter Day Saints, pursuant to the requirements of the faith for the faithful. One is an All-American Boy, determined to do well by doing good for his church (a Mitt Romney kind of guy), but is aghast when he finds out he is assigned to Uganda with the nerdiest loser of the bunch as his partner. Of course, Nerdy Guy ends up succeeding where All-American Boy fails, and because of his nerdiness, whereas All-American Boy can’t succeed because he can’t think outside the box generated by his own selfish biases.

Nerdy Guy hardly knows anything of the Book of Mormon, having never read it, so when it comes time to proselytize and convert the natives, he just makes stuff up. But he does it with empathy and genuine concern, attributes which All-American Boy desperately lacks. Thus Nerdy Guy wins (including the girl who sings Baptize Me to him), ironically through behaving in just the manner that the founder of Christianity taught. Christ repeatedly pointed out the wickedness of the overwrought legalism of the Pharisees, claiming that the measure of one’s life is not to be found in following rules but in human compassion. It’s not clear whether the playwrights intended for the message of the musical to comport so closely, if tacitly, with Christ’s message, a delightful irony if they did so unwittingly.

The Book of Mormon, the musical, was seven years in the making. The Book of Mormon itself took quite a bit longer, both in transcription and in achieving acceptance for its adherents. The church was founded in western New York state in the 1830’s, and in practically every place the Mormons tried to settle, they were run out of town by locals who were fearful they aimed to establish a theocracy. They finally gathered on the shores of the Great Salt Lake in 1847, there to found Sal Tlay Ka Siti (obviously, “Salt Lake City”, the musical’s song title for the place where natives who became Mormons would finally find happiness). The area was uninhabited upon their arrival. Brigham Young halted the wagon trains of adherents fleeing Mexico (their last forlorn refuge) saying “this is the right place.” There the Mormons did as the Israelis in Palestine, and made the desert bloom.

The New Yorker magazine recently reported on a Hindu ceremony of Shivaratri taking place in the City at the Broome Street Temple in SoHo. It is a ceremony of all-night wakefulness and fasting to honor the efforts of an ancient Indian noblewoman, Parvati, who had a crush on the Hindu god, Shiva, who was an ascetic. She took up asceticism to try to impress upon him the seriousness of her urges, which eventually worked. He came down from his mountaintop and married her. So now the Hindus honor her efforts with a Shivaratri. The one held at Broome Street Temple recently was attended by over a hundred fifty adherents, one of which was supposed to be Madonna, but she sent her regrets late in the night. And nobody made fun of the silliness and absurdity. The Hindu legend is essentially a throwback to Greek mythology, when humans and gods were believed to regularly interact in controlling or affecting mankind’s fate. But nobody questions it. In fact, it is considered chic to be into yoga, if for nothing other than the pants, and if yoga participation is extended into the actual Hinduism from which it arose, so much the hipper. Does Madonna ever do anything that’s not Queen of Pop hip?

People wouldn’t think of ridiculing Hindu rituals practiced by confused Westerners. Yet the Mormon Church is fair game for all sort of scornful and contemptible mockery. Mainstream Christianity doesn’t mind the Mormons being mocked because it feels threatened by Mormonism’s successes and its quirkiness. Non-believers love it when the ox of any religion is gored, but particularly one as quirky as Mormonism.
But the Jews should be particularly troubled with the ridicule because Mormonism feels like a neo-traditional Judaism. Like the ancient Jews, Mormons live in society but apart from it. The devout Mormons refuse alcohol, tobacco and caffeine, much like the ancient Hebrews (and a select few today) refused pork and shellfish, because each in their day represented things that could impair health and vitality. At about 2.5%, today’s Jews and Mormons represent an almost identical proportion of the American population. And like the Jews, Mormons are vastly overrepresented in the professions and upper strata of society.

I don’t know how those theater-goers were feeling that night last month in Birmingham as we left the show. I can only imagine they might have felt a bit like I did. And I felt a bit sleazy and naughty and repulsed for having had such a good time. I thought the show was funny, but of a low-brow, slapstick variety of humor, almost like one of those movies that parody movie genres, except this was a parody of religion.

I imagine that many of Alabama’s devout Christians felt about the same after attending The Book of Mormon as the people who flocked to theaters to see Fifty Shades of Grey felt after watching it. The Fifty Shades people had to have suffered twinges of guilt for experiencing pleasure at watching a grown woman’s bottom being spanked as an act of foreplay. Likewise for the theater goers seeing the Mormons spanked with intense satire and ridicule in the musical.

The culture seems to have crossed an unseen barrier with The Book of Mormon and Fifty Shades of Grey phenomenon. Things people might have once felt but would never have said or done are okay if enjoyed from the anonymous comfort of a theater seat. It has to feel something similar to how it felt to attend pornographic movies back during their heyday, where everything being displayed was taboo in public, but okay when presented to an audience in a darkened theater. People in polite society would never have been caught attending a XXX feature film. Nowadays, without experiencing the taboos displayed in darkened theaters there would be little to talk about around the water cooler on a Monday morning.

Is there another tech bubble? Perhaps, if the euphoric possibility of eternal life through biochemistry grips your soul

Google executive Bill Maris believes humans can live 500 years or more.  Not quite eternal life, but a quite substantial increase over the average lifespan worldwide of a measly 68 years.

Maris, at the age of 40, when men start seriously contemplating their own mortality, probably doesn’t grasp the irony in his having picked 500 years as the age to which human life could be extended.  It was almost exactly 500 years ago that Spanish conquistador Juan Ponce de Leon set sail for Florida seeking the Fountain of Youth, among other things.

So, in a sense, the soul of Ponce de Leon survives, 500 years on, in the person of Bill Maris, who is quite sure the Fountain of Youth will be found, not in Florida, but in immune system genetics.  Time, as always, will tell.

In the meantime, it’s probably a safe bet that, like Ricky Bobby observed in Talladega Nights (to paraphrase), “over 98% of everyone alive today will one day die.”  And they will do so well before their 500th birthday.  Though their cockamamie dreams of eternal life will surely carry on.

The Executive Summary, March 1-7, 2015

Economics

China

“The downward pressure on China’s economy is intensifying.”

Chinese Premier Li Keqiang, Thursday, March 5, 2015.

Li’s remarks came at the most important political meeting of the year (the National People’s Congress), where he also pegged the economic growth target at 7% for the current year, the lowest in a quarter century.  Growth last year came in at roughly 7.4%, slightly missing the anticipated 7.5%, which was also the lowest in a quarter century.

The People’s Bank of China (such an Orwellian name that the US Federal Reserve ought to adopt something similar) has been aggressively trying to thwart the monetary fall-out (deflation, mainly) from declining economic activity, cutting its headline interest rate three times over as many months.  It has also loosened bank reserve requirements multiple times lately. Its stated aim is to fight deflation, or more aptly, a decline in the inflation rate that threatens soon to slip into negative territory.  CPI ran about 2.06% for 2014, but has now fallen well below 2%.  Declining inflation increases real borrowing costs, so the PBOC moves have served mainly to forestall the negative monetary consequences of declining economic activity, but have not been enough to juice activity going forward, or so The Economist believes.

It’s hard to ever tell what is really happening inside China.  The Middle Kingdom has always been something of an enigma to Western observers, but this much is clear: China’s economic activity is slowing.  By how much its activity is slowing is not apparent, but the evidence is overwhelming that the days of double-digit expansion are, for now, over.

It is also obvious from glancing over the headlines of official Chinese newspapers (specifically, China Daily) that the Chinese view aggregate economic growth as a matter of national pride, something of a collective enterprise that all Chinese must endeavor to do their part in helping.  When Li speaks of a 7% “target”, what he really means is a 7% goal for the Chinese people to attain. This could be considered a vestige of the communist ideology of collectivism, or could just be a cultural tic, or could represent something more sinister.

It is not often in the developed world these days that aggregate economic growth is considered a matter of national pride, but that has not always been the case.  Before their imperialist adventures in the early twentieth century, Germany and Japan, among others, conflated their value as a people and nation with growth in their aggregate national output.  Individual desires were subordinated to state prerogatives, and the main state prerogative was economic modernization and growth, the profits from which were then funneled to military spending.  There’s no reason to think that China’s rapid modernization might also end (like, e.g., Japan’s and Germany’s) with a violent attempt at imperialist expansion.  But there’s also no reason to think it won’t.

Though China’s economic growth has declined to the high single digits, military spending will mark its fifth straight year of double digit growth in 2015.

Spotlight on Argentina

Buenos Aires (literally, “good air”), the capital of Argentina, is reputedly the Paris of Latin America.  On a per capital basis,, Argentina was, in the early part of the twentieth century, the richest country in the Western Hemisphere.  It has a population of roughly 43 million, of which some 13 million live in the capital.

Argentine income today is roughly $18,000 per person, about 75th worldwide, behind the likes of South Korea, Hungary and Portugal, among many others.  The reasons for its relative decline, or more aptly, its failure to keep up, would make for a decade’s worth of Ph.D economics theses at leading graduate schools.  Roughly speaking, Argentina seems to do exactly what it shouldn’t in terms of economic policy whenever presented with a situation demanding action.

Argentina is, like the US, a nation of immigrants, most of whom in its case hail from Spain or Italy.  Though it infamously served as something of a post-WW2 Nazi hideout, people of German descent make up only a small portion of the population.  Argentina is not a belligerent country.  It has fought only one very minor war with Great Britain in the early 80’s over a windswept group of craggy islands off its Atlantic coast where sheep outnumber humans in an ordinary year.  It lost, but didn’t much seem to care.

Argentina’s economy imploded in the early 2000’s, when it defaulted on external debt and was forced to abandon its one-to-one peg with the US dollar.  But it quickly rebounded, growing at Chinese-esque type rates of 8.5% per year for several years, until finally slipping back into recession in 2009, along with the rest of the world.  It today faces the specter of high inflation, with all its accompanying travails, due to chronic deficit spending by the government.  In characteristic Argentinian fashion, instead of attacking the problem of fiscal imbalance that is causing the inflation, it has slapped price controls on a great many imported goods, while also throwing up extensive barriers to free trade.  Exactly the opposite of the appropriate economic policy in the premises.

Argentina is interesting for a number of reasons, but one which stands out.  Why has it never sought empire?  What about its national psyche kept it focused inward, rather than outward, content to fight among its own rather seeking new worlds to conquer?  Is it the Latino heritage?  That wouldn’t make sense.  Spain and Italy have quite extensive historical experience with empire building, if not lately.  Is it the lack of British influence?  The British seem to have a genetic predisposition to colonization and empire building.  To someone in the US, which was largely settled by the British, the idea that being a significant force internationally may be less than appealing seems foreign, ridiculous even.  What Manifest Destiny, Argentina?  For me, it’s something of a pleasant anomaly that Argentina, possessing such beauty and riches that its first city is compared to Paris, doesn’t seem to have caught the imperial bug and all the belligerence normally accompanying it.

Tellingly, through all of Argentina’s economic problems, even as most of them are self-initiated, the Argentine people haven’t suffered anything remotely approaching, e.g., the Irish Potato Famine, or the famine that killed millions during China’s Cultural Revolution.  Perhaps Argentinians have a different view of the economic purposes of government than that offered by the West or the East.

Compare Argentina’s debt default to Greece’s.  The world economic system barely shrugged in 2001 when Argentina defaulted on all its external debt.  It is again barely paying attention as another round of defaults loom.  Yet Greece, a tiny country of 12 million, with a GDP only a third as large as Argentina’s, has kept the world, and particularly the manic-depressive financial markets, enthralled for years as to whether or not it would “Grexit” and bring the whole Euro experiment down.  Short of German invasion, Greece can’t and won’t pay back its creditors, a roughly $350 billion haircut.  Or, about a month’s worth of spending by the US government, or a slight percentage of the total annual output of the eurozone.  Yet Greece’s problems get the press coverage, while Argentina only makes the news when its political class is proved again to be particularly venal, such as when a prosecutor preparing to indict President Cristina Fernandez De Kirchner was suspiciously murdered.

Nobody pays much attention to Argentina.  I get the sense that Argentinians prefer it that way.

295,000 new jobs added in February, unemployment rate falls to 5.5%

Reversing a downward trend since November’s blowout numbers, there were 295,000 new jobs added in February, a gain of over 55,000 from January’s revised 239,000.  The economy has added an average of 288,000 jobs over the last three months.

Curiously, I could find no explanation pinning the quite robust job numbers on the weather, which would surely have been the case had the jobs report come in weak.  Apparently weather (or for that matter, climate change) doesn’t matter except for explaining anomalously poor results.  Following the logic of commentators who blame the weather for every bad report (for example, the increase in first time unemployment claims, noted below, was blamed on the weather) it could be argued that this strong a report in the face of relentless blasts of wintery Arctic weather means the jobs market is really going gangbusters.   Perhaps that’s why the stock markets didn’t like the report.

Stocks and bonds fall; dollar soars

What a fine mess of things the US Federal Reserve has made.  Ever since its initiation of ZIRP (zero interest rate policy, or as I like to call it, “free money for bankers”), good economic news means all the financial metrics go to shit.

Stocks dropped by over a percent Friday, the day the jobs report was released.  Ten year bond yields zoomed skyward, up over 6% in one day, and, after bouncing around all week, also for the week.  The dollar has steadily climbed this year, and on a trade-weighted basis has now well exceeded the level it enjoyed as a ‘safe-haven’ currency during the financial crisis.

The fact that good economic news means bad financial news makes the Fed’s plan to raise interest rates later this year just that much more complicated.

First-time claims for unemployment benefits rises

For the second consecutive week, first-time claims for unemployment benefits rose by 7,000, bringing the seasonally adjusted total of first-time claims to 320,000.  As noted, the increase was attributed to bad weather, which apparently failed to negatively impact the jobs market, if the pundits and prognosticators are to be believed.

New orders for manufactured goods slipped 0.2 percent in January

The decline comes on the heels of a 3.5% decrease in December from November.

Banking and Finance

All 31 Too Big To Fail Banks passed the quantitative portion of the Fed’s Stress Test

The US Federal Reserve released the results of its latest stress test of the 31 financial institutions that it has deemed are systemically too important to ignore (or fail, though they would never say as much).  It says the intent of the stress tests, as required by Dodd-Frank, is to ensure that losses due to bank failures are borne by the shareholders and not the taxpayers (as they were in 2009).

Imagine the level of hubris that must obtain to believe every possible contingency has been accounted for in creating a model that predicts whether or not banks will fail at some point in the future. The Fed can model risk.  It can’t model uncertainty.  Russia, anyone?

The Fed’s stress test modeling is of a piece with the hubris-besotted idea that models can be created that can reliably predict the earth’s future climate, even as no weather model yet devised is able to predict with any reasonable precision for a particular locale whether precipitation in Alabama, or the South generally, will fall as snow or sleet or freezing rain or just plain rain when area receives a blast of Arctic air.  The weathermen down here are something like zero for eight in their winter weather predictions going back to last year.

Not surprisingly, all of the too-big-to-fail banks passed this quantitative portion of the test. Whew!  We can all rest easy, at least for a short while.  The qualitative portion of the test results will be issued next week.

Monetary velocity in the US reaches an all-time low

The velocity of money sounds more complicated than it is.  It is simply a measure of how many times a particular unit of currency changes hands in a particular time period.  For the measure used by the Federal Reserve, it is the amount of times each dollar changes hands each quarter.  The latest reading, for the 4th quarter of 2014, was 1.53, down from the previous quarter.  Except for a very brief interlude in 2010, the velocity of money has been steadily declining since 2006, a couple of years before the recession began.

What does money velocity tell us?  It might indicate (if rising) the potential for inflation, because as prices rise, each dollar must change hands more frequently to buy the same amount of goods.  Or, it may indicate (again, if rising) that more goods and services are being purchased while prices are relatively stable.  What is unusual today is how low the velocity has dropped, even as economic activity has picked up, and the overall money supply has ballooned.  More money supply combined with lower monetary velocity indicates sluggish activity, and prices that rise very little, if at all.  In other words, pretty much what is going on about now.  Declining money velocity ameliorates the impact of loose monetary policies that tend to increase the money supply.

As the Fed should be learning, even given its vast infusion of cash into the US economic system over the last five years, prices that want to fall eventually will (see, e.g., oil).  Money has sneaky ways of distorting or altogether avoiding the Fed’s monetary policy prerogatives.

And now, news from the social cesspool

A pair of articles explain that there’s nothing wrong with the world that women can’t fix

Writing in the New York Times, Sheryl Sandberg, Facebook COO and self-appointed social scold to women whose posture remains centered, asserts that it is in the best interest of men to promote female participation and equality in the work force, from the article:

Studies reveal that women bring new knowledge, skills and networks to the table, take fewer unnecessary risks, and are more inclined to contribute in ways that make their teams and organizations better.

This is a bit confusing to me.  Does Sandberg advocate that women should be given more opportunities because they are the equal of men, or because they are superior to men?  Because if the latter were true, then there is a whole lot of human history that is nothing less than a confusing anomaly.  If women are superior to men, why aren’t women the superiors of men, more or less all the time?  And please, don’t tell me that it’s because men have been holding them back.  How could an inferior hold back a superior?

Sandberg goes on to say that the sexiest thing a man could do for a woman would be a load of laundry, calling it “choreplay”.  Oh, how cute.  It is not just a little bit condescending that Sandberg would think guys are that easily manipulated.  Or, that the price for sex, especially sex with one’s wife, is that cheap.  Marital sex is a whole lot more expensive than a load of laundry.  Every husband knows that.

Not to be outdone, the Wall Street Journal also published an op-ed weighing in on the gender wars.  Melvin Konner is a professor of anthropology at Emory University, but it is a subject about which he is apparently ill-informed, from the article:

Our own species hasn’t always suffered from male supremacy. Among our hunter-gatherer ancestors, living in small, mobile communities, group decisions were made face to face, among men and women who knew each other intimately. Men tried to dominate, but it wasn’t easy. They could show off by hunting, but war, that universal booster of male status, wasn’t common.

The notion that our hunter/gatherer ancestors were noble, peaceful savages has been by now so thoroughly debunked that it almost seems pitiful that this poor man, supposedly an expert in the field, somehow failed to hear of it.  Even Steven Pinker, who is not an anthropologist but knows a thing or two about the human animal in society, debunked the idea in his book-length treatise on the decline of violence, The Better Angels of Our Nature: Why Violence has Declined.  Among the extant hunter/gatherer societies that have been studied, violence is endemic–a natural part of life.  And practically all are patriarchies.  Even E.O. Wilson, the famed Harvard biologist, who is as devout a believer in the progressive ideal of non-violence, plaintively answered ‘yes’ to the question of whether human beings are innately aggressive. (In On Human Nature, page 99.)

The premise behind Mr. Konner’s argument is that women are better suited for leadership, at least in today’s world, than are men, and to allow them to so lead would return us to something more akin to our hunter/gatherer social organization.  He gets the hunter/gatherer social organization wrong, so the rest that follows is bunk.

And his argument begs the question:  If men and women were at one time relative equals in leading hunter/gatherer clans, then what happened to vault men to the top?  How was it that an equal became an inferior?  Did it happen without a fight, because men were more athletically and aggressively gifted?  If so, then men and women weren’t equals and never have been, which given the substantial sexual dimorphism in human biology (men being taller, heavier, stronger, etc.) sounds plausible.  Which means that Konner gets not only the social organization wrong, but also the human biology behind it.

Konner argues, unintentionally ironically, that women are better adapted biologically to leadership in today’s information-driven society basically because they lack the testosterone-fueled aggression that is innate to men.  But female leaders are just as capable of all the depredations attributed to male leaders.  The only reason it seems they aren’t is they haven’t been given sufficient opportunities to prove their equally maleficent character.  To cite one of Konner’s examples, Christina de Kercher, President of Argentina, very probably had assassinated a prosecutor preparing to indict her for complicity in the bombing of a Jewish community center.  Something similar to what that most manly of men, Vladimir Putin of Russia, did with one of his political rivals.

The gender wars continue apace, though it is not possible there will ever be any winners.  The battle is zero-sum on each side.  If women win and men lose then women also lose.  If men win and women lose, men also lose. The notion nowadays gaining purchase is that men are no longer needed in society–that all that aggression and testosterone just gets in the way.  But the notion can only be considered because men built a society so rich and so free and so secure that it can contemplate the luxury of discarding the attributes to which it owes much of its success.  Men can’t be blamed for all the ills of society without which they get some credit for its many successes.

Robert Grenier, CIA station chief in Islamabad 1999-2002, counsels patience against ISIS

In an opinion piece in the New York Times, Mr. Grenier offers very simple and succinct advice about what not to do in engaging ISIS, from the article:

At the outset of the war in 2001, I argued that it was critical for Afghans to lead the anti-Taliban campaign. My advice was followed at the time because we were initially successful. Even if Mr. Karzai and Mr. Shirzai had failed, though, my advice would have been the same — though I doubt anyone in Washington would have listened to me. I would have counseled strategic patience: Do not try to do in place of Afghans what only Afghans can sustain over the long term. In the fevered post-9/11 political environment, patience would have been a nonstarter.

The impulse to do something about the ISIS barbarians who are slicing off heads for YouTube stardom grows with the posting of each new video.  But we should ask ourselves, why are they so openly publicizing their barbarity?  And the obvious answer is that ISIS is trying to provoke the West into doing something.  At least Mr. Grenier gets it, again from the article:

Sadly, America has learned very little from the experience in Afghanistan. Just listen now to the impatient voices emanating from the right concerning the Islamic State. Our allies in Iraq, they say, are hopelessly ineffective, and our allies in Syria practically nonexistent. ISIS poses a clear threat to American security, they insist: If others will not, or cannot defeat it, we should not be afraid to step forward ourselves to crush it.

These sentiments play to the instincts of many Americans, and they must be resisted at all cost. If the United States were to take the lead in the ground war in Iraq and perhaps eventually in Syria by introducing conventional combat forces, we would feed into a radical Islamist narrative that pits the invading armies of the crusader against the committed defenders of Islam. In the process we would only strengthen the appeal and the morale of our enemies, while weakening and demoralizing our friends.

This is, very simply, the most clear-headed thinking I have yet seen as regards ISIS.  We are fools to let them draw us into a full-scale war.  Doing so would only enhance their appeal, perhaps even to more of our own citizens.

Alas, full-scale combat operations are practically inevitable.  Almost two-thirds (62%) of Americans recently polled by Quinnipiac University want ground troops fighting ISIS, according to an article on Bloomberg.

I can only imagine that the reason so many want to rise to ISIS’s bait is because the notion that someone else does the fighting has by now become completely internalized in the American psyche.

Personally, I have a moral aversion to wars of choice, and this would be a war of choice.  But more importantly, the war would be ineffective, except at enhancing the prestige and allure of ISIS to our detriment.  Sure, we could beat them in battle, but doing so would only reinforce the ISIS narrative of the West as crusaders come to oppress the Arabic Muslim world and cause anti-Western sentiment to ramify into areas that today consider us friends.

Why do we always ignore the experts on the ground, except as a last resort?

It’s always Groundhog Day* in Alabama

(A reference to the movie of the same name starring Bill Murray as a television reporter covering Groundhog Day festivities who finds he is condemned to suffering an endless loop of repeats of the same Groundhog Day festivities in Punxsutawney. He is condemned, something in the manner that a Buddhist is condemned, to endless rebirths until he becomes capable of seeing himself and his world as it truly is so that he might reach Nirvana, which in his case included sleeping onscreen with his costar, Andie MacDowell).

Like Bill Murray’s life in Groundhog Day, nothing ever changes in Alabama politics.  The state wakes up with the flip of the digital clock to 6:00 am. Sonny and Cher come blaring into the consciousness singing “I’ve got you babe”, and Alabama’s political establishment finds a way to embarrass itself by standing alone in opposition to some federal government imperative. Thus it has always been.  Thus it will always be.

The clock struck 6:00 Tuesday night (March 3, 2015) with the Alabama Supreme Court’s ruling that a federal district court judge in Alabama had no authority to rule on the constitutionality of the state’s marriage statute.  So again, Alabama’s political landscape is revealed as a featureless desert of federal government opposition.  Or, perhaps more aptly, as a Sisyphean struggle against the Leviathan of the federal government– whatever metaphor one chooses to describes a pointless, repetitive activity. The state starts its political day by rolling the boulder of federal government defiance up the hill, only to have it roll inexorably back down by the evening.

The history of Alabama refusing to kowtow, or even acknowledge, federal government power and preeminence dates to well before the Civil War, lasted through Reconstruction and Jim Crow, but reached its ultimate expression during the Civil Rights era. It’s a fair observation that the US Supreme Court used Alabama’s defiance as its guide in crafting the contours of its Civil Rights jurisprudence. There’s a litany of cases decided by the US Supreme Court (e.g., Katzenbach vs. McClung, the Ollie’s Barbecue case, to name just one that readily comes to mind) during the era that used Alabama’s recalcitrance to set an example. Essentially, Alabama was the US Supreme Court’s bitch during the Civil Rights era, slapping her around to show the rest of the states how things would be.

Which points to an important aspect of the relationship between the two governing entities. Alabama needs the federal government like the federal government needs Alabama. Each is a foil to the other. George Wallace’s stand in the school house door was carefully choreographed political theater. It gave him segregationist credibility with the Alabama voters, while allowing the federal government to show the power of its commitment to change. All the pols benefited vis a vis their respective constituencies.

So it was with the Alabama Supreme Court’s ruling yesterday on same-sex marriage. Alabama’s Supreme Court essentially told a federal district court judge to go to hell; that she didn’t have the authority to rule on the constitutionality of Alabama’s law defining marriage as between one man and one woman. The Alabama Supreme Court proved its political bona fides by fighting against the social scourge of gay marriage, while the federal government will soon enough get the chance to stomp on Alabama’s stiff-necked people again. It’s a win-win. Political symbionts, I believe best describes the relationship.

Alabama’s Supreme Court went further than just claiming that a federal district lacked authority to rule on the constitutionality Alabama’s marriage laws. They also weighed in, quite heavily, on the notion of whether gay marriage is a good thing or bad for society (surprise!—it’s bad). They needn’t have. They were actually right (even a blind squirrel occasionally finds a nut) that a federal district court judge should not have the authority to rewrite state law according to her view of its constitutionality, and then attempt to have her ruling apply outside the limits of her territorial jurisdiction. Such a thing raises the specter of crowning every federal district court judge a king, not only in her own district but all across the fair land. There are thousands of federal district court judges. Bedlam would ensue.

Stop for a moment to consider:  would it be wise to allow federal district courts to legitimately rule on the constitutionality of deploying forces in combat? Should a federal district court judge carry more power than the Chief Executive of the country? Because as much would be the effect of allowing federal district court judges the power to rule on matters of constitutionality in a manner that had broad application outside of their local jurisdictions.

The parceling of power between state and federal courts, more so than whether there is a constitutional right to gay marriage, needs redress. Forlorn federal district court judges do not need to be setting policies for whole states, and certainly not for the whole nation. Things have become a confused mess, particularly as federal district court judges have become activists, willing to tackle the tough issues of the day in the same manner that the now chastened (since the disastrous ruling in Roe v Wade) Supreme Court jurists once did.

It may well be that the US Supreme Court is tacitly encouraging this sort of activism, allowing the results it seeks without having to take responsibility for the political fall-out. The Supreme Court is very sensitive after Roe to the limits of judicial legislating, but it can’t shirk its responsibilities indefinitely.  It will eventually have to rule on the matter, hopefully thereby settling the law across the land.

In the meantime, Alabama will continue to be the foil to federal government power, and the federal government will continue to be the foil to Alabama recalcitrance. Both Alabama and federal politicians thereby profit. Same shit, different day, as me and my Army buddies might have observed.

The two governing entities have something of the same relationship between American imperial ambitions and ISIS. Or of the relationship between Israel and Iran. In every instance, each party needs the other to give meaning and purpose to their existence. In every instance, each party enhances its image in the eyes of its constituency by fighting the good fight for whatever moral truth they hold dear and in opposition to the other.

So maybe it’s not just Alabama where every day is Groundhog Day.

 

The gold/white, blue/black dress: How genes color your world

I saw a gold dress with white stripes. My daughter, looking at exactly the same image on exactly the same screen, saw a black dress with blue stripes, dispelling the notion that the differences in perception are due to variations in screen brightness. That was always nonsense. The differences in perception are due to very subtle differences in how each human brain calibrates colors to account for its subjective evaluation of the ambient light levels. Because there was no clue in the photo as to what sort of ambient light was illuminating the dress, the brains of the people viewing it picked a light source that made sense for them. The light level was ambiguous enough that about half of the brains picked a light source that made the dress look gold and the other half made it look black or blue. Thus was the reality of the dress’s color created by each individual brain. You’re welcome, said your brain.

There is no way to perceive an “objective” reality, except through understanding the brain’s mechanisms and recalibrating the subjective view presented by the brain to account for its biases and calibrations, but even then, the objectivity is only dimly glimpsed, like looking darkly through a mirror. To achieve some measure of objectivity in perception, the conscious mind must force a disengagement from the subjective perspective while simultaneously being immersed in it. But the brain, so powerfully devoted to presenting us with a useful view of reality, is more or less capable of the feat, probably because understanding that the brain is a reality-distortion machine is adaptively advantageous; i.e., it does exactly that for which the brain is intended, enhancing our prospects for survival and propagation, if we know that our brain often lies to us.

Look around you. Take a large object, like a sofa or a table or a desk, and consider the shape of the image your brain is presenting to you. I’m looking at a square table, but my brain is showing me something that looks for all the world like a diamond, sitting as it is, cater-corner, from my position behind a desk. But I am consciously aware that the diamond is a square. So, too, is my brain aware that the diamond is a square. From the diagonal view, at an angle of about twenty degrees, the brain turns a square into a diamond that appears a bit oblong, left to right. Creating the image in this manner helps the body orient itself properly in space. Imagine if the square table always looked square, no matter the position relative to it. It would be a most disorienting view, like floating through a dimensionless space, and disorientation is not conducive to continued survival.

Except perhaps to the woman deciding upon wearing it or not to a cocktail party, it doesn’t matter whether the dress was “actually” gold or blue. But the principle revealed by the differences in perception matters a great deal. And by ‘principle’ I don’t mean the notion that our brains often lie to us in presenting reality to us. That’s easy and obvious. By principle I mean the impetus the brain has for lying to us. The brain lies to us in its presentation of reality because it exists not to reveal truths, but to present a version of reality that will most advance the body’s twin prerogatives of survival and propagation. As reason is the hand-maiden of emotion, the brain is the hand-maiden of the body. It exists to ensure the body’s continuation in space and time. And as the brain is the hand-maiden of the body, the body is only a temporal capsule for the genes, or gonads. The system is something like a set of Russian Matryoshka (nesting) dolls, with the brain being the outermost doll, the body next and the genes at the core, with complexity decreasing like the dolls decrease in size the further into the nesting one goes. We see what the body tells us to see, and the body tells us to see according to what the genes have told it to see.

Thus is revealed the even deeper truth nesting at the core of the differences in the perception of dress color. Genes drive the perceptual train. But what drives genes? Evolution. Evolution made us see the dress as variously gold and white or blue and black. Evolution drives everything, yet like the perceptual biases and calibrations that have us seeing a dress in two different color schemes, we mainly choose to ignore it, sort of how reason is used only as a last resort before conflict, after all other avenues have been exhausted. Perhaps willful ignorance is also an adaptive trait. If the article on ‘you’re welcome’ in the New York Times Magazine this week is any guide, it may well be.

“You’re welcome” literally means ‘you are a welcome guest’. Welcome comes from Old English, wilcuma, meaning welcome guest. Not long ago, it was considered the standard reply to an expression of thanks. It was the coda to whatever transaction had transpired between two people. One person orders a hamburger happy meal; the other delivers the meal and takes payment; the first thanks the server; the server replies, “You’re welcome.” On to the next customer.

But like all greetings and cordialities, there is a latent hostility to the phrase, because between any two people who aren’t identical twins (and sometimes not even then), there is a latent hostility suffused in any interaction. Each person has a set of genes, those things that selfishly seek only to survive and propagate, the ones that we’ve established caused the dresses to look variously gold or black. Each person’s genes know that everything else in the environment, including other people most of all, represent potential threats to achieving their prerogatives. They also know that some things in the environment, including other people, represent opportunities to be exploited to their advantage. So the question a gene has in greeting another gene of its same type is “Are you wid’ me or agin me?

In the fast food transaction, there is an assumption of cooperation—that the two sets of genes will enhance their individual fitness by helping the other out. Each are ‘wid’ the other. But still, there is an element of hostility. One set of genes orders the food it needs for sustenance. The other set must scurry to get it if the bargain is to be consummated. Once he delivers the food and accepts the payment, the transaction is complete. If everything goes well, both sets of genes leave satisfied, but the server set has to scurry on to the next set of genes that wants food if he is to achieve his ultimate end of obtaining the necessaries of life. He is under pressure to quickly and efficiently dispense with the transaction. After he’s invariably thanked, “You’re welcome” does it. The cordiality to initiate the closure of the transaction could as well have been, “Fuck you, I’m glad this stressful little aspect of my life has concluded.” And the response would have been, “Fuck you, too,” meaning much the same.

Cordialities are ways our genes disguise to their advantage the latent hostility that they have for everyone except themselves. Allow me a couple of examples.

I remember being at a little café at the Brooklyn Produce Terminal in New York City early one morning (don’t ask), getting some coffee (regular, black, with cream—I don’t even know how to order at Starbucks) and a bite to eat, and a guy walks in the door to his own great fanfare. He hops in line and yells across the café in a thick Brooklyn accent, “Yo, wassup, Joey!”, talking and waving to the guy at the cash register taking orders. Joey barely looks up from punching numbers into the machine and responds, “Yeah, why the fuck you want to know, you writing a fucking book?” It was the most beautiful exchange of cordialities I have ever witnessed. It accomplished the end of greeting someone in a manner that establishes they are meant no harm, while at the same time being plaintively honest about the innate hostility that is felt from being in their presence.

This sort of brutally honest greeting is far more prevalent among men greeting each other than it is among women or when men and women meet.

Where I live, in the South, the culture of faux niceness prevents, except among the closest of male friends and then only in private, the sort of brutal honesty displayed by Joey. And failing to greet someone is considered ruder still. So the neighbor who I don’t like and who doesn’t like me greets me with a brusque, “Good Morning” when I happen to see him outdoors. His ‘good morning’ sounds about the same as saying “Fuck you” and viscerally meaning it. I return the favor with a reply just as vituperative. We might as well be snarling at each other like a couple of dogs. But the culture demands that we at least pretend not to be openly hostile to each other.

Women in the South condemn each other with protestations of fake concern, covering for their hostility while skewering them with insults, “Bless her heart, she’s so stupid she’d have to go to school just to learn how to be blond.”

The New York Times article stumbles and bumbles along its way to almost finding the true nature of greetings when it discusses how Tina Fey in “Mean Girls” figured out what verbal interactions among teenage girls were really about:

Ten years ago, Fey intuited that among girls, even the most banal conversational exchanges could be wielded as weapons. Regina is so studied in the art of verbal manipulation that every compliment she gives is a sneaky bid to amass more social capital:

Indeed, not just compliments, but greetings and cordialities even, are sneaky bids to amass more social capital, especially among women, for whom social capital is so important when it comes to surviving and propagating. Female genes know that the path to their eternity lies through lies, the more cleverly and relentlessly proffered, the better.

The takeaway from the dress story and the ‘you’re welcome’ story is therefore  perhaps not what you think. The same dress can look different to people because our mind plays tricks on us. Big whoop. We knew that already.  Words can carry any number of meanings depending on intent and context. Another big whoop. The takeaway is that we are nothing but packaging for a selfish little set of genes that have as their sole concern their own survival and propagation. Everything else, from telling us what to see, to the relentless friend or foe inquiries, to the ceaseless striving for status, are animated, directed and consummated by the genes, of the genes, and for the genes. And so now you know. You’re welcome.

The Executive Summary for the week of February 22-28, 2015

Several regional manufacturing surveys missed expectations, some by a large margin, over the last two weeks, from Bloomberg:

  • Feb. 17: Empire State Manufacturing Survey: 7.78, down from 9.95 previously. Slight miss on expectations of 8.0.

  • Feb. 19: Philadelphia Fed Business Outlook: 5.2, down from previous 6.3. Far below expectations of 9.0.

  • Feb. 23: Dallas Fed Manufacturing Activity: -11.2, down from previous -4.4. Far below expectations of -4.0.

  • Feb. 24: Richmond Fed Manufacturing Index: 0.0, down from previous 6.0. Far below expectations of 6.0.

  • Feb. 26: Kansas City Fed Manufacturing Activity: 1, down from previous 3. Below expectations of 3.

  • Feb. 27: Institute for Supply Management–Milwaukee: 50.32, down from previous 51.60. Below expectations of 54.0.

  • Feb. 27: Chicago Purchasing Managers: 45.8, down from previous 59.4. Far below expectations of 58.0.

But at least the War on Terror is showing signs of growth:

‘Jihadi John’, the star of ISIS’s execution videos, is revealed to be a British citizen

Mohammed Emwazi, nicknamed ‘Jihadi John’ by captives who were soon to lose their heads, was born in Kuwait (I knew Kuwait was never worth saving way back then). His parents emigrated to London when he was six.  He was a more or less regular kid, graduating from the University of Westminster in 2009. Now he’s an internet video sensation, lecturing the West on its evils in his prim and proper British accent, before lopping off the head of another of its unfortunate souls.

The renowned British Intelligence Service, MI-5, which is different from MI-6 in some way that I don’t care to figure out, did not untangle the enigma of Mr. Emwazi in time to prevent his radicalization.  They did detain him a couple of times as he was maybe trying to join the jihad, but ultimately, that just radicalized him more, or so said CAGE, a British outfit opposed to the War on Terror.

It’s not clear whether Mr. Emwazi will be able to convert his new found fame to wealth.  It’s hard to make a living off of videos distributed for free, as Google’s YouTube is finding out.

YouTube is losing money even though it has over a billion annual users Google bought YouTube for $1.7 billion in 2007. Even with over a billion annual users, it has yet to add anything to Google’s bottom line.  The video-sharing site brings in about $3.8 billion per year, but costs more than that to operate.  Its unique monthly visits of about 150 million views is double Facebook’s.  But Facebook sort of makes money.  Sometimes.  But  Google could have invested their $1.7 billion in Apple back in 2007 and made oodles of dough, enough to juice the following numbers even more.

Ultra luxury automobile sales have done just fine since the Great Recession, thank-you very much:

In case you were wondering how the rich fared in the wake of the financial crisis, the following chart from Bloomberg provides a poignant snapshot: There’s you some inequality.  What’s surprising is how low sales dipped during the crisis.  Did the super-rich  lose confidence in their ability to manipulate government and society to their advantage during that time?  Poor little rich people.

Jobless claims jump by most since December, 2013

First-time claims for unemployment benefits were up by 31,000 in the week ended February 21st over the previous week.  What does it mean?  Practically nothing, and not because, as commentators have variously offered, ‘there was a floating holiday that fell in the week’, or ‘this reflects single-company issues’ or ‘it was really cold in Boston’.  Okay, I made that last one up.  But the reason this doesn’t matter is because it is just one data point.  It does go along with a trend of declining job gains over the last three months, from a peak in November of 423,000 jobs added to only 257,000 added in January, a decline in job growth of 40% in just three months.  But one data point does not a trend make, not even when it is correlated to data points measuring similar economic phenomenon.

There is little doubt in my mind that the US economy is slowing its expansion a bit, perhaps even contracting a trifle even as I write.  The 2.2% growth rate for the fourth quarter, 2014  announced this week by the Commerce Dept., down from the third quarter’s 5%, bears that out.  But there is nothing to suggest the bottom is about to fall out.  Contrary to stock prognosticators, growth that isn’t the product of human speculation (such as obtains with stock prices, meaning the growth is often only illusory) does not go up in a straight line.  There is a ‘natural’ growth rate in any living organism of about e, or 2.718, a magical number.  And that’s about the medium-term average that a mature living organism known as an economic system is capable of.  Sort of like an adult crocodile.  In the long term, growth rates must always be zero (a mathematical certainty), and in the short term they can fluctuate wildly.  For the medium term in which a human life is lived, about 3% is a fair estimate of what to expect.

Apple Computer coyly announces a coming out party for its new watch

The invitations say ‘Spring Forward’. The ‘Special Event’ is the day after the country goes on daylight savings time, springing its clocks forward.  Get it?  Apple doesn’t actually mention the watch in the invitation to their announcement, but then, do they have to?  You’re smart and hip and all over Apple anything.  You get it.

The event is to announce their new watch.

Incidentally, they’re also counting on you not remembering that they announced the new watch when they announced the new iPhone 6.  But then they know they can count on you, gullible American public. Isn’t Apple just the coolest company in the world!  I don’t know how I could tolerate my miserable life if they didn’t provide me with a new toy on a regular basis.  Okay, maybe I could survive so long as I still had the Kardashians, which thankfully, I will.

NBC Universal announces a new three-year contract for “Keeping up with the Kardashians”

But not for the $100 million dollars it was rumored to be, and that NBC Universal vociferously denied.  And soon my world will be complete.  I will have my new Apple Watch buzzing to remind me when it’s time to tune in, and I’ll have Kris, Kim, Kourtney, Khloe, Kendall and Kylie and all the rest (except gender-confused Bruce and the perfect asshole, Kanye) to fill the existential void that arises from having so much time and so little to do. But if the Kardashians fail me, there’s always Taylor.

Taylor Swift intentionally releases photos revealing her bellybutton

Anything that a woman refuses to reveal–her bellybutton, her hands, her eyes, her ankles, her shoulders, her neck–anything, instantly becomes the most erotic, desirable part of her body.  Judging by the fashions worn by today’s women, not many understand this.  But Taylor Swift does.  Until her recent Hawaii vacation she had refused to reveal her bellybutton.  Then some paparazzi threatened to do it for her, catching her sunbathing in a bikini.  So she took some photos and posted them on Instagram before they had the chance. And it set the world a twitter.  And me.  I just love T Swizzle. And she’s just got the cutest little bellybutton.  Man, does it get any better than this?

Why the dress is black and blue for some, but is gold and white for others

Which do you see?  A black and blue dress or a white and gold one?  The explanation is not as exotic as you might imagine.  According to Wired, the brain’s automatic calibration for color according to light has nothing to do with it.  It is, instead the differences in the brightness on the screens on which it is being viewed.  Or, maybe not.  My daughter saw a black and blue dress and I saw a white and gold one when we saw it on the same television last night. Instead, it may be that the dress illumination sits right along the tipping point where our brains begin calibrating for lower light, such as we do when looking at the Rubric’s cube in shadow, below.

Both the top middle square and the side square are brown.  But we see orange/yellow in the side square because of the perceived shadow. Though the example is frivolous, the point is profound.  Our brains concoct reality for us.  Acknowledging as much does not mean accepting the claims of some philosophers that reality is strictly a figment of the imagination.  No, there are objects that absorb and reflect light of differing waves lengths that throw light on the back of the retina, which the brain then calculates and calibrates into a useful image; the key word being ‘useful’.  The brain is the body’s handmaiden, doing what the body needs in order for it to survive, because the body is the vessel for the DNA.  That’s the profundity buried in all that frivolity.

Kevin Garnett returns to the Minnesota Timberwolves to end his career where it began

KG has been either loved or loathed by fans.  The fans of the teams he plays on have loved him; the fans of teams playing against him have loathed him.  That’s about the best coda to an athletic career anyone could hope for.  Minnesota, where KG started in the NBA as a skinny eighteen-year-old kid, loves him still.  He came home this week, traded in the flurry of All-Star Week, to heart-felt adulation.  Welcome home KG. As for me, I root for the teams he’s on, just because he’s on them. He always shows up to play.

Follow

Get every new post delivered to your Inbox.

Join 226 other followers