Trump’s “dangerous babble” on foreign policy maybe isn’t really dangerous or babbling

The New York Times editorial board must be very concerned, frightened even, that they and their fellows in the liberal intelligentsia (an oxymoronic term if ever there were one), who desperately believe that only Hillary has any claim to the presidency, might be wrong.  They must worry that Hillary’s accession to the office of the President, which she and they feel is rightfully hers, might just be usurped by a bombastic bullying billionaire.  How else to explain the diatribe in today’s newspaper (March 29, 2016, Mr. Trump’s dangerous babble on foreign policy ) against Donald Trump’s mostly off-the-cuff foreign policy utterances?  If they weren’t afraid he might win, then why bother?  (Except of course to keep to the daily quota of Trump-bashing articles providing eyeballs for their advertisers by making noises in their echo chamber to attract the Trump-hating sycophants that comprise the vast majority of their readers.)

Trump is not the president right now.  What he says on the campaign trail is for the purpose of getting elected.  The present occupant of the Oval Office clamored on the campaign trail that he’d close Guantanamo and end the war in Afghanistan.  Things look different from the perspective of Commander-in-Chief.  Guantanamo is still open and we are still actively engaged militarily in Afghanistan.  And the Guantanamo and Afghanistan promises were made eight years ago, time a plenty for them to have been fulfilled by the candidate making them.  Incidentally, the one foreign policy promise Obama kept, if excruciatingly slowly—getting our troops out of Iraq—backfired miserably with the rise of the Islamic State.  So now we’re sending troops back to Iraq.  It’s only a dribble right now, but will surely be a flood soon enough.  Especially if Hillary is elected.  As Secretary of State, she never saw a foreign policy problem she believed couldn’t be made better through muscular US military intervention.  If she’s elected, expect to see divisions of American infantry campaigning from the shores of Tripoli to the cradle of civilization in the Levant to the no-man’s-land of the Hindu Kush.

But the irony is that nothing Trump has said on foreign policy is outrageous or babbling.  It doesn’t even sound much like campaign promises he has no intention of keeping.

The NY Times is aghast that in an interview on a Bloomberg political show, Trump refused to rule out the use of tactical nuclear weapons against ISIS.  He said, “I’m never going to rule anything out — I wouldn’t want to say. Even if I wasn’t, I wouldn’t want to tell you that because at a minimum, I want them to think maybe we would use them…”

What is troubling about that?  Seasoned diplomats fully understand that self-imposing limits to one’s options is a sure way to lose in a negotiation.  The other side needs to wonder at what the response might be if negotiations fail.  Even Obama, when he unwisely claimed that Syria’s use of chemical weapons would cross a “red line”, did not explain what he might do in response to the transgression.  As it turns out, the Assad regime (maybe) called his bluff as Obama had no answer when they (maybe) went ahead and used chemical weapons.  Even Assad had to believe that if he used chemical weapons in defiance of a US president’s warning, he’d be punished for it (which is why I don’t believe his regime deployed them, but that the rebels used them to try and draw the US into the war on their side).  In any event, saying but not doing is very bad diplomatically.  Doing without saying or explaining is much more advantageous.

It is a principle of leadership few understand that the most effective means of motivating others to do your bidding is inconsistency.  Being like the God of Abraham, arbitrary and capricious in the way you treat those subject to your power, which, in the case of a US President is the entire world, will compel awe at your strength and fear for your wrath.  People worship (and fear) that which they don’t understand.  Be forthright and others have the advantage.  They know you, while you don’t know them, always a losing strategy in poker.  Trump is hinting at this proven, if Machiavellian, principle of power, when he says that he wouldn’t want to say one way or another about using tactical nukes against the Islamic State.  Let them wonder, indeed.  In the Darwinian jungle of international relations, it is much better to be feared than loved, and there is nothing more fearful than a nation led by someone who is unpredictable in how he might deploy the terrible power at his disposal.

For example, why did the Iranians start negotiating the release of their US Embassy hostages practically the moment Ronald Reagan was elected?  It was not to curry favor with the new Reagan administration out of a position of strength.  No.  It was because they were afraid at what the new cowboy-president, Ronald Reagan, might do.  They knew what Carter would do—he’d already tried all he had the will to do militarily and had miserably failed.  They weren’t so sure about Reagan.   He might well melt the Iranian sand.  So the hostages were freed on the day he was inaugurated.

The NY Times editorial board remarked on Trump’s comment regarding nuclear weapons and ISIS, saying that, “Even if Mr. Trump, the leading Republican presidential candidate, doesn’t really believe that nuclear weapons should be used against a terrorist group, the fact that he has voiced it lends weight to this insane notion and could make it easier for other nuclear-armed states to think about that possibility…The consequences of using a nuclear weapon in terms of lives lost, physical destruction and cost to American moral standing would be devastating.”

Which show just how little the NY Times editorial board understands of diplomacy.  Any country that takes the position that they have too much to lose to do whatever is necessary to win risks ultimately losing all that they have.   The Jin Dynasty in ancient China undoubtedly thought it had too much to lose of its civilization if it became as barbarous as the Mongol hordes invading from the north in the 13th century.  The Mongols had nothing to lose and everything to gain through their barbarity, the more the better.  They perfected the tactic of total war, killing and destroying everything and everyone—women, children, the aged and infirm–in their path. And the Jin Dynasty vanished from the face of the earth.  The lesson is that depravity must be met with overwhelming depravity.  Bomb our cities by turning airplanes into bombs?  It wouldn’t hurt if they imagined we might just be crazy enough to turn their host country to rubble.  Which we did, but then it was already pretty much rubble to start with.

The NY Times editorial board bemoaned Trump having described NATO as “obsolete”.  But isn’t it?  The Cold War ended a quarter century ago with the collapse of the Soviet Union.  NATO was specifically created as a military counterbalance to Soviet power.  What part of obsolete doesn’t the NY Times editorial board get?  The NY Times editorial board seems reflexively conservative, hoping for the continuation of institutions that long ago lost their relevance.  Perhaps that’s because the NY Times editorial board is itself an institution that has long suffered from declining relevance starting, apparently of only accidental coincidence, about the same time as the collapse of the Soviet Union.

Besides, by the Times’ own admission, Trump didn’t say he wanted to abolish NATO, he said in its present configuration—solidified when Europe was on its knees after WWII and the US was the only power capable of confronting the Soviet Union—it is unfair to the United States; that its European members need to pull a greater share of the burden of defense.  Isn’t that what Barack Obama more or less just said in an interview in “The Atlantic”?  There are no empires in history, save that of the United States, that didn’t expect some sort of tribute for the privilege of being brought under its protective wing.  Is Obama guilty of “dangerous babbling” as well?

There is something of a grass-roots democratic revolution afoot in the United States.  It rejects the economic and political status quo for a wholesale rethinking of priorities and processes that have grown long in the tooth (such as whether NATO is still relevant or is yet another military boondoggle).  Its supporters believe we have gone as far as we can with the old power structures, seeing them as having grown corrupt and sclerotic for having held power for so long.  Organs like the NY Times editorial board (and a great many other organs of the American empire like it—the Wall Street Journal editorial board, e.g.) championed and supported democratic revolutions across the globe—from Poland’s rejection of communism a quarter century ago, to more recently, the Arab Spring, to the splitting of Sudan.

But they seem to believe that democracy, with all its messy processes, with its proclivity to emotional pendulum swings, and its potential for usurping long-standing power arrangements, is not good for America.  So they disparage Trump (as a clear and present danger to the status quo) and ignore Sanders.

I believe otherwise.  I think democracy is as good for America as it is anywhere else (which is to say, that it is the least-worst form of government).  Power corrupts and absolute power corrupts absolutely.  Donald Trump’s candidacy, the first since Reagan’s to auger any sort of real shift in political power arrangements, is a breath of fresh air in what has become a stultifying political climate.

Donald Trump is not “dangerously babbling” on foreign policy (or anything else), except to the defenders of the status quo who are terrified of the challenge he and his popularity present to their worldview.  When Trump speaks on foreign policy, he makes more sense than a NY Times editorial board that apparently didn’t get the memo, inter alia, that the Cold War ended.  Besides, as an alternative to shoot-‘em-up Hillary’s known foreign policy proclivities, I’d take Donald Trump’s dangerous babbling every day.  He would be far less dangerous to America’s fortunes internationally than would she.


Trump and Sanders: The enemy of my enemy is my friend

Everyone loves to hate, mock or ridicule Donald Trump.  If you want a quick buzz before bedtime—okay, maybe a blacked-out drunken stupor that precludes the possibility of even finding your own bed—take a shot of whiskey every time you hear Trump’s name mentioned by any of the three or four late night talk show hosts during their opening monologues.  You won’t have to flip channels to cop a buzz.  They’re all Trump, all the time.  Considering that roughly 40% of Republican Party voters support him—maybe 40 million people or so—it’s quite remarkable that no one seems to mind.  Maybe the Neanderthals supporting Trump really aren’t so stupid as the talk show hosts implicitly believe, and get that he’s a bit of a bombastic, bellicose buffoon, and think it’s kinda fun to make fun of him, but still see him as the only candidate worth supporting in the Republican field.  That’s roughly how I feel about him.

Bernie Sanders, the upstart septuagenarian socialist Senator from Vermont by way of a secular Jewish upbringing in Brooklyn, New York, just won caucuses in Alaska, Hawaii and Washington by enormous margins over the favored Democrat (or perhaps I should say, ‘Democratic favorite’), Hillary Clinton.  Nobody much makes fun of Sanders.  Maybe because he’s so seriously focused on spreading his socialist message.   It kind of spoils the fun when someone appears to be genuinely sincere in their political proselytizing.  Sanders’ seriousness recalls the skit on Saturday Night Live about the girl at the party you regret having talked to.  Or maybe he gets a pass from late night comedians because he’s lighthearted enough to occasionally make fun of himself, even through all the seriousness.  But probably it’s because no one really believes he has any real chance to be elected president, which it might be pointed out was the same thing people thought of Hillary Clinton’s last opponent for the Democratic nomination.

Trump and Sanders together surely enjoy a plurality, if not an outright majority, of support among registered voters.  Which isn’t all that surprising, since the candidates are more alike than they are different, and their supporters have in common one very important attribute:  they are overwhelmingly White and American.  By “White” I mean non-Hispanic, non-Black, non-Asian people of European descent, including Northern European Jewry—a real melted pot that would have no hope of cohesiveness or collective identity except that they are opposed by Blacks and Hispanics and Immigrants.  By American, I mean people whose ancestors came here long ago—generally second generation or more.  Relative to Immigrants, by which I mean those who recently arrived or who were born here to recent arrivals, Americans are far longer established in the country.  They comprise the nation Trump is referring to when he says he wants to “Make America Great Again.”  ‘America’, for him and his supporters, does not include people who don’t speak American (not British) English as their first language.  It does not necessarily preclude Blacks, but they have up to now been firmly behind Hillary Clinton.

Trump’s is a nativist campaign, so necessarily at times sounds xenophobic and racist.  He rails against Immigrants for having taken American jobs.  Sanders rails against the real costs of free trade—lost jobs, poorer environment, worker exploitation at home and abroad, and ever-increasing inequities in income distribution.  Thing is they’re bashing different sides of the same thing—the New World Order (for lack of a better euphemism) that arose after the collapse of the Soviet Union and the end of the Cold War.

Characterizing this New World Order are a) declining or stagnant wages for American laborers; b) rapidly concentrating share of income and wealth in the top percentages of income and wealth (the rich getting richer); c) muscular American military engagement across the world for the benefit of American capitalists (not, no matter how often it is pretended, for American civic values);  d) capture of each major political party establishment by capitalist elites spending enormous sums directly and indirectly for access and influence; e) lack of immigration control; f) rampant offshoring of jobs to American trading partners operating without even rudimentary labor or environmental protections.

The bottom line:  The New World Order screwed over American workers.

This election cycle, American workers are finally beginning to see which way the wind’s a blowing.  The Republican rank and file are coming around to the realization that the culture wars the Party tries to sell them are actually a diversionary tactic promulgated by the Republican elite who sold their souls to imperial capitalism and its practitioners as soon as the Berlin Wall fell.  (By ‘imperial capitalism’ I mean capitalism that ranges the world over, something like the ships of the Dutch East India Company or the British East India Company, seeking out profitably exploitable opportunities, including people, nations, environmental resources, etc., with no regard for the costs left in their wake.  Imperial capitalism is capitalism unfettered by social responsibility.  It is free-rider capitalism of the worst sort.  It sends soldiers to their deaths to protect its ability to peddle its wares; child laborers to its mills at the break of dawn to trudge at hard labor all day long; pollution into the water and land and sky without regard.  It is 17th century capitalism in the great tradition of British joint stock companies commissioned by the Crown, but in the 21st century.)

There was before the fall of the Berlin Wall a sense that we were all in this thing together, or at least among Republicans, that all the white people who didn’t consider themselves intellectual and emotional snobs, were in this thing together.  Rank and file Republicans were accustomed to hierarchical social structures, so understood that some among them would be richer and more powerful than others.  But they didn’t imagine that there would develop an elitist element in their midst that was so unabashedly focused on power and wealth that they’d throw the rank and file under the bus to achieve it.

The scales fell from their eyes, in stages at first, then all at once, when a charismatic leader not beholden to elitist establishment Republicans finally began speaking to their fears over what unfettered imperial capitalism had wrought.  Turns out that it never really mattered all that much to them whether the state did or didn’t allow gay marriage, or that their candidate couldn’t competently quote scripture, so long as their economic future was secure.  But now they realize their economic security has been intentionally sacrificed on the altar of imperial capitalism, allowing people who they considered their cultural brethren to get rich at their expense and disenfranchisement.  So now they simply ignore the culture war totems to focus on the economics.  They understand they’ve been dished a bowl of thin cultural gruel by their Party’s establishment for having agreed to provide political support to a Party controlled by rich capitalist overseers who are the very people ripping the economic rug from underneath them.   Their anger is expressed as opposition to immigration and immigrants and offshoring and what they see as one-sided trade treaties.

Some are even beginning to realize that post-Cold War military engagements (Iraq and Afghanistan, mainly), supposedly bringing order and liberal democracy to benighted places around the world, are really just excuses to enrich the military-industrial complex capitalists while providing the muscle needed to protect the ventures of imperial capitalists.  The elite get richer and more powerful while the rank and file offer up their sons and daughters for slaughter.

The Democrat rank and file are experiencing much the same revelation, but being a less racially homogenous group, are expressing it less through xenophobia and racism than through anger at the rich capitalists many of its members protested through the Occupy movements.  They, too, are against free trade and unfettered immigration—in short, the same imperial capitalists and their methods that Republicans oppose–but are more likely to couch their argument in terms of foreign worker exploitation and the broken communities it creates back home.  But they are resolutely against our post-Cold War military engagements, many of them from the very beginning.

Politics makes strange bedfellows.  Pick any political cliché and it’s apt to apply to the situation here, which, while unusual for American politics, is quite ordinary in the context of history (else we wouldn’t have the clichés).  Trump and Sanders supporters have as their common enemy imperial American capitalism and capitalists.  Both groups can see that their common enemy is slowly eroding their power and relevance.   A common enemy can operate like an electromagnet, supplying the charge that pulls opposite poles together.  Through their political exploitation and economic disenfranchisement of the American working class since the end of the Cold War, imperial American capitalists and capitalism could provide the charge that pulls the quasi-fascist elements of the Republican Party and the quasi-socialist elements of the Democratic Party together in a tight bond in opposition.  It’s not as ridiculous as it sounds.  Before the Cold War, the Soviet Union was our sworn ally.  Hitler provided the electromagnetic charge.

On the Democratic side, the only thing saving Hillary Clinton is the Black vote.  Blacks have monolithically supported Democrats since Lyndon Johnson and the Civil Rights Era.  They have reflexively supported Ms. Clinton in the primaries, presumably in recognition of Bill Clinton’s reputation as the first “Black” president.  But their economic interests are more closely aligned with the Trump/Sanders supporters than with establishment Democrats, who are basically no different than establishment Republicans, except their diversionary tactic is to pretend they believe Black lives matter when in fact they only matter on election day.  What has their monolithic support of Democrats done for them economically?  Not much, it could be argued.  They are as likely as Whites to lose a job to an immigrant laborer or to offshoring.  Obama’s election was surely an emotionally satisfying experience, but really, what changed as a result?  Not much of anything except Black expectations, which, after briefly soaring to lofty heights, crashed firmly back to earth when it was realized that all they got in Obama was another establishment politician who happened to sport an African tan.

I doubt either Trump or Sanders will be nominated by their respective parties.  Trump will win the largest number of delegates, but perhaps not enough to give him an outright claim to the nomination, which will throw the Republican convention into convulsions, which may result in someone being drafted to run.  Because of superdelegates, Sanders was mathematically eliminated from competition before he started.  But once the two political revolutionaries are rejected by their parties, it wouldn’t be at all surprising if they joined forces for an independent bid for the White House.  Because if they did, they would very likely win.  The enemy of my enemy is my friend.



The graduation speech I wish my daughter would give…

(My daughter is graduating high school in a couple of weeks. She’s very academically gifted. She’s ranked third in a class of 250 students from one of the richest [hence generally smartest] school districts in the state of Alabama.   She, along with seven other honor grads, has the opportunity to give a speech at her commencement. She has two minutes at the podium. This is what I wish she would say.)

What a whirlwind of activity to close out my high school career. There have been parties and picnics (even crawfish boils) and practices and performances and awards and baccalaureates, etc., etc., etc. It almost makes it feel as if I’d actually done something remarkable simply by virtue of graduating from high school. I haven’t.

And I certainly haven’t done anything remarkable to warrant your being forced to listen to me speak for a couple of minutes. I stand before you today because my academic record compares favorably to that of my peers. But so what? That only means that I was lucky enough to draw from the genetic deck a hand that makes my mind slightly more efficient or a split-second faster at processing information than some others. I was born with it. I had nothing to do with it. It’s like having red hair or something—sort of remarkable in its infrequency but really nothing to make a big deal over. True, I didn’t squander the unwarranted blessing. I applied myself and got good marks. But is that really so remarkable? And can I even be sure that the same batch of fortuitously good genes didn’t motivate me to make the most of my academic talents? If so, then what is there to celebrate? It couldn’t have happened any other way.

Besides, nobody but me could possibly know whether I did the best I could with the intellectual endowment provided me. What of others who tried harder and did more with what they had? There’s no way to tell, outside of people with obvious outwardly visible handicaps, how deserving of accolades is any particular student for how they performed relative to potential–always the true measure of achievement. Nobody can possibly ever know about anybody but themselves. So, it’s not that I’m not happy with myself. I happen to be very satisfied with my performance. But my having lived up to my expectations is not something that can legitimately be externally rewarded. And it anyway need not be. The satisfaction and self-respect is reward enough.

As much as my academic success depended upon an accident of genetic probabilities, the relationships I have formed during my years in the Homewood school system are also accidents, but of space and time. I and those with whom I am friends happened to arrive at this time and place by independent paths we had no, or very little, input in choosing. I am leaving for the University of Georgia in Athens later this year. I don’t know of anyone else in our class who is also going. Our family is moving out of Homewood to a hipster condo in the city later this week. I don’t know of anyone else’s family who is joining us. How many of these accidental relationships that were formed during my time here will stand the tests of time and geography—tests they will face almost as soon as the ink dries on my diploma? I don’t know, of course, but it will be interesting to see.   My parents met in high school, but didn’t get married until almost ten years later, after going their separate ways for several years. Some relationships persevere. Others don’t. That’s about all I know for sure about such things right now.

So, what has been the point to all this celebratory madness? Why make such a big deal of things? As a practical matter, it appears to have less to do with me and my fellow students, and more to do with the parents and educators seeking affirmation of their status in the community and validation of their educational efforts. But really, is there a point to it all, even from their perspective?

Consider the insights of one of the greatest minds Alabama has ever produced, the Harvard biologist E. O. Wilson. He started out studying ants as a boy in the bayous and backwaters the Mobile and Tensaw River Deltas in South Alabama, but eventually made his way to Harvard University as a Ph. D biologist and Pulitzer Prize winning author. He practically invented the field of what is now called evolutionary psychology by applying the lessons learned studying ant colonies to human social behaviors. In his 1978 book, On Human Nature, he made this observation about the human condition:

The first dilemma, in a word, is that we have no particular place to go. The species lacks any goal external to its own biological nature. It could be that in the next hundred years humankind will thread the needles of technology and politics, solve the energy and material crises, avert nuclear war, and control reproduction. The world can at least hope for a stable ecosystem and a well-nourished population. But what then? Educated people everywhere like to believe that beyond material needs lie fulfillment and the realization of individual potential. But what is fulfillment and to what ends may potential be realized? Traditional religious beliefs have been eroded, not so much by humiliating disproofs of their mythologies as by the growing awareness that beliefs are really enabling mechanisms for survival. Religions, like other human institutions, evolve so as to enhance the persistence and influence of their practitioners. Marxism and other secular religions offer little more than promises of material welfare and a legislated escape from the consequences of human nature. They, too, are energized by the goal of collective self-aggrandizement.   The French political observer Alain Peyrefitte once said admiringly of Mao Tse-tung that ‘the Chinese knew the narcissistic joy of loving themselves in him. It is only natural that he should have loved himself through him.’ Thus does ideology bow to its hidden masters the genes, and the highest impulses seem upon closer examination to be metamorphosed into biological activity.

And perhaps that is what all this is. For today anyway, parents and educators can love themselves by casting glory upon the achievements of the graduates. But there is no particular point to any of it that transcends our biological nature. It seems that all we can really know about the point to life, from which the point to this celebration is ultimately derived, is that nothing of it transcends our biological nature. And it is the happenstance of biological endowment that is considered the most singular of achievements in this orgy of celebration.

I gained a lot of knowledge in my four years here, but not a lot of wisdom, as the first prerequisite of wisdom is humility, and this celebration, along with pretty much everything through the years leading up to it, stands as the antithesis of humility. But today, as I stand in awe at the vastness of the world outside these walls, I find myself humbled at my own insignificance. And maybe that is the point to all this, at least for me.

In the end, though I may have no particular place to go, still, I am for some reason quite eager to get out of here in order that I might get there. Thank-you.

A dad’s eye view of a daughter’s senior prom

She bought a $400 dress, or more accurately, her mom and I bought her a $400 dress, and before she even had a date. But there was never really any doubt she’d get a date. In her clique of friends, everyone always gets a date, whether it’s to the prom or to the homecoming dance or to the Sadie Hawkins dance or to any other big and official event comprising their social calendar. The identity of the date means little. They are arranged through friends and are between casual friends or acquaintances. Romance hardly matters. It is the rarest of things when a couple attends an event on the social calendar because they actually feel some romantic attraction for each other. So not only was there little danger she wouldn’t get a date, there was practically no chance the $400 dress would end up washed away in a tidal wave of teenage hormones. So at least she might be able to wear it again, even if I know that’s not likely to ever happen.  What else are social calendar events for except buying new dresses?

The dress was poppy-colored, like her Mom’s. She’d seen the old pictures. Maybe she was trying to emulate, and one-up, her Mom. But she doesn’t know anything of her Mom’s experiences at the prom except what the official prom photos tell, and they tell nothing. I know something of her experiences (I was her date), but I’m not telling. But I don’t know my daughter’s motivations. I don’t, or didn’t, even know that “poppy” is another way to say bright pink. Or, why my daughter knows the color of a poppy bloom. Isn’t that the bloom that provides the world with heroin? The drug Keith Richards called the most aptly named in the world, because she is a real bitch? Does my daughter have a fucking poppy garden tucked away somewhere?

The kids at my daughter’s high school generally pair off in their prom dates with people who are their opposite-sex friends, people for whom they think they feel some emotional, rather than sexual attraction, a category of friendship which, as Tom Hank’s character explained in Sleepless in Seattle, does not actually exist. The kids’ confusion is a function of them not knowing their own minds, which is understandable, after all they are just baby adults, the enfants terribles of the reproducing caste. It’s not that two people can’t be emotionally attracted to each other, but that they can’t be of the opposite sex and be solely attracted emotionally as friends—opposite sex relationships of any significance always have a sexual component. Or, perhaps I should say, opposite gender relationships with two heterosexual people, one of whom identifies as a male and the other as a female, cannot be purely platonic. The attraction always resolves to something sexual. From where would the emotional attraction otherwise arise? Out of the platonic blue? When it comes to (heterosexual) men and women, there is no such place. Most (heterosexual) men think (heterosexual) women are batshit crazy and only begrudgingly ever engage them in conversation. Most women think men are assholes, perhaps because of what they correctly believe about what men think of them. Being from what almost seems two different species, neither trusts the other. All that can ever be hoped when the two nonetheless come together because of the desire for sex–the most powerful of attractants for H. sapiens because it must be else the species would surely and quickly die out–is that some sort of uneasy truce arises, with skirmishes relegated to only the periphery—the periphery of the sex, not the friendship. The friendship really doesn’t matter, and never did. It was always about the sex.

The kids my daughter hangs with save their rendezvous with those they find sexually attractive for casual hookups, not events on the official social calendar. These casual affairs usually involve alcohol, or some other drug (perhaps a poppy derivative?) because shortly after kids become self-conscious about their bodies with arrival of puberty and the development of mature sexual organs (after the fall, if you will, when they begin wearing fig leaves to cover their nakedness), they figure out that drugs like booze, weed, coke, meth, dope, etc., have the marvelous capacity to impair their inhibitions. Tequila, as the song goes, makes her clothes fall off. Events on the official social calendar are all about inhibitions remaining firmly stuck in place (or so it goes for these children of affluence who are ever mindful of status—the official social calendar wasn’t so stuffy for me back in the day). So the only way they can scratch the itch is in clandestine meetings in illicit places while under the influence. One of the favorite hangouts of my daughter’s group is an old landfill where they build a bonfire and pass the bottle around like a bunch of hobos to screw up their courage, and then wander off to the cars two-by-two.

I know all this because my daughter isn’t afraid of telling me most anything. I won’t say she tells me everything, but she tells me a lot—more than I really want to know. I think she feels safe doing so because I’ve never tried to pretend I was an angel at her age. I’ve told her that I drank, smoked weed occasionally, and got as much sex as a seventeen/eighteen year old boy could rustle up. I had as good a time as I could get away with. And isn’t that the whole point of life, after all?

The prom festivities began on a Saturday afternoon at 4:30 pm, when the daughter’s prom group was to meet at the idyllic campus (ivy covered walls and all) of a local university where they would present themselves to their parents for pictures, because what’s the point of having kids if you can’t photograph their every breath? (The prom goers had split themselves into groups of ten or fifteen couples whose membership was usually established along female clique lines, and not male pack lines, presumably because females are more catty and bigoted towards non-clique members, particularly if the woman not belonging to the clique is a stranger and is pretty. Males are more readily accepting of other males, at least when they aren’t seen as rivals for females, and since the males would only belong to the group if they were going with a female of the clique, there would be less tension among them. Besides, if there were ever a day when male packs got to overtly decide much of anything in H. sapiens society, that day is long past.) It turns out that all the prom groups in the high school were doing the same thing. So by the time they all arrived by around 5:00 pm, there were about 250 kids dressed to the nines in formal evening wear, milling around the campus of a local university (which had not given permission for being invaded in such a manner, but apparently had no objections, probably considering it good public relations with potential future customers), with their parents following them around with cell phone cameras, and even a few camera-only devices (the mono-function things that are just called cameras, if I correctly recall), having them pose so they could take pictures. Thus was the whole reason for the prom fulfilled.

Or, at least that’s what the mother of one of the girls in my daughter’s clique said. While her husband followed (or stalked, your preference) the kids around, clicking photos with his real camera that had a big, zoomy lens and a strap from where it might occasionally just dangle from his neck instead of being grasped and at the ready as it was the whole time I was there, she confided in me that she hated all these picture-taking ceremonies—that she thought it ridiculous that nothing ever happens and someone doesn’t take a picture of it and post it to the internet. She said that the picture taking had become the event; that no one actually ever did anything. Nobody had the time to actually live life; they were too busy photographing it. All people did anymore was pose at doing things so they could record it on digital film, the ultimate end of which is as yet to be ascertained. Basically she said the world was full of phonies and posers. Holden Caulfield couldn’t have said it better.

And I agreed. From the fake motorcycle gangsters hilariously parodied in the movie Wild Hogs starring Tim Allen, et al; to the guys riding bicycles in a contrived peloton on a busy thoroughfare through my neighborhood, each pretending to be Lance Armstrong, who was himself a fraud; to the very existence of a thing called a selfie stick; to the whole life and times of the Kardashians, and especially the Bruce Jenner subplot of the pointless tale; to every other faux reality show out there, each proving in its own way, like an unhappy family, that people are like electrons— the act of observing them changes their behavior; to the faux outrage generated by racialist hustlers over black men being gunned down in the streets when cops are doing the killing while ignoring the multitudes more who are killed by their own; to the petite bourgeoisie bureaucrat and technocrat parents of these prom children who thought that a picture imbues an event with meaning, getting things exactly bass ackward for the umpteenth time in their lives, and etc., etc., etc., ad nauseam: She was right. It is a nation of posers and phonies. There aren’t any real events or real people anymore; there are only events contrived to make good pictures, “good” meaning anything that casts the subject in a favorable light. Authenticity died with Facebook and iPhones and Instagram. Lives are contrived to fit this or that perceptual slot in the public psyche, meaning reality distorting media like Facebook and iPhones and Instagram deform lives to the shapes that will fit them. Social media taught people to lie, and continuously, and hammered home its necessity. So yes, I agreed, the whole thing was a vapid, vacuous fraud. But while doing so, I sort of wondered how well she got along with her husband, as he strode around the premises, clicking that camera’s shutter at everything he could coax to stillness for a moment or two. He was all in. But neither she nor me were in at all. We were revolutionaries, kindred spirits. I maybe woulda found a friend, had she not been of the opposite sex. And married. Even if her reproductive years were well behind her. That’s the irony of the Sleepless in Seattle rule. It has everything to do with sex but almost nothing to do with reproduction, so it doesn’t matter whether a couple would be reproductively viable. They’re still can’t be any deep emotional attachment between heterosexual men and women that doesn’t turn sexual, at least in the imaginations of one of them.

Actually, the festivities didn’t really start with the picture taking. They were officially kicked off the prior Friday afternoon, when all the girls in my daughter’s clique checked out of school early to get their nails done. Yes, you read that right. They checked out of school to get their nails done. There is a place in my brain that makes it somehow hard for me to imagine, even though I know it happened and have now written about it, that I let my daughter check out of school so she could get her nails done for a prom that wasn’t even until the next evening of a day in which she had nothing better to do than her nails. What will be left for her to look forward to if she ever gets married? How much more pampered could an entitled American daughter possibly be? Extravagances like these make me long for hard times, for something more severe even than the Great Depression. I want these pampered little princesses my daughter runs with, and her included, to experience times so hard that maybe they have to one day choose, like women in Greece during the Nazi occupation did, which child to feed and which to let die. When I see the contented confidence with which these American brats regard the future, I want times so bad that anarchy reigns, that there is rape, pillage and plunder on a scale to exceed the collapse of the Western Roman Empire fifteen hundred years ago, when Roman women didn’t contemplate whether their being drunk prevented their ability to consent to sex, but instead contemplated whether having been gang-raped while they were stone-cold sober by bands of roving barbarians meant they should commit suicide. In short, I want these narcissistic, selfish, entitled little bitches to learn that though theirs might be a rarified and exalted existence today, everything can change on a dime, so they ought appreciate it.

But I know it’s terribly gauche to think such thoughts. And probably pointless, too. It’s highly doubtful they’ll get their comeuppance in my lifetime. As outrageously easy as they have things now, I doubt they have quite reached the pinnacle of pamperedness. But their rarified status is not a birthright except in the imagination of the majority culture in which they were born. Other cultures beg to differ, sometimes violently. And the universe doesn’t care.

Yet, hope is the antidote for my despair. I hope for my daughter and her friends that one day American women won’t be so unfortunate to as live such banal and purposeless existences as to think that checking out of school to get someone else to paint their fingernails is something that is their birthright and obligation.

I must admit, my apocalyptic vision of what would benefit these women didn’t arise fully-formed on the day I let my daughter check out of school to get her nails done. It has been simmering in the kettle of my subconscious for some time. Nor for the first time did it gurgle into consciousness a couple of weeks before the prom, when my daughter came home with the news that we owed her a new car and a dog to compensate her for us having so audaciously sold the house where she grew up. Really, who does she think she is? That’s silly. She knows who she is. She’s an American Woman. I never had any doubt that the song of the same name by the Guess Who, a band from Canada who presumably could see things a bit more objectively than others from the States, had things about right. “American Woman…stay away from me”. Indeed. Thankfully, the daughter’s going off to college later this year. Now I know why college is so expensive.   The college administrators have come to the realization that people will pay almost anything for someone to take their teenaged kids, especially their teenaged daughters, off their hands.

And no, I don’t feel even a twinge of guilt for feeling this way. For one, it’s perfectly natural to develop a healthy loathing for kids as they approach some semblance of adulthood. Parental/offspring violence at the time when the offspring are ready for independence is one of the most common sorts of violence among mammals (if generally not the most deadly). Besides, I learned a long time ago what neuroscientists and philosophers and theologians are only coming around to now—the mind can’t control what the heart feels. It can only, at best, explain why the heart is feeling as it is. The mind need not be controlled by the feelings—the ability to have an impulse and not act upon it is a uniquely human attribute. I figure pretty much everyone feels this way to some extent or another, but that only the rare few are brave enough to admit or act upon it. Perhaps some don’t mention them because they want to make them go away by ignoring them. It never works. The way to make a feeling go away is to accept it into the heart and mind and experience it. The requisite understanding when doing so is that it is perfectly natural for parents to feel this way about their young adult children when it’s time for them to leave. There’s a reason God makes teenagers generally so unlovable to parents. And for me personally, at the conflux of where my feelings lie for my daughter about now is how utterly loathsome I find the American culture she is somewhat being forced to embrace as she prepares to leave.

My new kindred spirit and me (the woman who can’t be a friend because of the Sleepless in Seattle rule) also discussed how the logic of every event on a child’s social calendar must now be followed to its illogical, extravagant end. If a night at a sit-down restaurant would be something nice for kids to have done at prom fifty years ago, why not a night at a dining club sitting atop the highest building in town at well over $200 a head for dinner? If a limo to drive the kids to the prom and afterwards was nice twenty-five years ago, why not a bus to ferry whole groups of them so that the journey becomes a part of the party? And if we get them a bus, then why not rent them a lake house where they can continue with an all-night after party that lasts well into the next day (duly chaperoned, of course)? The prom is no longer just a night on the official social calendar. It is a weekend gala celebration.

The absurdity of showering all these resources upon a bunch of dumbass teenage girls would not have been complete without some drama, which was supplied by the refusal of the boy’s parents’ who paid for the bus and the lake house to allow any of the parents of the females to come along as chaperones to the after-party.   A tizzy developed, oddly enough led by one the parents of the boys, the one who happened to be my daughter’s date (probably some sort of behind the scenes power struggle to which I was not privy), over whether thirteen virginal (maybe) young women should be sent to a party solely hosted by a group of young men and their dads (the moms were to be at a separate lake house). It did seem kind of creepy. And I did experience pangs of what it meant to have the responsibility, as dad’s once did, of controlling sexual access to their daughters.

It is said that with great power comes great responsibility. I say that the cliché has causation backward—great responsibility requires great power. If I had lived in a time when I was responsible for delimiting sexual access to my daughter (which was more or less the case in most cultures until the modern era), it would have been necessary for me to have had great power of control over her because without it, I could not have met my responsibilities as a dad. No dad so burdened would have allowed their daughter to go to spend the night at a lake house with a bunch of boys and their dads, not even if their daughter was in the company of twelve other girls. I saw in the brief kerfuffle why in cultures where dads still suffer the burden of controlling access to their daughter’s sexuality, the daughters are kept covered up and away from boys and men as much as possible. It’s just easier to control things that way. The penalty for a dad failing in his responsibilities is family shame and the likely obligation of having another mouth to permanently feed because the daughter can’t be married off. I can also see where dads would want to blame it on the daughter if she has illicit sex, whether forcibly or not (yielding the honor killings that Western sensibilities so abhor). The daughter’s sexuality is not nearly as valuable as a marriage commodity when it is known she has had sex, and the dad is the one who has to deal with the consequences of her besmirched reputation, whatever its source.

I am a laisse faire, and fairly lazy, sort of guy, so I’m glad that I don’t suffer the responsibility of controlling access to my daughter’s vagina. Besides in the American culture, it would be a responsibility impossible to fulfill. It would just make me a blame sink more than I already am. The same cultural forces in the West that effectively relegated the family to irrelevancy over the last hundred years would make controlling sexual access to a daughter nigh well impossible for a dad. Even as the imagery of a father with a shotgun sitting on a porch waiting on his daughter’s suitor still resonates, especially among country music fans (if the lyrics of the songs are any indication), that world, like most of the world that country music sings about, is long past. Now the responsibility for a young woman’s sexuality is delegated to the young woman herself. With the responsibility comes the power. And there is no power and responsibility greater than that of the womb. So the world is basically now the charge of teenaged women, which probably goes a long way to explaining why young American women are so contumaciously impossible to live with. It’s not clear that any of these developments should be construed as progress, except perhaps in a selfish way, for me. Since my daughter has initial and final say on her sexual life, I don’t have the burden of culling suitors for suitability, or really even offering an opinion on the matter. Society goes to hell, but I get off scot free.

I observed to my platonic new kindred spirit (though I must admit that she started looking prettier and prettier somehow while we talked—unintentional proof that the Sleepless in Seattle rule is robust) that the utter banality of American society, along with the emasculation of its men, particularly in their roles as husband and father, might explain why young men and women in this country and other similarly benighted countries are romanticizing and even volunteering for ISIS and other fundamentalist sects that are fighting against what they see as a morally bankrupt West. She disagreed. Or, perhaps, she had never considered it. It is perhaps something of a radical thought to imagine that a bit of moral bankruptcy in mainstream society might yield a romantic fervor in youth to fight to against it, the costs be damned.

But I think it perfectly plausible. From where might a life of meaning be derived when the sum total of social developments over the last ten years can be summarized in three proper names: Apple, Facebook and Google? Especially since all that the three (and, of course, there are others) have done is make the culture phonier and more banal, magnifying the very existential angst they are in some measure in the market to relieve?

Hell, if I weren’t fifty-two years old, I might volunteer to fight myself (in fact, maybe because I’m fifty-two I ought to volunteer to fight for one side or the other—I’ve never understood why we waste our youth on the battlefields while old men sit idly by, directing the effort and waving the flag. Ants are smarter. The oldest members of the ant colony end their days defending it). Even if I wouldn’t go so far as to enlist in ISIS, I certainly would never have agreed in the Cold War era to serve in the US Army protecting and defending the Constitution of the United States had I known that the next thing we’d do after we ripped down the Iron Curtain and supposedly made the world safe for democracy would be to fight a tin-pan-alley dictator for his oil, in the process getting us embroiled in a never-ending war in a place we’ve got no business to be.

I can understand, even if I can’t condone, Westerners who volunteer to fight for ISIS, or for that matter, against it. There really is no better way to banish existential angst than the prospect of death. And the spirit of H. sapiens, particularly the male portion of it, needs a good fight because it is built for a good fight. And I can see where women follow along because they have always been romantically inclined to favor fighting men. They don’t want emasculated men any more than the men want to be emasculated.

Alas, none of those kids going promming last weekend are likely to ever fight for anything. They’re so immersed in American culture they don’t even know it’s a toilet bowl full of shit in which they’re swimming. They don’t need to keep repeating to themselves “This is water” like Dory in Finding Nemo; instead it should be, “This is sewage…this is sewage.” They’ve grown up as heirs to the great American Imperial Fortune, in a worse way even than the Baby Boomers, and believe it their birthright to leave school to get their nails done, just because.   They sneer at the Vietnamese women who primp and pamper them in the nail salon, thinking themselves superior to lowly immigrants who can’t even speak English. They just don’t get (or simply refuse to consider) that only a blink of the historical eye has passed since their ancestors were also lowly migrants, scraping to eke out an existence in this sometimes harsh but always potentially bountiful land. (Neither do they get that the manicurists can speak the language well enough to understand their disparaging comments and attitudes—it is because the customers don’t understand Vietnamese that the manicurists speak it among themselves). It’s doubtful any of these American brats will realize and appreciate their good fortune without which it is taken from them. I’d say it couldn’t come too soon.

Should the State of Iowa send Henry Rayhons to prison for life for having had sex with his dying wife?

Over three hundred years ago, Rene Descartes observed that we know we exist because we are aware of the fact we are pondering the question. But he took pains to point out—that as much is all we can know for sure. We can’t be certain the whole rest of the world, including all the people in it, aren’t just a phantasmagorically grand illusion (even as he didn’t think as much was true, because why would God be so cruel?). I think, therefore I am; that is all I can know for sure.

Modern science has partially proved and extended this Cartesian reductionism, empirically showing that we are only ever aware of a slight proportion of what our minds are consumed with thinking. Consciousness comprises only a small part of our neural activity. We know we exist because we can think about existing, but know almost nothing of anything else we might be thinking outside of our conscious thoughts, and our subconscious thoughts comprise the vast majority of our thinking.   It hardly needs remarking that if we know almost nothing of what we are thinking, we can hardly be expected to know what others think, or if in fact others actually exist.   Yet an assumption that we have the ability to ascertain the thoughts of others forms the foundation of sexual assault law and jurisprudence as it has developed over the last half century in the West.

Sexual activity is okay, or not, depending on the consent, of the people engaging in it. And each party must guess at the other party’s consent. If either party is deemed to have ignored the lack of willingness of the other and has proceeded to engage them sexually, then today’s jurisprudence provides that a sexual assault of some sort occurred.   Failing to successfully read another person’s mind in the boudoir or bedroom can yield a life sentence. And let’s not be naïve.   The mind-reading which must take place is that which a man must do of a woman. Men who can’t read the female mind, the vast majority of men one can reasonably presume, risk life and liberty when engaging a woman sexually.

And it is not just casual affairs in which men risk all just for the temporary bliss of intercourse. Even married men who seek relations with their wives—women who had presumably consented to sex with them and only them when they agreed to the marriage—must gain their assent if they are to protect themselves from charges of rape or sexual assault, as Henry Rayhons found out when he was charged with sexual assault for allegedly having sex with his wife.

Henry Rayhons is a 78 year old Iowa farmer and state legislator. After his longtime first wife died, he took up with a widower, Donna Lou. They were each in their early seventies when they married. After a few years together that were by all accounts blissful and loving, Donna Lou was diagnosed with dementia/Alzheimer’s/senility (or whatever the name de jour is for the mental deficiencies that arise wth old age).

Eventually Mrs. Rayhons had to be admitted to a nursing home for full-time care. Her daughters from her first husband made the arrangements. Mr. Rayhon did not like the meddling of the daughters, but went along with them. (I would speculate that one or both of the daughters had Mrs. Rayhons’ power of attorney, not unusual for the adult children of people who remarry late in life, otherwise he could have ignored their suggestions). A part of the care plan devised by the staff at the nursing home and the daughters (but not Mr. Rayhons) provided that Donna Lou was incapable of consenting to sex. (Why exactly the staff needed to worry over whether a nearly 78 year old woman had sex or not doesn’t seem to confound anyone but me. My guess is that it wasn’t the nursing home staff but the daughters who demanded the acknowledgement of her incapacity for consent.) Mr. Rayhons knew of the treatment plan, and maybe (the evidence is sketchy) had sex with his wife anyway. There is no evidence that she objected, or for that matter, even knew it happened. That’s sort of what being mentally incapacitated is all about. But there is no question that she had consented to sex with Mr. Rayhons many times before. They were, after all, married. For all anyone knows, what little part of Mrs. Rayhons’ conscious mind remained was eager for the physical closeness that accompanies sex (which, really, is all that sex for a woman of her age is good for anyway). It could just as well be imagined that Mrs. Rayhons didn’t recognize her husband, and for all she knew was having sex with a stranger. There is no way to know exactly what she was thinking, and not just because such things are philosophically unknowable. At some point a dementia patient won’t even meet Descartes reductionist view that thinking is all we know for sure of being.

Two weeks after the alleged incident, Mrs. Rayhons died. A week later, Mr. Rayhons was indicted for the ordinary litany of nonconsensual sexual contact crimes. The State of Iowa was Johnny-on-the-spot, moving quickly to protect its citizens from this sexually ravenous beast.

Less than two hundred years ago, wives were considered the husband’s property, and not just in the Antebellum South, but also in supposed havens of progressivity like New York. The notion that a husband could rape his wife was so absurd as to not even be within the realm of contemplation. When does property have the right to object to its use? And hadn’t the wife anyway pledged her consent to having sex with her husband as part of the marital transaction?

And ‘transaction’ aptly describes what a marriage was all about. A father owned his daughter until he gave her to her husband. Her sexuality was a valuable commodity, her womb, an asset of the family to be bartered and bargained away for advantage (there are historically some cultural differences as to whether a daughter was overall perceived as a liability or an asset, but all cultures, until recently, have recognized the value of sexual access to fertile females, and rarely left it to a young woman to decide for herself how to use it). Until around the turn of the eighteenth century in the US, wives were practically treated as the husband’s chattel, as his personal property, to do with as he pleased. (As an aside, it is quite remarkable how women romanticize the Antebellum South—their status then was roughly tantamount to that of a moderately well-regarded plantation slave—but such is the nature of the female heart. The popularity of both Gone with the Wind and Fifty Shades of Grey among women is neither unrelated each to the other, nor anomalous. )

The Talmudic Hebrew culture was among the first to afford wives and women a significant, if mainly unofficial status, as a partner, not a property, in marriage. While men could divorce their wives without their assent and wives did not have a reciprocal right, according to Everyman’s Talmud, Hebrew society ensured the inequities that might inhere with such rules were otherwise ameliorated.   That it took so long for western societies to come around to the idea of affording women and wives an equitable footing with men speaks to how overblown is our sense of progress, if nothing else. And to the reality that in the US, the state was initially very weak while the family, customarily headed by the husband, was quite strong. Wives were property when the state wasn’t strong enough to make them full-fledged citizens.

By the fin de siècle, women in the US were rapidly becoming the Talmudic equivalent of men. By World War Two, women had been exercising the right to vote for decades. And only a few more decades later, after the war’s end, Roe v. Wade came along, representing the idea that a woman’s womb was hers to do with as she wished. This was progress of a sorts, even as it destroyed in less than a century thousands of years of the means with which culture had settled sexual relationships between males and females of the H. sapiens species.

Except in the Soviet Union and a few other Eastern bloc countries, it wasn’t until well after the War, roughly about the time of Roe v. Wade, that the West finally recognized the crime of marital rape (Communist Eastern Europe, having stronger and more focused state apparatuses, recognized the crime much earlier—as early as the 1920’s in the Soviet Union). It stood to reason that if a woman had final say over her sexuality as an unmarried woman, that she didn’t give up the right to refuse sex just because she had voluntarily entered a covenant to love, honor and respect, even one whose main purpose was the establishment of an exclusive sexual relationship.

But the right to refuse sex in the marital relationship was of only dubious value. If the husband forced himself on the wife, an accusation of rape was tantamount to a suit for divorce. The recognition of the crime of marital rape by the state was only relevant to crumbling, abusive relationships that the wife wanted to dispense with anyway. Even in Talmudic Hebrew culture women had ways to escape abusive, unhappy marriages. Where we might see progress, others with a more objective perspective of history would see a cyclical rediscovery of well-tempered wisdom.

But poor Henry Rayhons. The new rights afforded to women that extended to women as wives were never meant to destroy a life and the memory of a marriage well lived.   His wife could not have consented to sex in the same way as it is generally assumed a woman might consent to sex. But she wasn’t a drunk college girl getting ravished by horny frat boys either. She had consented to sex with Henry when she married him. She couldn’t have withdrawn or affirmed her consent. She was mentally incapable of doing either.

Hard cases make bad law. Rayhons’ is a hard case. He is not the sort of rapist the law has in mind when it provides a cause of action against forced sex, not even in the context of marital rape. He wasn’t an abusive husband. He just did (maybe) what he and his dying wife had done before as an expression of their love. It certainly had no reproductive repercussions, the main point to established rape law and jurisprudence.

The truth of the matter is that Henry wouldn’t be in the dock except that his new wife had existing daughters who did not care what was the law’s proper intent, but were intent on punishing someone they saw as an interloper in their relationship with their mother. So Henry now has to face the utterly ridiculous prospect that his life will end in ignominy, a convicted rapist for having maybe made love to his dying wife.

Progress, even in the expansion of women’s rights, is not an unmitigated good. Read more about the details here.


Will an Apple Watch make you happy?

“Here we go again”, I thought, when my college junior son announced he “needed” a new guitar because of his unpaid gig playing in a praise band at the church he attends while at school. (Yes, there are college kids who don’t drink or do drugs or have casual sex, and who regularly attend church and are involved in campus religious organizations. While I wasn’t that college kid when I was in school, that’s my boy, the one who’s had two bone marrow transplants. It’d seem to me he might be mad at God, for all the hell God’s put him through. But he’s sanguine in his faith. I think he figures God picked him because He knew he was one of the few who could handle it. I sure as hell couldn’t have.) The boy could not possibly need a new guitar. He has about five guitars, a banjo, an ukulele, and a trombone (from his marching band days). Of the guitars, he has two straight acoustics, one straight electric, and two acoustic/electric’s. One of the latter two is a Yamaha that is more acoustic than electric and the other is a Gibson Les Paul that is more electric than acoustic.

I got the Gibson for him shortly after his being diagnosed a second time with leukemia. At $700, it was a pretty penny to pay for a guitar for an amateur guitarist who would probably be dead within the year. I did it out of guilt, feeling bad for having been so audacious as to think that having a kid would be the one thing I did that didn’t turn out disastrously, as if this would be the one instance when I didn’t have the reverse Midas touch (no, I don’t generally feel this way, but you have a kid who is stricken with leukemia the second time, and see how that makes you feel about whatever else in life you may have accomplished). I knew the boy loved the process of acquisition, and offered to let him pick out the new guitar to give him something better to do than mull over his fate. His acquisitive soul settled on the most expensive guitar he figured he could get away with wanting (of course). He knew how bad I felt about the leukemia coming back. His calculus was spot on. I bought the expensive guitar for him as penance for my sins, though I doubt it worked, as I still felt as guilty as ever, and for what exactly, I didn’t really know. Five years later, and the guitar has practically never been taken out of its also-quite-expensive case. Yet it is practically identical to the one he says he now wants for playing with the praise band.

I know my son. It’s the getting that keeps him going, not the having. At least while he was shopping for the guitar, it managed to focus his mind on things other than his leukemia. And with this latest obsession acquisition, it is the getting, not the prospect of having, that is driving him forward. He’s always happiest when he’s figuring a way to get something. Cleaning out his room lately in preparing for a pending move, the reality of his acquisitive impulses were hammered home through the outsized garbage bags overflowing with the products of his successful forays that had to be carted out, first for a garage sale and later for the dump. He feverishly acquired according to whatever fad fancied his mind, so there are collections of Pokemon cards, Beanie Babies, Legos and much more.

My son is not unique in what might seem a quirk of human nature—that happiness is not to be found in the achievement or the acquisition, but in the yearning. Most people have an idea in their mind of some idyllic future place where they will finally find peaceful bliss.   But the human psyche is not designed for peaceful bliss. It is designed to impel us forward, to keep us continually striving, no matter how much is achieved or acquired. If it needs to dangle the carrot of peaceful bliss in front of our noses to keep us striving, that’s exactly what it will do. If it needs to inculcate the idea of an eternal life of happiness in heaven as our reward for continually fighting a battle against entropy that we know we will ultimately lose here on earth, then it will do that, too. The human mind was created to keep us alive long enough that the genes we carry make it into the next generation. It will allow brief interludes of pleasant feelings we sometimes call happiness, as the reward for achievement of some goal or objective, but it has no interest in allowing any particular achievement to quell the ceaseless striving that it evolved as a forager to believe was necessary for survival.

There is no off switch to the yearning impulse, just a pause button. Apple Computer figured this out long ago, and cynically exploits it with each new product launch or upgrade. And to be fair to Apple, the whole of capitalist endeavors directed at the consuming public depend on this principle of continuous striving. In every consumer market, from car companies, to homebuilders, to that fancy new restaurant down the street that serves hardly identifiable yet lyrically-described food, capitalists exploit the yearning impulse.

What humans in fully-developed, wealthy societies yearn for the most these days is status (survival being already more or less secure). The yearning for status sells cars, boats, motorcycles, cell phones, kitchen appliances, paintings, spirits (wine and beer and vodka and tequila and bourbon and scotch), televisions, vacations, jewelry, clothes (especially women’s clothes and shoes), the services of manicurists and tanning salons, etc., ad nauseam. 2/3rds of the economy is consumer spending. Probably 2/3rds or more of consumer spending nowadays goes towards achieving or protecting status. Even when a consumer good is purchased for its utility, more, sometimes vastly more, is paid for the product if it has a status-enhancing quality (e.g., Apple’s cell phones versus Nokia’s). The only economic sectors not overly concerned with selling status are producers of commoditized goods and services that are not sold in end markets. Big agriculture, mining, transportation and delivery—these are not goods which can be directly employed to enhance a consumer’s status among consumers. They therefore sport some of the lowest profit margins in the economy. A bushel of corn or a barrel of oil doesn’t fetch a premium for how its ornamental display around the neck or on the feet might enhance the wearer’s status. They’re just bushels of corn or barrels of oil.

There are really two (and perhaps more) sides to the question of whether it is possible to buy some happiness with the purchase of an Apple Watch. Buying the watch to satisfy the yearning impulse as I’ve been describing might buy a transitory period of contentedness that will fade something like gravity, by the inverse square root of the distance, the further away in time was the purchase. So from the perspective of the yearning impulse, the answer is “yes”, qualified by the notion that the happiness will be quite fleeting, after which another yearning will arise in roughly the same mental space to take the place the Apple Watch previously occupied. Don’t worry though, Apple is well aware of the inverse square rule of contentedness, and will time its improved Apple Watch for release at about the time the contentedness from buying the first one will have just fizzled out.

But is there a deeper, less transitory happiness that might be achieved through owning an Apple Watch? It seems unlikely. Several reviewers of the watch pointed out that it is intended to operate as something of an extension of the iPhone, so that the user finds liberation from their cell phone by strapping an Apple Watch around their wrist. This is a dubious claim. First, the watch has much too small a screen for anyone who isn’t an avian predator to be able read and manipulate it on a routine basis.   Second, the watch will likely do exactly the opposite of liberation, tethering its owners ever more tightly to their electronic devices (surely Apple’s intended outcome), as the Apple Watch doesn’t do much except in tandem with the iPhone. Where before there was one electronic device, now there will be two.

And it is a readily observable truth of human nature that people don’t change who aren’t interested in change. The Apple Watch won’t simplify anyone’s life who doesn’t want their life simplified; for people who are “addicted to their iPhone” as one reviewer described himself while raving over the watch’s simplifying potential, the watch will most likely simply substitute one addiction for another.

In short, except for a very brief period just after purchase that quickly fades away, buying an Apple Watch won’t buy happiness.

Which begs the question: Is there any material thing whose purchase or acquisition can bring happiness? Economists know that people get less unhappy as their income increases past that required for subsistence, but only to a point (which was, a few years ago deemed to be about $40,000 per year for a family of four). Achieve subsistence and just a bit more, and you will have eliminated as much unhappiness from your life as is possible.   Eliminating unhappiness is requisite to enjoying happiness, and there is some happiness to be achieved just by eliminating things that make us unhappy, but again, the happiness thus achieved is transitory. More income (past subsistence and then some) can yield even a bit more happiness, but the law of diminishing marginal returns makes income-generated happiness like heroin. People who become addicted to ever-increasing income levels find that it takes bigger and bigger increases to get the same high.

Ignoring the status enhancement that comes with having the latest technologies, does owning any labor-saving and/or communications devices hold any promise of happiness through increased efficiency, i.e., through less expenditure of effort to achieve the same ends? Remember my striving son? It wasn’t the guitar that he was really after, it was the striving to acquire the guitar that he really sought (even if he kidded himself otherwise—a necessary bit of self-deception if the objective is to be achieved). How much happier might he have been had his objective been harder to achieve?

And so it goes with so many of the labor and time saving devices that modern man considers necessities. The human body has been magnificently constructed to spend its days searching for food, and even when it is not hungry, because hunger was never more than a few days away no matter how successful was the latest hunt. The yearning impulse never abates, it is only sometimes paused. But with his belly full and his mind barely taxed, modern man has little for which to yearn.  Ten thousand years of technological development, from irrigated agriculture in the Tigris and Euphrates River valleys, to chatting with strangers across the globe in real time, as life got easier, achieving happiness got harder. We spend our lives thinking we will find happiness in the satisfied glow of achievement or acquisition, but have been tricked by our neural hardware. What really makes us happy is the striving.

At about $400 a pop for the cheapest Apple Watch, acquiring one won’t be so challenging to most people in the West as to provide much happiness in the striving. And as Apple well knows, the contented glow of ownership quickly fades.

So no, buying an Apple Watch won’t much make you happy. In fact, you weren’t made to achieve happiness. You were made to strive for happiness, because that kept you striving to survive and propagate. But once you have overcome every challenge, accomplished every goal, achieved every objective, that you thought stood in the way of you and happiness (I’ve finally got an Apple Watch!), you still won’t have found, as the U2 song goes, “…what you’re looking for.” Because you had what you were looking for when you were looking for it. And what you were looking for was lost as soon as you thought you found it.

The Apple Watch: Reviewing the Reviews

No, I have not been provided an advance copy of the Apple Watch.  I’m not a pseudo-publicist, i.e., a tech journalist, who Apple graced with an advance copy. I regret to admit that I can’t transmit fresh, first-hand information about its delights to the huddled masses of Applytes (my word for Apple acolytes) who have been patiently waiting, waiting, waiting, since the death of their patron saint Steve Jobs, for Apple to develop a new product.  The thin gruel of having to rely on iPhone updates and iPad tweaks has left them thirsting for anything new from Apple. They’re now to the point of needing a splashy new product launch in a new category in order that they might affirm their status as pioneers in the consumption of hip new consumer electronics. The Apple Watch, as utterly unnecessary and productivity impeding as it surely will be (features, not bugs, in the minds of the Applytes), will doubtlessly quell their thirst, and the angst it generated, at least for awhile.

Apple is a so-so technology company but genius when it comes to understanding the wetware of human beings.  Like a coy virgin teasing her bridegroom with glimpses of flesh before the wedding day, after the announcement heralding the watch’s arrival a few months ago, Apple revealed only enough of the watch’s details to pique the public’s interest and keep it continually smoldering just on the edge of public consciousness. Now that the launch is only weeks away (April 24th, according to Apple), it has bestowed upon a select few technology journalists the great honor of providing them an advance copy of the watch in order that they might issue glowing reports on the watch to the wistful, longing public. Ostensibly, of course, their reports should be paragons of journalistic objectivity, because that’s how journalism works, no?   But take a moment to consider who picks the journalists and how important to a technology journalist’s career it is that they be among those picked by Apple to review a new Apple product. Apple, after all, is the most valuable company, tech or otherwise, the world has ever seen, which says a lot about the frivolities of the age, but not so much about the historical importance of Apple. It’s hard to be historically important in a mainly frivolous and irrelevant age, except perhaps as a good example of the tenor of the times in question.

But given the journalistic circumstances, what are the odds there will be any bad reviews? It’s as if Apple hired its own official parade observers to ensure all parade goers are on board in admiring the naked emperor’s lavish clothing.

As a public service aimed at reading between the journalistic lines (writers sometimes have subtle means of revealing their true inclinations) I will review three reviews, those from the Wall Street Journal, The New York Times and Bloomberg, my three main sources of daily news. I figure these three are good, general purpose news organizations that have reasonably competent tech writers. Let’s start with the Wall Street Journal.

The headline of the Wall Street Journal’s review, written by Joanna Stern, tells it all:

What the Apple Watch Does Best: Make You Look Good.

And then, to drive the point home, the subheading:

Apple isn’t just selling some wrist-worn computer, it’s selling good looks and coolness, too.

Well, what more do you need to know?   Especially if good looks and coolness matter to you. Do you seriously expect me to believe that good looks and coolness don’t matter to you? Of course, though the headline implies the watch is good looking and cool, it doesn’t explicitly say so; it just says that’s what Apple is selling. Not to worry, Stern raves about how the artifact appears on her wrist, and the article is peppered throughout with pictures of the artifact on, presumably, her wrist.

Personally, I think the watch looks about as good as an early-days Casio, one of those boxy whiz-bang electronic gizmos that came out somewhere around the early eighties (if the mists of time haven’t too terribly occluded my memory). Casio watches were for geeky types who hung out at Radio Shack on the weekends. And a bit for greasy disco wannabes in Members Only jackets (remember those?). I haven’t a clue as to how Casio managed to appeal to both of those disparate groups.  The Apple Watch is boxy and electronic-looking, nothing like a classy Rolex or even Seiko or Timex (the ones with a traditional face). But Stern raves:

Like many Apple products of the past decade, the watch is a status symbol, a sign of wealth and taste. But unlike a MacBook or an iPhone, this Apple product works to help you look—and feel—good.

I sought a simpler experience, turning it into a stylish watch to keep me on schedule and a workout companion to keep me moving.

I know what you’re thinking: Can’t I just buy a $150 fitness tracker for that? Sure, but it might end up in a drawer. The Apple Watch succeeds where the fitness trackers have failed. Not only does it provide more accurate data and a platform with big promise, but it’s an accessory I love to wear all day long.

And then we find, buried in the adulations, a few drawbacks, even when using the watch as a very expensive fitness tracker:

Ideally, the watch would automatically kill off notifications during workouts so your arm doesn’t vibrate so much; in reality, you need to put it on Do Not Disturb mode, which requires too much futzing. Even getting to the Exercise app is a challenge, it being one of many tiny circular icons on the watch’s app screen that makes me wish my fingers were the size of toothpicks.

There are other frustrations: Why is there an Exercise app on the watch, but the data lives in the iPhone’s Activity app? Why must I click “save” to keep a record of a workout? And why can’t the watch’s battery make it past 10 p.m. on days that I exercise?

Fingers the size of toothpicks? Batteries that won’t make it past 10:00 pm? Too much futzing required to shut the damn thing off so that you can do a workout in peace?

My, my. The emperor’s fleshy thighs seem to be peeking through, and perhaps his pasty hindquarters, too. Incidentally, in the case you weren’t aware, people have been doing workouts for centuries without the need to have up-to-the-second feedback on the beatings per minute of the heart or their respiratory rate or their caloric burn. If you think all that nonsense is required, then you are a) seriously narcissistic; b) obsessive-compulsive; c) both, or d) less interested in getting a workout than you are interested in flaunting a hip new electronic gadget. But then, in this age of selfies and social media, you are most likely all of the above.

By the end of Ms. Stern’s review, her true feelings, quite contrary to those which were headlined, managed somehow to slip the editor’s hatchet:

But the prompts to stand up every hour got downright annoying. I don’t stand enough, I know, but I don’t plan to change that in the middle of a meeting, or after I’ve burned 300 calories at SoulCycle. (I did leap out of my seat…when I found out how to turn the stupid prompts off.)

In the end, she advises that people shouldn’t buy the watch, because it’s not as good as it inevitably will eventually be, a sentiment playing to the hearts of those Applytes, if in a backhanded way, as Apple represents for Applytes the idea that human progress unfailingly marches ever upward and onward. She knows the Applytes are gonna buy the watch no matter what she advises, and is just hedging her bets with everyone else in case it turns out to be a marketplace dud.

Over on Bloomberg, Joshua Toposlky’s review starts out with a headline that’s about as ambiguous and confused as the Wall Street Journal’s was celebratory:

Apple Watch Review: You’ll Want One, But You Don’t Need One: The Company has succeeded in making the world’s best smartwatch.

This begs the question: If the Company has succeeded in making the world’s best smartwatch, then how could it be that there is anyone alive who doesn’t need one? Unless perhaps the headline is an underhanded compliment, something like pronouncing that GM had built the world’s best moonbeam-powered car when nobody really wants a moonbeam-powered car and wouldn’t know what to do with a moonbeam-powered car (especially during the day and when the moon is new) if they had one.

And it isn’t clear exactly what the headline means by “you’ll want one, but don’t need one”. Is an Apple Watch like a Krispy Kreme doughnut? Is it somehow bad for you, in a delicious sort of way?

Topolsky’s review was a give and take, replete with praise for Apple and the Apple Watch at one moment, while decorously pointing out the watch’s defects in the next. For example, he observes that the watch keeps impeccable time, as it is true to something called Coordinated Universal Time within 50 milliseconds, a feature which can be really cool (that notion again). If you put Mickey Mouse faces (a software option) on all the watches your group of friends own, together you can watch all those legs on Mickey synchronously tap away the seconds of your lives that are passing you by while you marvel at the novelty. But, Topolsky points out, if you don’t tap the screen to get the time, which itself can be aggravating in its uncertain effectiveness, the only way you can find out the time is by an exaggeratedly violent movement of the arm that is likely to rudely, more so than is normally the case when checking time in the presence of others, to scream out that you really wish you could be shut of whomever is in your presence. Thus the Apple Watch has the potential for allowing someone to do more expressively, and by only using its watch features, that for which smart phones have always been useful—shunning the people around you.

Topolsky buried his headline, but not very deeply, putting it in the very first paragraph:

I’m in a meeting with 14 people, in mid-sentence, when I feel a tap-tap-tap on my wrist. I stop talking, tilt my head, and whip my arm aggressively into view to see the source of the agitation. A second later, the small screen on my new Apple Watch beams to life with a very important message for me: Twitter has suggestions for people I should follow. A version of this happens dozens of times throughout the day—for messages, e-mails, activity achievements, tweets, and so much more. Wait a second. Isn’t the promise of the Apple Watch to help me stay in the moment, focused on the people around me and undisturbed by the mesmerizing void of my iPhone? So why do I suddenly feel so distracted?

The cell phone industry, and particularly Apple, as the biggest and baddest of all cell phone providers, abolished the utility of watch wearing, perhaps intentionally, perhaps incidentally, by making cell phones (all of which tell time) necessary accouterments, condensing the function of watches to jewelry that might confer status if made by right company. For people who don’t care about the status-conferring potential of jewelry watches (like me) and who therefore abandoned watch wearing because of having to carry around a clunky clock all the time anyway (like me), Apple now must coax them into believing that wearing a watch AND carrying a cell phone is the way to be, because, as Topolsky observes, the watch isn’t really smart—it’s just quite intricately connected to the smart phone that is. That’s gonna be a hard sell. If I can’t wear a watch to replace the clunky, annoying cell phone in my pocket, then what’s the point? Oh, yeah, I forgot—it’s cool (Topolsky) and makes me look good (Stern).

Topolsky ends by heaping the device with praise, while explaining that he really doesn’t want one:

So Apple has succeeded in its first big task with its watch. It made something that lives up to the company’s reputation as an innovator and raised the bar for a whole new class of devices. Its second task—making me feel that I need this thing on my wrist every day—well, I’m not quite sure it’s there yet. It’s still another screen, another distraction, another way to disconnect, as much as it is the opposite. The Apple Watch is cool, it’s beautiful, it’s powerful, and it’s easy to use. But it’s not essential. Not yet.

The New York Times headline of its review by Farhad Manjoo was also something of a backhanded compliment:

Apple Watch Review: Bliss, but only after a steep learning curve.

Manjoo is far and away the most awestruck of the Apple Watch’s reviewers, but then there may be more than just a correlative relationship (i.e., there might be a causal connection) between his opinion on the watch’s potential and his self-admitted addiction to his iPhone. Anyone who could be described as “addicted” to an existing consumer electronics artifact is perhaps a poor choice for reviewing a new consumer electronic artifact, especially when the two artifacts are made by the same company and are intended to be used synchronously. To Manjoo, the watch is finally a seamless extension of his mind and body. It even opened his door, using a door key app, at his Starwood Hotels room. And it served as his boarding pass. And it bought him groceries (Stern used the watch to buy an iced latte, because, would anyone writing a review of an Apple Watch buy anything but an iced latte with it the first time, after, of course, their Soul-Cycle session?).

Manjoo may have loved the watch, and may have quickly become its dutiful slave, but judging from the comment boards accompanying the articles on both the Wall Street Journal and New York Times websites, not much of anyone else did. My favorite was from “Bob” a New Jersey Patent Attorney:

I would like to see a review from someone who does not regularly get comments about his “addiction to [his] smartphone,” and whose wife isn’t pleasantly surprised that he “seem[s] to be getting lost in [his] phone less than in the past.” It is hardly surprising that a guy who can’t control his impulse to see every update is smitten by a device that allows him to deepen that dependency another level. But how about his wife (and kids, assuming he has them)? How about someone who does not see an addiction to a smartphone as something to make light of? We keep falling all over ourselves to find reasons to “love” these technologies, and seemingly never step back and give them an honest assessment. I mean seriously — opening a hotel door? Paying for things? Presenting a boarding pass? Since when did we need a solution for these things? Who ever had a problem with key cards, credit cards, or a paper boarding pass? I guess you can tell my personal bent on this subject, but let me dispel any notions that it is due to my age or background. I’m 34 years old. I’m an engineering graduate and a patent attorney. My day-to-day life is all about technology, and yet I still don’t understand why people are so tickled by it. Even more so, I don’t understand how people can call these things revolutionary. One of my colleagues has had an Android watch for a while now. You know what does with it? He gets distracted during lunch while the rest of talk face to face.

It is a rare event for me that someone else captures my sentiments exactly, but Bob about says it all. Indeed. When will we ever step back and give these devices an honest assessment? Have social media and smart phones done anything but complicate our lives? How is society so much better now that we can stay constantly connected to some ephemeral idea of it through a four by six, rectangular, half-inch- thick artifact? How will being able to stay connected to it through a small, square, thin piece of metal strapped to our wrist make things any better? Have we finally reached the endpoint of technologies that have the potential for improving the human condition?

Judging by the comments, it may well be that the almost instinctive believe in the value of technological advancement is finally fading. Technological innovation has ever and always provided only bare improvements, and only sometimes, to the human condition; it could be that the Apple Watch launch will reveal that a majority of people have become innovation skeptics, rather than believers. Or it may be that the Apple Watch is just a lousy, useless innovation, and people will see it as much. Or, it may be that all seven or so billion souls on the planet (minus one—me) will be sporting a new wristwatch by the end of the decade.

As you might have by now figured out, I have no desire to purchase, or even test, an Apple Watch. I only have an iPhone because I got my daughter’s hand-me-down, and I don’t use it to connect to the internet. I don’t do Facebook or any other social media. I’m not a Luddite; I just don’t see the benefit to everyone knowing every little thing I do, or to me knowing every little thing everyone else does, and so don’t care to be that confusedly, confoundedly and constantly connected to the world.   Although it’s hard to tell, as I refuse to Tweet a question in 140 characters for an instantaneous response, or garner a thousand Facebook friends to gauge opinion, I’ve got the feeling that fewer and fewer people are enamored of social media, and that growing numbers (like me) downright loath it, so that the market for a device worn on the wrist that practically demands continuous social media connection might be a tad weak. Time will tell, even if an Apple Watch can’t very usefully tell time.

Any thoughts you might have on the subject are welcome.

Executive Summary, March 29-April 4, 2015

It’s another holiday ritual week (doesn’t it seem like yesterday that we were in the midst of holiday jolly with Christmas and New Years?).  It is Holy Week for Christians and Passover for the Jews. Sunday, the 5th is Easter and Friday (today), the 3rd is the first day of Passover (which lasts until the 11th). It also happens that the daughter’s high school had spring break this week. So this is a week redounding with religious and pagan ritual and mythologies.

According to the Christian catechism, last Sunday, Christ entered Jerusalem on the back of a donkey; the crowds lined the streets, waving palm fronds to show their love and appreciation for him. The Christians celebrate the day as Palm Sunday.   As Christ was neither a Pharisee nor a Sadducee (the rabbis and keepers of the Jewish law) and taught an alternative way than the legalism of the elders, this adulation was most upsetting. By Maundy Thursday (meaning the day of commandment, which refers to the command that Christ issued to the disciples at the Last Supper to love one another), Christ knew the gig was up. The Pharisees and Sadducees turned the people against him. According to the gospels, Christ repeatedly predicted he would die and then rise from the dead. The next day, Good Friday, it finally came to pass that he was crucified and died. On Sunday, he was resurrected from the dead.

The day wasn’t always known as “Good” Friday. Until the Middle Ages, it was known as Black Friday, because it was the day that Christ died, and Fridays were anyways considered particularly unlucky. And, of course, no one really knows what day of the week Christ was crucified, but for him to rise from the dead on the third day (Sunday), he had to die on Friday, if Friday is counted as one of the days. And the Christians desperately wanted Christ to rise on Sunday, instead of his resurrection happening on, for example, Saturday, which is the Jewish Sabbath. Christ was a Jew, but the early Christian church preferred to downplay that aspect of his heritage in order to broaden his appeal, sort of like Barack Obama is fully half-white, but only when it garners more votes to acknowledge as much.

Tonight is the first night of Passover for the Jews, when the first, and perhaps only (depending on which sect of Judaism) Passover Seder is held. The Seder, meaning literally “order or arrangement” is a ritual retelling of the Exodus story that God commanded of the Hebrews after having forced the Pharaoh to release them from captivity.  The name, Passover refers to an episode reeked in blood, when the Lord passed over Hebrew households if they had the blood of a yearling lamb smeared over the sides and top of their doorframe, on his way to killing the first-born child and animal of every Egyptian household. We would today consider this genocide.

Passover lasts until April 11th this year. The first two days—from sundown today until sundown two days hence, are, for observant Jews, full-fledged non-working days. So the timing of the holiday this year is fortuitous, as it starts on a weekend, and one in which not much work happens anyway. For seven days, Jews are to eat only unleavened bread with their meals, thus Passover can also be called, as it is in the Bible, The Feast of Unleavened Bread.

The first day of Passover doesn’t always coincide with Good Friday, and the period of celebration doesn’t not always (at least partially) coincide with the Christian Holy Week culminating in Easter Sunday. It just happens that it does this year. But it is not happenstance that the two Judeo-Christian holidays take place at roughly the same time of year. They come at if from different calendars, but for the Christians anyway, arrive at about the same place purposely. The Christians wanted to distinguish their theology from the Jews (and Pagans), but make it enough of the same that it would be attractive for individuals of either group to join as followers and believers. The same could be said of Christmas, at least with regard to the Pagan holidays around the winter solstice.

Spring break, that most Pagan of education-calendar rituals, coincides, like Easter and Passover, with the beginnings of spring and the rebirth of the world under a warming sun.   It’s fitting that all the holidays overlap this year.   They all are derived from the same place in the temperate-climate mammalian heart that rejoices at the warmth and reawakening that comes with the lengthening days and more direct sunlight of spring. But on the education calendar spring break originally had as its justification the need for children to help out on the farm to get the fields ready for planting. Very obviously, that is a quaint bygone. In these easy, post-modern times, it celebration now recalls the Roman Bacchic festival of boisterous and riotous revelry and drunkenness. Towns along Northwest Florida’s Emerald Coast have come to loathe as much as love the Bacchanalian spring break rituals. They sell their souls for the money, and raucousness spring breakers bring.

In economic news, the big number out today—the March payroll report—was a disappointment. From the Bureau of Labor Statistics:

     Nonfarm payroll employment increased by 126,000 in March,

and the unemployment rate was unchanged at 5.5 percent.

Employment continued to trend up in professional and business

services, health care, and retail trade. Job losses continued in



     Incorporating the revisions for January and February, which

reduced nonfarm employment by 69,000, monthly job gains have

averaged 197,000 over the past 3 months. In the 12 months prior

to March, employment growth averaged 269,000 per month.


This continues a downward trend that began after the blowout numbers of November of last year, when payrolls grew at a monthly pace of over 400,000.

Here’s what I said about the developing trend in February:

The 257,000 number [in January] is a decline from November and December’s numbers, which were 423,000 and 329,000, respectively, after revisions.   Though it would never be spun in such a manner on the long-only news outlets (Bloomberg, The Wall Street Journal, i.e., basically any mainstream business media), the numbers point to a declining level of gains. From November’s 423,000 to December’s 329,000 is a drop of 94,000, or about a 22% reduction in monthly gains. From December’s 329,000 to January’s 257,000 is a drop of 72,000, also a 22% reduction in additional jobs.   If the trend of dropping about 22% each month continues, the gains to employment in February will barely tip 200,000, which is a perfectly meaningless observation to make, except that it does a fine job of helping flip the idea that January’s numbers were great on its ear. Employment gains will turn negative by the end of the year if current trends continue.

In February the trend reversed (temporarily?), and payroll gains increased from January’s 257,000 to 295,000. With the numbers declining to only 126,000 last month, the downward trend has returned and accelerated, dropping 57% in a month. From the details of the report, most of the declining growth can be attributed to the mining industry (oil and gas extraction and services pertaining thereto), which lost jobs again this month. Most other industries saw declining growth, but none actually lost jobs.

What does all this mean? It could be that the cyclical stage of this expansion peaked around the end of 2014. Quite a few other economic metrics are coming in weak, including last month’s automobile industry sales volumes (a big economic bellweather), which had been on a tear. It could be that this is just a hiccup. Or, it could be the start of something more ominous. No one would have predicted in 2007 that a few defaults in subprime residential real estate mortgages would kick off an economic and financial conflagration for the ages. Time will tell.

The only other newsworthy event of this week is the putative nuclear arms deal the US has struck with Iran. The US is gravely concerned that Iran might get the bomb because Iran would surely commence to dropping bombs on US cities just for the hell of it if they were so endowed. It’s not really clear whether Iran wants the bomb, or just wants to negotiate concessions for agreeing not to build one. Iran is rapidly expanding, with the US’s help, its imperial reach in the Middle East. It can now add Iraq and Syria to areas under its hegemonic purview, and the US is helping it defeat ISIS in the Levant, which will  expand its influence even further. In Yemen, it has so rattled Saudi Arabia by its support of the Houthi rebels who recently overthrew the Yemen government that the Saudi’s commenced bombing Yemen, but with plain old conventional bombs.

The first thing to know about the Iran nuclear deal is that it is tentative. It is to be drafted in writing and signed by June 30th, which is an eternity of time when it comes to international relations. Under the agreement, Iran is to give up 2/3rd’s of the weapons-grade uranium it now has, and convert a heavy water reactor so that it is incapable of making plutonium, and submit to inspections of some sort for fifteen years. It would take an expert in nuclear technology who was also expert in international relations to know whether the agreement will actually work to limit Iran’s nuclear ambitions. In return for agreeing to the deal, the US and other world powers agreed to lift economic sanctions, allowing Iran to again sell oil directly on the world markets, which ought to help those US employment numbers even more. What’s not to love about more oil for an already glutted market?

Nobody likes this deal, except President Obama, who championed its negotiation. He apparently sees it as a potential legacy maker for him. But a nuclear-armed Iran is really not any more dangerous to American interests than is a non-nuclear Iran. And, ironically, with US help, plain old conventional-bomb Iran has expanded is sphere of influence dramatically since Obama came to office. It may be up to Hillary to beat back the Persian hordes.

The party that most loathes this deal is, surprise, Benjamin Netanyahu, the Prime Minister of Israel. But the US does not exist to do Israel’s defensive bidding. Israel is not the 51st US state. Israel is an independent sovereign that is ever and always out for what every other independent sovereign is out for—its own interests. An Iran that retains enough bomb-making material to cobble one together in a year, as this deal provides, is no nuclear threat to the US at all. And not legitimately much of one to Israel, either, but the Israelis have grown to believe that they among all the peoples and nations of the world deserve to be allowed to proactively eliminate any and every possible threat they face. Israel has the bomb, but no one hears Lebanon caterwauling about its compromised security as a result.

Regardless of what else happens between the US and Iran and the whole Middle East situation, I sense that the US is heading to war, ultimately with Iran. I hear it from the right and the left, both of which worship heroes that we created in Iraq for no other real purpose than we needed heroes to validate our collective spirit. The US really gets confused about its reason for being without which it is embroiled in some conflict abroad, whereby it is inevitably “defending freedom”, or is engaged in some rights expanding exercise at home (gay marriage, e.g.). The US depends on some notion of progress continually progressing, else the whole thing collapses in on itself. That’s what really makes it “exceptional”, as the neocons like to say. Most other nations are content to just survive and thrive. Not so, the US. It must survive, thrive and conquer. So it relentlessly searches for things to conquer, as surviving and thriving have proved fairly easy.

In the meantime, Happy Easter, Passover, Feast of Unleavened Bread or Spring Break, depending on your religious and secular affiliations!

Book Review: Philosophy in the Flesh—The Embodied Mind and its Challenge to Western Thought, by George Lakoff and Mark Johnson (1999)

Being over fifteen years old, and intricately involved with explaining the philosophical impacts of discoveries about the mind in the field of cognitive science, this book may seem a bit dated. But not really. Its fundamental premise, that there is no mind or soul or spirit or consciousness without which there is a body, and that the body determines the mind, not the other way ‘round, is hardly original to 20th century psycholinguists and cognitive scientists.

Lakoff and Johnson never once mention him in elucidating the embodied mind and its challenge to Western thought, but Baruch D Spinoza (incidentally, a contemporary of Descartes, he of the disembodied mind and Cartesian duality, which Lakoff and Johnson spend hundreds of pages and gallons of ink debunking) came up with this idea of an embodied mind roughly 350 years ago.  In the Scholium to Proposition Thirteen of Part II of Ethics, his magnum opus completed in 1675, Spinoza very clearly explains that there is no mind independent of the body:

From the above [i.e., Proposition Thirteen, which states, ‘The object of the idea constituting the human mind is the body—i.e., a definite mode of extension actually existing and nothing else.’] we understand not only that the human Mind is united to the Body but also what is to be understood by the union of Mind and Body. But nobody can understand this union adequately or distinctly until he first gains adequate knowledge of the nature of our body.

Without making this a treatise on Spinoza, suffice to say that Lakoff and Johnson’s radical “challenge” to Western thought is about 350 years too late. And what’s more—it’s not as good or powerful as Spinoza’s, because Spinoza had to overcome beliefs that were demanded of the culture of his time, with as painful a death as Inquisitors could muster for heretics as the cost of refusing to believe in the duality.

Lakoff and Johnson set up a straw man with the Cartesian idea of duality and then pretended as if they were the first to ever knock it down. They weren’t. There is nothing at all radical about the idea that the mind is part of the body. It is instead a radical notion to believe that it isn’t (hence the necessity for official sanction and public persecution and punishment for those heretics who refused to believe the two are separate). Just as it is a radical notion to believe that one certain living human was God in the flesh who died a human death but was three days later revived in the flesh. The reward for believing these radical notions, according to the Christian catechism, is that one’s disembodied spirit will live in heaven with God until it is reunited with a reconstituted body after the end days, after which it will enjoy eternal life in heavenly bliss with God.

The Christian necessity for a disembodied spirit or soul to depart for heaven to await the end days with God is what drove the silliness of the Cartesians, who even went as far as to identify where the soul or spirit resided in the body in life—the pineal gland, which is a bit ironic in that we today know the pineal gland to be intricately involved with reproduction, which is the actual means to whatever trifling of eternity, or continuation in time and space, is possible for human beings and all other creatures, rather than some disembodied soul that floats away to heaven at death. So, Descartes was at least in the ballpark, if unwittingly perhaps, when he identified our potential for eternal life as residing in the pineal gland.

Lakoff is known for believing that the mind thinks in metaphors, attributing the use of different metaphors to, for example, different political beliefs. He explains that liberals and conservatives are distinguished by the metaphors they use to describe life in political society. Both analogize political governance to metaphors for the family. For conservatives, theirs is a hierarchical Father Knows Best metaphor (Lakoff and Johnson capitalize every positively identified metaphor, as if some great principle were involved in explicating them) in which it is the duty of a strict father (the government) to raise the children (the people) to be responsible adults who should need little supervision, care or help once having proved their responsibility and maturity. For liberals, governance fits the Nurturant Parent metaphor of the family, where the mother and father work to keep the essentially good children free from harm, protected from the potentially corrupting and harmful influences of pollution, poverty, injustice, etc.

There is precious little doubt that humans think in metaphor. Metaphor and analogy is the manner with which the mind categorizes the world in order to make readily accessible and intelligible sense of it. Language is, as Guy Deutscher points out in The Unfolding of Language, built on a reef of dead metaphors. Words start out describing very physical, fully-embodied concepts (to “go” to the barn, an example of the original meaning of “go”, literally means to move one’s physical presence from its present location to the barn). They slowly become metaphorized (my word) to describe all sorts of abstract things (e.g., “I am going to think of an idea for his birthday party”, using a variant of go which doesn’t describe movement of any physical object, but is movement of the will through time). So, Lakoff is undeniably correct that humans think in metaphors, at least so far as their thinking is done through language. There is a vast amount of thinking, however, that takes place outside of conscious, linguistic purview (“outside” in this sentence is an example of a word being used as a metaphor derived from its original meaning of physically being located somewhere else than inside some sort of container, i.e., “out” of the “side”). It is not clear that metaphor animates subconscious thinking, especially since the subconscious thinking that dominates our mind is thinking done closer to the non-metaphorical stuff that matters to our continued survival. For our subconscious thinking, it’s hard to imagine that the mid-morning hunger I’m feeling right now is anything more or less than just that. No metaphors are involved or necessary to categorize the way I feel. But in writing this essay, I am hungry to get at the truths, if any, that are to be found in Lakoff’s opinions. It’s not the same hunger as I feel for lunch, but it’s similar, and that’s the point. Without physically embodied concepts, it would be very difficult to describe our abstract thinking to others. We would still think abstractly, but communicating those thoughts would be much harder, if not impossible.

The problem with Lakoff’s argument that we think in physically-derived metaphor (actually, Lakoff didn’t go so far as to point out that the base activity for all metaphorical activity is physical embodiment as I am here) is that it explains nothing to say that liberals and conservatives use different metaphors to describe the process of governance. Of course they use different metaphors. People think using metaphor. If they used the same metaphor for views of governance that exist at either end of the political spectrum, their ability to understand and distinguish their views would be greatly diminished. But as Steven Pinker noted in The Language Instinct, liberalism or conservatism are heritable traits—the metaphors with which people are thinking about politics are selected more or less by the genes, not by political party affiliation. Yet even knowing that doesn’t explain much. What caused nature to select for genes that in some people yielded a nurturing, trusting and egalitarian perspective of what governance is about, but in others yielded a hierarchical and disciplinarian perspective? Nobody really knows, but that’s the only question worth asking. To answer it by saying some people use this metaphor or another when thinking through their political views is to end up where the inquiry was begun, without actually answering anything. (Incidentally, my guess is that the genetic differences arose because some people lived in environments that favored socially collective and egalitarian cultures, and some people lived in environments that favored more individualistic and hierarchical cultures, which then begs the question—which environments?)

In the end, Philosophy in the Flesh comes across as an intricately (practically Rube Goldberg-esque, which is to say, poorly) argued rationalization for Lakoff and Johnson’s political impulses, which to further detract from its value, is quite poorly written. Lakoff is something of the psycholinguist version of politico-economist Paul Krugman, who uses economics as rationale for his left-wing political impulses when he uses economics at all, rather than as means of discovering truths about the material world.  Except that Krugman writes well and clearly and Lakoff doesn’t. Lakoff’s psycholinguist contemporary, Steven Pinker, has reached very similar conclusions about the nature of the mind and body and therefore at least about the epistemological aspects of philosophy in light of today’s cognitive science, but from a decidedly less political viewpoint (though it is imagined that most of Pinker’s theories find their greatest acceptance among those with right-wing proclivities). It seems Lakoff is to Steven Pinker what Paul Krugman is to Milton Friedman.

Concluding by returning to the first prominent philosopher of the modern (post Renaissance) era who espoused the view that mind and body are inseparable, Baruch Spinoza had also this to say regarding the unity of mind and body. From the Scholium of Part III, Proposition Three, Ethics:

Now surely all these considerations go to show clearly that mental decision on the one hand, and the appetite and physical state of the body on the other hand, are simultaneous in nature; or rather, they are one and the same things which, when considered under the attribute of Thought [mind] and explicated through Thought, we call decision, and when considered under the attribute of Extension [body] and deduced from the laws of motion and rest, we call a physical state.

There really is not much in the way of new ideas in this age about the nature of human beings that wasn’t already deduced by philosophers long ago. Through cognitive science and modern medicine we have learnt a great deal about the intricacies of the human body and the mind inhabiting it, but the big picture remains by and large the same—body and mind exist as a unitary entity devoted to getting the genes for which they are vessel into the next generation. There is no understanding of mankind except that he is understood as an animal that owes his existence, like all others, to evolution by natural selection (a theory which Spinoza adduced, only a couple hundred years before Darwin, in setting out his argument for the nature of God in a beautiful essay comprising the Appendix to Part I of Ethics). Every bit of man, including his magnificently developed brain, has to be evaluated and examined with the clarity of evolutionary impulses in mind, or only confusion can arise.

So the central premise of Philosophy in the Flesh, the notion that mind and body are inextricably connected, is hardly revolutionary, except perhaps to a political shill trying to garner attention for the freshness of his old ideas. Don’t read Lakoff, if you want to better understand the embodied mind and what it means for philosophy. Read Spinoza. Or, Pinker. And read Spinoza anyway. Even accounting for the empirical discoveries of modern cognitive science, he’s still the Michael Jordan of philosophers. Lakoff isn’t. Pinker maybe is the Scottie Pippen.

Goodbye, Homewood

It’s been seventeen long years now, on this corner of Roxbury and Huntington, a place that started as a practical compromise between a couple of places we had to be for work, that almost became a place where the heart could live, until the sentiment died aborning when the first kid turned seven and got sick for the first time. A couple of years of illness that felt like decades later, and it seemed more like a prison, a repository for bad memories, a place where the struggle to survive might well fail. Not a happy place. Definitely not a place to thrive, except in spite of everything. And what sort of thriving is that?

I worked hard after that. So hard. Every day to make money and every night and weekend to make the place a home, if not for me, then for everyone else who lived there, and for maybe some other someone one day who wouldn’t think the whole pitiable existence an albatross, a sign of good fortune that had somehow been destroyed. Then it came back. All that work at making it work, at pleasing the mad gods controlling our fates, proved for naught. The kid got sick again. What had I done that these people around me should deserve this fate, this suffering? The second leukemia was magnitudes worse than the first. I quit hoping . Quit caring. I blamed myself, but I couldn’t help what I’d done. Because I didn’t even know what it was.

And so now, it’s time. The ship has finally reached shore. The last kid is almost out of school. The “good” (code in Alabama for “majority white”) school system’s boot heel is slipping from my neck. The first kid, the leukemic kid, survived a second time, but barely, and left this place to thrive. And now I can leave this place and all those memories bouncing around these old plaster walls. There ain’t many good ones. Leaving won’t lose them, but at least a new place won’t be like a minefield, hair-triggered to set off memory bombs with every step.

Leaving means more than just leaving behind the sadness of an unfortunate life poorly lived. It means getting rid of the hassle. The place was old when it was new. Nothing worked right. The basement flooded every time it rained. The windows were all painted shut. They needed repainting every couple of years just to stay that way, a Panglossian circumstance, the best of all possible worlds. Water dripped from every faucet and drain. Sixty-year-old leaded paint chipped and peeled from every wall and window. The roof leaked. Water pooled in the yard. And it was all mine to maintain and repair. The house was an aging parent to care for, before Social Security and Medicare made the burden bearable. Things constantly broke, and then broke again soon after being repaired. Taking care of the dump on the corner of Roxbury and Huntington was nothing to taking care of a double transplant patient, but still, it weren’t fun. Not a bit.

The neighborhood wasn’t so much a neighborhood as a real estate investment club. People came and went in a blur; it seemed every last person in the greater Birmingham metropolitan area decided that living here would be the perfect thing, for a few years. They all rode the real estate tram to the top of the real estate investment hill until the flimsy infrastructure inevitably failed, leaving a whole carload of passengers stranded. For a few years after the crash, there was stability in the burg. Nobody could leave because nobody had the money to buy their way out. A few years later, it all roared back to life—the banal search for yield. Everyone became again a speculator, day-trading real estate lots. I quit bothering even trying anything more than passing cordiality with the ever-spinning roulette wheel of “neighbors”, more aptly described as bare-fanged venture capitalist wannabes, using other people’s money to fund their real estate fantasies, just like the real ones in Silicon Valley.

The house on the corner is a fishbowl. The three-sided visibility made our business everyone’s business. All the trials and tribulations of the seventeen years were lived out like reality theater for a bunch of people with their faces pressed hard against the glass, peering in, dispensing judgments in murmurs too faint to be clearly heard but too loud to completely ignore. Nothing was off conversational limits. From the new awning on the front porch to the reason our kid had cancer twice, there was nothing but cruel, condescending judgments whispered in the leafy subdivision about the house on the corner of Roxbury and Huntington and the people who lived there (the cancer was our fault, by the way, because it’s always the parent’s fault, because that’s just how the puny human mind works—it has to attribute nefarious effects to human agency in the absence of any real evidence, which for cancer there are practically none).

The corner lot where the fishbowl sits is the neighborhood doggie poop park. The biggest, flattest, best yard in the neighborhood that I bought for my kids to play in hardly ever saw a child frolic or run or kick or swing a bat; it mainly just saw dogs crouching to shit. While the kids played video games or watched the Disney Channel or had bone marrow transplants, the neighborhood dogs had a field day. They were walked to the doggie poop park surrounding the fishbowl, or let free to run, so they could poop in a place their owners didn’t have to clean. The dog neighbors weren’t even worthy of the appellation “frenemies”. There weren’t anything friendly about them. The guy up the street with a Ron Paul sign in his front yard was the worst. He didn’t quite get that the libertarian freedom Paul espoused critically depended on strictly enforced property rights. Libertarians don’t let libertarians let their dogs shit in their yards, so I shouted him off the property, twice, and quite threateningly so the second time. He really has no idea how close he came that second time to grave bodily injury (I had a pistol stuffed in my shorts, just in case), but I didn’t grow up in a place where faux comity was used as a pretense to shit on someone else’s shoes, literally or figuratively.

A bigger fool agreed to buy my little slice of hell on the corner of Roxbury and Huntington, and for almost three times what I paid for it. So there may be some consolation to seventeen years living in a fishbowl on a corner lot used as a doggie poop park. The arc of the moral universe is long but it sometimes bends towards restitution, which isn’t quite justice, but is better than nothing.

And now, I’m moving to the country, to twenty-five acres on the top of Lookout Mountain that I bought just before the financial system collapsed the first time. I’m figuring on building a house right in the middle of the woods overlooking the pasture. It may well be that painted-shut windows are the best of all worlds in the world that I’m leaving. But I am going to a place where I must cultivate my garden.

After seventeen years indentured to a crappy school system, trapped in a fishbowl of a dump on a doggie-poop-park lot at the corner of Roxbury and Huntington, it’s finally time to go. Goodbye, Homewood.

And good riddance.


Get every new post delivered to your Inbox.

Join 228 other followers