Saturday, March 17, 2018


Chagall: The Sacrifice of Isaac, 1966


A man goes up a mountain. He is moved by what he believes. He sees the climb as necessary, as no way out but through. He brings his son, who watches; son who still hasn’t caught on. Son who has followed. Son who thinks one day he will inherit. Son who acts as if without brothers. Son who says yes to whatever is before him. Son who waits by the old rock, the low bush. Son who brought nothing but the rucksack he was given. Son who slightly moved beneath the knife. Son who saw the end of day as ecclesiastic, as blaze. Son who in time made all other sons listens to the story of the old man who got all the way up and who without looking back went over to the other side. Who disappeared as if searching for other sons. As if done. Son who walked in quiet and calm, having come back down, alone. Son for whom nothing was changed, was changed, and in the changing changed the world.

~ Sophie Cabot Black, “The Exchange”

Poets and writers take mythological stories and typically change them. This tendency to revisionism, including radical revisionism, is quite pronounced in modern literature. Here the poet follows the midrash tradition of filling in the details — at what point did Isaac figure out he was to be the sacrifice? Maybe even as Abraham was tying him down, he still thought this simply could not be: he would just be taught a new ritual. Cabot Black presents Isaac as completely trusting and obedient. At most “he moved slightly beneath the knife.”

“Son who still hasn’t caught on. Son who thinks one day he will inherit.” How crafty the poet is in making Isaac real, his trust heart-breaking.

And the poem is all the more interesting because of all the repetitions of “son who.” Far from making the writing tedious, they make it more dramatic and unnerving. And the repetitions are also a musical element, crucial to making it poetry.

(Here is a prompt for the brave: try to tell a story using one or two words in precisely this manner. “Daughter who” is the first phrase that came to my mind.)

And then, Abraham does not return home with the boy and tell Sarah (according to one midrash, Sarah found out anyway, while Abraham was still climbing the mountain, and died of grief). He climbs down the other side of the mountain and disappears. Perhaps he can’t face Isaac, who in turn develops problems with his eyesight, eventually becoming blind. Again a midrash suggests that Isaac went blind when he saw his father’s knife over him. This is not the original story, but interpretations develop, even if it means changing the details.

How does Hebrew mythology differ from all other mythologies, even those from which it borrowed? It is more disturbing. The horror of the story of Abraham and Isaac can’t be shaken off. As the two climb the mountain, (Isaac carrying the wood; Christians love to point out that this prefigures Jesus carrying his own cross), the boy’s asking, “But where is the lamb?” is a stab of the knife in both Abraham’s and the reader’s heart.

And Cabot Black won’t even give us the relief of the original “happy ending.” There can be no father-son relationship between Abraham and Isaac after this. And yet the world is changed because our modern ethics developed in part in response to this story.

Not changed enough, it has been claimed. The sacrifice of Jephta’s daughter happened afterwards — one wonders if some lucky loophole in Jephta’s vow would have been found had it been a son. But worst of all, the spin that made Jesus the human sacrifice that was to be the vicarious collective “bloody ransom” for the sins of humanity going back to Adam and Eve was a strange setback in the evolution of religion away from barbarity (see “blood magic” in the article that follows).

Not that the nuns told nine-year-olds “Every time you sin, you put a nail into the flesh of Jesus” saw anything wrong with this approach to teaching ethics to children. Your relationship with an invisible supernatural being counts so much more than your relationship with mere humans.

More important, it has been pointed out that fathers still send their sons to be killed for the Tribe or Nation, or democracy (supposedly — some say it’s the price of oil), or Islam. Have we truly abjured the very essence of archaic religions? We know that for thousands of years there was no happy ending — the son did get sacrificed to the invisible, fictional King of Kings in the Sky. (Well, Abraham heard a voice commanding him to kill — hearing voices? a modern lawyer would have little trouble getting a Not Guilty verdict.)

But the poet chooses a different ending: Abraham apparently doesn’t kill, but simply continues alone down the other slope of the mountain. He chooses to disappear, bearing the agony for the rest of his days. He chooses to become irrelevant, to sacrifice his own interests: let the son inherit now. That’s one solution, though not a very convincing one.

I prefer the actual myth as it has come down to us. It chooses to invent a happy ending that appears to announce that there will be no more killing of sons (virgin daughters are another story), offering them to an abstract, nameless deity — offering them as a “holocaust,” the body burned entirely, nothing held back. A huge psychological change has happened, making the original ritual abhorrent. Child as more precious than blind obedience and tradition? Father who is in the end more like a mother — note that we never even consider that it might be Sarah who would be summoned?

This is a part of the larger problem created by religions: is our greatest obligation to fellow human beings, or to the deity? Christians recoil at human sacrifice if it’s “pagan” — obviously, those gods are fictional. What a horror to destroy a destroy a human (or even animal) life as an offering to mere fiction. But what if their god is also fictional?

Then you can make the “leap of faith.” This is the solution offered by a widely admired philosopher, Soren Kierkegaard, who glorified Abraham’s willingness to sacrifice his son. Only recently are critics willing to take another look at Kierkegaard’s choice between human and “divine” obligations. But then only now we are beginning to respect human beings and not see them as innately evil — obviously, it’s no big deal to sacrifice human lives to a god or to a cause if humans aren’t worth much to begin with.

And “god” can be defined as anything a person finds “the highest” — it can be an ideology, it can be one’s special group (extending to one’s nation).

Note, finally, that we are still obsessing over this myth, just as we aren’t done with Adam and Eve, or, from another mythology, Orpheus and Eurydice. Centuries of trying to resolve disturbing ancient stories that don’t seem to go away. Why not?

Adam and Eve, Abraham and Isaac, and, to a lesser degree, Lot’s Wife, are stories of disobedience versus blind obedience, blind faith. These are the foundational stories of the Abrahamic religions, and of the authoritarian tradition in culture in general. The unquestioning following of commands is glorified. Following the voice of reason, of curiosity, of love — watch out, the penalty can be horrific. 

Titian: Abraham and Isaac, 1544

“The very first crack in the dike was when it occurred to me, as a father myself, just how immorally twisted the idea of a father blood-sacrificing a son was. And to think this was the best idea a God of the Universe could produce is less than silly.” ~ a former Baptist minister

Blood sacrifice as the foundation of Christianity and precondition for entry to heaven is the greatest put-off for me too. How barbarous! I'm curious if Christianity could finally reject its worst idea: “Jesus died for our sins.” I'd love to live long enough to see if a debate about it can at least be started. Perhaps not, since it seems that only the most conservative denominations will be left. But reality never runs out of surprises.


~ “What struck me the other day was how quick Christians are to dismiss this story as an example of “Old Testament thinking.” When confronted with the horrific brutality of both Yahweh and his people in the Bible, it has always been customary to explain it away by citing progressive revelation, as if such primitive thinking were not the foundation of the early Church’s profession of faith. But it was. Far from condemning Abraham’s belief that it’s ever okay to sacrifice your own child, James extols this act of Abraham as an example for all of us as if it were the quintessential act of worship which pleases God. Reading this at my current stage in life, I see that it’s wrong on so many levels that I hardly know where to begin.

For this daunting task, I’ll call on my friend Brian, who blogs over at A Pasta Sea, to help me parse these things out.  He starts off by speaking to the heart of the matter:

    ~ I think this chapter strikes at the very core of exactly how the traditional expressions of the Abrahamic faiths are able to make otherwise good people do terrible things in the names of their gods. ~

Indeed it does.  In fact, as I re-read this story today I count terrible discoveries about biblical faith which I never saw before because the eyes of faith remain blind to things like this.

Biblical morality is relativistic to the extreme. Counter to the monotonous charge that atheists have no basis for moral reasoning, the story of Abraham being told to kill his son illustrates that if “whatever God says is good is good” then nothing can consistently be called “bad,” not even child sacrifice.
Another point that’s not typically dealt with by most expositors of this passage is that God has given opposing commands. He’s instructed Abraham to kill Isaac and then later commands him not to harm the boy. We can therefore conclude that any command of God might be countermanded. This presents a problem for any moral argument that makes God out to be the “objective standard” of what is right. Under this view, it was morally right for Abraham to desire to kill Isaac in obedience to the command of God and then three days later it was morally wrong. Not because the situation had changed, but simply because God said so.

Killing something is central to this religion. First and foremost, there’s the killing of Jesus, which the Bible says was necessary to appease God wrath.

Old Testament aside, to an outside observer, it’s pretty clear that human sacrifice is undeniably central to much of Christian theology, given that its most important figure is thought by many of its adherents to have been sacrificed to appease its god’s wrath. Try to nuance it all you want. It’s simply inescapable when the main symbol of a religion is the very instrument of death its founder was supposedly sacrificed upon.

Blood magic is all over the Bible in both the Old and New Testaments and is celebrated all over Christianity’s traditional hymnody (e.g. “Power in the Blood”, “Nothing but the Blood”, “Are You Washed in the Blood”, “The Blood Will Never Lose its Power”, etc.) and blood magic is either ritualistically performed or memorialized every time participants drink their shot of wine/grape juice or have a priest do it for them.

Furthermore, there’s the daily metaphorical “crucifixion” of the human personality which evidently this deity requires.

It’s fitting that human sacrifice plays such a central role in Christianity. When certain Christian doctrines are taken seriously — doctrines that proclaim an individual’s depravity, uncleanness, corrupt reason, unworthiness of anything good, worthiness of eternal torture and utter inability to do anything about those things apart from the divine intervention that only comes when one blindly surrenders in faith — it leads to exactly that. It leads to the sacrifice of one’s own humanity.

Wherever the Bible seems to present God in a repulsively negative light, you can just make stuff up to sanitize the story and people will uncritically eat it up.  The writer of Hebrews struggled with this at some level, and he suggested (rather ad hoc, I might add) that Abraham must have figured Yahweh would just bring Isaac back from the dead.  Which he knows because…um…because…uh…

    ~ Second, Isaac was to be a whole burnt offering, meaning after Abraham slaughtered Isaac, he was supposed to burn him. The smoke from burnt offerings was to rise up to heaven and be a pleasing aroma. This would point to the totality of the sacrifice and the rising up of the essence of whatever it was toward heaven. There’s not going to be a body, bones or anything else left to be “raised” and the writer of Hebrews doesn’t seem to pay any heed to that little detail….I’m merely pointing out that it would be incredibly unnatural for Abraham to conclude that a pile of ashes would be raised back to life. Such a belief would require a highly developed theology that’s completely foreign to the Old Testament and unprecedented in any Biblical example of resurrection . . . and most importantly, there is no mention in the text of Genesis itself that Abraham believed that God would raise Isaac from the dead . . . The writer of Hebrews is either offering this resurrection belief up as his own supposition or is repeating some other tradition, but it’s nowhere in the text of Genesis. ~

Abraham wasn’t praised because he didn’t kill his son; he was praised because he was going to.  It wasn’t the sparing of Isaac which made Abraham the ultimate model for devotion to Yahweh; it was his willingness to do absolutely anything—even kill his son—that won him the prime spot on the Hall of Faithful. Brian puts it bluntly:

    ~ It is astonishing and baffling to me that this act of Abraham’s is so highly regarded in Judaism, Islam and Christianity. This “test” would better serve as a means of weeding out psychopaths who unquestioningly obey the voices in their heads, rather than as a way to show how much someone fears God or how much faith they have. Instead, this is supposed to be the epitome of faith and the act which actually justified Abraham. ~

Rembrandt: Abraham and Isaac, 1654


Every time I return to the story (and I hope this is the last time, but a writer can’t predict future themes), I discover a new angle. This time it’s the BLOOD MAGIC. I am very grateful to Neil Carter, the author of the article, for making it explicit, especially in the context of communion. Yes, when a priest raises the chalice filled with wine upward toward heaven, he’s offering the blood of Jesus to his father — according to the Catholic doctrine, the wine has literally changed into blood.

Why does the priest offer the blood of a murdered son to his alleged father? Because “without blood there is no forgiveness?” Such archaic, cruel, and non-Christian words are still spoken in the 21st century? 


In a previous post, I related that it suddenly occurred to me that the story got sanitized by later scribes: most likely, Abraham really did kill Isaac. We know that human sacrifice was practiced in the Ancient Near East (and of course not only in that region). It was interesting for me to learn that in a few medieval midrashim, Isaac does get killed. His body was supposed to be burned so that the “pleasing” smoke of the sacrifice would rise up to heaven and fill god’s nostrils (Yahweh definitely had a body in the early stories — and even the not so early ones, come to think of it; he does mention his nostrils to Isaiah, this time disdaining the “stench” of the people’s offerings).


The story of Abraham and Isaac is a brutal one. Blood sacrifice and blood magic, a god who demands blood sacrifice and delights on the smell of burnt offerings, what but tribal affiliation distinguishes this god from Baal, these sacrifices from the purely barbaric?
Oh, yes, god relents at the last minute, but disobedience would have meant Abraham failed the "test." And this is a violent and vengeful god, this Yahweh. But the Father of the new testament still requires the blood sacrifice, of his own son, to satisfy his wrath. And the doctrine is that in communion, we partake of the body and blood of that sacrificed son. 

This is not a metaphor, although metaphor is fundamental in us. The doctrine of transubstantiation requires belief this is actually the body and blood of the crucified god we line up to swallow. When my granddaughter made her first communion the class went through a "rehearsal" of the ceremony, complete with receiving the unconsecrated host. After the actual ceremony she told me she was surprised the consecrated host didn't "taste better" than the unconsecrated, ordinary wafer they had in rehearsal. Surely the flesh of god should be remarkable!


Thanks for sharing this wonderful story! Yes, the flesh of god should taste fabulous, better than the best lamb chop . . .

Perhaps to prevent such thoughts, we were told it’s a terribly sin to bite into the host (from Latin Hostia, meaning “Victim”). As kids we wondered if blood would pour out of the wafer if someone did that. Yes, talk about literal!

Even as a young teen I felt terribly anxious over possible trespass whenever the wafer would stick to the roof of my mouth due to dehydration (we weren’t supposed to eat or drink on the morning of the communion) and I had to use my tongue and, in stubborn cases, the tip of my finger to dislodge it. This is funny now, but was quite an ordeal back then!

In general, when I remember how seriously I took certain absurdities, I have trouble believing it. I understand that this is the general case among atheists: they tend to go through a stage of being very earnest believers — and then they can’t believe that once they literally believed this stuff! 


I wanted to tell you I remember the same worries about that communion host. If we bit into it, would it bleed?? And it was so dry and hard to swallow!! Sticking to the roof of the mouth just as you describe!

My sister and I were talking today about the problems children face in school these days . . . And she remarked the only bullying she remembered was from the nuns. In addition to all the horror stories of sin and punishment, they were not averse to hitting us with rulers to the knuckles, a slap or shove, or the wooden paddle to the legs. You also could be put in the corner for hours. This was only true for us in our grade school. It was an Italian parish, but a German order of teaching nuns.

Our high school nuns were Ursulines, and a whole different level of kindness and civility. Talking about our grade school we also remembered a male lay teacher who was very violent with the kids, would throw them up against the walls. You could hear it all from your classroom nearby. Everyone knew, but nothing was done as far as I remember, to stop it. Oh my. And now children are left unprotected against mass shootings.


I wasn’t in a Catholic school, so I merely heard stories of sadistic nuns — after coming to the US. Big sigh of relief — at least I was spared that. Of course worrying about those cauldrons in hell, filled with boiling tar and pitch, that was quite bad enough . . .

Nevertheless, at least we didn’t have to worry about getting shot.  That is just surreal in the worst way.


As noted, the liturgy, the hymns, all talk about that sacred blood we need to be washed in, to swallow, to consume so we can reach salvation. The massive human sacrifices of the Aztecs, that seem so overwhelmingly horrific, are part of the same kind of thinking: that there is power and magic in the blood, and the gods thirst for blood sacrifice, which can provide a connection to their power, their protection, their favor. 

I am also reminded of the terrifying, violent Christianity found in the work of Flannery O' Connor, whose characters move toward a vision of apocalyptic grace. It may be argued they've got it all wrong, but the potential for violent terror is there already, waiting to be embodied, expressed, fulfilled.


Yes, when we think of the Aztecs and others with particularly gory religions, it’s hard to remember that the same logic pervaded almost all religions: the main form of archaic worship was animal and/or human sacrifice. Think of Abel and Cain: Abel killed a lamb, and his offering was accepted, while Cain’s gift of grains and other “fruit of the land” was rejected. While no reason is offered in the text, one theological interpretation has been that god requires blood sacrifice. 

(By the way, that was the first lie told to us by the teaching nun — that Cain’s offering was rejected because it was of poor quality. The text gives no such indication [no wonder we were not allowed to read the bible by ourselves], but the story unfortunately teaches that you need to kill something as a gift to the deity.)

Caravaggio: Abraham and Isaac, 1604
Is there any turning away from the archaic brutalities once they get enshrined in the so-called “holy” scriptures? How is humanity ever to get beyond barbarism (of war in general, not just religious war) if true believers continue to find their duty to fictitious deities (including nationalism or a political ideology) more important than their duty to living fellow human beings? I find some hope in the article below.


~ “How is it that as humanity as a whole seems to be evolving to be more inclusive and less dogmatic in general, certain religious strains are doubling in their extremism? It’s possible to conceive of kernels of extremism as intrinsic within particular faith traditions. But it’s also possible to understand the current rise of extremism as a reactionary backlash against the overall liberalization of faith.
We live in a world where every single person is challenging everything, where every single person has a voice,” Amanullah De Sondy told me. De Sondy is a senior lecturer in Contemporary Islam at University College Cork (Ireland) and author of The Crisis of Islamic Masculinities.

“The extremists want conformity and detest plurality and differences. Being different, being an individual who states that it is their individual relationship with the divine that matters is a huge challenge to those who want the strict order of organizing society.

Put another way, strict religious ideology requires strict conformity, and people aren’t confirming anymore.

The number of church-goers has dropped steadily for decades, but now there [is] also a lot of space in mosques around Europe. Recent data from the extensive European Social Survey (ESS) show that the number of Muslim immigrants who regularly go to the mosque drops significantly after they've lived in their new homeland for some time.

So how is it that in the face of declining religiosity, we nonetheless find ourselves swept up in almost unprecedented magnitudes of religious struggle—from the brutality of Daesh (as ISIS hates being called), or the far less extreme yet still perpetual hostility of Christian fundamentalists toward the gay, lesbian, bisexual, and transgender community?

“The three major Abrahamic religions—Judaism, Christianity, and Islam—all have groups that espouse some type of eschatology, or belief about the end of time,” says Valerie C. Cooper, associate professor of Black Church Studies at the Duke Divinity School. “Among these groups, eschatological fears that the end times are near may be stoked by perceptions that the group is being persecuted.”

That sense of persecution can come from the fact of declining religiosity. Or, say, a war being launched against an entire religion — whether it’s the supposed “War on Christmas” or a kind of “War on Islam” that some on the far right call for.

In this context, it’s reasonable to interpret any surge in fundamentalism within a given denomination as a reactionary backlash to the overall trend of liberalization.

And so, unable to propagate their narrow view through ideological cohesion alone, dogma resorts to force—in mild forms like pro-discrimination laws against LGBT people pushed by Christian extremists in the United States, or murderous forms like the brutality of Daesh, which is disproportionately used to punish other “unfaithful” Muslims.

In fact, like other fundamentalist religious groups in this era, Daesh is overreacting to a shifting global climate in which its ideas are increasingly marginalized. The trick to defeating Deash it to see for what it is—a desperate backlash by a declining ideology.” ~

Photo: Oliver Sacks



“Trump’s strength-worship and contempt for “losers” smack more of Nietzsche than of Christ. Blessed are the proud. Blessed are the ruthless. Blessed are the shameless. Blessed are those who hunger and thirst after fame.” ~ Michael Gerson


I haven't seen this before. Sure, identifying Trump with the anti-Christ, that's common enough. But specifically picking up the current of Nietzschean immoralism is a fresh insight. Nietzsche's unfortunate admiration for the “master morality” versus the “slave morality” of Christianity was a source of inspiration for the Nazis (though they would have found their justification without Nietzsche — even so, here was an impressive erudite philosopher they could quote).

I realize that ideally a whole essay should follow. And this is taken from an essay that details how Trump tramples on ethics. But that’s belaboring the point. It would take a Nietzsche scholar to write a philosophical analysis of Trump as an exemplar of Nietzsche’s “master morality” of the strong (aka the rich — obviously already for many millennia the real power has been wealth). For me the brief paragraph I quote is enough.

How did Nietzsche become so unbalanced as to fall into the trap of “might makes right”? I don’t feel qualified to cite philosophical influences. I'm not even sure of Nietzsche’s hatred of Christianity can explain much, given that it was shared by many who nevertheless held on to values such as compassion. Rather, I suspect a personal reason (and Nietzsche would be the first to agree that one’s philosophy is autobiographical and psychological in nature).

Nietzsche was a sickly, non-macho sort of man; according to friends who knew him most closely, he lived and died a virgin (the cause of his “madness” was most likely not syphilis but brain cancer). He was a sensitive, highly “civilized” person. In my observation, it’s not unusual for such a man to feel that he’s not a “real man,” and to compensate through fantasies of machismo. As for social pressures and cultural influences, when we complain about toxic masculinity and the glamorization of the military, for instance, let’s remember how much worse this was in the 19th century.


And now for the real detox: the quirky history of the English language

HORSE USED TO BE HROS (How errors in pronunciation have become standard English)

Words that used to begin with "n"

Adder, apron and umpire all used to start with an "n". Constructions like "A nadder" or "Mine napron" were so common the first letter was assumed to be part of the preceding word. Linguists call this kind of thing reanalysis or rebracketing.

When sounds swap around

Wasp used to be waps; bird used to be brid and horse used to be hros. Remember this when the next time you hear someone complaining about aks for ask or nucular for nuclear, or even perscription. It's called metathesis, and it's a very common, perfectly natural process.

When sounds disappear

English spelling can be a pain, but it's also a repository of information about the history of pronunciation. Are we being lazy when we say the name of the third day of the working week? Our ancestors might have thought so. Given that it was once "Woden's day" (named after the Norse god), the "d" isn't just for decoration, and was pronounced up until relatively recently. Who now says the "t" in Christmas? It must have been there at one point, as the messiah wasn't actually called Chris. These are examples of syncope.

When sounds intrude

Our anatomy can make some changes more likely than others. The simple mechanics of moving from a nasal sound ("m" or "n") to a non-nasal one can make a consonant pop up in-between. Thunder used to be "thuner", and empty "emty". You can see the same process happening now with words like hamster, which often gets pronounced with an intruding "p". This is a type of epenthesis.

When "l" goes dark

A dark "l", in linguistic jargon, is one pronounced with the back of the tongue raised. In English, it is found after vowels, as in the words full or pole. This tongue raising can go so far that the "l" ends up sounding like a "w". People frown on this in non-standard dialects such as cockney ("the ol' bill"). But the "l" in folk, talk and walk used to be pronounced. Now almost everyone uses a "w" instead- we effectively say fowk, tawk and wawk. This process is called velarisation.


Your grandmother might not like the way you pronounce tune. She might place a delicate "y" sound before the vowel, saying tyune where you would say chune. The same goes for other words like tutor or duke. But this process, called affrication, is happening, like it or not. Within a single generation it has pretty much become standard English.

What the folk?

Borrowing from other languages can give rise to an entirely understandable and utterly charming kind of mistake. With little or no knowledge of the foreign tongue, we go for an approximation that makes some kind of sense in terms of both sound and meaning. This is folk etymology. Examples include crayfish, from the French écrevisse (not a fish but a kind of lobster); sparrow grass as a variant for asparagus in some English dialects; muskrat (conveniently musky, and a rodent, but named because of the Algonquin word muscascus meaning red); and female, which isn't a derivative of male at all, but comes from old French femelle meaning woman.

Spelling it like it is

As we've mentioned, English spelling can be a pain. That is mainly because our language underwent some seismic sound changes after the written forms of many words had been more or less settled. But just to confuse matters, spelling can reassert itself, with speakers taking their cue from the arrangement of letters on the page rather than what they hear. This is called spelling pronunciation. In Norwegian, "sk" is pronounced "sh". So early English-speaking adopters of skiing actually went shiing. Once the rest of us started reading about it in magazines we just said it how it looked. Influenced by spelling, some Americans are apparently starting to pronounce the "l" in words like balm and psalm (something which actually reflects a much earlier pronunciation).

~ this is from The Guardian, but I accidentally lost the link — maybe because I was thinking of shiing in Norway. 

Oriana: Note, by the way, that Shakespeare uses "nuncle" and not "uncle." "Hros" sounds wonderfully Old Anglo-Saxon.


Norway is a magnificent country as long as you’re dressed for the climate. (In spite of the name, Norwegian forest cat is a special breed of domestic cat)

Norwegian forest kitten

“Revolutions don't work. They're breeding grounds for sociopaths and know-it-alls, especially when they need to defend the revolution against the inevitable backlash.” ~ Jeremy Sherman


“The true focus of revolutionary change is never merely the oppressive situations that we seek to escape, but that piece of the oppressor which is planted deep within each of us.” ~ Audre Lorde

I certainly discovered that when trying to liberate myself from Catholicism. "A recovering Catholic" is no joke. It's a lifelong journey to stop seeing oneself as a sinner.



~ “Long past his humble beginnings, President Andrew Johnson would speak proudly of his career as a tailor before he entered politics. “My garments never ripped or gave way,” he would say.

On the campaign trail, a heckler once tried to embarrass him by shouting about his working-class credentials. Johnson replied without breaking stride: “That does not disconcert me in the least; for when I used to be a tailor I had the reputation of being a good one, and making close fits, always punctual with my customers, and always did good work.”

Sometimes, on the road to where we are going or where we want to be, we have to do things that we’d rather not do. Often when we are just starting out, our first jobs “introduce us to the broom,” as Andrew Carnegie famously put it. There’s nothing shameful about sweeping. It’s just another opportunity to excel — and to learn.

Everything we do matters — whether it’s making smoothies to save up money or studying for the bar — even after we’ve already achieved the success we sought. Everything is a chance to do and be our best.

An artist is given many different canvases and commissions in their lifetime, and what matters is that they treat each one as a priority. Whether it’s the most glamorous or highest paying is irrelevant. Each project matters, and the only degrading part is giving less than one is capable of giving.

Steve Jobs cared even about the inside of his products, making sure they were beautifully designed even though the users would never see them. Taught by his [adoptive] father  —  who finished even the back of his cabinets though they would be hidden against the wall  —  to think like a craftsman. In every design predicament, Jobs knew his marching orders: Respect the craft and make something beautiful.

Every situation is different, obviously. We’re probably not inventing the next iPad or iPhone, but we are making something for someone — even if it’s just our own resume. Every part — especially the work that nobody sees, the tough things we wanted to avoid or could have skated away from — we can treat same way Jobs did: with pride and dedication.


This is somewhat too absolutist: there are cases when something is worth doing even badly. And there can be excessive dedication to what doesn’t really matter, resulting in crippling perfectionism.

But on the whole, seeing a task as an opportunity to cultivate the habit of excellence is generally good advice — even if, as with the back of the cabinet, no one will see it. Why? My argument, based on personal experience, is different from the thrust of the article: because it’s a great pleasure to do things well.

My argument is “from happiness” — doing things well makes us happy. The cook who serves an excellent meal is a happy person. So is a ballerina who has given an exquisite performance. But so am I if I do a Pilates pull-up in perfect form: very slowly and mindfully.

Now, just because Leonardo did not do a sloppy job on the Mona Lisa (to put it mildly) doesn’t mean that EVERYTHING is to be approached with the aim of producing a masterpiece. But doing things well is one of the best ideas that humanity has ever conceived.


~ “Let’s start in the 1930s, when an American nutritionist named Clive McCay designed a low-calorie diet for his lab rats at Cornell that gave them all the nutrients they needed but kept them as thin as supermodels and (presumably) ravenous. The diet seemed to act like a time machine, and Dr. McCay’s hungry rats maintained their dapper, glossy coats of fur and frisked about their cages; their well-fed counterparts doddered about in shabby coats and then died. “In the laboratory today are two male white rats that are the equivalent in age to men more than 130 years old,” Dr. McCay announced, promoting the benefits of calorie restriction.

A gentleman farmer, Dr. McCay applied his theories to himself, nibbling on morsels from his own fields. But he didn’t make it close to 130. Though trim and athletic, he had two strokes and died at 69.

Over the decades that followed, research teams would repeat his experiments and confirm that calorie restriction almost always prolonged the lives of lab animals. One of the most prominent of those scientists, Roy Walford, showed that a strict diet could double the life span of mice. Dr. Walford himself stuck to a 1,600-calorie-a-day diet. In the 1980s, he wrote “The 120 Year Diet” and then followed it up with even more misery and abnegation in “Beyond the 120 Year Diet.” He became a cult figure to thousands of CRONies (“calorie restriction with optimal nutrition” enthusiasts) who hoped to live past 100. But he himself died of A.L.S., or Lou Gehrig’s disease, at age 79.

Some of the biggest names in dieting, organic agriculture and preventive medicine died at surprisingly young ages. The wild-foods enthusiast Euell Gibbons was far ahead of his time in his advocacy of a diverse plant diet — but he died at age 64 of an aortic aneurysm. (He had been born with a genetic disorder that predisposed him to heart problems.) The nutritionist Adelle Davis helped to wake millions of people to the dangers of refined foods like white bread, but she died of cancer at 70. Nathan Pritikin, one of the foremost champions of low-fat diets, died at 69, nearly the same age as Dr. Robert Atkins, who believed in the opposite regimen.

Then there is Jerome Rodale, founder of the publishing empire dedicated to health. In 1971, Dick Cavett invited Mr. Rodale onto his TV show after reading a New York Times Magazine article that called him “the guru of the organic food cult.” Mr. Rodale, 72, took his chair next to Mr. Cavett, proclaimed that he would live to be 100, and then made a snoring sound and died. (The episode never aired.)

There are obviously things you can do to improve your health. Give up cigarettes and start walking — that kind of common-sense lifestyle redo can deliver good results. But there are diminishing returns. My travels in the obituary section convinced me that the more esoteric personal choices — and diets based on the latest scientific findings — have far less of an effect on our own health than we may think.

Even those pioneers who did everything “right” were buffeted by circumstances that they couldn’t control on their own — like bad genes, accidents or exposure to smog or pesticides.

It’s the decisions that we make as a collective that matter more than any choice we make on our own.

Beginning in the 1970s, activists and governments collaborated to outlaw leaded gasoline worldwide and to reduce other sources of lead exposure. It is one of the best “lifestyle choices” that we humans have ever made. Average lead levels in our blood dropped by more than 80 percent — a huge health benefit, because lead exposure can increase the risk of heart disease, kidney disease and probably also dementia.

Unfortunately, we have yet to tame many other pollutants, like the particulate matter spewed by diesel engines and coal plants. And the damage from dirty air begins long before any of us can make our own health choices: A study released in January, for instance, suggests that babies exposed to high levels of air pollution in the womb may be at risk of premature aging.

It’s the things we tend to ignore, like our exposure to pollution, that will affect us far more than the things we obsess about, like whether to eat gluten.

 According to Dr. Thomas Frieden, the former director of the Centers for Disease Control and Prevention, “since 1900, the average life span in the United States has increased by more than 30 years; 25 years of this gain have been attributed to public health advances.”

Today, the greatest threat to your life span may be the Trump administration’s assault on public health and medical research. Dr. Robert Phalen, a new appointee to the Environmental Protection Agency’s Scientific Advisory Board, has said that he believes our air is “too clean.” Next year, we’re likely to see drastic cuts to mental-health research, oil-spill remediation and clean-water programs.

The founder of Bulletproof Coffee recently bragged that he hopes to live to age 180, in part by sipping one of his company’s signature drinks made with “Brain Octane Oil.” But aging isn’t some kind of competitive sport you play against your peers. When it comes to staying alive, we’re all in it together.” ~

And each of us probably knows stories of not-so-famous health nuts who died young: runners who died of a heart attack, people on funny diets (e.g. “fruitarians”) who died of cancer, herb enthusiasts who ended up with liver failure, and the like.

Jung took a dim view of health nuts. Even though his statement is dated, it’s still interesting: 

“People who go wrong psychologically are often health fanatics. They are always seeking the right food and the right drinks, they don't smoke and they don't drink wine, they need a lot of salts and are drug-store fiends. Always some new scheme and never very healthy. It is a fact that the sinner generally feels better than the righteous one, for the weeds always thrive better than the wheat. All virtuous people complain about that. Those people who take such care of themselves have always a tendency to become morbid. That amazing energy for drinking a certain water, for instance, comes from a continuous fear which is in them, and that is the fear of death. It is because something in them says, 'For God's sake don't let me die because I have not lived.’”  ~ C.G. Jung, Dream Seminars, 5 February 1930

Of course these days we know better about smoking. We also know about the benefits of  moderate drinking (so many things that were supposed to be bad for you turned out to be good for you, and vice versa). But the main point is valid — health nuts who are motivated by the fear of death can end up imprisoned in lifestyles that keep them from living. 


I think Jung is right about the health gurus — the primary motivator is fear of death, along with fear of aging and disability. The other part of it is the need for control, which can involve all kinds of crazy systems and requirements, dietary restrictions of every imaginal type — low fat, no fat, keto, paleo, Atkins, low carb, fasting, purging, swallowing supplements, getting rid of gluten, all fruit, no fruit, etc.

The ideas about increasing activity are more moderate and sensible than all that dietary confusion, and universally regarded as beneficial. But all these attempts to control and protect ourselves can't assure good health or long life — too much is accidental, or genetic, or determined by the social environment — pollution, lead pipes, regulation and protection of the food supply, availability of preventive medical care, the costs of maintaining health, and the presence of infectious and parasitic agents . . . And so much more. We creep along and bit by bit improve the circumstances of our lives, achieve longer and healthier lives, and better living and working environments for the population as a whole. For the individual, however, there are not, and cannot be, any guarantees.

And I should be ashamed to say the story of Rodale's demise on the Dick Cavett show was hilarious. Pride goeth, etc.


All you say is right on — there isn’t very much we can do about aging. Those with centenarian genes ruin it for the rest of us — some smoke for decades, others get no exercise more vigorous than knitting. Meanwhile a lot of people die young because cancer genes run in the family.

Some factors influencing life expectancy happen to be interesting, though. IQ has a strong correlation with longer life, and the amount of autonomy one enjoys — bosses tend to live longer than subordinates. Marriage seems to offer a health and life expectancy advantage to men but not to women.

Robert Atkins. What he didn't realize was that excess protein gets converted into glucose, which in turn activates insulin, the fattening hormone. Fat cannot be converted into glucose, but it can be converted into energy-giving ketone bodies, the secret of the ketogenic diet.

~ “Since the earliest days of psychiatry, “alienists” have tried to understand the motives and acts of criminals. Toward the end of the nineteenth century, French pathologist Jean Alexandre Eug√®ne Lacassagne believed that insights would show up in personal details, so he instigated what he called “criminal autobiographies.”

Initially, Lacassagne tried to gather information with interviews, but then he devised what seemed a more productive idea that would also benefit the offender. He identified those who wished to express themselves in writing or art and encouraged them to do so.

Lacassagne directed them to address their writings to him. Each week, he visited the prison to check their notebooks, correcting the writing and sometimes guiding an offender into more productive directions. If they filled a notebook, he gave them another, and often he'd publish their work in his professional journal. Occasionally, he even paid them.

From these inmates, both male and female, Lacassagne collected dozens of manuscripts, averaging about twenty-five pages (although one man, set for execution, had filled six notebooks). Many inmates were keen to work with a scientist to try to understand themselves.

Over time, he acquired a rich trove of information. Lacassagne learned that many inmates' family histories were full of violence, tension, poverty, and disease. Most of the inmates had little education and only transient means of self-support. Their marginality contributed to their impulse to commit crimes – especially if they had limited options.

One mass murderer claimed that while the professionals who evaluated him attributed his offenses to greed, he saw the influence of a childhood head injury, lifetime substance abuse, and a sudden blinding sensibility that preceded each stabbing event. No one had even considered these items as causal. In this, said the offender, they were remiss. From what we know today, he was right.” ~


Women inmates’ stories about their lives seem especially unbearable: so filled with childhood abuse, then continuing with more abuse. Seems all they’ve known is violence — emotional, physical, sexual. But I need to remind myself that this is of course true for male inmates too.


~ “Doing lots of exercise in older age can prevent the immune system from declining and protect people against infections, scientists say.

They followed 125 long-distance cyclists, some now in their 80s, and found they had the immune systems of 20-year-olds.

Prof Norman Lazarus, 82, of King's College London, who took part in and co-authored the research, said: "If exercise was a pill, everyone would be taking it.

The researchers looked at markers in the blood for T-cells, which help the immune system respond to new infections.

These are produced in the thymus, a gland in the chest, which normally shrinks in size in adulthood.

'Out of puff'

They found that the endurance cyclists were producing the same level of T-cells as adults in their 20s, whereas a group of inactive older adults were producing very few.

The researchers believe that being physically active in old age will help people respond better to vaccines, and so be better protected against infections such as flu.

Steve Harridge, co-author and professor of physiology at King's College London, said: "Being sedentary goes against evolution because humans are designed to be physically active.”

"You don't need to be a competitive athlete to reap the benefits - or be an endurance cyclist - anything which gets you moving and a little bit out of puff will help."

Prof Harridge and Prof Lazarus believe that highly physically active older people represent the perfect group in which to analyze the true effects of biological aging.

A separate paper in Aging Cell found that the cyclists did not lose muscle mass or strength, and did not see an increase in body fat - which are usually associated with aging.

Aged just 64, Jim Woods, is a comparative youngster in the group. He averages 100 miles a week on his bike, with more during the summer.

He said: "I cycle for a sense of well-being and to enjoy our wonderful countryside."

Cycling 60 miles or more may not be your idea of fun, but these riders have found something that gives them pleasure, which is a key reason why they continue.” ~

ending on beauty:
If the universe is—this is the latest—
bouncing between inflation
and shrinkage, as if on a trillion-year
pendulum, why wouldn’t

an infant’s sobbing, on the exhale,
have a prosody
as on the inhale have the chemistry
of tears and seas

~ Ange Mlinko, “This Is the Latest”


  1. Hello Oriana, I am always so riveted by your commentary, perspectives, poetry and storytelling. I "met" you in an online poetry workshop through the Poetry Barn. I was always intrigued by your "posthumous" comments. I found your blog from way back and added it to my favorite links. Reading your blog is always an education. You can visit my blog

  2. Hi Deborah,
    Anyone who remembers my "posthumous" self-description would be dear to my heart. Now my left knee is posthumous as well -- I've had knee replacement and, alas, so far only a partial recovery -- though I do love my walker (one of those posthumous surprises).
    I'm thrilled to hear from you -- and hope you'll comment again.