Saturday, April 28, 2018


Monet: Houses of Parliament, stormy sky, 1904

One day in the street my grandmother
stopped before another grandmother.
Both stammer: “It’s you —
you — in Auschwitz — ”

Turning to me: “She and I shared
the same blanket. Every night
she said, ‘You’ve got more than I’
and pulled, and I pulled back —

and so we’d tug across the bunk — ”
and the two grandmothers laugh.
In the middle of the sidewalk,
in old women’s dusk,

widows’ browns and grays,
they are laughing like two schoolgirls —
tears rain down the cracked
winter of their cheeks.

On Piotrkowska Avenue,
in the busiest street,
they are tugging that thin blanket.
They are pulling back.

~ Oriana

My grandmother's first ID photo after Auschwitz

1) To begin with, all the quarrels over the uniqueness of the Holocaust strike me as futile. I readily accept the crude definition by the nameless woman who called the Holocaust “the worst thing that ever happened.” Some writers want to give it a kind of sacred status as an ineffable mystery, which makes limited sense. (Such critics might well dismiss my effort to see "lessons" in the Holocaust as a naive trivialization.) But the Holocaust ultimately has a place on the vast continuum of evil that it shares with the other horrors of the 20th century and before, from Turkey to Rwanda. And we can talk about it, even though the term "Holocaust," as has often been pointed out, is utterly inappropriate. A burnt offering? “Sanctifying the Name”? (kiddush ha-shem, the traditional phrase for martyrdom)? What did God have to do with it? Better Shoah or Hurban, but these have yet to become common parlance in English and doubtless never will.

When I do talk and think about it, I find myself reeling from one aspect of the Holocaust to another: from the sheer numbers of the slain (one might hope that Hilberg's original “low” estimate of 5.1 million is right) to the grisly variety of the forms of death (deliberately fomented starvation, exhaustion and disease; hanging, shooting, burning and gassing; elaborate torture and summary execution) to the incomprehensible madness of slaughtering people instead of just pragmatically robbing and enslaving them, to the cold-blooded bureaucratization of monstrous procedures (as opposed to eruptions of blind hatred), to the massive contribution by hundreds of thousands of “Hitler's willing executioners,” to the way the destruction consumed not just people, but culture, language, and every material and spiritual feature of the past while spawning a hideous new culture and language (the Lingua Tertii Imperii dissected by Victor Klemperer).

All of these things were horrible enough in themselves, but when fused into a whole, linked in a sort of row of scythed chariots, they reduced the spectator to helpless grief and rage. The world after the Holocaust had henceforth to be defined as the sort of place in which these sorts of things could always happen.

2) Grisly as all the stories were, from the Einsatzgruppen in 1941, to the Wannsee Conference in 1942 (four days before I was born), to the liquidation of the Warsaw Ghetto in 1943, to the annihilation of the Hungarian Jews (which claimed my wife's aunt and other relatives) in 1944 (around the same time my great-uncle Otto Feilchenfeld was gassed in Auschwitz), the death marches in 1945 (when the war had long been obviously lost), they didn't tell the worst: the survivors' (or liberators' or narrators') perspective always skewed things too positively. They could never tell first-hand about the excruciating final moment when the Nazis triumphed, of how it felt when the bullet blasted through the back of the neck or when the Zyklon B pellets turned to gas and the agony of asphyxiation began. People talked a lot about Auschwitz, but that was because it was actually the more "benign" part of a death camp (Birkenau), whereas the pure death camps of Treblinka, Sobibor, Belzhets, Chelmno and Majdanek were swallowed up in obscurity. Most of the witnesses to murder there had been murdered themselves. Few people visited them, and there was little to see at the now greened over sites.

But wasn't this typical? The worst moments of the worst terrors and torments had always gone unrecorded: the countless thousands of slaves and rebels crucified by the Romans, the Aztec prisoners butchered by the myriads (Inga Clendinnen pointedly reminds us of them in Reading the Holocaust), the Africans who died on the Middle Passage, the victims of Stalin and Mao and Pol Pot. There were no scribes writing, no cameras rolling when they were swallowed up in the black hole. While staring down into this particular abyss, my class and I often groped for something halfway adequate to say about it. I sometimes asked them to consider for a moment the banal, but indisputable, fact that here, as elsewhere, what all but a tiny handful of the perpetrators of the Holocaust had in common was, not their anti-Semitism, their Christian roots or their cruelty, but their testosterone. They were all, God damn them, men.

3) The Holocaust definitively abrogated the covenant between God and his people. Not that God hadn't failed to keep his word in the countless pogroms and persecutions before then; but this was the limit. Jews (and Christians) often claimed that memory was redemptive (liturgical practice, among other things, was based on that notion). Not this time: the “sacred history” (Heilsgeschichte, from Abraham to the rebirth of Israel) that theologians liked to talk about had been replaced by suffering-history (Leidensgeschichte, from the Jewish War to the 1946 pogrom in Kielce, and beyond) — but without the aesthetic comforts of tragedy.

It was a grim fulfillment of the scenes in The Trial where Joseph K. keeps finding childishly lettered signs or cheap pornography or stupid faked portraits, instead of the beautifully inscribed Torah he is looking for — a scrawl, not a scroll.

Of course, the notion that God had ever chosen a people (for no particular reason, judging from Genesis) and then watched over its destiny, punishing here, rewarding there, macro- and micro- managing everything behind the scenes, was never more than an ingenious conceit, perhaps even an obnoxious delusion. But it was pleasant to entertain it; and there was something quasi-miraculous about the stubborn survival of the Jews. Well, that was by the board. As Itsik Manger said, "Nor mir di galitsiener mekhn dikh oyf eybik oys,/ fun der eyde emese oyeve-yisroel" ("But we the Galicians forever exclude you [God]/ from the congregation of the true lovers of Israel.") God had a chance, and he failed — over and out. In some poems about the Holocaust God was still needed, but only to pour abuse on.

4) In a broader sense the Holocaust put paid to all the genial anthropomorphic visions of a world where there was some grand Judge in charge and some sort of long-term justice. Of course, we didn't need the Holocaust to realize this. The notion that one had to wait for the Holocaust to abandon traditional theism would have made Voltaire and Freud, among others, smile ruefully. But again the Holocaust was the clearest, most vivid demonstration of God's impotence. All Jeremiah's and Job's and Qoheleth's complaints, it now turned out, barely scratched the surface. "Behold, the tears of the oppressed, and they had no one to comfort them" (Ec.4.1). There wasn't even delayed justice, as the great majority of Nazi criminals went unpunished and died in their beds.

5) Throughout all this, one undeniable fact about the Holocaust was its power to fascinate. If ever there was such a thing as the pornography of violence, this was it. As a stunning set of limit-situations, where human experience was pushed to every conceivable extreme, it turned us all into rubber-neckers, like motorists passing a spectacular chain-collision. Seventy or so years later, the questions still burned: what would you have done if you were ... (a Jew ordered to dig your own grave before the "nape-shot"? a member of a Judenrat? part of a Sonderkommando at a death camp? a sympathetic gentile in Poland? a would-be assassin of Reinhard Heydrich? FDR?) No use pretending, class: it was infinitely more interesting than Milton's theology or Le Cid or Romantic alienation or the poetry of W.B. Yeats. Where else in the world could one find so astonishing (and weirdly comic) a tale as this one, reported in Martin Gilbert's The Holocaust (pp. 200-01), about a survivor of an Einsatzgruppe massacre in Ejszsyski, Lithuania,

 ... the sixteen-year-old Zvi Michalowski, who had fallen a fraction of a second before the volley of shots which killed those next to him, including his father. Later he had heard the chief executioner, the Lithuanian Ostrovakas, singing with his fellow executioners as they drank to their successful work.

Just beyond the Jewish cemetery were a number of Christian homes. Michalowski knew them all. Naked, covered with blood, he knocked on the first door [wait, was this a fairy tale?— PH]. The door opened. A peasant was holding a lamp which he had looted earlier in the day from a Jewish home. "Please let me in," Zvi pleaded. The peasant lifted the lamp and examined the boy closely. "Jew, go back to the grave where you belong!" he shouted at Zvi, and slammed the door in his face. Zvi knocked on other door, but the response was the same.

Near the forest lived a widow whom Michalowski also knew. He decided to knock on her door. The old widow opened the door. She was holding in her hand a small burning piece of wood. "Let me in!" begged Michalowski, "Jew, go back to the grave at the old cemetery!" She chased him away with the burning piece of wood as if exorcising an evil spirit.

Michalowski, desperate for shelter, returned. "I am your Lord, Jesus Christ," he said, "I came down from the cross. Look at me — the blood, the pain, the suffering of the innocent. Let me in."

The widow crossed herself and fell at his bloodstained feet. "Bozhe moj, Bozhe moj," "My God, my God," she kept crossing herself and praying. The door was opened.

Michalowski walked in. He promised the widow that he would bless her children, her farm, and her, but only if she would keep his visit a secret, and not reveal it to a living soul, not even the priest. She gave Michalowski food and clothing, and warm water to wash himself. Before leaving the house three days later, he once more reminded her that the Lord's visit must remain a secret, because of His special mission on earth.

This being a miracle story, Michalowski went on to join the partisans and survive the war. The exception proved the rule. The Holocaust, as Lucy Dawidowicz said, was the war against the Jews; and in many crucial ways the Nazis won.” ~

(Source: Facebook, the page of Peter Heinegg; also available on M. Iossel’s page. Peter Heinegg used to teach the “Literature of the Holocaust” at Union College in Schenectady, New York)


Jesus really did exist at least for that moment.


The Holocaust (the term is indeed inappropriate, but it has become standard usage) will be analyzed again and again for decades to come. But right now, for me, the unforgettable part is the story of Zvi Michalowski’s narrow escape by claiming he’s the Second Coming of Christ. It’s a powerful story because it contains an element of tragic truth: that WAS the Second Coming. When Michalowski says, “I came down from the cross. Look at me — the blood, the pain, the suffering of the innocent,” he is not lying. Not in the deeper sense. 

American Nazis near Newman, Georgia, April 21, 2018; photo: Spencer Platt

~ “On March 23, 1971, the Soviet Union set off three Hiroshima-scale nuclear blasts deep underground in a remote region some 1,000 miles east of Moscow, ripping a massive crater in the earth. The goal was to demonstrate that nuclear explosions could be used to dig a canal connecting two rivers, altering their direction and bringing water to dry areas for agriculture.

The nuclear bombs, it turned out, weren’t that effective for building canals, though they did create an “atomic lake” in the crater formed by the blast. But the tests had another lasting consequence, all but forgotten until now: They set in motion the first U.S. government research on climate change — a far-reaching project that has continued into this decade.

On the surface, the reaction to the Soviet tests was somewhat muted. Western countries, including the United States, detected the explosions and lodged a protest alleging a violation of the Limited Test Ban Treaty. Moscow wouldn’t publicly acknowledge the tests for several years.

 But in the national security community in Washington, the blasts sparked panic. When intelligence officials briefed Stephen Lukasik, the director of the Pentagon’s secretive Defense Advanced Research Projects Agency, he had an immediate reaction: “Holy shit. This is dangerous.”

The Soviet Union, it turns out, had for more than a decade been studying ways to use nuclear weapons to create massive canals to reroute water for irrigation, and the plan involved hundreds of nuclear detonations. “The Soviets wanted to change the direction of some rivers in Russia,” Lukasik, now 87 years old, told me recently in an interview. “They flow north where they didn’t do any good for them and they wanted to turn them around so they would flow south.”

The Pentagon didn’t particularly care which way rivers ran in the Soviet Union, but it cared about how this ambitious act of geoengineering, which would affect waters flowing into the Arctic Ocean, could potentially alter the world’s climate. Lukasik decided that DARPA needed to start a climate research program that could come up with ways to model the effects. The name of this climate program, highly classified at the time, was Nile Blue.

At first glance, DARPA might have seemed like an odd place to study climate change. The agency was created in 1958 as a response to the Soviet Union’s launch of Sputnik, to help the United States get into space. But in those years, DARPA was also deeply involved in nuclear issues. It had created an extensive monitoring system precisely to tip off the Pentagon to secret tests like the Soviet effort in 1971.

The [climate] research program for the first time was drawing together [computer] modelers, paleo-climatologists, radiation experts, and meteorologists. The program created an interdisciplinary field, according to Warren Wiscombe, who credits the agency for transforming him from an applied mathematician into a climate scientist in the 1970s. “All of the sciences then that later contributed to climate science were very separate and they had brick walls between them,” he said. “They were what we call stovepiped now.”

 As DARPA was building up its Nile Blue program, another government effort that would alter the course of climate research was taking place behind the scenes. In December 1972, George J. Kukla, of Columbia University, and R.K. Matthews, of Brown, wrote to President Richard Nixon expressing their concerns about “a global deterioration of climate, by order of magnitude larger than any hitherto experience by civilized mankind.”

Their concern was not global warming, but cooling, which they feared could lower food production and increase extreme weather. It was a preliminary result (and one that would later be used by critics of climate change in a simplistic fashion to argue that climate predictions were wrong). The letter caught the attention of Nixon, who ordered an interagency panel to look at the issue. The recommendation, according to William Sprigg, who helped set up the national climate program, was “that the government should have some kind of a program, a plan that would set goals and determine who should be doing what.”

In the end, the Soviets abandoned their grand plan to alter the course of rivers, but by the time DARPA finished its research in 1976, the foundation of climate research was firmly in place: a community of scientists dedicated to the issue, and a political atmosphere conducive to continuing the research. DARPA, whose mandate is for fixed-term research, ended its climate program, but the National Science Foundation and the National Oceanic and Atmospheric Administration picked up the work, eventually leading to the establishment of the national climate program.

 More than 40 years after the end of Nile Blue, former DARPA officials like Perry and Lukasik still get together for a monthly lunch, where they reminiscence about their days at the pioneering agency. Lukasik recalls Perry telling him: “You know, Steve, the work started in DARPA and continued by me in the National Science Foundation became the foundation for all of the understanding of global warming.” ~

Delta of the River Lena, Siberia. If you are reminded of Lenin (April 22 was his birthday), you are on the right track.


~ “Dr. Winthrop Kellogg and his wife, Luella, had adopted Gua from Cuba at 7-and-a-half months to see if a chimp would act like a human if raised with their 10-month-old son, Donald, and surrounded by other people — a bizarre thesis in any era but especially in 1931, when chimps were rarely used for behavioral research. For nine months, Kellogg, his wife and other researchers meticulously observed the two babies, an experiment that today would alarm scientists, animal rights activists and child protective services.

Two years prior to the child-chimp procedure, Kellogg had received his doctorate in psychology from Columbia University and returned to his alma mater, Indiana University, to begin teaching. Early in his career, he was fascinated by wild children. “What would be the nature of the resulting individual who had matured … without clothing, without human language and without association with others of its kind?” he asked in his 1933 book, The Ape and the Child. There had been a few instances of feral children allegedly appearing from the woods, but those cases weren’t scientifically sufficient to answer a major issue in psychology at the time: Is nature or nurture more important in shaping an individual’s life?

Kellogg’s experiment was conducted during the heyday of the eugenics movement, which held that mental and intellectual deficiencies were always nature, always genetic. That contention was bolstered in 1927, when the U.S. Supreme Court ruled in Buck v. Bell that the intellectually disabled could be forcibly sterilized. The ape and the child experiment set out to disprove this theory by showing that environment was more important than genes — that nurture was key.

Although the Kelloggs claimed they treated Donald and Gua the same, the parenting wasn’t always loving. They tapped Donald’s and Gua’s heads with spoons to hear the difference in the sound of their skulls; they made loud noises to see who would react faster; they tried to convince Gua not to eat soap bubbles by shoving a bar of soap into her mouth; and they spun Donald around in a high chair until he started crying — all in the name of science.

Today, the experiment would never pass an ethics board. “Experimenting on your own children is highly problematic,” says Jeffrey Kahn of the Johns Hopkins Berman Institute of Bioethics. “Anytime you do an experiment with your own family and your own life, it’s not scientific in the same way as a laboratory study.”

This was only one of the problems with the premise. “N=1 [a trial with one participant] can only give you so much information,” says Kahn. “How much can you generalize from one case? And how do you do that experiment? You can’t raise the same chimp in two different scenarios.”

Kellogg touted how much Gua learned and how many human qualities she seemed to develop over the nine months: She walked upright, used a fork and had humanlike facial expressions. But Kellogg’s attempt to instill the power of speech in the grunting Gua was a nonstarter. “That was one last shot to see if you could teach a chimp to talk,” says Andrew R. Halloran, author of The Song of the Ape: Understanding the Language of Chimpanzees. The species’ physiology and brain development simply don’t allow for human communication.

“It could have been an interesting psychological study about how children and chimpanzees both crave companionship,” says Halloran. “If there’s a way to not be isolated, chimps will find it, and that’s what they share with humans. Chimps need companions just like humans need companions.”

But Luella Kellogg had other concerns that ended the experiment — namely, that Donald was becoming more chimp than Gua was becoming human. Gua and Donald wrestled in a way that looked more like chimp play than how babies interact. Gua taught Donald how to spy on people beneath doors. Donald started biting people. Donald crawled like Gua even after he could walk, and began grunting and barking like his “sister” when he wanted food. This might have been something the Kelloggs should have expected. “If you raise a baby with a puppy, you don’t expect the puppy to learn human traits,” says Kahn, but who hasn’t seen toddlers crawling around on the floor and barking like dogs?

After Luella pulled the plug, Gua was taken away, caged to be the subject of another experiment and died of pneumonia months later. Donald reached adulthood, became a doctor and killed himself at age 42.” ~


The penultimate paragraph is so funny: the human baby “becoming more chimp than [the chimp baby] becoming human.” “Donald crawled like Gua even after he could walk, and began grunting and barking like his “sister” when he wanted food.”

[from another source: “though Donald had learned to walk before Gua joined the Kellogg family, he regressed and started crawling more, in tune with Gua. He'd bite people, fetch small objects with his mouth, and chewed up a shoe. More importantly, his language skills were delayed. At 19 months, Donald's vocabulary consisted of three words. Instead of talking he would grunt and make chimp sounds.”]

And then the brutal brevity of the last paragraph.

If you go to the full article and watch the movie that comes with it, you’ll likely feel that the parents were unpleasant people. Note that the write-up mentions that both the boy and the chimp would be “fed, diapered, and punished just like any children”; words like love or affection are not mentioned. However, we need to remember that the parents who attempted to perform the “humanizing” experiment didn’t know what we know now, after more research and further along the human cultural evolution: social animals thrive on affection, and need it as much as they need food. It’s not surprising that after the trauma of losing her companion poor Gua (who should have never been separated from her mother and her social group to start with) would become ill and die. And for a while at least, the little boy would likewise miss his “sister” — but at least he didn’t lose his home and parents.

Still, what most stays in my mind — aside from the image of little Gua, frightened by the sound of a gunshot, jumping into the arms of the nearest human, seeking to be hugged — was the irony that instead of “humanizing the ape,” the Kelloggs ended up with an “animalized” toddler.

Mary (on various posts):

This week seems to find in the history of horrors, the Holocaust, the various genocides and instances of mass murder, from the Aztecs and the Romans to Armenia, Rawanda , Stalin's Russia and Pol Pots' Cambodia, ripe with irony and more, finally, akin to the theater of the absurd than of classic tragedy. The massive numbers of the victims and the industrialization of murder make it almost impossible to see the singular human intimacy of each particular death--that moment when the bullet enters the flesh, when the gas rises and chokes the lungs. The depersonalization particularly present in the Nazi bureaucratization of genocide, everything counted, weighed, measured and recorded, teeth and shoes and hair — no longer personal, human, unique, just so much stuff, so many numbers, to enter into the ledgers kept by the book keepers of death.

This is a world not where god is dead, but where he is irrelevant. Useless. All the meaning is gone, shaken out of things, rolled over, fractured beyond repair. The story of Zvi's survival is the perfect illustration. Why does he survive? No reason . . . simply accident . . . he falls before the bullets take everyone down, and he is covered with their blood, overlooked long enough to flee. Naked, freezing and bloody,  he knocks on the doors of three Christians, begging shelter. Of course there are three, that is always the number for storytelling. But in the usual story, there are two fails, either out of wickedness or stupidity, and then the third succeeds, because of goodness, wisdom, intelligence or grace. That is where Zvi's story so perfectly reveals the absurdity of our world. He is saved, not because the woman is good or kind or wise, but because he tells her he is Christ himself come down from the cross, and she believes him. So, accidents, lies and superstition allow his survival. Not justice, certainly not an act of god, simply chance, accidental, without purpose or agency, or anyone to thank.

 Follow this with the story of the Soviet attempt to use underground nuclear explosions to reverse the flow of first too ridiculous to believe, but apparently true. What hubris!! But even more, what a circus, what a riot of clowns mucking about with powerful forces, ones they neither understand nor respect. Not only clowns, dumb clowns. With enough power to actually go forward with their experiment for some time before abandoning it. Hundreds of nuclear explosions over more than 10 years. Didn't reverse any rivers, but surely did some damage.

The third  story, also an absurdist folly, raising a child and an ape together to see if the ape would become more human. Result: the child becomes less human and more like the ape in behavior. Uh-oh..experiment ended — an experiment astonishing for its unethical nature. And the end result for both victims, death, sours the comedy.


I'm especially grateful for your comment on how Zvi’s story differs from the typical stories. Let me simply quote that part — it’s worth re-reading:

“Naked, freezing and bloody,  he knocks on the doors of three Christians, begging shelter. Of course there are three, that is always the number for storytelling. But in the usual story, there are two fails, either out of wickedness or stupidity, and then the third succeeds, because of goodness, wisdom, intelligence or grace. That is where Zvi's story so perfectly reveals the absurdity of our world. He is saved, not because the woman is good or kind or wise, but because he tells her he is Christ himself come down from the cross, and she believes him. So, accidents, lies and superstition allow his survival. Not justice, certainly not an act of god, simply chance, accidental, without purpose or agency, or anyone to thank.”

Well, in a way he has himself to thank. This man was incredibly resourceful. I think he fell
deliberately just before the shot was fired; I read of another case like that, when another man saw that his would-be Lithuanian executioner was drunk, so there was a chance to get away with that. It was extremely difficult to get out from under the bodies that fell on top of him, but this man was strong enough — and of course motivated enough — to manage. If I remember correctly, the first challenge was simply breathing — he knew he had to wait until the massacre was over and all the Nazis and their collaborators were gone. Eventually he did crawl out and made his way into the forest. Alas, I forget the rest.

But Zvi counted on human help, and eventually proved super-resourceful that way — or should we say that he had unbelievable nerve? A Christian, or even a former Christian, unless a schizophrenic, would not have dared to say “I'm your Lord Jesus Christ.” Childhood indoctrination and remnant fear of hell would have prevented that. And it was precisely this childhood indoctrination that made the woman believe him and shelter him. Not her goodness, nor Zvi’s goodness, but the woman’s not wanting to take a chance on refusing Jesus — in case this really was Jesus and the Second Coming.

To be sure, the element of sheer luck was huge in both stories. The first man decided to fake  being killed when in the last moment he became aware that the executioner was drunk — which may have been the case with Zvi’s guy as well. And the number of bodies that fell on each man just happened to be fewer than would have made it impossible to crawl out from under them. One can easily imagine a horrible scenario where either man would have gotten crushed and/or suffocated.

So yes, luck rather than meaning. But in terms of the Christian story, each victim could indeed be said to be Jesus in the sense of being innocent yet condemned to death for being, well, not exactly “the King of the Jews,” but simply Jewish and thus guilty of made-up crimes, e.g. “all wars were started by the Jews.”


I am with you on the horrible hubris of the Soviet leaders who conceived the project of diverting the Siberian rivers. Scientists and engineers probably knew the idiocy and danger of it, but didn’t dare disobey.

And then we get the generic human hubris of trying to “humanize the ape.” By the way, this wasn’t the only such attempt, though probably the only one in which a human baby was also involved. And it took the mother rather long to pull the plug . . . but at least she did act in the end to prevent more developmental damage to her son. I have a feeling that the father would have persisted much longer — imagine how ambitious he was, dreaming of fame, blind to the most obvious thing that was happening. You see, he was obsessed with signs of Gua’s “becoming human” — for instance, she learned to eat with a spoon.

By the way, eventually it turned out that it’s possible to teach a chimp sign language. A female named Washoe mastered almost 350 signs in ASL. One gorilla and one bonobo likewise learned to “speak” in sign language. I duly watched the movies that boasted, for instance, of how well these intelligent primates could make the sign for “dog” when a dog passed by or a dog’s bark was heard. The problem, as I saw it, was that these animals had nothing interesting to say — or at least nothing that would be interesting to us humans, with our capacity for abstract thought and not just a large vocabulary, and our considerably different human reality.

Obviously animals communicate quite effectively in their own way within their social group, and the attempt to “humanize” a few primates by teaching them ASL showcases hubris once again. The ancient Greeks who came up with the concept of hubris did not think humans could ever be cured of it. History has shown the correctness their insight with monotonous regularity.


~ “Partnership used to be practical, then it got hyper-romantic and then we opened it up, dreaming that we could have it all, the practical benefits of an honest friendship and a mutual admiration society, a straight-shooting buddy and a devoted bunny all rolled into one.

We dream that with the right partner we’d be free to be ourselves warts and all, and still be reliably adored. 

That’s a tall order, not that some don’t succeed in pulling it off. But fewer than the romantic ideal promises. Most couples have to settle for compromise – less romance, more tact, a tireless effort to find what poet Philip Larkin calls “words at once true and kind or not untrue and not unkind”.

Partnership is an ever-simmering crucible that often feels too close for comfort with no respite in sight. You can’t afford to lie to each other. If you’re caught, you may never live it down. And you can’t afford to be totally honest either. Too blunt and you may never live it down.

Not that breakups are any picnic. Unshackled, you might start dreaming of a perfect union again with someone better or just better-suited. You’ll want another crack at the ideal, as though the problems in your last partnership were caused by your partner or just a bad match, not by the internal inconsistencies of the romantic ideal itself. Like gamblers who don’t get that the deck is stacked against them, singles often dream that they’ll be dealt a good hand next time, not like the last, a full house of freedom and safety, love and honesty, being yourself and being appreciated.

This may sound like a pretty dark interpretation of partnership. It’s meant to be kind and optimistic. If we can sober up on how drunk we get on romance we no longer have to convert our mutual admiration society partnerships into mutual accusation societies when they go sour.

Were you at fault? Was your partner? Was it bad chemistry? Maybe, but above all, the problem may just be that we expect more from partnership than it can deliver.

Sobering up about the drunkenness of romance frees partners to escape the threat of romantic blackmail: “If this doesn’t work, I’m going to hold you responsible for the failure.  If you don’t love me right, you’re a narcissistic pig.”

Many couples ease their way into romantic sobriety over time. Often they’re the couples that partnered early and sustained it such that, 30-plus years in, they’re at ease with each other, warts and all (warts do accumulate with age).

Sure, they fell in love as God and Goddess. Nice to have had that temporary delusion fueled by the hormonal certainty of youth. Nice state to visit, but they know that one can’t live there, so they no longer try. They are buddies to each other and it works just fine.

Some of us take romantic drunkenness as real, and seek it through endless dating, unable to sustain the high, the endless quest to find a super-human partner, disappointed again and again by only finding people who are also looking for a super-human partner. And you don’t qualify.

And some make it work through simplicity. They don’t expect much. They partner because people partner.

For the kind of people who read Psychology Today, the best partnerships might be an honest merging of ambivalences, two people who admit they each want conflicting things, a bunny and a buddy, brutal honesty and tactful kindness, and can laugh together about the predicament of trying to get that from one person for life.

A partner of this kind laughs at you.

With you.

And vice versa.

Which requires two people who can each laugh at themselves and the predicaments they find themselves compelled to enter, for example, romantic partnership.” ~

 Rembrandt: Artemisia, 1634 (probably a portrait of Saskia)


“In the beginning, there was nothing. And God said, “Let there be light.” And there was light. There was still nothing, but you could see it a lot better.” ~ Woody Allen
Image: Woody as a high-school senior


~ “In defending the cause of Christ, Luther was uncompromising. No one, he wrote, should think that the Gospel “can be advanced without tumult, offense and sedition.” The “Word of God is a sword, it is war and ruin and offense and perdition and poison.”

In Luther’s famous dispute with Erasmus of Rotterdam over free will and predestination, the renowned Dutch humanist suggested that the two of them debate the matter civilly, given that both were God-fearing Christians and that the Bible was far from clear on the subject. Exploding in fury, Luther insisted that predestination was a core Christian doctrine on which he could not yield and that Erasmus’s idea that they agree to disagree showed he was not a true Christian.

In his later years, Luther produced venomous attacks on groups he considered enemies of Christ. In his notorious On the Jews and Their Lies, he denounced the Jews as “boastful, arrogant rascals,” “real liars and bloodhounds,” and “the vilest whores and rogues under the sun.” In Against the Roman Papacy, an Institution of the Devil, he called the pope “a true werewolf,” a “farting ass,” and a “brothel-keeper over all brothel-keepers.”  

When in 1542 a Basel printer was preparing to bring out the first printed Latin version of the Quran, Luther contributed a preface explaining why he supported publication. It was not to promote interfaith understanding. By reading the Quran, he wrote, Christians could become familiar with “the pernicious beliefs of Muhammad” and more readily grasp “the insanity and wiles” of the Muslims. The learned must “read the writings of the enemy in order to refute them more keenly, to cut them to pieces and to overturn them.”

Luther arrived at his own interpretation of the Gospel after experiencing years of debilitating doubt as an Augustinian friar. The prescribed rituals and sacraments of the Roman Catholic Church—designed to offer a clear path to salvation—provided little relief. No matter how often he went to confession, no matter how fervently he prayed the Psalter, Luther felt undeserving of God’s grace. Sometime around 1515, while lecturing on Paul’s Epistle to the Romans, Luther had his great intellectual breakthrough: Salvation comes not from doing good works but through faith in Christ.

Upon discovering this truth, Luther later wrote, “I was altogether born again” and “entered paradise itself through open gates.” In thus describing his sudden spiritual transformation, Luther provided a model for millions of later Protestants seeking similar renewal. Being born again is one of the defining characteristics of evangelicalism, and it was Luther who (along with Paul and Augustine) created the template.   

[Luther retreated] from his early radicalism into a reactionary intransigence in which he opposed all forms of resistance to injustice and maintained that the only proper course for a Christian was to accept and acquiesce. He took as his watchword Romans 13: “Let everyone be subject to the governing authorities.” It was the individual who had to be reformed, not society. Luther also believed in the concept of the “two kingdoms,” the secular and the spiritual, which had to be kept rigorously apart. Christ’s Gospel was to apply only in the spiritual realm; in the secular, the government’s role was to maintain order and punish evildoers, not to show compassion and mercy. The Lutheran churches in Germany and Scandinavia (like most established churches in Europe as a whole) became arms of the state, developing a top-heavy bureaucracy that bred complacency, discouraged innovation, and caused widespread disaffection.

Not so in America: With no established churches to confront and freedom of worship guaranteed by the Constitution, American Christians have been free to create their own spiritual pathways. Over time, Luther’s core principles of faith in Christ, the authority of Scripture, and the priesthood of all believers became pillars of American Protestantism—especially of the evangelical variety.

The message from evangelical pulpits is overwhelmingly one of self-reliance, personal responsibility, individual renewal, scriptural authority, and forging a personal relationship with God and Christ. American evangelicalism has further assumed the populist stance of the young Luther. His rebellion was directed at the dominant institution of his day — the Roman Catholic Church. He denounced the ordained clergy, anointed theologians, and university scholars who, appealing to custom and tradition, sought to silence and discredit him. Protestantism, in short, arose as a revolt against the elites, and Luther’s early appeals to the common man and his disdain for the entitled lent the movement a spirit of grassroots empowerment that remains alive to this day. His insurgent nature further implanted in the faith a reflexive adversarialism — a sense of being forever under siege.

Luther’s rebelliousness was, however, paradoxically joined to an opposition to real-world change. While rousing the masses, he refused to endorse measures that would concretely address their needs. This combination of incitement and passivity is apparent in contemporary American evangelicalism, with both its ceaseless agitation against the centers of power and its shunning of any real program to address the underlying sources of resentment and dissatisfaction. In accord with Luther’s doctrine of the two kingdoms, many evangelicals see the proper role of the government to be imposing order, not showing mercy.

Donald Trump has followed this approach. On the one hand, he has played on the conviction of evangelicals that they are an oppressed minority who have been prevented from practicing their religion as they see fit. He has vigorously defended the right of the faithful to say “Merry Christmas,” of pastors to speak freely in their pulpits, of church-run hospitals and health-care organizations to refuse to offer contraceptives. He has also appointed judges committed to those principles (and adamantly opposed to abortion, a key issue for this group). At the same time, Trump has carefully avoided taking on the powerful financiers and magnates who have helped to create the economic system that has inflicted such hardship on his base. Trump’s insults, invective, and mocking tweets against enemies real and perceived seem a long way from the Sermon on the Mount, but they very much mirror the pugnacity, asperity, and inflammatory language of the first Protestant.

Luther throwing an inkpot at the devil

Of course there was more to Luther than his inflammatory speech and inability to compromise. An article like the one above is by its nature selective in what it highlights. Another article might focus instead on Luther’s astonishing courage in standing up to the Catholic church. Of course he knew there’d be a price on his head.

Another bad thing about Luther was his doctrine of sola fide — one is saved by faith alone. That made conduct of less importance, since nothing you did could “earn” heaven — or lose it, if you happened to be among the Elect. In practice, having correct faith got translated into belonging to the right church. Sins? They didn’t matter all that much if the person was “one of us,” i.e. member of the same denomination.

“He who doesn’t give up all that he has cannot be my disciple,” is one of the most extreme pronouncements of Christ (Luke 14:33). Though it’s entirely in line with other teachings condemning wealth, it is generally ignored. Luther, so eloquent in condemning the wealth of the Catholic church, found himself on the side of the rich and against the peasants during the peasants’ uprising. Protestantism eventually fully embraced the idea that wealth is a sign of being among the Elect.

This is a fascinating article — Martin Luther is hardly the figure we think about in connection with the current administration. And yes, certain parallels exist. But Luther was neither cynical nor greedy. A devoted husband and father, he certainly was not a one-person summary of the Seven Deadly Sins.

True, he was prone to wrath, but I think his worst fault, at least in modern eyes, was his literal, fanatic belief. He shifted submission from the Catholic church to the text of the bible — or rather, his own translation and interpretation of that text, opening the way for much sectarian craziness to come — unfortunately with political repercussions.


A better parallel for Trump is Yahweh himself, the polar opposite of Jesus. See my blog from two years ago:

Let me quote a brief comment of mine from that blog:

~ We shrug and say that’s just the archaic tribal mentality — boosting “us” — the chosen people, the greatest country in the world, the exceptional nation among inferior humanity — against the “other.” We say that god is dead — meaning specifically the vengeful, wrathful, nationalist, sexist, narcissistic, petty, jealous, infantile, and altogether distasteful Yahweh. But when this mentality suddenly crops up in the 21st century, there is reason to wake up — and lament. ~


In retrospect, the first person to plant the “mythology” idea in my head was my catechism nun! She was trying to explain the opening of the Genesis, the waters below divided from the waters above by a solid firmament (thanks to my father I already knew words like stratosphere and ionosphere), and she said, “This comes from the Babylonian mythology.” 

I was thunder-struck. Dividing the waters from the waters — Babylonian mythology! This was a nun!! The seeds of my seeing the Judeo-Christian tradition as mythology (one full of borrowings, too, but aren't they all?) were sown by a nun. They'd take seven years to germinate fully.

Still thinking back to the nun, she seemed vaguely embarrassed. She mumbled a bit as she went over this “dividing the waters from the waters.” Maybe she suspected that in the second half of the twentieth century not even children can swallow the biblical creation story. It wouldn’t surprise me if this particular nun later joined the mass exodus of clergy from the Catholic church during the 1970’s. She knew too much about the human origins of those stories.


It’s the amount of gold on display in this church in Madrid, belonging to the Convent of the Barefoot Trinitarians, that struck me as overwhelming. I couldn't help thinking, “Inca gold.” Of course it could also be Aztec or from yet another region, but the Inca filling a large room with gold in the hope of providing ransom for their captured king Atahualpa was the first one to come to mind. The gold was taken and Atahualpa executed anyway.

More "Inca gold" (or so I call it, to make it stand for gold looted from the New World).  Imagine if Peru demanded the gold back?


ending on beauty:

The House Lies Silent

In the distance, an occasional whistle from a passing train. There's a lone firefly flitting among the morning glories & a lone man with a thin cigar wandering in the yard. Now & then you might imagine a woman too, one with long delicate fingers & a wan smile who walks with the man & maybe holds his arm. There's no telling what they might say to one another, no, that's their secret & who would want to intrude – a rejected lover – his or hers – one with a grudge who can't abide happiness, one who thinks spite rules our lives? & as the man makes his way to the gate & out the tree-lined street you might think he feels loss & regret & as he makes his way to the corner & into town where a trio is playing something like Autumn Leaves or Moonlight in Vermont, you might think he's come to this place to reflect, a place that tells him time is no longer running far behind but is already running alongside & he must keep up or, for the last time, lose his way.

~ Roger Aplon

Saturday, April 21, 2018


Eagle Dance, Emil Bisttram (American, 1895-1976), 1934



Permitted by the Nazis: 

eight hours of schools a week.
The children were to learn
enough German to understand commands,
and to count to one hundred.
Felicja was the village schoolmistress.

She arranged “afternoon handicrafts.”
While the girls stitched,
the boys chipped blocks of wood,
she lectured the forbidden
subject: history.
Above the bent heads of children,
she read poems.
On graduation day,
they covered windows with black cloth
and sang the national anthem.

She did fear arrest.
People noticed her religious care
never to touch,
never to rest her hand
on the small and much-folded pages
of the underground newspaper,
“Poland Lives.”

Thus, if interrogated, she could say
“I have never
held it in my hand,”
and speak the truth,

Jesuitical and absolute:
would raise her eyes and see,
dazed by the bare light bulb,
not the spiders of the swastikas,
the black uniforms, the death’s-heads,
but the commandment

shining above,
taking her in like a daughter.

~ Oriana

I wonder if I’ve written about even ten percent of the family wartime stories. Many are simply too overwhelming to write about, too filled with horror and close escapes (not everyone managed to escape; these were the stories of survivors who learned to recount the more bearable parts). This one was told to me by my mother, who happened to be mostly amused by it. But for me it was a revelation: I knew Felicja mainly as the hostess of an annual garden party and family reunion. She did a lot of home pickling: mushrooms, tomatoes. Until my mother, chuckling, told me the story, I had no idea about this “ordinary” woman’s wartime courage.

The black uniforms and the death’s-heads indicate the SS, notorious for the use of torture during interrogation.

Removing the Nazi Eagle, Berlin, 1945


An eye-opening article contrasts the West’s generosity toward Poland in 1989 with its vengeful attitude toward Russia, reminiscent of the attitude toward Germany after WWI.

~ “As William Faulkner remarked, "The past is never dead. It's not even past." WW1 and the fall of the Wall continue to shape our most urgent realities today. The wars in Syria and Iraq are the legacy of the closure of WW1, and dramatic events in Ukraine are unfolding in the long shadow of 1989.

In 1919, at the end of WW1, the great British economist John Maynard Keynes taught us invaluable and lasting lessons about such hinge moments, how decisions of victors impact the economies of the vanquished, and how missteps by the powerful can set the course of future wars.

With uncanny insight, prescience, and literary flair, Keynes's 1919 The Economic Consequences of the Peace predicted that the cynicism and shortsightedness at the core of the Versailles Treaty, especially the imposition of punitive war reparations on Germany, and the lack of solutions to the roiling financial crises of the debtor countries, would condemn the European economies to continuing crisis, and would in fact invite the rise of another vengeful tyrant in the coming generation.

One morning, in September 1989, I appealed to the US Government for $1bn for Poland's currency stabilization. By evening, the White House confirmed the money. No kidding, an eight-hour turnaround time from request to result. Convincing the White House to support a sharp cancellation of Poland's debts took a bit longer, with high-level negotiations stretching out for about a year, but those too proved to be successful.

The rest, as they say, is history. Poland undertook very strong reform measures, based in part on recommendations that I had helped to design. The US and Europe supported those measures with timely and generous aid. Poland's economy began to restructure and grow, and 15 years later it became a full-fledged member of the European Union.

The story of the end of the Cold War is not only one of Western successes, as in Poland, but also one of great Western failure vis-a-vis Russia. While American and European generosity and the long view prevailed in Poland, American and European actions vis-a-vis post-Soviet Russia looks were much more like the horrendous blunders of Versailles. And we are paying the consequences to this day.

Where Poland had been granted debt relief, Russia instead faced harsh demands by the US and Europe to keep paying its debts in full. Where Poland had been granted rapid and generous financial aid, Russia received study groups from the IMF but no money. I [the author of the article, Jeffrey Sachs, an eminent economist] begged and beseeched the US to do more. I pleaded the lessons of Poland, but all to no avail. The US government would not budge.

The West had helped Poland financially and diplomatically because Poland would become the Eastern ramparts of an expanding Nato. Poland was the West, and was therefore worthy of help. Russia, by contrast, was viewed by US leaders roughly the same way that Lloyd George and Clemenceau had viewed Germany at Versailles — as a defeated enemy worthy to be crushed, not helped.

"The Cold War ended," said Putin, "but it did not end with the signing of a peace treaty with clear and transparent agreements on respecting existing rules or creating new rules and standards. This created the impression that the so-called 'victors' in the Cold War had decided to pressure events and reshape the world to suit their own needs and interests.”

We live in history. In Ukraine, we face a Russia embittered over the spread of Nato and by US bullying since 1991. In the Middle East, we face the ruins of the Ottoman Empire, destroyed by WW1, and replaced by the cynicism of European colonial rule and US imperial pretensions.

We face, most importantly, choices for our time. Will we use power cynically and to dominate, believing that territory, Nato's long reach, oil reserves, and other booty are the rewards of power? Or will we exercise power responsibly, knowing that generosity and beneficence builds trust, prosperity, and the groundwork for peace? In each generation, the choice must be made anew.” ~ Jeffrey Sachs


John Maynard Keynes “joined the Treasury during WW1, and in the wake of the 1919 Versailles peace treaty, published The Economic Consequences of the Peace, criticizing exorbitant war reparations demanded from Germany, claiming they would harm the country's economy and foster a desire for revenge” (the BBC)

John Maynard Keynes


Indeed "We live in history" and the past is never over. It's not enough, either, to cast a small net — we are still feeling the consequences of the fall of the Ottoman Empire, and the bad decisions of Versailles. It is not enough to consider the last 50 years, or the last century, wisdom and clarity demand the long view, or we will come far short of understanding the forces at work today, and will make more disastrous decisions.

The Crusades, for instance, have neither been forgotten nor forgiven, and the hatred generated by those long ago events is today alive and well in the inheritors of those ravages and defeats.  We needn't even look that far — consider the still active results of our own Civil War, the mythologies still at work regarding that war's causes and results, the long shadow of slavery persisting, the long struggle for civil rights still far from won, the entrenchment of racism in our social and economic structure. This past that is not past at all.

The results of either generosity or vengeance are also evident. From the training of animals, through the raising of children and the treatment of criminals, it has been well documented that generosity, kindness, and reward are infinitely more effective in producing a positive outcome than reprisal, punishment and vengeance.

In fact, punishment reinforces and guarantees continued negative results — misbehavior, recidivism, war. Unfortunately, we have been able to effect changes in childrearing and animal training, but not in our treatment of criminals or national "enemy states." Maybe we will learn, even if it is a long slow haul..if not, more wars, more wasted lives, more of the same, but with better technology, better weapons, and greater potential for cataclysmic damage.


So utterly true . . . Funny how animal training has been the greatest showcase for the power of reward as opposed to the typically negative results of punishment. Generosity versus revenge — as individuals, we tend to understand which works better — but, as you point out, when it comes to both criminals and countries, revenge rules (disguised as “justice”). 

“Destiny” is a delusion, but it’s easy to understand why we cling to it. Why we cling to revenge is harder to understand, seeing that we’ve had endless lessons regarding its harm to both sides, and have coined sayings — “Revenge doesn’t pay” — and proverbs — “He who takes revenge digs two graves: one for his enemy, another for himself.” 

When will we ever learn? Eventually, eventually . . . I think. Moral progress is slow and full of setbacks, but the point is that it does happen. It may take a thousand more examples and a hundred more years to show that revenge doesn’t pay while generosity does, but in the end enough people “get it” and change the culture. I suspect that the primary factor is less abusive child rearing. Children raised in a loving way are less likely to become vengeful humans.



In a speech in 1889, Susan B. Anthony noted that women had always been taught that their purpose was to serve men, but “Now, after 40 years of agitation, the idea is beginning to prevail that women were created for themselves, for their own happiness, and for the welfare of the world.” Anthony was sure that women's suffrage would be achieved, but she also feared that people would forget how difficult it was to achieve it, as they were already forgetting the ordeals of the recent past.

Why am I posting this? Because most women I’ve known continue to behave as if their own needs and interests do not count — they are always sacrificing for their families. No, they have not absorbed the message that “women were created for themselves, for their own happiness, and for the welfare of the world.” I realize that this may finally be changing, at least in parts of the world.

(Here I am reminded of yet another family story: a Red Army soldier tried to rape my Aunt Lola, but she managed to resist. He called after her, “If you don’t want to [have sex], then what are you for?”)

“Isn't it enough to see that a garden is beautiful without having to believe that there are fairies at the bottom of it too?” ~ Douglas Adams, writer, environmentalist, atheist; he created the counter fine-tuning argument of “a sentient puddle who wakes up one morning and thinks, “This is an interesting world I find myself in—an interesting hole I find myself in—fits me rather neatly, doesn't it? In fact it fits me staggeringly well, must have been made to have me in it!”

This is a lynx. The forest suits his needs so well that who can doubt it was created especially for him?


~ “John Bew’s biography of Clement Attlee . . . is a study in actual radical accomplishment with minimal radical afflatus—a story of how real social change can be achieved, providing previously unimaginable benefits to working people, entirely within an embrace of parliamentary principles as absolute and as heroic as any in the annals of democracy.

Attlee was an unprepossessing man. “A modest man with much to be modest about,” Winston Churchill said of him once. Yet what emerges from this biography is a figure fully as admirable in his way—and, in some significant ways, more so—as the much-portrayed Churchill, who, teasing aside, came to admire Attlee as much as Attlee admired him. (Attlee actually fought at Gallipoli during the First World War, following Churchill’s maligned strategic initiative there—one that Attlee, though he saw it fail firsthand, always thought sound and daring, and undermined by its execution.)

After the war, Attlee went to work as what would now be called a community organizer in the London slum of Stepney, which remained his spiritual home for the rest of his life. Bew, a professor of history and foreign policy at King’s College, London, reminds us that Attlee came of age at a time when Marx was seen as only one, and not the most important, of the fathers of the socialist ideal. Attlee, who saw through and rejected the Soviet totalitarian model early, schooled himself on the British alternatives—on the works of William Morris and Edward Bellamy, who dreamed of rebelling against the regimentation that was implicit in the industrialized system rather than of simply switching around the hands that controlled it. William Blake was one of the names that Attlee most often cited. (It was he, as much as anyone, who made Blake’s mystic poem “Jerusalem” the anthem of the Labour Party.)

It was in the darkest days of 1940, though, that Attlee’s heroism and acuity came most to note. Attlee’s Labour Party had entered into a coalition government with Churchill’s Conservative Party when the Second World War broke out. Then, in late May of 1940, when the Conservative grandee Lord Halifax challenged Churchill, insisting that it was still possible to negotiate a deal with Hitler, through the good offices of Mussolini, it was the steadfast anti-Nazism of Attlee and his Labour colleagues that saved the day—a vital truth badly under-dramatized in the current Churchill-centric film, “Darkest Hour,” as it has been in many a history book. (There were many, perhaps even a majority, on the Tory right more interested in preserving the peace and the British Empire than in opposing Hitler.) Had Labour been narrower in outlook, or implicitly pro-Soviet—at a time when Stalin was still tightly allied with Hitler—as were so many on the French left, the history of European civilization would be very different.

Attlee remained Churchill’s chief ally throughout the war, but he was far from a complaisant one. When Churchill and Roosevelt were considering their declaration of the Atlantic Charter, it was Attlee, acting with a celerity and a clarity of purpose that belied his reputation for caution, who insisted on including “freedom from want” as one of its aims, making economic rights and, with them, a decent life for all, one of the official aims of the war. He was a mumbler, but he was no ditherer.

In 1945, he led Labour to a stunning victory over Churchill, not ceasing for a moment in his admiration for his wartime role, nor ceding for a moment to what he perceived as his partner’s reactionary vision. (Churchill had the very bad idea in the campaign of attacking Labour as a quasi-totalitarian party, which everyone knew was nonsense.) The achievements of the first Labour government are still rightly legendary: a government that actually contained as ministers seven men who had begun their adult lives as working coal miners, brought in national health insurance, made the provision of housing central to its ends, and fought and mostly won the battle against unemployment.

Imperfect as its accomplishments were—the virtues of nationalization proved less absolute than the ideologues imagined—it nonetheless empowered the working classes and, Bew writes, “set the ethical terms on which Britain’s new social contract was founded.” It is still a social contract in many ways intact, and was the background for the extraordinary cultural renaissance of working-class Britain in the nineteen-sixties and beyond. The Beatles begin here.

Of course, Attlee, like any leader in a democracy, was far from perfect. He was as baffled about what to do in the Middle East as everyone else, but his eventual insistence on a parliamentary model in an independent India did mean that India, with all its vagaries and ups and downs, emerged into modernity with a stable polity and what are, by historical standards, minimal civil violence, at least since the war of partition that was part of its birth—certainly compared to the massacres and imposed famines of the Chinese experiment.

After reading Bew’s book, one can’t help but think about the number of T-shirts sold here over the years bearing an image of Che (innumerable), compared with those bearing an image of Clem (presumably zero.) Yet one was a fanatic who helped make an already desperately violent and impoverished region still more violent and impoverished—and who believed in “hatred as an element of struggle”—and the other a quiet man who helped make a genuine revolution, achieving almost everything that Marx had dreamed of for the British working classes without a single violent civil act intervening.

Attlee’s example reminds us that it is possible to hold to moral absolutes—there was no peace possible with Hitler, and it was better to go down fighting than to try to make one—alongside an appetite for conciliation so abundant as to be more prolific, in William Blake’s positive sense, than merely pragmatic. This might be a good year to start selling T-shirts with a picture of this modest man, and the word “Clem!” upon them.


~ “To Sir Ronald Aylmer Fisher, a 20th century statistician and geneticist who devoted much time to the proper design of experiments, the human tendency to spot spurious connections was a central problem even in research – to be flagged, noted, and avoided. “The ‘one chance in a million’ will undoubtedly occur,” he wrote in his 1935 book, “The Design of Experiments,” “with no less and no more than its appropriate frequency, however surprised we may be that it should occur to us.” With millions upon millions of people currently alive, one chance in a million becomes fairly high fairly quickly.

And yet, while we know this theoretically, while all of us are capable of reading and understanding Fisher’s logic, when it does happen to us, the counter-urge — to see deeper meaning rather than coincidence, to see, as Counter did, the hand of fate — can be overwhelming. We simply cannot accept that things just happen, and that their “just happening” can just happen without any good reason. We especially cannot accept it, as psychologist Ruma Falk pointed out over a decade of research, when it happens to us, in our own lives — particularly when the coincidence seems to be, somehow, a meaningful one, confirming some large force that we just know exists. True love, fate, karma, whatever we call it: when it happens to others, we are capable of rational skepticism; when it happens to us, wishful thinking often wins out. For isn’t it far more pleasant than the cold rationality of that shudder-inducing word, “statistics”?

Still, in 1989, Persi Diaconis and Frederick Mosteller, Harvard University mathematicians both, formulated a theory of coincidence based on just such chilly math, methodically investigating the question of what “chance” really means. As an example, when we meet a group of people, we can, and inevitably will, experience a number of coincidences. Jobs, names, birthdates, hometowns, hobbies, and the like. The chance of such coincidences, it turns out, is remarkably high. For instance, for the famous birthday problem — the chance that two people will share the same birthday — you need a mere 23 individuals for the chance to hit 50-50. If you have a group of 48, your likelihood of success jumps to 95 percent. For a triple-hit, the magic number is still quite low, at 88. Quadruple: 187. And if you want birthdays that are within a day — something many of us will still see as quite the coincidence, all you need is 14 people. Even as low as seven, you have a 50-50 chance of hitting a within-a-week match.

Humanity is a large enough number — and even subsets of it, say, the residents of a state or city are large enough numbers — that almost anything becomes possible, and some things are harder to dismiss than a shared birthday. Diaconis and Mosteller point to a then-recent headline in the New York Times: a “1 in 17 trillion” long shot of a woman who won the New Jersey lottery not once, but twice. It’s a coincidence so seemingly incredible that one almost can’t help but see the hovering hand of fate.

“With a large enough sample,” Diaconis and Mosteller write, “any outrageous thing is likely to happen.”

Someone will seem telepathic. Someone will win the lottery twice. Someone will find a dream lover online. “In a culture like ours based heavily on determinism and causation, we tend to look for causes, and we ask, ‘What is the synchronous force creating all of these coincidences?’” Diaconis and Mosteller conclude. “We could equally well be looking for the stimuli that are driving so many people to look for the synchronous force. The coincidences are what drive us. And the world’s activity and our labeling of events generates the coincidences.”

Which means that even though all of this rational explanation may make perfect intellectual sense, we struggle to embrace that statistical certainty, even with those wonderfully convincing numbers staring us in the face. No, we yet counter. It’s the golden touch of luck. It’s fate. It couldn’t have happened by chance. It’s kismet. It would seem we are fated to believe in fate — and that’s a faith that will provide endless fuel to the con artists among us.


Now, everything happens due to a cause — or rather, multiple causes working together. Nothing happens for a “reason.” The words cause and reason are often used interchangeably, but there is an important distinction. “Reason” implies some mysterious destiny, whether the divine plan or something along of the lines of New-Age understanding of the Universe.

No signs and wonders, no destiny, nothing supernatural — oh how that depresses some people . . .  They like to think that things are fated — perhaps even down to choosing this apple over that apple at the supermarket. I can understand this preference: less responsibility for your actions, less stress.


“When children are asked why, say, lions exist, they prefer teleo-functional explanations — “to go in the zoo.” ~ Jesse Bering, The Belief Instinct


The idea of “intelligent design” will probably be one of the last cognitive errors to go. You don’t have to posit a divine creator. Many of my friends believe in “something out there” and a purpose-ruled universe. Some believe that before conception each person (apparently already pre-existent) chooses some special task that only he or she can perform “on the earthly plane.” Oddly enough, all memory of this task is erased before birth, so all we have is some scattered clues — “Perhaps my love of animals indicates that I was born to be an animal-rights activist.”

It would be simpler to say, “I became an animal-rights activist because I love animals” — but that doesn’t satisfy our longing for a predestined purpose. According to Bering, even Sartre, who famously said that existence precedes essence (essence being some god-given purpose), confided in Simone de Beauvoir that at moments he couldn’t bring himself to believe that his existence was due to chance rather than predestined.

Here is more from Jesse Bering on the subject of “destiny”:

~ “In our heads, not only are we here for a reason, but also we (we, you, the lady next door, the clerk behind the counter, and every single of the billions of individuals on this planet) are each here for an even subtler she of the overall purpose.

To see how fantastically odd this highly focused degree of teleo-functional reasoning actually is, imagine yourself on a nice sunny farm. See that horsefly over there, the one hovering about the rump of that Arabian mare? Good. Now compare its unique purpose in life to, say, that other horsefly over there, the one behind the barn, waltzing around the pond algae. And don’t forget about the hundreds of larvae pupating under that damp log — each of which also needs you to assign it a special, unique purpose in life.

It’s hard enough to come up with a teleo-functional purpose for horseflies as a whole, such as saying that horseflies exist to annoy equestrians or to make the rear ends of equines shiver in anticipation of being stung. Just as Ogden Nash famously penned, “God in His wisdom made the fly / And then forgot to tell us why.” But to suggest that each individual  horsefly is here for a special, unique reason — one different from that of every other horsefly that has ever lived or will live — by using our theory of mind to reflect on God’s intention in crafting each its own destiny, may get us institutionalized.

Yet this is precisely what we do when it comes to reasoning about individual members of our own species; and, curiously, the concept of destiny doesn’t strike most of us as being ridiculous, or conceptually flawed at all.

This doesn’t imply that we are ‘accidents’, because even that term requires a mind, albeit one that created by mistake. Rather, we simply are.” ~


Re: “destiny,” or seeing “reasons” instead of “causes” in things. It is our tendency to think this way, to see agency and intent instead of chance, and it is hard to resist the temptation to see meaning and purpose where there is none. It is part of our desire to create meaning, to tell stories, to give life a plot, experience a meaningful shape. It is part of how we think and how we understand.

However, take this tendency to see agency and reason behind events to an extreme, and you have psychosis — everything has not only meaning, but particular meaning for and about you. There are messages everywhere, signals at every turn, voices in every random noise, all focused on you with sinister intent. There is no comfort here, as there might be at a lesser extreme. Here you are not the hero of the story, but a persecuted victim in a terrifying world.


Taken to the extreme, that’s the apophenia of psychosis: everything has a hidden meaning, and yes, it’s all about the psychotic person and his/her secret “destiny” (e.g. to save the world from Satan or the aliens). But the tendency to see a pattern where there is none is universal. Tough, we evolved that way because a sound in the brush might mean a predator. Alas, because of our complicated brains, that means that some will come up with conspiracy theories while others watch for signs that the world is coming to an end (and there are always such signs).

But speaking of accidents:

“Somewhere there's a typo more profound than anything ever intentionally written.” ~ Matt Flumerfelt

Oriana: That’s entirely possible. As a writer, I can attest to mistakes I’ve kept because I found more interesting than what I originally intended.


“There is no revenge so complete as forgiveness.” ~ Josh Billings


~ “When Arthur Miller met Marilyn Monroe, she was crying. Or at least that’s the story he always told her, the one she repeats in footage used in the new documentary Arthur Miller: Writer: “As he describes it, I was crying when he met me.” As he describes it.

They met on the set of the 1951 movie As Young As You Feel. At the time Marilyn was broken up over the recent death of her agent and paramour Johnny Hyde, and she was also casually involved with Miller’s friend Elia Kazan. When he first shook Monroe’s hand that day, Miller later wrote, “the shock of her body’s motion sped through me.” Having watched a few of her takes, he told her he thought she should act on the stage. “People around heard him say it,” Marilyn recalled, “and they laughed.” But she suddenly felt she could tune them out: Here was someone seeing a side of her she had always wanted to be seen, a woman not just with luminous beauty but a potential to become a serious artist when her other powers inevitably diminished. She wrote about their encounter in her diary: “Met a man tonight … It was, bam! It was like running into a tree. You know, like a cool drink when you’ve had a fever.”

Though their fates would soon reverse, in 1951 Arthur Miller was more famous than Marilyn Monroe. He’d just won a Pulitzer Prize for Death of a Salesman and was enjoying a celebrity most writers can only, well, write about; Monroe was still a star on the rise, best known for scene-stealing supporting roles in All About Eve and The Asphalt Jungle. They parted ways for several years—Marilyn weathered a rocky union with Joe DiMaggio, Miller tried to work on his failing marriage with his first wife, Mary Slattery—but eventually they began an affair while Miller was still married. In 1956, Miller established residence in Reno, Nevada, long enough to be granted a divorce—as you did in those days. Not long after, in a no-frills civil ceremony, he and Monroe married.

At a glance, it’s one of the oddest celebrity marriages in 20th-century American history. The press called them “the Hourglass and the Egghead,” and one magazine dubbed their union “the most unlikely marriage since the Owl and the Pussycat.” Even today, after their deaths, their five-year union continues to baffle. “She was a sex symbol and he was an aloof intellectual,” the Daily Mail wrote with characteristic tact in 2008. “Why did Marilyn Monroe marry a misfit?”

It’s easier to understand from Miller’s perspective: What hot-blooded heterosexual American man of the 1950s wouldn’t have married Marilyn Monroe? But the more you know about Monroe—her brooding, contemplative nature; her often-fetishized love of reading—the more her attraction to Miller starts to make a poignant kind of sense. He saw not only her artistic potential, but a kind of brokenness about her that most men found convenient to ignore. In the documentary, an elderly Miller recalls something he said to Marilyn many years before their marriage: “I said, ‘You’re the saddest girl I’ve ever met.’ A smile touched her lips as she discovered the compliment I had intended. ‘You’re the only one who ever said that to me.’”


Karina Longworth points out that Monroe had endometriosis and that she was first prescribed the pills that would eventually kill her to manage severe menstrual pain. She acknowledges the history of molestation that many of Monroe’s early biographers cruelly doubted and makes a stunning observation about the dark side of Monroe’s charismatic sexuality: “At that point, nobody was able to see that so many of the things that made Marilyn ‘Marilyn’—the actual or implied easiness, the childlike voice and perspective, the lifelong search for male protectors—all of these things were, in fact, textbook long-term symptoms of child abuse.”


“What makes you so sad?” Clark Gable asks from beneath the brim of a cowboy hat. “I think you’re the saddest girl I ever met.”

“You’re the first man that’s ever said that,” a morose Monroe says in this scene from The Misfits, the final film both she and Gable would ever make. Like so many things in the movies, it’s a comforting lie. The actual first man that ever said that to her, Arthur Miller, wrote the script.

“I just thought it would be a terrific gift for her,” he says in Arthur Miller: Writer, “because she’d never had a part in which she was supposed to be taken seriously. And she really wanted to do that.” For reasons beyond just its melancholy script, The Misfits has got to be one of the saddest Hollywood movies ever made: Its three leads, Monroe, Gable, and Montgomery Clift, would all be dead within years of its release, each from their respective physical failures to live up to the impossibilities of their screen personas. Clift committed what has famously been called “the longest suicide in Hollywood history” by drinking insatiably, partially because of the pressure of hiding his romantic relationships with men; his face in The Misfits isn’t quite as expressive as it had been earlier in his career, since it had been disfigured in a 1956 car accident. Fifty-nine-year-old Gable had a fatal heart attack just days after The Misfits wrapped, and some blame his macho insistence on doing his own stunts in the film, especially during a harrowing sequence that involved roping wild mustangs.

But there’s something particularly poignant about Monroe’s performance in The Misfits: Here is (at least in Miller’s estimation) the kind of dramatic role she always wanted, and yet she was too dependent on pills and booze at this point to pull it off with confidence. She was chronically late to set, delayed shooting by endlessly running lines with her acting coach Paula Strasberg, and forced the production into a two-week hiatus when she had to go to rehab (although some contest that the cause of this delay was director John Huston’s out-of-control gambling debts; Marilyn was always an easy scapegoat). Miller’s only film script that was actually produced, The Misfits is a fascinating pop-culture time capsule. It’s an elegy not just to Miller and Monroe’s marriage (they split up during production), but to Monroe herself. She died the year after it was released and never completed another film.

Monroe was devastated when she came across some notes Miller was taking about their own relationship while writing The Misfits, as well as a journal entry of Miller’s in which he confessed to being “disappointed” with his wife and “embarrassed” by her in front of his intellectual friends. “I guess I have always been deeply terrified to really be someone’s wife,” she wrote in her own diary around that time, “since I know from life one cannot love another, ever, really … starting tomorrow I will take care of myself for that’s all I really have and as I see it now have ever had.”

There’s something endearing, revealing, and ultimately tragic about the fact that, at the height of her powers, Marilyn Monroe married a Pulitzer Prize–winning playwright and the world still refused to take her seriously.” ~

 Marilyn Monroe and Clark Gable in Misfits

“You have to be very fond of men. Very, very fond. You have to be very fond of them to love them. Otherwise they're simply unbearable.” ~ Marguerite Duras 

But I suppose men say the same thing about women.

Somerset Maugham at an official dinner, New York, 1941; John Phillips


~ “It turns out that countries with lots of immigration have historically relied more on nonverbal communication. Thus, people there might smile more.

For a study published in 2015, an international group of researchers looked at the number of “source countries” that have fed into various nations since the year 1500. Places like Canada and the United States are very diverse, with 63 and 83 source countries, respectively, while countries like China and Zimbabwe are fairly homogenous, with just a few nationalities represented in their populations.

After polling people from 32 countries to learn how much they felt various feelings should be expressed openly, the authors found that emotional expressiveness was correlated with diversity. In other words, when there are a lot of immigrants around, you might have to smile more to build trust and cooperation, since you don’t all speak the same language.

People in the more diverse countries also smiled for a different reason than the people in the more homogeneous nations. In the countries with more immigrants, people smiled in order to bond socially. Compared to the less-diverse nations, they were more likely to say smiles were a sign someone “wants to be a close friend of yours.” But in the countries that are more uniform, people were more likely to smile to show they were superior to one another. That might be, the authors speculate, because countries without significant influxes of outsiders tend to be more hierarchical, and nonverbal communication helps maintain these delicate power structures.

So Americans smile a lot because our Swedish forefathers wanted to befriend their Italian neighbors, but they couldn’t figure out how to pronounce buongiorno. Seems plausible. But there’s also something very  w i d e  about the classic American grin. Why is it that Americans smile with such fervor?

This could be because Americans value high-energy, happy feelings more than some other countries. For a study published last year, researchers compared the official photos of American and Chinese business and government leaders. After coding them according to their levels of “facial muscle movement,” they found that American leaders in all contexts were both more likely to smile and showed more “excited” smiles than the Chinese leaders did.

Later, they asked college students from 10 different countries how often they would ideally like to experience certain emotions—from happiness to calmness to hostility—in a given week.

Then, they looked at photos of legislators from those 10 countries. They found that the more a country’s college students valued happy, high-energy emotions, like excitement and enthusiasm, the more excited-looking the government officials looked in their photos. (The correlation held after controlling for economic indicators like GDP.) Interestingly, the amount that people in those countries actually felt happy didn’t matter. The leaders’ excitement appeared to reflect the ideal emotional states of their constituents, not their actual ones.

Like so many other daily practices, in other words, the American smile is a product of our culture. And it can be similarly difficult to export.


“You can hold yourself back from the sufferings of the world, that is something you are free to do and it accords with your nature, but perhaps this very holding back is the one suffering you could avoid.” ~ Franz Kafka
Audrey Hepburn 1959; Richard Avedon

“If you seek tranquility, do less.” ~ Marcus Aurelius, Meditations


“Since the rise of the novel to be our most popular literary form, we seem to have taken secular humanism for granted. Jane Austen’s characters are all of them Anglicans; but the world they inhabit has already become completely secular.” ~ Don Cuppit, Sea of Faith

Jane Austen's writing table


~ “British fine wine, not so long ago an oxymoron, is now a thing. Coffee farmers in Indonesia, Ethiopia and Peru are venturing uphill. Across the Atlantic and the North Sea, U.K. trawlers see less cod and haddock for the nation’s fish and chips, and more squid and anchovies. The nation is importing its cod from Iceland, China and Norway.

“The very cold-water fish that our grandparents used to catch have moved further north, which means that we now import most of the fish that we eat,” said Dr. Stephen Simpson, an associate professor in marine biology and global change at Britain’s University of Exeter. “When we go on holiday in Spain, we often eat the U.K. fish.”

It’s not gloom for everyone, with mostly colder northern areas benefiting so far.

“The areas where foods are grown the most efficiently are shifting,” said Jason Clay, a senior vice president at the World Wildlife Fund, who has more than four decades of expertise on farming and fishing issues. The U.S. corn belt stretching from Ohio to the Dakotas is edging toward the border with Canada, which is already growing more crops than it used to in some parts of the country, he said.

Russia is enjoying bumper harvests of wheat, the world’s most widely grown crop, partly as record temperatures boost yields. That’s adding to the global glut of grains, pushing down prices. In the U.S., North Dakota now has a longer growing season, while some California farmers are planting coffee.

Off the coast of Maine, lobstermen have been catching more of the delicacy than ever before. While further temperature increases may go too far and erode lobster populations in coming decades, for now crustaceans are still breeding in great profusion.

English sparkling wine is winning international awards as the climate in some areas of the country begins to resemble France’s Champagne region, while Poland is growing chardonnay and finicky pinot noir varieties.

But for many, the changes are bad news.

Warmer temperatures are encouraging pests and fungus to develop. Growers in the U.S. and Canada have suffered increased levels of poisonous mycotoxins from fungi in their crops because of drought and humidity. Coffee farmers face rising threats from pests including berry-borer beetles, while disease epidemics such as leaf rust have hit Central America, and Colombia to the south.

Extreme weather events from floods to droughts have taken their toll. In France, fickle weather has been a disaster for the vineyards of Bordeaux, with spring frosts damaging vines, and summer storms leading to grape rot in Champagne. The country’s production of wine overall hasn’t been this low in 60 years.

In California, wine country was ravaged by wildfires last year. Droughts swept across Africa, demolishing corn harvests from Ethiopia to South Africa two years ago. Brazil, the top coffee grower, has also been battling drought in the past few years that curbed crops. Researchers warn that the suitable area for the beans will shrink as temperatures rise.

“When extreme events occur, you’re in trouble,” said Lorenzo Giovanni Bellu at the United Nations’ Food and Agriculture Organization. “For sure, climatologists see increasing occurrence of extreme events, which is the worst for agriculture.”

Less immediately catastrophic is the effect on quality and flavor.
Arabica coffee beans, favored by cafe baristas, are the most sensitive to shifts in rainfall and temperature. Trees are usually grown at high altitude, where cooler temperatures allow the fruit to ripen slowly and develop more complex flavors of acidity and sweetness.

“When temperatures rise, as has slowly been happening in many coffee producing countries for years, the warmth causes the coffee to ripen too quickly, which means less flavorful beans.” said Jamal Gawi, a climate-change consultant in Jakarta. Java coffee is among those affected, he said.

For wheat, while some regions have benefited from larger harvests, parts of Europe and the U.S. have recently seen reduced protein in their grain (important for keeping bread airy) thanks to sudden downpours.

Even rising carbon dioxide that helps plants grow can flush out essential nutrients such as zinc and iron.

Whether through crop failures or price impact, changes in climate have serious implications for nations concentrated in equatorial and tropical regions, whose economies and people rely on agriculture more than others.

Natural disasters have cost farmers in poorer countries billions of dollars a year in lost crops and livestock, and it’s getting worse thanks to climate change. Many countries in sub-Saharan Africa are dependent on single crops—Ethiopia relies on coffee for a third of its export earnings and Malawi gets about half from tobacco.

Nations reliant on food imports, many also in the Middle East and Africa, are vulnerable to supply upsets thousands of miles away that ripple through global markets to push up the cost of household staples. Drought in the biggest growers, from the U.S. and Russia to Brazil, can have dramatic effects on international prices and in some cases threaten political and social unrest among exposed populations. As Europe is discovering, such desperate people can’t be contained by borders.

There will be some winners, but I think there are going to be far more losers and many of them, if not most, are going to be in the tropics,” said Clay at the WWF. “The bigger issue is that everybody is going to have to adjust, and the question is how fast.” ~

~ “We know that CBD (the anti-inflammatory compound in marijuana that does not produce a “buzz”) binds to receptors in the brain but not on neurons. It binds to receptors on something called MICROGLIAL CELLS which are the cells that wrap around neurons and are responsible for some of the neuron's structure, holding them together. But they also have an immune function. They're sort of THE BRAIN'S IMMUNE CELLS. ... CBD also binds to cells in the immune system, so CBD receptors are fairly common in lymph nodes and also in areas of the body where there's a lot of immune activity like the [gastrointestinal] tract.” ~


It follows that cannabidiol (CBD) might help prevent brain diseases — which are just beginning to be recognized as autoimmune. But establishing effective dosage range will take time.

The new trend is microdosing — under 10 milligrams of THC and CBD (some edibles offer a mix). The purpose is medicinal, not recreational. And older brains in particular may benefit from microdosing.


[Karen] Parker also makes sure to regularly step outside her lab and glean insights from parents of children with autism. “I’ve had a lot of parents come up and tell me things like, ‘I got a dog and my kid has dramatically improved,’ or ‘my kid has done equine therapy and they’ve dramatically improved.’” She knows, of course, that these are anecdotal observations, but she uses them to seed new studies. Perhaps, she muses, playing with a dog or interacting with a horse might cause autistic kids to produce more oxytocin or vasopressin.

Revisiting her work on hormones that influence social functioning, Parker began to investigate whether oxytocin played a role in autism. She discovered that on average, people with and without autism had the same blood oxytocin levels. But in the subset of people with autism and very low oxytocin levels, administering oxytocin improved their social functioning. She’s now looking at vasopressin, a hormone similar to oxytocin but one that’s more important to male social functioning (boys are nearly five times more likely than girls to have autism).

An associate professor of psychiatry and behavioral sciences and the director of Stanford’s Social Neurosciences Research Program, Parker published findings last year that indicate that autistic children with low levels of blood oxytocin show improved social functioning when they receive additional oxytocin in the form of a medication. The study was limited to just 32 participants, but her conclusions suggest that children with low oxytocin levels stand to benefit the most from oxytocin treatment, and point the way toward a more personalized approach to treating social deficits in children with autism.

ending on beauty:

A girl sleeps as if
she were in someone’s dream;
a woman sleeps as if
tomorrow a war will begin;
an old woman sleeps as if
it were enough to feign being dead,
hoping death will pass her by
on the far outskirts of sleep.

~ Vera Pavlova, tr Steven Seymour

Picasso: Sleeping Woman, 1931