Saturday, November 18, 2017


Methuselah tree: a 4845-year-old Great Basin bristlecone pine (Pinus longaeva) tree growing high in the White Mountains of Inyo County in eastern California. I have visited the White Mountains and got to see Methuselah and its ancient siblings, amazing sculptures carved by wind and scarcity of water and nutrients.


For a long time we have been in this car,
His hands on the wheel, the sun
Finishing behind the building

And a couple walks by, tucked into one coat
As if against a wind. I am not sure
If he has seen them, but he goes on

This talk of his and I do not watch
His body anymore, the light being
What it is, already going. He repeats

He wants to do this alone
But will do everything they tell him
And will do nothing more than that

While now and then traffic comes
From up behind then veers around
As we sit into dark, streetlight not yet

Started, his head against the window
Like a bird, waiting; we have been at this
For hours as the light changes, trying to love

What has not yet been written and then
We are still here when the couple turns back;
They were only walking around the block

Or maybe they return because the snow
Has begun and from this sudden world
The unbroken comes, and nothing is wasted.

~ Sophie Cabot Black, “The Exchange”


Why have I chosen this poem just before Thanksgiving? Because it praises life. Bear with me.

Other poems in the book makes clear that the speaker’s partner is dying of cancer. The poems are a long death watch — though the surprise here is a sprinkling of poems about Abraham and Isaac, with absence of “the animal that was supposed to save us” and other twists. Still, the main story is the long dying of a man still young, just settling into his career, the couple buying their first house. And then comes the diagnosis, and the whole world changes as the husband and wife (whether they are officially married doesn’t matter) begin to see what will no longer happen:

The meadow you meant to walk all year;
That part of the woods you’ve never been. 


Note how in this poem the setting of the sun and the growing dusk becomes symbolic, but too conveniently so, and we realize that when the speaker says

I do not watch
His body anymore, the light being
What it is, already going

we realize that it’s not so much the literal waning of the light as the waning of the body that is meant here; it’s painful to look at someone in the final stage of cancer. They tend to be emaciated like the victims of concentration camps; suddenly, as if overnight, they also look extremely aged. But Sophie Cabot Black is no Sharon Olds, who tends to wallow in physical detail (note all the critics who accuse Olds of “oversharing”). Black spares us the physical description of terminal cancer; when I mention “emaciated” and “suddenly extremely aged,” I speak on the basis of eyewitness experience. But the author of this poem collection about the dying of her partner avoids the physicality of disease. The reader is left to imagine the dying man as relatively young and probably attractive — healthy and active before the diagnosis.

Another interesting and unexpected detail here is the couple walking by, “tucked into one coat.” This is something somewhat childlike and “fun” that new couples may do, usually very young couples, at the beginning of their relationship. The shared coat symbolizes their unity and their sharing of whatever they have, their mutual nurturing. This is a brilliant detail — one relationship is being born while another is ending, not through anyone’s fault or lack of courage or kindness (“He wants to do this alone” implies that he doesn’t want to increase anyone’s suffering by having them watch his actual last moments; that's his last gift to others).

The poem ends on the continuity of life: the young lovers tucked into one coat return, and it begins to snow, making the world look “unbroken.” 

Note that here snow doesn’t function as a shroud. But it’s been such a frequent literary use of snow that the “shroud” metaphor is probably in the reader’s consciousness. The speaker sets up the opposite: snow as unbrokenness, as the continuity of life. But the most potent symbol of that is the couple tucked into one coat.

Can we believe that “nothing is wasted”? We want to. At the very least, we imagine that not everything is wasted. Some important traces and memories will remain for a while. But maybe that is not all that important. What is important is that life will indeed go on, with others continuing to fall in love, so love will go on. As Borges observed, others will be our immortality.  

Pierre Paulus de Châtelet (Belgian): November in Auderghem, 1905

“And reincarnation? Really? If that were real, wouldn’t there be some proof by now? A raccoon spelling out in acorns, “My name is Herb Zoller and I’m an accountant.” ~ Bill Maher


A wild raccoon spelling out ANYTHING would make me reconsider my atheism.

I'm also reminded of a Kabbalist rabbi who said that because of the population increase we now have only one-eighth of a soul, compared to the good old days.


~ “For many patients with terminal diseases, Coyle has observed, this awareness [of imminent death] precipitates a personal crisis. Researchers have given it other names: the crisis of knowledge of death; an existential turning point, or existential plight; ego chill. It usually happens as it did with my mother, close to when doctors break the news. Doctors focus on events in the body: You have an incurable disease; your heart has weakened; your lungs are giving out. But the immediate effect is psychological. Gary Rodin, a palliative-care specialist who was trained in both internal medicine and psychiatry, calls this the “first trauma”: the emotional and social effects of the disease.

The roots of this trauma may be, in part, cultural. Most people recognize at an intellectual level that death is inevitable, says Virginia Lee, a psychologist who works with cancer patients. But “at least in Western culture, we think we’re going to live forever.” Lee’s advanced-cancer patients often tell her they had thought of death as something that happened to other people—until they received their diagnosis. “I’ve heard from cancer patients that your life changes instantly, the moment the doctor or the oncologist says it’s confirmed that it is cancer,” she says.

The shock of confronting your own mortality need not happen at that instant, Coyle notes. Maybe you look at yourself in the mirror and suddenly realize how skinny you are, or notice your clothes no longer fit well. “It’s not necessarily verbal; it’s not necessarily what other people are telling you,” Coyle says. “Your soul may be telling you, or other people’s eyes may be telling you.”

E. Mansell Pattison, one of the early psychiatrists to write about the emotions and reactions of dying people, explains in The Experience of Dying why this realization marks a radical change in how people think about themselves: “All of us live with the potential for death at any moment. All of us project ahead a trajectory of our life. That is, we anticipate a certain life span within which we arrange our activities and plan our lives. And then abruptly we may be confronted with a crisis ... Whether by illness or accident, our potential trajectory is suddenly changed.”

In this crisis, some people feel depression or despair or anger, or all three. They grieve. They grapple with a loss of meaning. A person’s whole belief system may be called into question because “virtually every aspect of their life will be threatened by changes imposed by the [disease] and its management,” Lee has written. In a small 2011 Danish study, patients with an incurable esophageal cancer reported that after their diagnosis, their lives seemed to spin out of control. Some wondered why they had received a fatal diagnosis, and fell into despair and hopelessness. “I didn’t care about anything,” one patient said. “I had just about given up.”

n the 1970s, two Harvard researchers, Avery Weisman and J. William Worden, did a foundational study on this existential plight. Newly diagnosed cancer patients who had a prognosis of at least three months were interviewed at several different points. At first, for almost all the patients in the study, existential concerns were more important than dealing with the physical impacts of disease. The researchers found that the reckoning was jarring, but still relatively brief and uncomplicated, lasting about two to three months. For a few patients, the crisis triggered or created lasting psychological problems. A few others seemed to face the crisis, then return to a state of denial, and then double back to the crisis—perhaps more than once. In the study, the researchers describe a patient who was told her diagnosis, only to report to interviewers that she didn’t know what it was—and then make it clear she wasn’t interested in receiving a diagnosis in the near future.

Palliative-care doctors used to think that a patient was either in a state of denial or a state of acceptance, period, Rodin says. But now he and his colleagues believe people are more likely to move back and forth. “You have to live with awareness of dying, and at the same time balance it against staying engaged in life,” he says. “It’s being able to hold that duality—which we call double awareness—that we think is a fundamental task.”

Whether or not people are able to find that balance, the existential crisis doesn’t last; patients can’t remain long in a state of acute anxiety. Coyle has found in her work that later peaks of distress are not usually as severe as that first wave. “Once you’ve faced [death] like that once, it’s not new knowledge in your consciousness anymore,” she says.

For most, figuring out how to adapt to living with a life-threatening disease is a difficult but necessary cognitive process, according to Lee. When patients do emerge on the other side of the existential crisis, she finds that many are better off because of it. These patients are more likely to have a deeper compassion for others and a greater appreciation for the life that remains.

To arrive there, they have to squarely face the fact that they’re going to die. “If you’re an avoidant person, and you don’t like to think about these things, that works better when life is going well,” Rodin says. “It just doesn’t work well in this situation because reality doesn’t allow it. It’s like trying to pretend you don’t need an umbrella or something, or it’s not raining, when it’s pouring. You can do that when it’s drizzling, but eventually, you have to live with the rain.”


I thought this would be an interesting follow-up on Sophie Black's poem. After the shock and the crisis, people adjust to the thought and re-engage with what life remains.

For me Christopher Hitchens remains a model of how to die: he kept writing up to the very end. In spite of the pain and the horrible intrusion of chemotherapy and other torturous medical procedures, he kept on working, contributing. 

Oriana: In the long run, nothing matters. But we don't belong to the “long run,” much less to the “cosmic perspective.” We belong to the moment. And then it matters how we live and how we die. It matters because we are not isolated individuals: we touch the lives of others.

Aside from continuing productivity, even in the face of death there is also a desire for continued deep connection:


Jacobsen’s apprehension of his own mortality would manifest itself in perhaps his greatest work, the novel Niels Lyhne (originally titled The Atheist), which Jensen calls “the most death-haunted novel in European literature.” In its bizarre alloy of detached detail and dreamy, horrific awe, it is a novel “in which the strands of both realism and modernism are greedily imbricated.” Niels, the titular protagonist, loses his faith at the age of 12 following the death of a beloved aunt. Over the course of the novel he runs a harrowing existential gauntlet, accruing a series of other terrible losses: his friend, his wife, even his young child. At his son’s deathbed, Niels breaks down and prays to the God he has rejected; when the boy dies, Niels is left with his failure of spirit and the understanding, as Kierkegaard wrote, of “the agonizing self-contradiction of not being able to do without a confidant and not being able to have one.” It is an uncannily full and nuanced account of atheism, “not simply as an idea,” Jensen says, “but as a living, fluctuating belief.” The paralysis Niels feels at the novel’s end is the apostate’s natural condition, one Jacobsen knew intimately: that of an unwilling conversant with a deposed divinity.!


Sometimes I do feel that yearning for a very wise, non-judgmental, empathetic person with whom I could discuss what my life has been about. And no, I wouldn't want to pay a therapist for listening. A loving friend. An all-understanding god, god as the supreme confidant? Sure, that’s a wonderful fantasy — a Protestant one, someone recently pointed out to me, since it’s Protestantism that emphasizes having a a personal relationship with the deity.

But for me it's far from an all-consuming longing. What we do with whatever time is left is more important. And besides, as a writer I can always confide in writing — though I'm aware of the limitations and inevitable distortions. But any use of words, even talking to a loving friend, can't avoid limitations and distortions.

God as a lover, then? The typical god of the mystics? Just how explicitly erotic are we willing to get?


Thirty years ago, the art scholar Leo Steinberg published “The Sexuality of Christ in Renaissance Art and in Modern Oblivion,” a book that does much to explain the connection between Pope Francis’s passionate devotion to the poor and afflicted and his seeming openness to gay Catholics. In “The Sexuality of Christ,” Steinberg argues that as a result of the rise of the Franciscan order, around 1260, an emphasis on Christ’s nakedness, and, thus, on his humanity, joined compassion to an acceptance of the role of sexuality in human life.

A credo of the Franciscan order was nudus nudum Christum sequi (“follow naked the naked Christ”). It was a radical call to cast aside worldly wealth and belongings and acknowledge the fragile, fallen nature of all men and women. But in casting aside Christ’s garments, the Franciscans made Christ’s nude body a focal point. As a result, according to Steinberg, from about the middle of the thirteenth century until the sixteenth century artists lavished particular care on Christ’s penis, the part of Christ’s body that made him most mortal, and which proved his union with humankind. “One must recognize,” wrote Steinberg, “an ostentatio genitalium comparable to the canonic ostentatio vulnerum, the showing forth of the wounds.”

Trinity by Lucas Cranach; when I posted the painting in my blog from a week ago, I certainly noticed the upward slant of the loincloth, but dared not think the obvious. 
“The Sexuality of Christ” has changed the way we look at certain works of art. The “modern oblivion” of Steinberg’s subtitle was just that: centuries during which the central fact of Christ’s phallus in hundreds of Renaissance paintings was overlooked, denied, and, sometimes, bowdlerized. Steinberg adduces several examples of Christ’s genitalia being painted over or touched up to make them look like a mere blur. In one case, probably in the mid- to late nineteenth century, the Alinari brothers, famous for their photographic reproductions of paintings, blackened out the Christ child’s penis in their photograph of a fifteenth-century “Madonna and Child” by Giovanni Bellini. Such censorship, Steinberg believes, was meant as distraction from an uncomfortable theological premise: “A disturbing connection of godhead with sexuality.”

To bring to the surface this suppressed artistic trend, Steinberg reproduced dozens of paintings and drawings in which Christ’s genitalia are indisputably a central thematic concern. There are paintings of the Christ child touching his penis, and of the Virgin handling the infant Christ’s penis. In some pictures, the Christ child exhibits his genitals in a style similar to Venus displaying her sex. “Again and again,” Steinberg writes, “we see the young God-man parading his nakedness, or even flaunting his sex in ways normally reserved for female enticements.”

Many representations of the Three Magi show one of the foreign kings closely inspecting the infant Christ’s genitalia. Depictions of Christ on the cross and of the dead Christ lying in the Virgin’s arms clearly portray Christ with an erection. In some images, which Steinberg calls “psychologically troubling,” the divine Father touches his Son’s penis, “a conciliation,” Steinberg writes, “which stands for the atonement, the being-at-one, of man and God. For this atonement, on which hinges the Christian hope of salvation, Northern Renaissance art found the painfully intimate metaphor of the Father’s hand on the groin of the Son, breaching a universal taboo as the fittest symbol of reconcilement.”

“The Sexuality of Christ” takes up, to put it mildly, an ultra-sensitive subject. For that reason, Steinberg stresses that Renaissance artists’ emphasis on Christ’s penis is an esthetic choice guided by deep religious belief, though he occasionally hints that Renaissance artists could at the same time have been having sly fun with the subject. And it is hard to believe that in, say, quattrocento Florence, an epoch so liberated in its sexual mores—Fra Filippo Lippi, for example, lived openly with a defrocked nun, whom he used as a model for his Madonnas—artists could resist being simultaneously worldly and pious.

For Steinberg, however, theological motives were preëminent. He held that artists used the evidence of Christ’s genitals to prove that Christ submitted to becoming human before returning to the godhead. The revelation of his penis demonstrates, as Steinberg puts it, Christ’s “humanation,” that moment of incarnation which proved Christ’s love for humankind. And the many representations of the Christ child’s circumcision are important as foretellings of his crucifixion—the blood of Christ’s penis is fulfilled in the blood from Christ’s wounds.

Entering with obvious relish the realm of Christian sexual hermeneutics, Steinberg relies on St. Augustine, who emphasized his surrender to and then escape from the “fleshpots of Carthage,” to argue that Christ’s erection was a singular way to demonstrate Christ’s chastity. Without the capacity to yield to lust, Christ’s triumph over carnal desire would have no human meaning. Unlike men after the fall of Adam, who fell victim to lust, Christ willed his erection; it was not an involuntary physiological event. By both willing and resisting it, he declared his victory over the stain of sin bequeathed to humanity by Adam and Eve, and over the death that their carnal weakness brought into the world. That, after all, is the significance of the resurrection.

To drive this point home, Steinberg had to prove that during the late Middle Ages and the Renaissance the word “resurrection” could be used as a double entendre, connoting both the divine event and the humble mortal fact of an erection. Steinberg quotes from one of Boccaccio’s fourteenth-century tales in the Decameron, in which a pious young girl inflames the desire of a monk named Rustico, causing in the latter a “resurrection of the flesh.” Steinberg notes that it was not until modern times that the original phrase was accurately translated from Italian, a censoring that he sees as analogous to the later bowdlerization of Christ’s penis in Renaissance paintings.

The vulnerable component of Steinberg’s perspective was that it was almost entirely speculative. Steinberg does quote from some sermons of the time in support of his argument concerning the centrality of the circumcision, but he builds his case mostly on logic and on physical evidence. Christ’s penis is a prominent element of countless paintings in the Renaissance. That is undeniable, and a theological explanation is the only one that made sense to him.

Michelangelo: Cristo della Minerva, in the church of Santa Maria sopre Minerva, 1519-21 (the bronze loincloth was added during the Baroque period)

The skeptical response to Steinberg’s thesis was that the attention paid to Christ’s penis was merely the consequence of Renaissance naturalism. Steinberg had a convincing set of rejoinders: No children in actual life have been known to receive powerful kings shortly after their birth while smiling benignly and proudly displaying their genitals. It is not a medical fact that dying men experience an erection in the moment of their decease. And even if the emphasis on Christ’s penis in Renaissance painting were the product of fidelity to real life, Christ was no ordinary man.

The most cogent criticism of Steinberg’s book came from Caroline Bynum, a feminist scholar. Bynum pointed out that in medieval texts Christ was often portrayed in feminine terms, and she gave as evidence paintings in which a feminized Christ appears. Steinberg conceded that Christ was sometimes portrayed as both male and female—“In one category of metaphors, the wound [in his side] is said to lactate and give birth”—but responded that this did not diminish the universal resonance of phallic imagery, nor did it lessen the impact of the other paintings he offered as evidence. Steinberg’s and Bynum’s arguments do not appear to be mutually exclusive. An androgynous Christ with a highly symbolic phallus does not seem out of the question.

Steinberg also argued that the artists he was using as examples were not illustrating preëxisting texts. They were confronted by the entirely new artistic problem, made possible by the Franciscan emphasis on Christ’s nakedness, of how to portray Christ’s naked body. In response, they created their own theology, embedded in their representations of Christ. “Renaissance art,” wrote Steinberg, “harnessed the theological impulse and developed the requisite stylistic means to attest the utter carnality of God’s humanation in Christ.” Byzantine art had to prove the divinity of Christ in the face of schisms and iconoclasms; Byzantine artists had no special use for Christ’s naked body. But the more confidently situated Catholic artists of the Renaissance celebrated Christ’s carnal humanity.

Particularly striking now is the original book’s postscript, written by a Jesuit scholar named John W. O’Malley. In the course of defending Steinberg’s thesis, O’Malley writes that the “ ‘Renaissance theology’ ” of Christ’s penis that was put forward by the artists Steinberg discusses “was severely damaged, perhaps in large part destroyed, by the bitter controversies sparked by the Reformation and Counter-Reformation.”

The Jesuit O’Malley is talking about a time when Catholicism was under such siege that the freedom of embodying Christ’s love for humanity in his naked body, a freedom fueled by Franciscan piety, vanished, giving way to polemics and proselytizing. As James Carroll vividly demonstrates in his Profile of Jorge Mario Bergoglio, it is this very lapse into militancy that the present Jesuit Pope, inspired by Franciscan piety, is determined to correct. Pope Francis could well agree with Steinberg, who lamented that the human Christ disappeared “as modern Christianity distanced itself from its mythic roots; as the person of Jesus was refined into all doctrine and message, the kerygma of a Christianity without Christ.” That, Steinberg says, was when “the exposure of Christ’s genitalia became merely impudent.”

One might add that in our own epoch the Catholic Church’s denial of Christ’s sexuality runs parallel to its denial of human sexuality, taboos that resurface in once scandal after another.

In modern times, the Catholic Church has been under siege to an unprecedented degree, as much by internal rifts and abuses as by unbelief and competing Protestant sects. In response, its doctrine and its message have become all the more abstract and inflexible; all the more a Christianity without Christ. The current Pope, by heeding the call to “follow naked the naked Christ,” seems determined to make inseparable the alliance between the naked body that lives, works, suffers, and dies, and the naked body that was created with the capacity to experience physical love. If this is so, then Pope Francis has an ally in Leo Steinberg, the displaced Russian Jew whose modernist, heretical instincts led him to the grave, beautiful, profound, and, at times, playful depiction of Christ’s sexuality.


I have long been interested in the bodily aspect of divinity. Greek gods certainly had bodies and sexuality; Yahweh did have a body, at least in the earlier books, though his sexuality remains unclear (except for the Kabbalists). But the insistence that Jesus was fully human leaves us little choice but to assume some stance about it.

As for Christ’s sexuality, it’s strange and incomplete at best: sexuality without sex. It’s telling that Jesus is shown without a partner — except for those who draw an unsurprising conclusion from the phrase “John, the Beloved Disciple,” or those who cling to the idea that Jesus was married to Mary Magdalene. In the main, there is no denying that the four gospels avoid the subject of Jesus’ sexuality, and the general impression was that Jesus lived and died a virgin. 

Insofar as Jesus is regarded as the supreme role model to be imitated, this is not a feasible model, but one bound to produce pathology.But the pathology goes back already to Yahweh, a god without a mate, an angry father for whom no loving arms wait wherever “home” might be.

Somewhat on a tangent: Someone on Facebook argued that Jesus’ DNA was identical with Mary’s. That would make Jesus Mary’s clone, necessarily female. The Y chromosome comes only from the father. If Yahweh’s was the biological father of Jesus, then he somehow produced sperm that carried the penis-producing Y chromosome. The most logical solution to this is to suggest that Yahweh himself had genitals.


~ “The diminutive Yezhov, who was nicknamed the Bloody Dwarf and was a true sadist, had a curved leg and a limp, while suffering from “myasthenia and neurasthenia, anemia, angina, sciatica, psoriasis, and even malnutrition” and other ailments. During the Terror, his teeth began to fall out. He drank until he lost consciousness. One of his buddies (later arrested as a Polish spy) would bring him prostitutes, while another (whom Yezhov’s NKVD also arrested) joined him in farting competitions. In one report, Yezhov claimed to have discovered numerous interlocking conspiracies: one fascist plot in the NKVD, another in the Kremlin, a Polish espionage group, several Trotskyite groups—the list of plots goes on and on until Yezhov concluded: “I have enumerated only the most important.” But the specific charges really didn’t matter, since Stalin set quotas for arrests.

Interrogation virtually always involved torture, followed either by execution or a sentence in the Gulag. Those who knew they were about to be arrested—like Politburo members—often committed suicide to avoid the interrogation. Despite the purges in the NKVD, by 1938 it grew to over a million men.

Of course, Yezhov himself was eventually arrested and replaced by the still more sadistic Lavrenty Beria. When Yezhov’s apartment was searched, it turned out he had preserved as souvenirs the bullet casings with which Zinoviev and Kamenev, two of the original Bolshevik leaders, had been shot. His one regret was that he hadn’t killed more people. He promised he would die with Stalin’s name on his lips.

The event that was used as an excuse for the Great Terror was the assassination of a popular party figure, Sergei Kirov, who was shot by one Leonid Nikolaev. Mysteriously, the guards had been withdrawn from each floor of the building where Kirov worked and his personal bodyguard was absent. Even stranger, Nikolaev had already been caught trying to sneak into the building with a revolver, and had been released! Over the years, one story after another was promulgated, each involving an ever-growing network of spies. Especially in 1937–38, literally millions of people were accused of belonging to branches of a vast conspiracy whose main purpose was killing Kirov. Conquest’s classic book on the topic, Stalin and the Kirov Murder, cautiously concludes that circumstantial evidence points to Stalin as the instigator of Kirov’s killing.

Yezhov with Stalin before his fall from grace
One of Stalin’s decrees ordered the arrest of wives of traitors, just for being their wives, and in one famous toast he promised to destroy every enemy and also “all his kin, his family.” Other decrees made being late to work punishable by a term in the Gulag and the theft of even a minute amount of grain a capital crime. Any attempt to call such punishments excessive was denounced as “rotten liberalism.”

For Stephen Kotkin, this aspect of the regime—its destruction of its own most loyal members—constitutes something unprecedented in world history, and he gropes for reasons. Even if Stalin was afraid of other officials challenging him, he could sack or transfer anyone at will. But not only did he murder them or send them to slave labor camps, but, “in a huge expenditure of state resources, had them tortured to confess . . . not to being corrupt or incompetent, but to plotting to assassinate him and restore capitalism on behalf of foreign powers.”

Kotkin rightly dismisses explanations based on Stalin’s childhood or Georgian upbringing. Stalin’s character surely made a difference, but his character was itself shaped by his experience as a revolutionary and a dictator. As Kotkin notes, absolute power not only corrupts absolutely, it also shapes absolutely. Above all, Bolshevik ideology was crucial. It taught the inevitability of maximal brutality in class warfare and treated anything less—such as refraining from torture—as an impermissible, liberal lapse. For a Bolshevik, there is no such thing as “human values,” only “class values.” Killing millions not only posed no moral dilemma for Stalin; “on the contrary, to pity class enemies would be to indulge sentiment over the laws of objective historical development.”

The best proof that terror inhered in Bolshevism itself, Kotkin observes, “was the relative ease with which Stalin could foist the bloodbath upon the political police, army, party-state, cultural elites, and indeed the entire country.” He could count on the widespread acceptance of Marxist-Leninist ideology. “It was no accident . . . that a single leader had emerged atop a single-party system that, on the basis of class analysis, denied legitimacy to political opposition.”

Yezhov after his fall from grace


~ “From his perch as a linguist eavesdropping on Soviet-backed forces in Eastern Europe, [Jeffrey Carney] knew that Washington’s portrayal of the other side was a lie. The enemy wasn’t an unstoppable juggernaut preparing to invade the West. Its combat units were barely functional. And it was the U.S. that was trying to provoke the Soviets into an incident that could lead to war.

Depressed and looking for an escape, Carney bolted for Checkpoint Charlie, the gateway to Communist East Berlin, near midnight on April 22, 1983, and asked for political asylum. It didn’t work out as planned; within hours, East German intelligence agents blackmailed him into returning to his unit as their spy. If he refused, they made clear, they’d leak his planned “defection” to his bosses.

[The “Able Archer” military exercise] “This situation could have been extremely dangerous if during the exercise—perhaps through a series of ill-timed coincidences or because of faulty intelligence—the Soviets had mis­perceived U.S. actions as preparations for a real attack.”

That was exactly what worried Carney—that one shot would lead to another, and maybe even a nuclear war. “We underestimated the Russian psyche,” Carney says. “They were institutionally paranoid. The average American would not launch a rocket and shoot a plane out of the air. But they don’t think like we do.”

As Able Archer unfolded in the summer of 1983, Soviet state-controlled radio started making announcements “several times a day” suggesting a U.S. attack was imminent, the study notes. New street signs went up in Moscow and other cities showing the locations of air raid shelters. A Soviet air force unit in Poland began carrying out drills to speed up the transfer of nuclear weapons from storage to aircraft. Some in the Ronald Reagan administration worried that the Soviets were preparing for an invasion of Europe. In response to a Western attack, Moscow’s war doctrine called for the destruction of most European cities and ports using nuclear weapons, followed by a massive ground invasion that would put Soviet troops on the Atlantic in 14 days.

 “One misstep,” Reagan recalled years later, “could trigger a great war.”

Carney had no idea what he was getting into when he crossed into East Berlin in the spring of 1983. His access to some of the Pentagon’s most sensitive electronic-spying operations had driven him to reconsider his initial enthusiasm for the election of Reagan, who had dubbed the Soviet Union “an evil empire” bent on crushing the West. Newspaper reports at the time described the Russians as unstoppable. “Perhaps the first moment I realized there was a problem, a big discrepancy, was while I was waiting for the bus to go to work one day,” Carney recalls. “Stars and Stripes, the military newspaper, had an article about Soviet superiority in the European theater. I remember laughing with a friend, a Russian linguist, about the numbers and technical information cited in the report. It stood at complete odds with what we saw in our intel reports every day.”

The truth, he says, was that Communist-allied units were hampered by fuel and food shortages, alcoholism and even cholera, picked up by soldiers rotating into East Germany from the Soviet Far East. Soldiers were siphoning off brake fluid to get high. He doubted many were battle-ready. “Ronald Reagan,” Carney began to think, “was intent on making Russia an evil empire, whether it was evil enough on its own or not.”

Beginning in May 1983, Carney started looking for “important” documents to steal. The more he read, the more he was concerned about Washington’s electronic warfare programs and weapons, which could fry the Soviets’ command-and-control telecommunications. “[They] were mind-boggling in their reach and ability,” he says. “Many of them were purely offensive, and...would have only found use in a first-strike scenario.”

Later that year, Carney learned that U.S. warplanes were about to fly into Soviet airspace to simulate an attack on a sensitive military site and measure how the enemy responded. War jitters were already high with the impending deployment of U.S. Pershing ballistic missiles in West Germany. In September, the Russians shot down a Korean airliner that wandered over its missile testing area on the Kamchatka Peninsula, in the Soviet Far East. Fearing a similar result, Carney rushed to tell his Stasi East German handler what was coming.

He says another incident in particular, in the fall of 1983, drove him from an “unwilling to a very willing spy.” Since it’s still classified, he refuses to divulge it further, for fear it could land him back in prison. “It was an intentional, aggressive provocation of the Soviet Union in a very sensitive area,” he says, “that would have made [Russian radar monitors] flip out.”

He adds, “When it was explained to me, I said, ‘You’ve got to be kidding. You are going to push their buttons. People are going to be shot down.’”

But Carney has few regrets. “I regret the pain I caused people, I regret the fact that I was in a position where I didn't have the whole picture and I made decisions where I ended up hurting people,” he says. “Unintentionally, though, I think what I did—and there are hundreds and hundreds of people who did what I did, on both sides: American spies, Russian spies, German spies—all of us together made it basically impossible for a war to break out. And I think that's where the focus should be.” ~


In Poland we knew very well that the US had total military superiority. But in the US the fear-mongering went on, and the reckless militarism kept bringing the world to the brink of nuclear war.


The 1918 pandemic was unusual in that it killed many healthy 20- to 40-year-olds, including millions of World War I soldiers. In contrast, people who die of the flu are usually under five years old or over 75.

The factors underlying the virulence of the 1918 flu are still unclear. Modern-day scientists sequenced the DNA of the 1918 virus from lung samples preserved from victims. However, this did not solve the mystery of why so many healthy young adults were killed.

The 1918 flu and World War I

While a mild flu circulated during the spring of 1918, the deadly strain appeared on U.S. soil on Tuesday, Aug. 27, when three Navy dockworkers at Commonwealth Pier in Boston fell ill. Within 48 hours, dozens more men were infected. Ten days later, the flu was decimating Camp Devens. A renowned pathologist from Johns Hopkins, William Welch, was brought in. He realized that “this must be some new kind of infection or plague.” Viruses, minuscule agents that can pass through fine filters, were poorly understood.

With men mobilizing for World War I, the flu spread to military installations throughout the U.S. and to the general population. 

The quest to understand the 1918 flu fueled many scientific advances, including the discovery of the influenza virus. However, the virus itself did not cause most of the deaths. Instead, a fraction of individuals infected by the virus were susceptible to pneumonia due to secondary infection by bacteria. In an era before antibiotics, pneumonia could be fatal.

Recent analyses revealed that deaths in 1918 were highest among individuals born in the years around 1889. An earlier flu pandemic emerged then, and involved a virus that was likely of a different subtype than the 1918 strain. These analyses engendered a novel hypothesis, discussed below, about the susceptibility of healthy young adults in 1918.

Exposure to an influenza virus at a young age increases resistance to a subsequent infection with the same or a similar virus. On the flip side, a person who is a child around the time of a pandemic may not be resistant to other, dissimilar viruses. Flu viruses fall into groups that are related evolutionarily. The virus that circulated when Adolfo was a baby was likely in what is called “Group 2,” whereas the 1918 virus was in “Group 1.” In fact, exposure to the “Group 2” virus as a young child may have resulted in a dysfunctional response to the “Group 1” virus in 1918, exacerbating his condition.

Support for this hypothesis was seen with the emergence of the Hong Kong flu virus in 1968. It was in “Group 2” and had severe effects on people who had been children around the time of the 1918 “Group 1” flu.

To 2018 and beyond

What causes a common recurring illness to convert to a pandemic that is massively lethal to healthy individuals? Could it happen again? Until the reason for the death of young adults in 1918 is better understood, a similar scenario could reoccur. Experts fear that a new pandemic, of influenza or another infectious agent, could kill millions. Bill Gates is leading the funding effort to prevent this.

Flu vaccines are generated each year by monitoring the strains circulating months before flu season. A time lag of months allows for vaccine production. Unfortunately, because the influenza virus mutates rapidly, the lag also allows for the appearance of virus variants that are poorly targeted by the vaccine. In addition, flu pandemics often arise upon virus gene reassortment. This involves the joining together of genetic material from different viruses, which can occur suddenly and unpredictably.

An influenza virus is currently killing chickens in Asia, and has recently killed humans who had contact with chickens. This virus is of a subtype that has not been known to cause pandemics. It has not yet demonstrated the ability to be transmitted from person to person. However, whether this ability will arise during ongoing virus evolution cannot be predicted.

The chicken virus is in “Group 2.” Therefore, if it went pandemic, people who were children around the time of the 1968 “Group 2” Hong Kong flu might have some protection. I was born much earlier, and “Group 1” viruses were circulating when I was a child. If the next pandemic virus is in “Group 2,” I would probably not be resistant.

It’s early days for understanding how prior exposure affects flu susceptibility, especially for people born in the last three to four decades. Since 1977, viruses of both “Group 1” and “Group 2” have been in circulation. People born since then probably developed resistance to one or the other based on their initial virus exposures. This is good news for the near future since, if either a “Group 1” or a “Group 2” virus develops pandemic potential, some people should be protected. At the same time, if you are under 40 and another pandemic is identified, more information would be needed to hazard a guess as to whether you might be susceptible or resistant.

ending on beauty:

Among twenty snowy mountains,
The only moving thing
Was the eye of the blackbird.

I was of three minds,
Like a tree
In which there are three blackbirds.

~ Wallace Stevens

Not having found any image of blackbirds that pleases me, I'm posting instead this fireball meteor seen on November 14 over Italy’s Dolomite Alps (Ollie Taylor)

Saturday, November 11, 2017


Europe (a part of it) at night from the Space Station


(member of Robert F. Scott’s Antarctic expedition; he slid into a crevasse when he could no longer walk, in order not to be a burden to his companions)

Lieutenant Oats!
You lie close to my heart,
there in the icy crevasse.
Your death
death: resignation
cold death in the polar winter
cold death in cold snow
cold death among friends
the friends with whom
you were planning to win the South Pole

I know — the harness of the sled
cuts into your fingers
your legs get stuck in ice-sharp snow
I know — on a sheet of ice
and everywhere else
and always
and for everyone
life has infinite worth

I know it’s hard to catch your breath
when snow-laden wind muzzles your mouth
I know
that the night
the space
the blizzard
erases the last trace
But I also know
everyone has the right to hope

I understand you Oats
I understand the silence that falls
the silence that can’t replace words
The glances that avoid your eyes
your motionless darkening eyes
you you
a universe laid on the sled
blind deaf mute
in the dead night
Is it easier to die lying down
than standing up

They wrote:
Here lies a brave gentleman
What else could they write
too heavy
terribly heavy is the sled
the sled with someone sick
heavier than graves
heavier than dead friends

But after all they too lie in the same grave
on the same white hearse
death is always the same
then was it worth it
was it worth it?

I understand you my Polar Viking
I an ordinary person
Life is always a blizzard
the hospital — a cliff in the polar desert
the pillow a sharp-edged boulder
the blanket the dust of an avalanche
It’s easy to slide into a crevasse
The wind will cover the tracks with snow

Comrades of Lieutenant Oats
who can no longer cope
you too will die without
fulfilling your dreams
death is always the same
we need to forgive

~ Elżbieta Fonberg (1920-2005), tr Oriana Ivy

Grotto in an iceberg, Antarctica, January 1911

Ela Fonberg was a Polish neuroscientist and my mother’s friend and colleague for a great many years at the Nencki Institute in Warsaw. Like my mother, she was a member of the Resistance and took part in the Warsaw Uprising. Only toward the end of her life she admitted to having always written poems, and sent my mother a proverbial slender volume.

I knew her personally — her soft low voice combined with tall stature and athletic build, her chuckle, her off-color jokes, her gesture of slowly pushing up her glasses with one finger at the nosepiece, are all engraved in my memory. It amazes me that I have just translated a poem of hers into English, hopefully gaining a few readers she never imagined reaching.


I know the poem is ambiguous, with lines like "everyone has the right to hope." But I'm sure Oats wasn't just simply giving up because of being a weak person. On the contrary, his suicide was an expression of moral strength.

It was clearly an altruistic suicide. I just remembered that even that kind is forbidden by religion, punishable with eternal damnation. What horrible nonsense.


This is not to deny that death is a terrible loss. The richness of a unique personality is lost. Hence:

you you
a universe laid on the sled

That universe comes to an end. So each of us faces “the end of the world.” What richness perishes with each person! As Emerson said, “What lies behind us and what lies before us are tiny matters compared to what lies within us.”


I'm pondering the line that we die without fulfilling our dreams. That's probably true for almost everyone. On the other hand, life gives us something else. As we look back, there is always something to mourn and regret, and something to celebrate. “I am in continual astonishment” that nothing in my life turned out as I originally expected . . .  but that somehow is the theme of my life: “nothing as expected.” 

By the way, it just occurred to me that this may be the only poem there is about Lieutenant Oats. It wouldn't surprise me if there exist a few forgotten poems about Captain Scott, whose diaries remain his best memorial. But this is probably the only poem ever about Oats — written in Polish by a Polish woman scientist, a former Resistance fighter, and just now translated into English by her friend's daughter. A minor event, I know, but it touches me to imagine this emotional chain: first Ela was moved by his story — by his heroic endurance up to the point when he saw that to persist was to be burden to others and to lower their chances of survival; then his heroic (in my eyes) self-sacrifice. Then Ela's poem reached me, moved me, and stayed in my mind for years, until, after much delay, I translated it after reading an article about Scott's doomed expedition. Once it’s published it in my blog, it may reach a hundred readers or so in several countries. While the number may seem modest, poets quickly learn to be grateful for every single reader, every mind they have the privilege to speak to.


On the possibility of an altruistic suicide, I think we are all too reductionist in our thinking about suicide. Like any other human act, suicide is not monolithic, but complex in both motivation and meaning, not always irrational, or "selfish" as so many would judge, not an act of weakness or lack of courage, certainly not a “sin.” It is probably often an act of desperation, but not always, and not always an unreasonable response to a bad situation. Traditional religion, which has “set its canon 'gainst  self-slaughter” is responsible for most of our cultural judgements on suicides.

At one of the lowest points in my life, severely depressed, having lost most of what I most valued, dulled and confused by medication and other medical “interventions,” I decided to go to confession, though I hadn't been a believer or practitioner for many years. I think I was seeking some kind of comfort, some place to feel “safe” — sanctuary, maybe, the way it used to be thought of. The priest asked me if I had committed the “sin of despair.” So hopelessness was not to be alleviated, but punished. Needless to say, no comfort there.


Ah, the “sin of despair.” It’s also called the “sin against the Holy Ghost,” the one sin which will not be forgiven. Yet given the teaching about how only the chosen few will enter heaven, how was I to feel about my chances? Of course sooner or later I’d decide that my continued doubt, for instance, clearly indicated that I was not among the chosen few, so hell was inevitable. But I also knew that to believe that meant you were sinning against the Holy Ghost, and would not be forgiven, so now you were going to hell for sure. 

Another reason I was going to hell for sure that no way could I feel the slightest love for the creepy god that the church tried to jam into our heads, the first command being to love god above everything — and only then your neighbor. So I felt doomed — but remember, to feel doomed is to commit the sin against the Holy Ghost, which will not be forgiven.

In retrospect I'm just glad I didn’t go insane. I certainly felt tormented, though.

Religion is often defended and justified because it supposedly offers “consolation.” It offered none to you or me. Perhaps it offers consolation only to those who are able to kid themselves that they are going to heaven for sure.

 Lucas Cranach: Trinity, 1515. I love the baby cherubim. But also note the centrality of the loincloth, and its upward extension. But . . .  is there any question as to who is boss? The one who sends the plagues and floods and other disasters? I think Jesus is already dead, and in this painting at least, he remains dead.


 After the tragedy of Capt. Scott's Antarctic expedition, this provides a bit of a comic break: ~ “The discovery of a 106 year old fruit cake among the artifacts from Antarctica’s first building at Cape Adare went viral with more than 1700 media stories in 32 countries across the globe. Made by Huntley & Palmers, the fruit cake is still wrapped in paper and encased in the remains of a tin-plated iron alloy tin.

The cake was most likely taken to the ice by the Northern Party of Scott’s ‘Terra Nova’ expedition (1910 – 1913).

Finding such a perfectly preserved fruitcake in amongst the last handful of unidentified and severely corroded tins was quite a surprise. It’s an ideal high-energy food for Antarctic conditions, and is still a favorite item on modern trips to the Ice.” ~



~ “At any major juncture in life, Hollis argues, we should ask: “Does this path, this choice, make me larger or smaller?” There’s something uncanny about this question, which has seen me through several dilemmas since discovering his work. The usual question is “Will this make me happy?” – but few of us, if we’re honest, have much of a clue about what will make us, or our loved ones, happiest. Ask whether a choice will make you larger or diminish you, though, and surprisingly often the answer’s obvious.

Every choice, writes Hollis, demonstrating again his splendid refusal to be upbeat for the sake of it, represents a kind of death. So “when we get to junctures like that, we had better choose the dying that enlarges rather than the one that keeps us stuck”.

And anyway, who says that “happiness” — that shallow, elusive, rather narcissistic notion — is the best measure of a life in any case? Hollis quotes Rainer Maria Rilke: “The purpose of life is to be defeated by ever greater things.”

Hollis is a follower of Carl Jung, so his view of the mind is that the ego — the conscious “voice in the head” that we take to be ourselves — is only a tiny part of the whole. Sure, it has all sorts of schemes it believes will make us happy and secure, usually involving large salaries, public acclaim, or flawless partners or children. Yet in reality (Hollis writes elsewhere) the ego is nothing but a “thin wafer of consciousness floating on an iridescent ocean called the soul”. The vast forces of the unconscious — the psyche, or “the gods” when Hollis is feeling more lyrical — have their own plans for us.

These days we try to just ignore this deeper level. But when suppressed, it always surfaces somewhere eventually, as depression or insomnia or bad dreams. “When we are off track, psyche protests,” Hollis writes. “Noisy demonstrations are held in the amphitheatre of the body; streets are blocked in the brain by rebels from the cane fields; dreams are invaded by spectral disturbances; affects riot and tear down the work of years.”

OK, OK, but what’s the answer? What does matter most? Don’t expect Hollis to tell you. “I will not rehearse the usual list of what matters most, namely: friends and family, love, honour, good work, reputation,” he writes. [To repeat the central point: Hollis argues, we should ask: “Does this path, this choice, make me larger or smaller?”] ~


I read several books by Hollis, including this one, around the time when my mother was close to dying. Told that she should do this or that, she began to say, more and more often, “That's not important.” She was obviously totally focused on what she regarded as important. There was no time to waste on the trivia.

“Does this choice make me larger or smaller?” is a fabulous question to guide us in making choices, often pointing to the need for taking that rather expensive and inconvenient trip or taking a chance on trying something new. Enlargement or diminishment? Of course we need also ask about the price, the consequences for others, the stage of life . . . But on the whole, we don't regret choosing enlargement.   


“I don’t regret a single ‘excess’ of my responsive youth,” Henry James wrote to fellow writer Hugh Walpole when James was 70. “I only regret, in my chilled age, certain occasions and possibilities I didn’t embrace.” ~ Henry James
It’s wise to remember this as we are tempted to turn down an opportunity to experience something new for the sake of staying comfortably where we are (though once in a while doing nothing is the best choice — there are no absolute rules here). And, once it’s too late to undo one’s choice, there is no point spending time on regret — we need to concentrate on the present and the future.

But in the main I agree (reluctantly, the home-body that I am) with the idea that “life rewards action.”


“You will always be fond of me. I represent to you all the sins you never had the courage to commit.” ~ Oscar Wilde, The Picture of Dorian Gray

 The movie starred Hurd Hatfield

Sabina’s view on the nature of the “cure” in psychoanalysis was arguably an important influence on Jung’s interest in the future more so than in the past — his notion of fate versus destiny (destiny being the daimon, the future pulling us onward).

It’s by having a vision of what s/he might become that the patient can afford to drop pathology and move toward what might be called a “purpose-driven life.” When life becomes focused and disciplined through an inspiring vision of a future self, pathologies are transcended. Useless now, they drop away.

Thus, in the Jungian system, it’s not insight into the past that is curative, but insight into the future. Health is restored when we grow more interested in what we are becoming rather than in brooding over our past wounds.

Sabina also apparently influenced Freud’s development of the Eros-Thanatos theory of the “death drive” – to me, the most provocative of Freud’s ideas. That emerged from the movie more clearly than the more central divide between Freud and Jung. Freud was fixated on the past and on the need to understand the past, “the make the unconscious conscious,” and restore the patient “from neurotic suffering to normal misery.” Jung became more interested in a person’s vision of the future. There is a larger personality that we are intended to become. That destiny, or vision of the future self, pulls us upward.

The text above is from my own 2012 blog, A Dangerous Method (part of it was a review of the movie by that title). 

Sabina Spielrein: My main goal is to cultivate and express all the wonderful things.


~ “Recently, I was in an all-day meeting in San Francisco with a pretty sophisticated group of international business experts. As the morning wore on, our host brought out treats. I quickly learned that sophistication does not dull one's response to M&Ms. As bowls of the brightly colored, candy-coated chocolates were distributed around the room, these hot shot financiers and venture capitalists perked up, wiggled with glee, and leaned forward to retrieve a handful of happiness.

When the host reached the couch on which I sat with another man, I heard the man mutter, “Oh no, here goes my diet.”

I turned to him and said, “Want some help?” He looked at me despairingly and said, “Yes!”

I leaned forward, picked up my note pad, and placed it over the M&M bowl that sat on the coffee table in front of us, offering its bounty like a candy shop window. The effect on my friend was immediate. It was as though the candy store proprietor had pulled a window blind. My friend relaxed. His breathing became more regular. And in spite of the fact that the M&Ms were no further from him than they had been seconds earlier, he endured the remaining hours of our meeting without even once succumbing to the bowl’s siren call.

Perhaps we don’t have as much free will as we think we do, but that doesn’t mean we can’t take an active role in shaping our own behavior. The journalist Michael Shermer suggests that the way to do so is by exercising our “free won’t.” While the impulses to act a certain way are inevitably tied to the various sources of influence that affect us—we can choose not only to not respond to them, but to blunt or change them.

Shermer is right to refocus our attention on “free won’t” rather than “free will.” My colleagues and I have come to the same conclusion. The vast evidence of the social sciences over the past decades suggests that human beings have remarkably little control over their own behavior. We are incredibly easy to manipulate. We spend, eat, talk, vote, work, and play in ways that are profoundly shaped by forces we grossly underestimate. But it doesn’t have to be this way. If we start accepting how little free will we have, we can refocus our attention on our free won’t by reshaping the sources of influence that shape us. In the end, we’re back in control—just a little less directly.” ~


It always seemed to me that there existed a tiny pause where you could choose (choose!!) to act differently. A “stop and think” pause. A one-syllable, yes-or-no pause. Sometimes a Wait! pause. It doesn't work under heavy stress, but in relatively low-stress situations, I had the impression that the pause, though brief and ephemeral, could not be denied. This article confirms the ability to inhibit customary behavior. Maybe we should stop talking about free will (which leads to judgmentalism) and concentrate on the “free won’t.”

Of course there are factors that cause us to inhibit (or not inhibit) customary behavior, to ultimately there is no escaping determinism. But since we can't spend our lifetime minutely analyzing the causes of everything, we might as well celebrate the existence of the “free won’t.”  Yes, there is an area of the brain responsible for inhibition, and it can be better developed in some individuals due to both environmental and genetic factors, but knowing that doesn't particularly get us anywhere (except perhaps, again, away from judgmentalism -- not everyone had the luck to have the kind of parents who provided training in inhibitory behavior). Instead, we can concentrate on the environmental factors and strengthen those.

Need I say that I'm the sort of person who especially enjoys the power of “won’t”? Ah, to walk out of a store without buying anything! It makes me feel like a stronger person. But I realize that, as with everything, inhibition too could be pushed too far — consider pathologies such as anorexia or mutism. Hence, as the ancients counseled, “moderation in all things.” And yet, and yet . . . the joy of that tiny pause when I can decide NOT to buy X, not to say anything, to turn off the computer (that’s the hardest of all).


When considering questions of determinism and free will, I am in complete agreement with your statement about the little "pause." And I think the idea of "I won't" is a much clearer way of seeing how choice and free will can act. Of course we are determined, influenced, manipulated in thousands of ways, from biology to history to the incidents of personal experience. The question is how much, to what degree, and how rigid and overwhelming the determining factors are, and how that may vary from individual to individual, in ways that are also determined by biology, history, experience. In alleging there is no space left for choice we also erase any sense of responsibility for any acts or decisions. In fact, there are not then any "decisions." All actions are put beyond individual choice and control, there is no room for judgement. This leaves us with a very strange, and I think obviously false, world of human activity.

For instance, I have been irritated by a series of ads run on tv for certain drug and alcohol rehabilitation programs. The ads state "addiction can happen to anyone, anywhere, anytime." What I think they are trying to get at is that addiction crosses social divisions such as race, class, gender, education etc, and can be found in every category, every segment of society. What I find unacceptable,  and insidious, is the assertion that addiction can "happen to anyone"-- with the emphasis on "happen." This makes addiction equivalent to a natural catastrophe , like getting caught in a hurricane, or even a rainstorm, or like catching an infectious disease, like measles or tuberculosis.

But even with these examples we know measures can be taken to avoid them — just as with addiction. Somewhere, way back at the beginning, there was that small space, that pause, that chance to say "I won't." In that space a choice is made. In that space is our responsibility for our lives. No matter how small the space, or how strong the pressures that have shaped us and our circumstance. And without acknowledging the existence of that space, we lose what it is to be human.


Addiction is a very complicated matter. Can it just “happen” to anyone? No. It happens to genetically susceptible individuals when the circumstances are just right for it, e.g. under the stress of combat. A lot of American soldiers started using hard drugs during the Vietnam war — but an extensive follow-up showed — surprise, surprise! — that most of those users discontinued drug use when they were out of danger, the hellish stress behind them. (I don’t mean to minimize the aftermath of war, only to say “the worst was over.”) Once these men were busy trying to succeed in civilian life, once they were once more with families and friends, wives and girlfriends rather than prostitutes, picnics and sports rather than slogging through the jungle being shot at by an enemy they couldn’t see, their circumstances mitigated against the use of hard drugs. The life of a junkie held no allure.

Am I arguing for strict determinism when it comes to addiction? Frankly, I don’t know. Those veterans who discontinued may have well had an “I won’t” moment. A conscious decision certainly plays along with all the factors influencing us.

I am very interested in the power of neural inhibition. We see it failing in the case of brain diseases and advanced aging (e.g. the so-called “senile garrulity” — for one thing, as aging progresses, the levels of inhibitory neurotransmitters wane, and the person becomes more impulsive and unfocused, “scattered”). We also see inhibition fail under severe stress.

We have drugs that can help. We know how to train people to be more focused. We know how to increase self-control. But all this is underutilized because the enormous power of inhibition, including the “free I won’t,” is not appreciated enough.

The Soul Nebula near Cassiopeia, in infrared


“Humans have an experiential mind, which allows us to experience the world via sensations and perceptions, to have urges and desires, and to feel emotions like fear or joy. We can bundle these various processes together and call them “primary processes”, in that they happen quickly and relatively automatically with no self-conscious, deliberative effort. We share our primary process system with creatures like dogs.

Adult humans also have a self-conscious, deliberative mind that allows us to talk and to be reflective about our feelings and actions. We can consider this slower, more effortful and reflective portion secondary processes. In keeping with the idea of what a person is, we will call it your “deliberative mind”.

 Addiction is a state of being whereby the Primary Process Mind is overpowering the Deliberative Mind.” ~ Gregg Henriques

Yet another formulation that I find applicable to my experience with deciding not to be depressed — perhaps the most interesting “decision experience” in my life. Though my central insight was about mortality: “It’s too late in life to be depressed,” another influence was having come across the statement: “You can practice falling apart, or you can practice being strong.”

Here was the POWER OF THE “OR” STATEMENT — I saw that there was a choice, and knew instantly my preference for being strong. Coming from a family of strong women, I never doubted my strength (a dubious logic, I know, but at the time it served wonderfully). 

On the other hand, the human brain is also the most magnificent thing in the universe: adaptable, always learning. It doesn’t exist in isolation but in constant dialogue with the environment — including fictional characters and memories, many of them false!


 ~ “William James, who ducked service in the Civil War but who watched it wreck the lives of two of his brothers, concluded that the idealism which had led them to volunteer had been a destroying angel, and that it would be far better to regard ideas as instruments which help people adapt to their circumstances, rather than abstract truths which they allow to govern their actions. In his post-war career at Harvard, James formulated an entirely different way of understanding ideas, which he called pragmatism.

Beliefs had to be judged by their consequences, James insisted, by whether they had “cash value in experiential terms” and could be converted into useful practical conduct. Giving abstractions like abolition and freedom some absolute status as truth made them into the lethal and uncompromising tyrants which decimated James’s generation. But without the status of truth, religion degenerated into therapy—which, from James’s perspective, was not necessarily a bad thing.” ~


I didn't know that the Civil War played a role in the development of James's pragmatism: never mind if X is true; what are the “fruits” (results) of believing that X is true? His approach to religion simply blew my mind: never mind if god exists, or which religion is “true”; what kind of belief works for you?

There was an element of Nietzschean perspectivism here, but with a much greater emphasis on subjective experience. If a certain set of beliefs helps you live a happy and productive life, James argued, don't worry about an objective validation of those beliefs. Go by what works for you. As the article points out, James would not be upset by the argument that by divorcing religion from its claims to objective truth he was reducing it to therapy; that, for him, was not a reduction but rather an enhancement.

James saw that a strong belief in the truth of ideas (whether religious or secular) could easily lead a disaster. In the debate between the eternalists and those who perceive perishability, he was certainly not an eternalist. 


~ “A Russian friend of mine likes to say, in the spirit of Voltaire’s famous joke about the Holy Roman Empire, that the Great October Revolution, as it was always known in Soviet days, was none of those things: not great (it was an economic and political disaster); not in October (according to the Gregorian calendar it was actually Nov. 7); and, above all, not a revolution. It was a Bolshevik coup d’etat. But it was not an accident, either. Lenin began plotting a violent seizure of power before he had even learned of the czar’s abdication. Immediately — “within a few hours,” according to Victor Sebestyen’s excellent new biography, “Lenin: The Man, the Dictator, and the Master of Terror” — he sent out a list of orders to his colleagues in Petrograd. They included “no trust or support for the new government,” “arm the proletariat” and “make no rapprochement of any kind with other parties.” More than a thousand miles away, in Switzerland, he could not possibly have had any idea what the new government stood for. But as a man who had spent much of the previous 20 years fighting against “bourgeois democracy,” and arguing virulently against elections and parties, he already knew that he wanted it smashed.

His extremism was precisely what persuaded the German government, then at war with Russia, to help Lenin carry out his plans. “We must now definitely try to create the utmost chaos in Russia,” one German official advised. “We must secretly do all that we can to aggravate the differences between the moderate and the extreme parties . . . since we are interested in the victory of the latter.” The kaiser personally approved of the idea; his generals hoped it would lead the Russian state to collapse and withdraw from the war. And so the German government promised Lenin funding, put him and 30 other Bolsheviks — among them his wife, Nadezhda Krupskaya , as well as his mistress, Inessa Armand — onto a train, and sent them to revolutionary Petrograd. They arrived at the Finland Station on April 16, where they were welcomed by a cheering crowd.

A few days later Lenin issued his famous April Theses, which echoed the orders that he had sent from Zurich. He treated the Bolsheviks’ minority status as temporary, the product of a misunderstanding: “It must be explained to the masses that the Soviet of Workers’ Deputies is the only possible form of revolutionary government.” He showed his scorn for democracy, dismissing the idea of a parliamentary republic as “a retrograde step.” He called for the abolition of the police, the army and the bureaucracy, as well as the nationalization of all land and all banks.

Plenty of people thought he was crazy. But in the weeks that followed, Lenin stuck to his extremist vision despite the objections of his more moderate colleagues, agitating for it all over the city. Using a formula that would be imitated and repeated by demagogues around the world for decades to come — up to and including the demagogues of the present, about which more in a moment — he and the other Bolsheviks offered poor people simplistic answers to complex questions. They called for “peace, land and bread.” They sketched out beautiful pictures of an impossible future. They promised not only wealth but also happiness, a better life in a better nation.

They certainly did not persuade all Russians, or even a majority of the Russians, to support them. They did not persuade the Petrograd Soviet or the other socialist parties. But they did persuade a fanatical and devoted minority, one that would kill for the cause. And in the political chaos that followed the czar’s abdication, in a city that was paralyzed by food shortages, distracted by rumors and haunted by an unpopular war, a fanatical and devoted minority proved sufficient. 

Lenin making a gramophone recording, 1919
Like Lenin, Stalin never accepted any form of legal opposition — indeed he never believed that there could be such a thing as constructive opposition at all. Truth was defined by the leader. The direction of state policy was defined by the leader. Everyone and everything that opposed the leader — parties, courts, media — was an “enemy of the people,” a phrase that Lenin stole from the French Revolution.

Within two decades of October 1917, the Revolution had devoured not only its children, but also its founders — the men and women who had been motivated by such passion for destruction. It created not a beautiful new civilization but an angry, unhappy, and embittered society, one that squandered its resources, built ugly, inhuman cities, and broke new ground in atrocity and mass murder. Even as the Soviet Union became less violent, in the years following Stalin’s death in 1953, it remained dishonest and intolerant, insisting on a facade of unity. As the philosopher Roger Scruton has observed, Bolshevism eventually became so cocooned in layers of dishonesty that it lost touch with reality: “Facts no longer made contact with the theory, which had risen above the facts on clouds of nonsense, rather like a theological system. The point was not to believe the theory, but to repeat it ritualistically and in such a way that both belief and doubt became irrelevant. . . . In this way the concept of truth disappeared from the intellectual landscape, and was replaced by that of power.” Once people were unable to distinguish truth from ideological fiction, however, they were also unable to solve or even describe the worsening social and economic problems of their society. Fear, hatred, cynicism and criminality were all around them, with no obvious solutions in sight.

So discredited was Bolshevism after the Soviet Union’s demise in 1991 that, for a quarter of a century, it seemed as if Bolshevik thinking was gone for good. But suddenly, now, in the year of the revolution's centenary, it's back.

 “October,” 1928 movie poster

Donald Trump, Viktor Orban, Nigel Farage, Marine Le Pen and Jaroslaw Kaczynski: although they are often described as “far-right” or “alt-right,” these neo-Bolsheviks have little to do with the right that has been part of Western politics since World War II, and they have no connection to existing conservative parties. In continental Europe, they scorn Christian Democracy, which had its political base in the church and sought to bring morality back to politics after the nightmare of the Second World War. Nor do they have anything to do with Anglo-Saxon conservatism, which promoted free markets, free speech and a Burkean small-c conservatism: skepticism of “progress,” suspicion of radicalism in all its forms, and a belief in the importance of conserving institutions and values. Whether German or Dutch Christian Democrats, British Tories, American Republicans, East European ex-dissidents or French Gaullists, post-war Western conservatives have all been dedicated to representative democracy, religious tolerance, economic integration and the Western alliance.

By contrast, the neo-Bolsheviks of the new right or alt-right do not want to conserve or to preserve what exists. They are not Burkeans but radicals who want to overthrow existing institutions. Instead of the false and misleading vision of the future offered by Lenin and Trotsky, they offer a false and misleading vision of the past. They conjure up worlds made up of ethnically or racially pure nations, old-fashioned factories, traditional male-female hierarchies and impenetrable borders. Their enemies are homosexuals, racial and religious minorities, advocates of human rights, the media, and the courts. They are often not real Christians but rather cynics who use “Christianity” as a tribal identifier, a way of distinguishing themselves from their enemies: they are “Christians” fighting against “Muslims” — or against “liberals” if there are no “Muslims” available.

To an extraordinary degree, they have adopted Lenin’s refusal to compromise, his anti-democratic elevation of some social groups over others and his hateful attacks on his “illegitimate” opponents. Law and Justice, the illiberal nationalist ruling party in Poland, has sorted its compatriots into “true Poles” and “Poles of the worst sort.” Trump speaks of “real” Americans, as opposed to the “elite.” Stephen Miller, a Trump acolyte and speechwriter, recently used the word “cosmopolitan,” an old Stalinist moniker for Jews (the full term was “rootless cosmopolitan”), to describe a reporter asking him tough questions. “Real” Americans are worth talking to; “cosmopolitans” need to be eliminated from public life.

Surprisingly, given its mild and pragmatic traditions, even British politics is now saturated with Leninist language. When British judges declared, in November 2016, that the Brexit referendum had to be confirmed by Parliament — a reasonable decision in a parliamentary democracy — the Daily Mail, a xenophobic pro-Brexit newspaper, ran a cover story with judges’ photographs and the phrase “Enemies of the People.” Later, the same paper called on the prime minister to “Crush the Saboteurs,” choosing a word that was also favored by Lenin to describe legitimate political opposition.

Famously, Trump has also used the expression “enemy of the American people” on Twitter. Though it is unlikely that the president himself understood the historical context, some of the people around him certainly did. Bannon, Miller and several others in Trump’s immediate orbit know perfectly well that the delegitimization of political opponents as “un-American” and “elitist,” and of the media as “fake news,” is the first step in a more ambitious direction. If some of what these extremists say is to be taken seriously, their endgame — the destruction of the existing political order, possibly including the U.S. Constitution — is one that the Bolsheviks would have understood. The historian Ronald Radosh has quoted Bannon’s comparison of himself to the Bolshevik leader. “Lenin,” Bannon told Radosh, “wanted to destroy the state, and that’s my goal too. I want to bring everything crashing down, and destroy all of today’s establishment.” At a conservative gathering in Washington in 2013, Bannon also called for a “virulently anti-establishment” and “insurgent” movement that will “hammer this city, both the progressive left and the institutional Republican Party.”


Like their predecessors, the neo-Bolsheviks are also liars. Trump lies with pathological intensity about matters small and large, and he lies so often and so obviously that it is not even necessary to cite his uncounted falsehoods again here. But he is not alone. Recently Le Pen was charged in an investigation into her anti-European party for cheating the European parliament out of money. The Law and Justice party pretends that its attacks on the Polish constitution are nothing more than “judicial reform.” Orban has hidden the probably corrupt details of Russian investment in a nuclear plant in Hungary. These are not coincidences. Nor is it a coincidence that the most successful neo-Bolsheviks have all created their own “alternative media,” starting online and moving into the mainstream, specializing in disinformation, hate campaigns, racist jokes and organized trolling of opponents. (The old Bolsheviks used to call this propaganda, and they were brilliant at it.) Both the politicians and the “journalists” lie out of conviction, because they believe that ordinary morality does not apply to them. In a rotten world, truth can be sacrificed in the name of “the People,” or as a means of targeting “Enemies of the People.” In the struggle for power, anything is permitted.
Finally, and most painfully, there is a hint, and sometimes more than a hint, of a reviving appreciation among the neo-Bolsheviks for the cleansing possibilities of violence. The violent poetry of 1917 has morphed into the violent memes of 2017, the “Ultra Violence” threads on Reddit, the white nationalist groups seeking “race war,” and the NRA videos urging Americans to arm themselves for the coming apocalyptic struggle to “save our country.” Some of this dangerous trash has been around for a long time: far-right and far-left extremists in Europe have always savored the idea of violence. But now some of that nihilistic desire for disaster has become mainstream, even reaching the White House. As long ago as 2014, Trump, after railing against Obamacare, fantasized: “You know what solves it? When the economy crashes, when the country goes to total hell and everything is a disaster. Then you’ll have a, you know, you’ll have riots to go back to where we used to be when we were great.”

Shocking though it is, that sentiment is mild by comparison with Bannon’s apocalyptic vision of a coming war — perhaps with Islam, perhaps with China — that will cleanse the Western world of weakness and restore Western greatness. This is how Bannon put it in 2010: “We’re gonna have to have some dark days before we get to the blue sky of morning again in America. We are going to have to take some massive pain. Anybody who thinks we don’t have to take pain is, I believe, fooling you.” A HuffPost article included similar Bannon statements. In 2011: “Against radical Islam, we’re in a 100-year war.” In 2014: “We are in an outright war against jihadist Islamic fascism. And this war is, I think, metastasizing far quicker than governments can handle it.” In 2016: “We’re going to war in the South China Seas in the next five to ten years, aren’t we?”

No excuse for complacency

Fortunately, we do not live in 1917 Petrograd. There are no bread shortages, or ragged barefoot soldiers, or aristocrats in thrall to mad monks. There will be few opportunities to surround the government in a palace, enter and take it over. Our states are not, yet, that weak.

We also have, as the Russians of 1917 did not have, the benefit of hindsight. In much of continental Europe, the demagogue who divides the nation into enemies and patriots creates bad connotations and triggers unpleasant memories. Over the past year, French, Dutch and Austrian voters rejected the nihilism and xenophobia of Le Pen, Geert Wilders and Norbert Hofer, not least because of what they resembled.

The French may even have taken the first necessary step in the longer battle against false revolutions by voting for Emmanuel Macron, the first major European politician to argue for a muscular revival of liberalism. Macron openly opposed the fear, the nostalgia and the nativism on the rise across the continent, and he won without offering impossible schemes or unattainable riches. Even if he fails in France, his formula hints at a way to fight back against modern false prophets. Offer a positive vision, both open and patriotic. Don’t let the nationalists appeal to “the People” over the heads of the voters. Don’t let extremists become mainstream.

But the Anglo-Saxon world was less lucky. It may not be an accident that neo-Bolshevik language has so far enjoyed unprecedented success in Britain and the United States, two countries that have never known the horror of occupation or of an undemocratic revolution that ended in dictatorship. They therefore lack the immunity of many Europeans. On the other hand, the Anglo-Saxon world has its own advantages: the bonds of old and long-standing constitutionalism, the habits created by decades of rule of law and relatively high standards of living. It may be that as Americans and Brits slowly learn to recognize lies, they will become less susceptible to the fake nostalgia on offer from their leaders.

But there is no excuse for complacency. That is the lesson of this ominous centennial. Remember: At the beginning of 1917, on the eve of the Russian revolution, most of the men who later became known to the world as the Bolsheviks were conspirators and fantasists on the margins of society. By the end of the year, they ran Russia. Fringe figures and eccentric movements cannot be counted out. If a system becomes weak enough and the opposition divided enough, if the ruling order is corrupt enough and people are angry enough, extremists can suddenly step into the center, where no one expects them. And after that it can take decades to undo the damage. We have been shocked too many times. Our imaginations need to expand to include the possibilities of such monsters and monstrosities. We were not adequately prepared.


When I first read that Bannon described himself as a “Leninist,” I was astonished — though the slogan of destroying the existing order did seem familiar. Still, the media usually called Bannon a Nazi, so his self-labeling as a “Leninist” was a shock. Then when Trump first used the phrase “enemy of the people” to describe the media, it all fell into place — the recollection of the political rhetoric I grew up with became complete. Perhaps we should stop thinking in terms of the right and the left, but rather think in terms of people who prefer dictatorship and see no point in democracy.

The nihilistic thirst for WWIII is particularly frightening.

In a different vein, I think the article doesn’t quite do justice to the initial (but continuing for decades) appeal of Bolshevism to countless millions, including a great many Western intellectuals. It wasn’t just an anti-democratic, nihilistic system based on destroying the old order  (though it was that too). Such a system would not fire the imagination of the young. What the young crave is a heroic ideal.

Unfortunately the Nazis knew how to provide that also.

Here it’s good to remember William James and his warning that idealism is a destroying angel.

Also, both the Nazi and the Communist ideology emphasized a radically different future. Never mind how false the promises were — it was still a vision of the future, a goal to be working toward, a direction. The Neo-Bolsheviks rely on the nostalgia for the past — and the past just isn’t all that real to the young. Even older people’s memories of the past tend to be bittersweet rather than filled alleged power and glory. That’s my chief hope: that the young will not find the neo-Bolsheviks sufficiently appealing. 

(Of course some will, as is already evident. Let’s hope they remain a lunatic fringe.)


“The ones who are crazy enough to think they can change the world are the ones who do.” ~ Jack Kerouac


Time for a comic break:


“... We see how ambiguous its results were, how closely the negative and, we must acknowledge, the positive consequences of those events are intertwined,” Putin told a gathering of academics last month.

“Was it not possible to follow an evolutionary path rather than go through a revolution? Could we not have evolved by way of gradual and consistent forward movement rather than at the cost of destroying our statehood and the ruthless fracturing of millions of human lives?”

Putin chose his words carefully. The centenary may leave him with mixed feelings, but it remains a hallowed anniversary for the Russian Communist party and for many older Russians.

Although it is the second largest party in the lower house of parliament after the pro-Kremlin United Russia party, the Russian Communist Party wields little real influence today and votes with the Kremlin on most major issues.

But its supporters, who held a week-long series of celebratory events to mark the revolution’s centenary and were due to rally in Moscow later on Tuesday, believe their time will come again.

“Capitalism is stumbling from one crisis to another,” veteran Russian Communist Party leader Gennady Zyuganov wrote in a centenary congratulatory note to his supporters.

“We are convinced that the sun of socialism will once again rise over Russia and the whole world.”



While the Soviet nostalgia of older Russians has no future, it’s the anti-democratic, totalitarian trend that is worrisome.

Still, for me it’s quite something that Russia’s head of state would openly NOT celebrate the centenary of the Bolshevik Revolution (once the greatest state holiday), and even state that the Revolution was a bad thing and should not have happened. Now that’s revolutionary!


~ "And before the throne was something like a sea of glass, as clear as crystal. In the center, around the throne, were four living creatures, covered with eyes in front and back." ~ Revelation 4:6

Just discovered this trippy passage. For some reason this was never discussed in church! The priests' mouths were sealed as to what creatures (a new species of angel?) cavorted before the throne of god.


~ “Danish researchers from the University of Copenhagen and Aarhus University compared urine and fecal samples from 15 healthy male participants who either consumed a diet featuring cheese or milk or a diet that included butter but no other dairy items. What they discovered was that the cheese-eaters had a different composition of gut bacteria, higher in levels of the compound butyrate—an anti-inflammatory fatty acid produced by intestinal fermentation—and it showed in their stool. High butyrate levels have been shown to actually reduce cholesterol absorption, improve metabolism, and prevent obesity.

Because the sample size was so small and funding was partially provided by the Danish Dairy Research Foundation, it’s crucial to perform larger tests before drawing significant conclusions about how cheese consumption—and which kinds of cheeses—could be the most helpful in terms of weight management. And the nuances of our gut bacteria have long been linked with how our bodies gauge appetite and store (or don’t store) fat.

But don’t just load up a bowl of pasta or thick slice of pizza crust with grated Jack and expect great results. Previous studies have linked blue cheese consumption to gut health and anti-aging properties—keep it classy and have a sliver with some fruit and red wine, another French staple that has been shown to have numerous health benefits.” ~


Butter is also a good source of butyrate. Likewise, consuming a lot of fiber makes your gut bacteria produce butyrate. It’s an anti-inflammatory compound that also helps prevent autoimmune disease.


Many molds simply taste unpleasant, yet are not problematic to our bodies. Dangerous moulds are those which produce mycotoxins and aflatoxins. These toxins may affect our respiratory system and in some cases even act as carcinogens. Not all moulds produce these toxins.

Penicillium roqueforti and Penicillium glaucum, which are the blue moulds used for cheese, cannot produce these toxins in cheese. The combination of acidity, salinity, moisture, density, temperature and oxygen flow creates an environment that is far outside the envelope of toxin production range for these moulds. In fact, this is true for almost all moulds in cheese, which is the reason that cheese has been considered a safe moldy food to eat for the past 9,000 years. Not only is it safe, but it can also be healthy (P. roqueforti and P. glaucum have natural antibacterial properties and ability to overtake pathogens. Moreover, our bodies use a variety of wild flora for digestion, development and immune systems).

Blue moulds have a particularly unique effect on cheese. They accelerate two processes dramatically: proteolysis (breakdown of proteins), which causes the cheese to take on an extra-creamy texture (especially in proximity to the blue mold veins) and lipolysis (breakdown of fats), which makes up the tangy, spicy, sharp and strong flavor. The creamy texture stands up to the sharp flavor and together they produce an exciting flavor/texture/aroma profile, which is often further balanced against sweet/nutty milk and lots of salt (blue cheeses typically contain twice the salt of other cheeses). This combination is so unique — it is unlike any other food.

ending on beauty:

Spring found us:
all the mountains around
are stone weights
to weigh how much we love.

~ Yehuda Amichai