Saturday, November 25, 2017


photo: Frank Rumpenhurst



We rest on a mound of white quartz
inlaid with black lichen.
Sunset's russet glow 

kindles your beard and hair.
Suddenly you step
onto a thick pine branch
and stand, Superman Angel, 

in the vertical sky.

I watch your flight-
ready body, knowing I’d be afraid
to jump back to the rocks.
But you turn, tense up
like a cat, then fly
in one long leap.
The huge pine barely sways,
in velvet shroud of last light.

So again I can plan
to die in your arms.
You will live to be a wild
still straying, swaying to watch
sunset and moonrise —
unless, breakable angel,
you fall and die in my arms.

Strange: when we are happy
we are most ready to depart,
dissolve into the blue 

haze of ridges.
The valleys smoke with mist,
but now you’re safe beside me.
The sky is a flood of rose,
and I love you, I love everything.

~ Oriana

My poems tend to have a strange history. Sometimes I’d write one and instantly know it was an important piece. I’d keep polishing it (sometimes to the point of learning it by heart without trying to) and send it out to magazines or enter them in contests. Some of them did win awards, which made me remember them all the more. But perhaps just as often I’d write a poem and then forget I ever wrote it. Most of those pieces are lost forever, even if they happen to be on my hard drive — there is simply no time to sift through the old files, much less ancient 3-ring notebooks. But now and then I rediscover such a poem, decide it’s quite good, and can hardly understand why it fell into oblivion.

The saddest case is a few poems I remember having written, but can no longer find. Those haunt me: they had enough magic and meaning to be memorable, but I wrote them so long ago that they got lost like those children that just disappear. That sadness is offset by the joy of an occasional rediscovery of a forgotten poem that deserves to be remembered after all.

“Thanksgiving, Mt. Abel” was restored to me by my long-term partner, who fortunately kept a copy — along with many other poems whose copies I gave to him over the years. I wrote it soon after the hike, handed a copy to the risk-taker who inspired it, and — I forgot all about it. But re-reading it brought it all back to me: the thick curved pine branch like the outer arm of a giant candelabra; the leap; the approaching dusk.

He, on the other hand, could no longer remember climbing onto that branch or his daring leap back to the rock. But he remembered something that only his prompting made me remember it as well: there were a dozen or so amateur star-gazers setting up their bigger-than-yours telescopes on a meadow near the top of Mt. Abel because a lunar eclipse was supposed to take place that night. Fascinating, the different ways people remember the same event.

(In case you’re wondering, there is no Mt. Cain nearby. I often wondered about the name — was it perhaps because the open area near the peak made someone think of a sacrificial altar?)

Regaining this poem is not merely a minor literary matter. It meant regaining a happy memory. “You must learn some of my philosophy,” Elizabeth Bennet says in Pride and Prejudice. “Think only of the past as its remembrance gives you pleasure.” It’s taken me a lifetime of brooding on unhappiness before I understood what treasure happy memories are, what beauty.

The thanksgiving in the title does not refer to the holiday, but to my gratitude for having love in my life. It’s a love poem, with mortality in it because it’s a rare poem that isn’t, in some manner, a meditation on mortality. Love and death, the two great subjects of poetry.

Love in poetry is most often love lost — that’s one reason so many poems are dark. Here it’s love gained. The lovers are no longer young; knowing that this love will last, at least one of them is beginning to wonder which of them is going to die first — because of the romantic desire to die in the arms of the one you love (at least until you realize that the loving thing — not that there is any choice about it — would be to wish for the partner to die in your arms, to dissolve into your love, so to speak).

The last stanza comments on what I found out already some years before: that being deeply in love can so transform the world that we both love it as never before and are nevertheless ready to depart without resentment, since we don’t feel that life has cheated us out of anything. It’s a calm knowing that itself adds to the happiness.

This poem has a special angle: arguably, it’s not just love and death, but death and happiness:

Strange: when we are happy
we are most ready to depart,
dissolve into the bluing haze of ridges.

Keats felt it would be “rich to die” when listening to a nightingale; I knew someone who wanted to die listening to the Ode to Joy. I’d like to be looking at the mountains, and to have “I love you” be my last words. Never mind that these are just fantasies. Fantasies are a vital part of life — even the paradoxical fantasies of a “happy death.”  


The kind of happiness you describe in your poem, that sense of completion, seems indeed to pull the sting of death — it becomes a transition you can consider without fear. On my recent plane flight I suddenly realized I was not afraid — I haven't flown much, and was always afraid when I did. When you come to the point where you are satisfied, where you have found love, and have done all the forgiving you need to do, when you have let go of bitterness, resentment, and anger, a space opens up around you, and there is no fear. You don't stop loving life; in fact, it seems more full, more exciting, more glorious than ever, and though you don't want to lose it, you are not afraid of losing it. You have experienced the best, and it will forever be enough.


You’ve put it perfectly. I only want to repeat after you:

“You don't stop loving life; in fact, it seems more full, more exciting, more glorious than ever, and though you don't want to lose it, you are not afraid of losing it. You have experienced the best, and it will forever be enough.”


The longer I live, the more I love life — in spite of the aches and pains that flesh is heir to, especially as the repair process gets less and less efficient due to aging. Now I especially enjoy the so-called little pleasures — even food tastes better, though it’s perhaps that I pay more attention to the taste. I spend time looking at clouds, trees, flowers — I glory in any beauty around me.

On further thought, there’s always some this or that you wish you’d experienced — but that’s where the wisdom of proverbs comes in: YOU CAN’T HAVE IT ALL. And when it comes to Prince Charming or that Great Teacher I was expecting to come, since I was ready (LOL): NOBODY’S PERFECT. Fortunately it’s enough to know that you’ve experienced plenty of wonderful things. If I can’t readily summon gratitude, I remember the Pacific Ocean: it’s the largest in the world, a first-rate ocean — and it’s within “easy commute” of where I live. A first-rate ocean! How could I complain of bad luck when this holds true . . .


The first, happy year with M (not the person in the Mt. Abel poem), he said to me, “If I had to die right now, I wouldn't mind. I could just go anytime. “ I knew what he meant: life had finally granted him the fulfillment he wanted. He was so sated with happiness that he felt calm and accepting — and, if need be, willing to let go of life with gratitude.

I knew, because even at a very unhappy time in my youth I experienced a similar serenity and a similar perception of being ready to die, even though I was only 28. Just before my most serious surgery, I realized (an unforgettable minute when it all flowed to me) that, for all the misery I’d also experienced, life had given me great gifts and blessings. I had known great love; I didn’t know motherhood, but I didn’t resent it because now I didn’t have to worry about leaving an orphan. I had had the best of literature, art, and music; I’d seen gorgeous scenery; my Polish summers were a paradise of nature, even the time I got chased by hissing geese that nipped my shins.

I felt reconciled to the possibility of dying, even though I hadn’t yet “done” anything to speak of. That was irrelevant somehow. I felt peaceful and accepting: life had been generous to me; I didn’t feel cheated.

Occasionally this theme appears in poetry: in Keats’s “Ode to the Nightingale” Sexton’s “Starry Night,” Hölderlin’s “To the Fates.” Hölderlin says he’ll enter the world of shadows content after he’s had his fill of singing: “Once I lived as the gods; more is not needed.” Keats and Sexton want to die sated with beauty: “Now more than ever seems it rich to die”; “Oh starry, starry night! This is how I want to die.”

And there is Jack Gilbert’s wonderful title: “We Have Already Lived In the Real Paradise.” It’s all in the title; more is not needed.

It’s not dying we dread, but not having lived.

Francis Picabia: I see again in memory my dear Udnie, 1914 (allegedly inspired by a Polish dancer). The title is very important, I think. It moves us. Ultimately, that's what remains: the memory of love, of tenderness, of being accepted and valued in that incredible way.

The amateur star gazers on Mt. Abel were quite a sight, each with a “bigger-than-yours” telescope. But moon eclipses and meteor showers are minor things compared to the overwhelming discovery that the universe is expanding. 



~ “The Hubble Space Telescope is named for this astronomer. Why? It’s because Hubble’s work was pivotal in changing our cosmology: our idea of the universe as a whole.

Most astronomers 100 years ago believed that the whole universe consisted of just one galaxy, our own Milky Way. In the 1920s, Hubble was among the first to recognize that there is a universe of galaxies located beyond the boundaries of our Milky Way.

He also showed that our universe of galaxies is expanding.

During the 1920s, Edwin Hubble observed stars that vary in brightness in a patch of light known at the time as the Andromeda nebula. He knew that these stars changed in brightness in a way that depended on their true brightness. He then saw how bright they looked to find the distance to the Andromeda nebula.

At the time, many astronomers believed that the Andromeda nebula was a forming solar system, located within the Milky Way’s boundaries. Hubble showed that this patch of light was really a separate galaxy – what we know today as the Andromeda Galaxy – the nearest large spiral galaxy beyond our Milky Way.

As soon as other nebulae were revealed as separate galaxies, the known universe got much bigger!

But was this huge universe stationary? Or was it expanding, or contracting?

The answer involved the light of galaxies as a whole. Astronomers observed that the light of distant galaxies was shifted toward the red end of the light spectrum. This red shift was interpreted as a sign that the galaxies are moving away from us. Hubble and his colleagues compared the distance estimates to galaxies with their red shifts. And – on March 15, 1929 – Hubble published his observation that the farthest galaxies are moving away faster than the closest ones.

This insight became known as Hubble’s Law. It was the first recognition that the galaxies are moving away from each other – that our universe is expanding.
It’s said that Albert Einstein was elated to hear of Hubble’s work. Einstein’s Theory of Relativity implied that the universe must either be expanding or contracting. But Einstein himself rejected this notion in favor of the accepted idea that the universe was stationary and had always existed. When Hubble presented his evidence of the expansion of the universe, Einstein embraced the idea. He called his adherence to the old idea “my greatest blunder.”

But the story of Hubble’s great insights begins earlier. In 1908, an astronomer named Henrietta Leavitt had discovered a relationship between the period and luminosity of a class of pulsating stars called Cepheid variables. By timing its period, astronomers could work out the true luminosity of a Cepheid – and by comparing the true luminosity with the observed brightness, they could work out its distance.

This worked fine for judging distances inside the Milky Way, but it wasn’t until the 1920s that telescopes existed that were powerful enough to observe Cepheids in other galaxies. Hubble spotted his first Cepheid in the Andromeda ‘spiral nebula’ in 1924.

The pulsation of the Cepheid variables let them estimate true distances to these objects. That’s how they showed that the objects are really separate galaxies, located extremely far away.

The nearest galaxy, the Andromeda galaxy, is 2.2 million light-years beyond our Milky Way. But other galaxies extend around us in space for many billions of light-years.” ~


The last time I browsed in a New Age bookstore (which already seems like a lifetime ago, the bookstore closed now, like so many bookstores), I happened on a book that announced that the whole UNIVERSE is supposed to be consumed by fire, not just the earth. Those billions of galaxies, who needs them? Well, some time by 2050 they will all be blotted from existence.

Stars, if you remember, are to fall down to the earth as the trumpets sound and the moon turns to blood — but for some reason I was oblivious to the fact that the belief in the end times includes the whole universe, scheduled for an imminent extinction. The Horsehead Nebula in Orion that people love so much? Blow a good-bye kiss toward Orion, the celestial hunter. No more Sirius, his bright dog star, or Capella, the little she-goat near Polaris. In fact kiss the whole universe goodbye, since the end could come anytime. Haven't you seen the bumper sticker that says, “In the event of Rapture, this vehicle will be unmanned?”

The book made no reference to astrophysics and the expanding universe, the galaxies speeding away from one another — a different kind of apocalypse, a vanishing that perhaps will be constrained by dark matter and dark energy — perhaps. With so many immediate problems, it’s a bit difficult to ponder the universe a billion years from now. No, the book was “spiritual.”

New Age is mainly crypto-Christianity, I’ve decided. Instead of Jesus, they talk about the Holy Spirit. And New Age is in decline these days. No one seems interested in Lemuria anymore, or in how “two entities of light” appeared in anyone’s living room. Empires rise and fall, Yahweh is dead, Jesus has grown pale and blond to look like a Norwegian; the Holy Spirit still keeps hovering, but in fewer and fewer bookstores and “centers for creative living.” Yes, this is the end of the old world order, but isn’t it always the case?

Strange, those Lemurian warriors don't appear so ferocious. So why has the Vatican been suppressing the evidence of their existence?


End of the world? What we do have is events that change the world forever. Even if The Economist can say, “As the world marks the centenary of the October revolution, Russia is once again under the rule of a tsar,” it’s not as before. The revolution raised questions to which we are still seeking answers. 



~ “In Russia, Putin’s state knows that the revolution matters, which puts it in an odd position. Committed to capitalism (gangster capitalism is still capitalism), it can hardly pitch itself as an inheritor of an uprising against that system: at the same time, official and semi-official nostalgia for the symbolic bric-a-brac of Great Russia, including that of Stalinist vintage, precludes banishing the memory. It risks being, as historian Boris Kolonitsky has put it, “a very unpredictable past”.

In one sense it’s uncontroversial that 1917 matters. After all, it is recent history, and there’s no arena of the modern world not touched by its shadow. Not only in the social democratic parties, shaped in opposition to revolutionary approaches, and their opponents of course, but at the grand scale of geopolitics, where the world’s patterns of allegiance and rivalry and the states that make up the system bear the clear traces of the revolution, its degeneration and decades of standoff. Equally, a long way from the austere realms of statecraft, the Russian avant-garde artists Malevich, Popova, Rodchenko and others remain inextricable from the revolution that so many of them embraced.

Their influence is incalculable: the cultural critic Owen Hatherley calls constructivism “probably the most intensive and creative art and architectural movement of the 20th century”, which influenced or anticipated “abstraction, pop art, op art, minimalism, abstract expressionism, the graphic style of punk and post-punk … brutalism, postmodernism, hi-tech and deconstructivism”. We can trace the revolution in cinema and sociology, theatre and theology, realpolitik and fashion. So of course the revolution matters. As Lenin may or may not have said: “Everything is related to everything else.”

. . . So to go back to the question: why does the revolution matter? Because of what was right about it, and what went wrong. It matters because it shows the necessity not only of hope but of appropriate pessimism, and the interrelation of the two. Without hope, that millennial drive, there’s no drive to overturn an ugly world. Without pessimism, a frank evaluation of the scale of difficulties, necessities can all too easily be recast as virtues.

Thus after Lenin’s death the party’s adoption of Stalin’s 1924 theory of “socialism in one country”. This overturned a long commitment to internationalism, the certainty that the Russian revolution could not survive in isolation. The failure of the European revolutions provoked this – it was a shift born of despair. But announcing, ultimately celebrating an autarchic socialism was a catastrophe. A hard-headed pessimism would have been less damaging than this bad hope.

Almost 60 years before the revolution, the radical writer Nikolay Chernyshevsky published What Is to Be Done?, a long political novel with an immense impact on the socialist movement, especially on Lenin, who, in 1902, named his own seminal tract on organization after the book. Chernyshevsky’s depiction of the hinge point, a fulcrum from history to future possibility, comprises in its entirety two rows of dots. Informed readers would understand that behind the extended ellipsis was revolution. Thus Chernyshevsky evaded the censor, but there’s something religious, too, eschatological, in this unwriting, from this atheist son of a priest. Apophatic theology is that which focuses on what cannot be said of God: an apophatic revolutionism, unashamed to go beyond words.” ~


For me, the most important sentence here is: “ultimately celebrating an autarchic socialism was a catastrophe.” Autarchic means autocratic, dictatorial — not allowing any questioning or dissent. When Lenin tried to dismiss the importance of freedom by asking, “Freedom for whom? To do what?” Rosa Luxembourg replied, “Freedom is always and exclusively freedom for the one who thinks differently.” 


from the Guardian review of Mieville’s October:

~ “The worst aspect of Stalinism – the unpredictability and arbitrariness of terror – ended after the dictator’s death. There followed 35 years of what western analysts disparage as stagnation but which for most Russian families was their first experience of economic sufficiency and political stability. This massive post-Stalinist change was deliberately obscured in the west during the cold war so as to provide one more justification for the argument that communism cannot be reformed but must be destroyed. As a result, most western analysts and politicians treated and still treat the historiography of the Soviet Union as a single block of time rather than dividing it into two periods, equal in their number of years but with radically different contents, one of turbulence, war and invasion, the other of order, peace and security. Because of this misinterpretation, outsiders fail to understand why many middle-aged and elderly Russians look back on the USSR with nostalgia. Its collapse was followed by a new wave of upheaval, which Putin is thanked for ending.

China Miéville’s contribution in October is to get away from ideological battles and go back to the dazzling reality of events. There is no schadenfreude here about the revolution’s bloody aftermath, nor patronizing talk of experiments that failed because they were doomed to fail. Known as a left-wing activist and author of fantasy or what he himself calls weird fiction, Miéville writes with the brio and excitement of an enthusiast who would have wanted the revolution to succeed. But he is primarily interested in the dramatic narrative — the weird facts — of the most turbulent year in Russia’s history: strikes, protests, riots, looting, mass desertions from the army, land occupations by hungry peasants and pitched battles between workers and Cossacks, not just in Petrograd but along the length and breadth of a vast country.

He is equally fascinated by the verbal fisticuffs, the debates and arguments at the epicenter between Socialist Revolutionaries, Mensheviks, Kadets, Kerenskyites and Bolsheviks. Miéville brings to life the democratic practices that continued to be observed to an astonishing degree even as law and order crumbled – struggles over the wording of Pravda editorials, votes (for, against, abstentions) taken at meetings of the Duma and the All-Russian Congress of Soviets, a rash of municipal elections. This was not a contest of warlords like those that mark many other revolutionary struggles but a battle of pamphlets and verbal jousting between men in suits endowed with huge oratorical talent.

There is wonderful detail on small points too. On the sealed train that brought him and his comrades from Switzerland back to Russia, Lenin was the man who organized the queuing system for the loo. In July, Trotsky acted as a moderate, telling the hard-left advocates of “all power to the Soviets” to stay calm and stick to sentry duty even as Cossack forces in Petrograd were killing workers on the streets. In August, as general Kornilov mounted his counter-revolutionary putsch, prime minister Kerensky bellowed operatic arias in his bedroom to try to steady his nerves.

Miéville does not neglect the Muslim issue, overlooked until recently – some contemporary scholars such as Jonathan Smele now see the anti-Russian uprisings in central Asia in the summer of 1916 as the true start of what turned out to be several overlapping civil wars – but records how the All-Russian Muslim Conference in May 1917 passed 10 principles, including women’s right to vote, the equality of the sexes and the non-compulsory nature of hijab.” ~


There is no question that the Revolution matters. It has raised questions to which we are still seeking answers — about an optimal economic system for various part of the world, about the feasibility and fragility of democracy given the growing influence of the very rich. I’ve always had the feeling that “someone had to do it” — someone had to perform the huge experiment of trying to revamp the fundamentals, or else it would be dreamed of forever and ever. I also think that the abolition of democracy and the suppression of dissent doomed that experiment.

Regulated capitalism with a strong social security net seems to work. Of course nothing is ideal, without drawbacks. But non-violent solutions seem best.

One of the many bad things about a violent revolution is that in order to perpetuate its non-democratic rule it has to rely on heavy propaganda, i.e. lies. 


This week's blog has a theme running through it, exploring the dynamic in history, culture, and thought, of the relationship between change and resistance to change, revolution and its transformation into repressive autocracy, the ever expanding universe and the solid state, static, unchanging universe we believed in until less than 100 years ago. Violent revolution, the overthrow of a stagnant and oppressive system, seems a daring, new revelation in thinking and action, followed by the formation of yet another monolithic repressive system, disallowing further change, new thoughts, any challenge to the new powers that be. France, Russia, China. Not to mention Christianity, another bit of radical thinking quickly subsumed by stultifying orthodoxy.

Perhaps we simply can't tolerate perpetual revolution, endless change. Maybe that's why tribalism is so prevalent, so difficult to challenge or abandon. Perhaps thinking 'outside the group' is just too terrifying, feels too much like apocalypse,  like losing everything, all safety, all meaning, all connection. Nothing rational in any of this, so of course, rational argument, careful exposition, changes no one's mind in these circumstances. The whole idea here is 'no thinking allowed’ — No exegesis,  no argument, no shaking the boat. Memorize the party line, and recite it. Loudly. So in times of change some grow nostalgic, longing for that old, narrow, predictable world, that has somehow become a Paradise lost.


Part of it is the way our brain works. Whatever we grow up with becomes “normal,” and the deeper brain structure abhor the “loss of the familiar.” That’s one reason immigrants almost invariably suffer from homesickness (and that’s a real sickness, with crying fits and what feels like tightness around your heart — the first two years are the worst), and it doesn’t matter that change has been mostly for the better. Their homeland now becomes paradise lost.

Tzarist Russia became paradise lost to many, and not just to those who managed to leave the country and thus experienced the common variety of immigrant trauma. Now the Soviet era is a source of nostalgia for millions. To many of those who grew up during that time, those were the glory days. Presenting statistics is useless.

It’s only human to hate sudden big change — but small incremental changes generally don’t provoke vehement emotions. Of course it all depends — sometimes there can be no half-way measures. You either abolish slavery, or you don’t.

In the US there is a special roadblock to change — the Evangelical voters. Since they expect the End of the World any time now, but certainly by 2050, it’s obviously pointless to go to college or to try to protect the environment. Religion has typically been an obstacle to progress, but in this country the reactionary nature of religion is particularly acute.

Still, that’s an excellent insight about early Christianity and how soon it fossilized into a reactionary, dissent-suppressing orthodoxy. For one thing, it allied itself with the rich and powerful — as religion usual does. 


~ “Manson ordered Atkins, Krenwinkel, Watson, and Linda Kasabian to begin “Helter Skelter,” his term for an end-of-days battle he hoped to start between whites and blacks (loosely inspired by The Beatles song of the same name). They would murder whites and frame blacks to start the race war, which blacks would win at first. After black people took over the world, Manson believed, his family would hide out in the desert and eventually overtake them to rule the Earth.

The first killings took place at Polanski’s home in the Los Angeles neighborhood Benedict Canyon while he was away on the evening of Aug. 9, 1969.

The first victim was teenager Steven Parent, who was killed in his car while trying to leave the property. Parent was stabbed and also shot in the face.

The assassins then made their way into the rented home, where they repeatedly bludgeoned and stabbed the guests inside. Victims included hairstylist-to-the-stars Jay Sebring, writer Wojciech Frykowski and his girlfriend Abigail Folger (an heiress to the Folgers coffee fortune).

The most gruesome slaying was that of Tate, who was in the third trimester of her pregnancy. According to a Family member’s testimony, Tate begged the killers to spare her unborn child.

“Look, bitch, I have no mercy for you. You’re going to die, and you’d better get used to it,” Susan Atkins barked to Tate, before she and Watson repeatedly stabbed her to death. They then scrawled “pig” on the front door with Tate’s blood.

The next evening, around midnight, Manson led the same four killers plus Van Houten and Grogan to the home of grocery-store executive Leno LaBianca and his wife, Rosemary, to “show them how to do it.” Manson tied up the LaBiancas before leaving his minions to finish them off.

Leslie Van Houten held Rosemary Labianca down and covered her face with a pillowcase while another Family member carved “War” into her husband’s stomach after stabbing him in the couple’s home. (Then they helped themselves to chocolate milk in the fridge.) Van Houten was also the one who scribbled missives on the house walls using their victims’ blood. 

In a 1971 trial, Manson was convicted and sentenced to life for the 1969 murders of Donald “Shorty” Shea and Gary Hinman. When Shea, who was a ranch hand and stuntman on Wild Western films returned to Spahn Ranch with a black wife, it allegedly set Manson off.

The murders failed to incite the prophesied race war Manson predicted, but they signified a violent end to the ’60s dreams of the hippies that the Family seemed to emerge out of.

“[Manson] has no redeeming values,” Kay said. “And wanting to commit these murders and blame them on blacks to start a race war, I mean, that’s one of the worst motives that I ever came across in all my 37 years as an L.A. prosecutor.”

At 73, and now retired, Kay said he can still hear the sinister threats on his life made by Manson and his disciples.

“Squeaky [Fromme] and Sandy Good snuck up behind me and said they’re going to do to my house what was done at the Tate house,” Kay said. Fromme and Good faithfully appeared at court every day to support the Manson Family.

All of the Family members who were sentenced to death, including Manson, were spared when the California Supreme Court overturned the death penalty back in 1972 and commuted their sentences to life in prison. The state would later bring back the death penalty, but the life sentences for Manson and his killer kin stuck.

Kay isn’t blind to the irony that had the sentence gone forward Manson wouldn’t have become quite the diabolical deity that has haunted popular culture for decades.” ~


I don't think much was made of the swastika on the foreheads of the members of the “Family.” I don't remember anyone mentioning that it was meant as a symbol of white supremacy. The racist angle, if mentioned at all, was buried and lost amidst all the lurid details. Manson was written off as a psychopathic cult leader. He certainly was that, and deluded enough to think he and his Family would be the only white survivors of the race war, during which he’d hide underground in Death Valley.

Delusional, yes. A charismatic psychopath, yes — a phenomenon familiar to criminologists. But the racist aspect, the swastika tattoo — only now it all seems to cohere. Only now we see that Manson was not a product of the hippie counterculture — he merely learned how to exploit it to manipulate his followers. He was instead a blatant white supremacist and a forerunner of today’s alt-right.

from an earlier article in Newsweek:

~ “He referred to African Americans as “blackies” and was petrified of black Muslims and Black Panthers. He reportedly refused to associate with black inmates during his time in prison.

“Charles Manson was one of the most virulent racists that ever walked the planet,” Jeff Guinn, author of Manson: The Life and Times of Charles Manson, told Newsweek.

Manson grew up in the 1940s in West Virginia, where racism was rife. After moving to California and serving two stints in prison, he started “The Family,” indoctrinating its members with the idea that blacks would rise up and start a war with whites, an idea known as “Helter Skelter.” He believed that blacks would eventually win because they were essentially savages.

He regularly spewed racist remarks to the bikers who were supplying his cult with drugs, but told his hippie apostles he just said those things because that’s what they wanted to hear, his followers told Guinn when he interviewed them in prison.

To incite a race war, Manson ordered his devotees to carry out gruesome murders that they would try to pin on blacks by leaving behind clues, such as words used by black power groups like the Black Panthers scrawled in victims’ blood.

The charismatic leader got his followers to go along with him by threatening that they would be either butchered or enslaved by the remaining blacks if they didn’t do what he said.

“I keep being reminded of Charlie Manson when we see white supremacist groups. It’s almost like they’re copying the Charles Manson playbook,” Guinn said. “He’s certainly acting as a role model for people today.”

And while his name will forever be associated with a cult of LSD-fueled hippies and gruesome murders, some have started to recognize him for what he really is.” ~


~ “I can think of no better way to keep a culture from changing too much, or too fast, than ascribing divine authority to it. When you think about it this way, a whole lot more things start to make sense. For starters, here are three things we learn from thinking this way.

1) Suddenly it makes sense why it’s so hard to change a religious person’s mind.

The direct approach—critiquing the beliefs themselves—often produces little change in the thinking of the believer because the real strength of the belief system comes from something external to the beliefs themselves. The real strength of our beliefs lies in their ability to hold together the tribal identity.

Have you ever tried changing the mind of someone who believes things that are irrational or lacking in evidential support? The mental gymnastics they perform in front of you will leave you dizzy, especially if they are relatively intelligent (and yes, intelligent people believe irrational things, too). If they are less articulate, they will just dig their heels in and keep restating their belief, now in ALL CAPS, as if you didn’t just expertly disassemble the entire narrative undergirding their belief. It’s like talking to a brick wall.

But why? Why the backfire effect? Why is it so hard to change their minds about things that are so easily deconstructed?

I recall an article a few months back wondering aloud why Trump supporters seem convinced of everything the man ever says even after showing them he contradicts his own positions three times in a single week. They will defend anything he says or does, not because the actions or words themselves are rationally defensible, but because at some point the mantle for a particular group identity was placed on him and from that point forward it became about the tribal identity, not the man himself.

Would you go back and reread that last sentence? The reason Trump remains popular with his base no matter how dangerous or irresponsible (or demonstrably false) his tweets and off-script public statements become is that he’s become a symbol for a group identity, like a team mascot strutting the sidelines during a football game. People will root for their team no matter how consistently poor their performance because it’s not about the performance. It’s about the group identity.

2) This also helps to explain why people take it so personally when they learn you no longer believe the same things they believe.

How many of you had to break it to your parents that you no longer believe the central tenets of their religion? Did they react charitably, with sympathy, understanding, and grace? Or did they explode in anger, remorse, attempts at coercion, or possibly even a verbal assault because “How could you do this to us?!”

Wait, what? What do you mean “do this to us?” From your perspective, this wasn’t some kind of personal slight to them. It was an individual matter, an unavoidable consequence of following your own thought processes, your own search for truth, wherever it leads you. But that’s not how they experience it at all. To them, this was a personal slap in the face.

That doesn’t make any sense until you realize that religious beliefs are social constructs — they are the scaffolding around which communities organize themselves such that your departure from their belief system means you are undermining the social fabric through which their entire identity is woven. What will everyone think of them now?

There is virtually no unoffensive way to tell friends and family you no longer believe in their religion. To do so automatically takes something out from under their social edifice and makes the whole thing feel like it’s wobbling a little. That’s why they get so angry. That’s why they take it so personally. In their moments of greatest insecurity the nicest people in the world will say the meanest, most careless things because your departure fundamentally threatens their tribal identity. They almost can’t help it.

3) It also explains how positions on issues that are non-essential to a religion (like same-sex attraction) can become the hill they are ready to die on.

This one keeps surprising me. I’ve personally taken part in quite a number of discussions through the years about which beliefs are truly essential to the historic Christian faith — what C.S. Lewis called “mere Christianity” — and yet I cannot recall a single one of those discussions including “fighting the gays” as a key component to the gospel.

And yet. Disapproving of same-sex attraction has become a litmus test for evangelical and fundamentalist Christians the world over. Scrolling through your newsfeed, you could be forgiven for concluding that this is why Jesus came to earth—to rid the world of homosexuality—despite the fact that the man never said a single word about the subject. I guess it never came up. But still, you would think someone with a direct line to Heaven would have included at least one quick mention for future reference.

For the life of me, I cannot explain theologically how disagreeing on this single issue could equal a betrayal of the entire Christian faith. It doesn’t really add up in my mind. Except that it does once you realize that at some point in the recent past it was decided that this would be an identity marker for the tribe itself, and that was the end of the discussion. Once that association was made, the battle lines were drawn and now they’re willing to go down fighting over this.

One could argue that the key issue with this particular point is really family structure itself. Modern American churches are built around meeting the needs of the traditional American family, which means one man married to one woman with at least two or three kids needing entertainment, character formation, and good friends to play with. That’s the target audience for the evangelical and fundamentalist church (too bad if you’re single and way worse if you’re gay). That is the family structure they will fight to the death in order to preserve. Their survival depends on it.

Incidentally this would also explain the church’s over-the-top obsession with sex in general, or rather controlling how people do it. If you let people have sex outside of wedlock they may never get around to marrying and having those kids you need them to have so that they’ll start coming to church again (because who will teach the children morals?). If you allow the family to start looking like something other than the template around which their subculture is built, what will happen to the tribe as a whole? It would likely dissolve into the surrounding world and the identity would be lost forever.” ~

~ “In deep-red white America, the white Christian God is king, figuratively and literally. Religious fundamentalism is what has shaped most of their belief systems. Systems built on a fundamentalist framework are not conducive to introspection, questioning, learning, change. When you have a belief system that is built on fundamentalism, it isn’t open to outside criticism, especially by anyone not a member of your tribe.” ~ Forsetti’s Justice

As the caption in Esquire said, “Wake up and smell the white supremacist theocracy.”


Another aspect of rural fundamentalism is practically no knowledge of the larger world. In rare instances of foreign travel, those are religious-theme tours, e.g. In the Footsteps of Paul. Fundamentalists travel in a bubble, carefully insulated from any contact with non-Christians and the riches of non-Christian cultures. It’s amazing how it's possible to live in the modern world but not in true contact with it.

But the basic problem is absolutism. God is assumed to be unchanging, so the very idea of change is heresy. That religions and other systems of belief evolve is a horrible, inadmissible idea to anyone in the absolutist-eternalist camp. Even clothes should be those of a previous era — especially of course women’s clothes should harken back to the “purity” of an unreal, idealized past. (What does the future hold? The end of the world.)

For me the critical factor in outgrowing religion was learning about other mythologies. This came before I later learned, for instance, that Elohim was a plural, “the gods,” implying an evolution from polytheism to monotheism. And before I learned how freely made-up the stories were, e.g. according to historical record, there was no “slaughter of the innocents” — the story was made up to create a motive for the escape to Egypt so that there could be a return from Egypt, a parallel with Exodus (the slaughter of the Egyptian first-born being another parallel); the gospel writers were concerned with aligning Jesus with the great events and figures of the Hebrew tradition. His bad fit as the Messiah required much distortion and outright fictional support.

But, first of all, at a certain age I could not but help thinking for myself, in spite of trying for two years to suppress my thinking. One day, and literally in one instant, the dam burst; thinking happened and could not be reversed. But I realize that this is far from universal experience (nevertheless, it may be more common than we are aware of: according to a Catholic source, 80% of Catholics leave the church by the age of 23).

Forsetti states that evangelical safeguards against thinking are formidable indeed. This reminds me of Hannah Arendt’s statement about the refusal to be a person and preference for “following orders” that stem from the refusal to think for yourself. But is this refusal a conscious choice? There is the early-childhood indoctrination and the threat of hell for incorrect belief. Any spark of independent thinking (“the whisper of Satan”) is quickly extinguished with punishment, whether external or (more likely) internalized.

Somewhat on a tangent, I love the folk etymology that “Israel” supposedly means “struggling with god.” While scholars think the name is probably pre-Semitic and only the “El” part is clear, the chief Canaanite god, I love the notion of the kind of deity you can argue with. This leaves room for dissent and wildly contradictory interpretations — rather the opposite of “submission.”

(I realize that reality is no idyll of free inquiry. Any orthodoxy finds ways to suppress real thinking, real questioning. Spinoza got excommunicated from the Jewish community because he dared to think too differently, along the lines of pantheism. Today’s liberal Judaism is much more tolerant — and some say that’s exactly why it’s doomed; once you start thinking, you become an agnostic or maybe a pantheist; in any case, you’re no longer scared and obedient.)

(A shameless digression, thanks to Neil Carter: here is Jesus kicking away the whole notion of kosher food. “Are you so dull?” he asked. “Don’t you see that nothing which enters a person from the outside can defile them? It doesn’t go into their heart but into their stomach, and then out of the body.”
[In saying this, Jesus declared all foods clean.] ~ Mark 7:18-19)

I distinctly remember a part of this from my church-going years. The gospels were read over and over, so it was easy to memorize certain sayings without even trying. I do remember the part that says, “What goes into your mouth cannot defile you.” But at the time I had no real understanding of kosher versus non-kosher food, even though the catechism-teaching nun did explain about hooves and how animals with divided hooves were regarded as “unclean.” It sounded bizarre and it made no sense to us to children.

What was never discussed was the terrific courage that Jesus had to say that no food defiled us. It was a heresy for which he could have been stoned! Maybe the apostles were strong working-class men because they had to function as body guards — that wouldn’t surprise me one bit.

Imagine if all people were familiar with other cultures, other traditions, other mythologies?

ending on beauty

Do not move
Let the wind speak
that is paradise

~ Ezra Pound, Notes for Canto CXXX

Saturday, November 18, 2017


Methuselah tree: a 4845-year-old Great Basin bristlecone pine (Pinus longaeva) tree growing high in the White Mountains of Inyo County in eastern California. I have visited the White Mountains and got to see Methuselah and its ancient siblings, amazing sculptures carved by wind and scarcity of water and nutrients.


For a long time we have been in this car,
His hands on the wheel, the sun
Finishing behind the building

And a couple walks by, tucked into one coat
As if against a wind. I am not sure
If he has seen them, but he goes on

This talk of his and I do not watch
His body anymore, the light being
What it is, already going. He repeats

He wants to do this alone
But will do everything they tell him
And will do nothing more than that

While now and then traffic comes
From up behind then veers around
As we sit into dark, streetlight not yet

Started, his head against the window
Like a bird, waiting; we have been at this
For hours as the light changes, trying to love

What has not yet been written and then
We are still here when the couple turns back;
They were only walking around the block

Or maybe they return because the snow
Has begun and from this sudden world
The unbroken comes, and nothing is wasted.

~ Sophie Cabot Black, “The Exchange”


Why have I chosen this poem just before Thanksgiving? Because it praises life. Bear with me.

Other poems in the book makes clear that the speaker’s partner is dying of cancer. The poems are a long death watch — though the surprise here is a sprinkling of poems about Abraham and Isaac, with absence of “the animal that was supposed to save us” and other twists. Still, the main story is the long dying of a man still young, just settling into his career, the couple buying their first house. And then comes the diagnosis, and the whole world changes as the husband and wife (whether they are officially married doesn’t matter) begin to see what will no longer happen:

The meadow you meant to walk all year;
That part of the woods you’ve never been. 


Note how in this poem the setting of the sun and the growing dusk becomes symbolic, but too conveniently so, and we realize that when the speaker says

I do not watch
His body anymore, the light being
What it is, already going

we realize that it’s not so much the literal waning of the light as the waning of the body that is meant here; it’s painful to look at someone in the final stage of cancer. They tend to be emaciated like the victims of concentration camps; suddenly, as if overnight, they also look extremely aged. But Sophie Cabot Black is no Sharon Olds, who tends to wallow in physical detail (note all the critics who accuse Olds of “oversharing”). Black spares us the physical description of terminal cancer; when I mention “emaciated” and “suddenly extremely aged,” I speak on the basis of eyewitness experience. But the author of this poem collection about the dying of her partner avoids the physicality of disease. The reader is left to imagine the dying man as relatively young and probably attractive — healthy and active before the diagnosis.

Another interesting and unexpected detail here is the couple walking by, “tucked into one coat.” This is something somewhat childlike and “fun” that new couples may do, usually very young couples, at the beginning of their relationship. The shared coat symbolizes their unity and their sharing of whatever they have, their mutual nurturing. This is a brilliant detail — one relationship is being born while another is ending, not through anyone’s fault or lack of courage or kindness (“He wants to do this alone” implies that he doesn’t want to increase anyone’s suffering by having them watch his actual last moments; that's his last gift to others).

The poem ends on the continuity of life: the young lovers tucked into one coat return, and it begins to snow, making the world look “unbroken.” 

Note that here snow doesn’t function as a shroud. But it’s been such a frequent literary use of snow that the “shroud” metaphor is probably in the reader’s consciousness. The speaker sets up the opposite: snow as unbrokenness, as the continuity of life. But the most potent symbol of that is the couple tucked into one coat.

Can we believe that “nothing is wasted”? We want to. At the very least, we imagine that not everything is wasted. Some important traces and memories will remain for a while. But maybe that is not all that important. What is important is that life will indeed go on, with others continuing to fall in love, so love will go on. As Borges observed, others will be our immortality.   


Yes, we want to believe “nothing is wasted,” and while I find the idea of reincarnation less than appealing, and somewhat absurd, I find great comfort in things like Einstein’s famous equation, the conservation of matter and energy, the long chain of connection and change in our genetic history, and that of all forms of life. I love that something of the dinosaurs is remembered in birds, and something of Neanderthals in our own DNA.

It seems that most religious systems present a universe already completed, static, all things already ordered and accounted for, all the rules and meanings written down — everything already finished, the end known, we simply have to follow the steps set out for us. The world of science is so much more of an adventure — a place of endless challenge and discovery, with room for play and hope, creation and freedom. A place where maybe nothing much is lost, and nothing wasted.


Sister when I look at you
I see our Mother’s face
When I speak
I hear her voice
And I know we have more of her
Than any of the things she left behind
Nothing is absolutely lost
Like a thrifty housewife
Time keeps every scrap
To use and reuse
Each atom danced out
Again and again
Cards sorted and resorted
Through a million hands
Each slap and shuffle
A new chance
In the old and endless game
Whose rules we only faintly


I too love the conservation of matter and energy and all the things that you mention. But I mourn the fact that it takes so long to learn how to live, to acquire some wisdom, to learn patience and other supremely useful and important skills . . .  and to learn how to really enjoy life . . .  and after so much time spent learning the hard way — I feel I'm finally ready to begin! — there isn’t that much time left. And all that personal wisdom will end with the individual, since others have to make their own mistakes.

Each death is the end of the world — that person’s rich and unique universe. Yes, of course the atoms will be recycled — but the inner life ceases like a flame once the fuel is exhausted. And “there’ll never, never be another you” — the refrain of the song from a justly forgotten movie, The World According to Garp.

I am slightly consoled only when I think that sometimes something we read affects us deeply and becomes part of our psyche. Perhaps something that I said and/or wrote will touch others in a positive way. But as for continuing in the sense of a literary afterlife, for instance, I stopped kidding myself a long time ago. My liberation was the insight that we are of the moment, and belong entirely to that moment. Yes, YOLO and Carpe Diem, but tempered with whatever wisdom (which is mostly the wisdom of kindness) we’ve managed to acquire.  

Pierre Paulus de Châtelet (Belgian): November in Auderghem, 1905


An example of how a painter can make a very mundane scene look beautiful. 


Just as a poet can transform an ordinary event into something transcendent. One secret is choosing just the right details — “less is more.”


“And reincarnation? Really? If that were real, wouldn’t there be some proof by now? A raccoon spelling out in acorns, “My name is Herb Zoller and I’m an accountant.” ~ Bill Maher


A wild raccoon spelling out ANYTHING would make me reconsider my atheism.

I'm also reminded of a Kabbalist rabbi who said that because of the population increase we now have only one-eighth of a soul, compared to the good old days.


~ “For many patients with terminal diseases, Coyle has observed, this awareness [of imminent death] precipitates a personal crisis. Researchers have given it other names: the crisis of knowledge of death; an existential turning point, or existential plight; ego chill. It usually happens as it did with my mother, close to when doctors break the news. Doctors focus on events in the body: You have an incurable disease; your heart has weakened; your lungs are giving out. But the immediate effect is psychological. Gary Rodin, a palliative-care specialist who was trained in both internal medicine and psychiatry, calls this the “first trauma”: the emotional and social effects of the disease.

The roots of this trauma may be, in part, cultural. Most people recognize at an intellectual level that death is inevitable, says Virginia Lee, a psychologist who works with cancer patients. But “at least in Western culture, we think we’re going to live forever.” Lee’s advanced-cancer patients often tell her they had thought of death as something that happened to other people—until they received their diagnosis. “I’ve heard from cancer patients that your life changes instantly, the moment the doctor or the oncologist says it’s confirmed that it is cancer,” she says.

The shock of confronting your own mortality need not happen at that instant, Coyle notes. Maybe you look at yourself in the mirror and suddenly realize how skinny you are, or notice your clothes no longer fit well. “It’s not necessarily verbal; it’s not necessarily what other people are telling you,” Coyle says. “Your soul may be telling you, or other people’s eyes may be telling you.”

E. Mansell Pattison, one of the early psychiatrists to write about the emotions and reactions of dying people, explains in The Experience of Dying why this realization marks a radical change in how people think about themselves: “All of us live with the potential for death at any moment. All of us project ahead a trajectory of our life. That is, we anticipate a certain life span within which we arrange our activities and plan our lives. And then abruptly we may be confronted with a crisis ... Whether by illness or accident, our potential trajectory is suddenly changed.”

In this crisis, some people feel depression or despair or anger, or all three. They grieve. They grapple with a loss of meaning. A person’s whole belief system may be called into question because “virtually every aspect of their life will be threatened by changes imposed by the [disease] and its management,” Lee has written. In a small 2011 Danish study, patients with an incurable esophageal cancer reported that after their diagnosis, their lives seemed to spin out of control. Some wondered why they had received a fatal diagnosis, and fell into despair and hopelessness. “I didn’t care about anything,” one patient said. “I had just about given up.”

n the 1970s, two Harvard researchers, Avery Weisman and J. William Worden, did a foundational study on this existential plight. Newly diagnosed cancer patients who had a prognosis of at least three months were interviewed at several different points. At first, for almost all the patients in the study, existential concerns were more important than dealing with the physical impacts of disease. The researchers found that the reckoning was jarring, but still relatively brief and uncomplicated, lasting about two to three months. For a few patients, the crisis triggered or created lasting psychological problems. A few others seemed to face the crisis, then return to a state of denial, and then double back to the crisis—perhaps more than once. In the study, the researchers describe a patient who was told her diagnosis, only to report to interviewers that she didn’t know what it was—and then make it clear she wasn’t interested in receiving a diagnosis in the near future.

Palliative-care doctors used to think that a patient was either in a state of denial or a state of acceptance, period, Rodin says. But now he and his colleagues believe people are more likely to move back and forth. “You have to live with awareness of dying, and at the same time balance it against staying engaged in life,” he says. “It’s being able to hold that duality—which we call double awareness—that we think is a fundamental task.”

Whether or not people are able to find that balance, the existential crisis doesn’t last; patients can’t remain long in a state of acute anxiety. Coyle has found in her work that later peaks of distress are not usually as severe as that first wave. “Once you’ve faced [death] like that once, it’s not new knowledge in your consciousness anymore,” she says.

For most, figuring out how to adapt to living with a life-threatening disease is a difficult but necessary cognitive process, according to Lee. When patients do emerge on the other side of the existential crisis, she finds that many are better off because of it. These patients are more likely to have a deeper compassion for others and a greater appreciation for the life that remains.

To arrive there, they have to squarely face the fact that they’re going to die. “If you’re an avoidant person, and you don’t like to think about these things, that works better when life is going well,” Rodin says. “It just doesn’t work well in this situation because reality doesn’t allow it. It’s like trying to pretend you don’t need an umbrella or something, or it’s not raining, when it’s pouring. You can do that when it’s drizzling, but eventually, you have to live with the rain.”


I thought this would be an interesting follow-up on Sophie Black's poem. After the shock and the crisis, people adjust to the thought and re-engage with what life remains.

For me Christopher Hitchens remains a model of how to die: he kept writing up to the very end. In spite of the pain and the horrible intrusion of chemotherapy and other torturous medical procedures, he kept on working, contributing. 


The discussion about death is interesting from the point of view of someone 91 years old. I have known for some time that death is imminent and have resolved many issues. I guess because so far I’ve had no terrible illnesses like cancer I have accepted what has come and what will be.  Have been in ICU once and near death. You live “as if we think we will never die..” a quote from my villanelle.


Coming from you, this line certainly has special authority: “we live as if we think we will never die” — ultimately we have to, for the sake of sanity, even after a terminal diagnosis, or at an age when it could happen any time. At the same time, of course, at one level we are perfectly aware that there comes a point you don’t begin any long-time projects, or buy a huge “lifetime” supply of anything. And whatever day we still wake up to becomes infinitely precious: that’s our “eternal moment” (Milosz’s phrase).


Dying, except in sudden traumatic situations, is not passive, it is an act. My parents each died in ways singularly their own. My mom was suffering from end stage emphysema, my dad in the last stages of Parkinson’s. Mom spent her last few days in Hospice, dad had been in a nursing home for several years. Mom was to all appearances in a coma, or state of unconsciousness, that came suddenly, and we were called to her bedside. For the last  two days of her life we sat around her bed, telling old family stories, remembering, sharing memories. My nephew, the last “baby” mom had taken care of was coming in from out of state. When he arrived, he held her hand and talked to her, and expressions flitted over her face. In a few hours, she died, surrounded by all her children.

My dad was a very private and non-demonstrative person. During his time in the nursing home we were vigilant, someone there almost every day, particularly one of my younger brothers, who was extremely faithful in his attendance. He left after dad had dinner one day, and while he was alone, before anyone else entered, dad died. Both of these deaths reflected the personalities and preferences of each as individuals. I have seen patients in hospitals, seemingly nowhere near death, say they were going to die, and then die, surprising everyone but themselves. Some die in fury —“raging against the dying of the light.” Others die with a sweet patience and endurance that seem almost impossible.

Of course I think our experience of death has been so changed that it is easy to believe we won’t die. Most now die in hospitals, not at home, bodies are not “laid out” as they used to be, at home, but in funeral “parlors” where “viewing time” is limited, and no one sits up through the night with the body, as used to be common. Many grow to adulthood never having seen a dead body. And the accomplishments of medicine, along with the separation of sickness and death from ordinary life, have almost resulted in a feeling that death is somehow “optional”—that medicine’s extreme measures can prolong and prolong life indefinitely, beyond all sense and reason. This distancing is disappearing lately in particularly horrific ways, however, with mass murders, wars that aim to kill civilians in great numbers, the huge losses suffered by refugees fleeing these terrors, the active genocides in progress now — bodies of children washed up on beaches, numbers beyond counting…horrific death has almost become ordinary.


Mary, thank you for your generous sharing.

I would like my last words to be “I love you” — even if spoken to a stranger, or no one in particular. But I also know that usually the dying person is unconscious, or at least mentally confused (by our standards) — in a world of their own.

I'm glad that I’ve seen dead bodies, both in childhood and in adulthood — not many, but enough to give me a strong sense that “this is no longer that person.” The change in appearance can be startling. It’s when I was looking at my dead mother’s body that I thought, “We are of the moment. We belong to the moment.” It wasn’t the first time those words surfaced in my mind, but this time the insight solidified.


Oriana: In the long run, nothing matters. But we don't belong to the “long run,” much less to the “cosmic perspective.” We belong to the moment. And then it matters how we live and how we die. It matters because we are not isolated individuals: we touch the lives of others.

Aside from continuing productivity, even in the face of death there is also a desire for continued deep connection:


Jacobsen’s apprehension of his own mortality would manifest itself in perhaps his greatest work, the novel Niels Lyhne (originally titled The Atheist), which Jensen calls “the most death-haunted novel in European literature.” In its bizarre alloy of detached detail and dreamy, horrific awe, it is a novel “in which the strands of both realism and modernism are greedily imbricated.” Niels, the titular protagonist, loses his faith at the age of 12 following the death of a beloved aunt. Over the course of the novel he runs a harrowing existential gauntlet, accruing a series of other terrible losses: his friend, his wife, even his young child. At his son’s deathbed, Niels breaks down and prays to the God he has rejected; when the boy dies, Niels is left with his failure of spirit and the understanding, as Kierkegaard wrote, of “the agonizing self-contradiction of not being able to do without a confidant and not being able to have one.” It is an uncannily full and nuanced account of atheism, “not simply as an idea,” Jensen says, “but as a living, fluctuating belief.” The paralysis Niels feels at the novel’s end is the apostate’s natural condition, one Jacobsen knew intimately: that of an unwilling conversant with a deposed divinity.!


Sometimes I do feel that yearning for a very wise, non-judgmental, empathetic person with whom I could discuss what my life has been about. And no, I wouldn't want to pay a therapist for listening. A loving friend. An all-understanding god, god as the supreme confidant? Sure, that’s a wonderful fantasy — a Protestant one, someone recently pointed out to me, since it’s Protestantism that emphasizes having a a personal relationship with the deity.

But for me it's far from an all-consuming longing. What we do with whatever time is left is more important. And besides, as a writer I can always confide in writing — though I'm aware of the limitations and inevitable distortions. But any use of words, even talking to a loving friend, can't avoid limitations and distortions.

God as a lover, then? The typical god of the mystics? Just how explicitly erotic are we willing to get?


Thirty years ago, the art scholar Leo Steinberg published “The Sexuality of Christ in Renaissance Art and in Modern Oblivion,” a book that does much to explain the connection between Pope Francis’s passionate devotion to the poor and afflicted and his seeming openness to gay Catholics. In “The Sexuality of Christ,” Steinberg argues that as a result of the rise of the Franciscan order, around 1260, an emphasis on Christ’s nakedness, and, thus, on his humanity, joined compassion to an acceptance of the role of sexuality in human life.

A credo of the Franciscan order was nudus nudum Christum sequi (“follow naked the naked Christ”). It was a radical call to cast aside worldly wealth and belongings and acknowledge the fragile, fallen nature of all men and women. But in casting aside Christ’s garments, the Franciscans made Christ’s nude body a focal point. As a result, according to Steinberg, from about the middle of the thirteenth century until the sixteenth century artists lavished particular care on Christ’s penis, the part of Christ’s body that made him most mortal, and which proved his union with humankind. “One must recognize,” wrote Steinberg, “an ostentatio genitalium comparable to the canonic ostentatio vulnerum, the showing forth of the wounds.”

Trinity by Lucas Cranach; when I posted the painting in my blog from a week ago, I certainly noticed the upward slant of the loincloth, but dared not think the obvious. 
“The Sexuality of Christ” has changed the way we look at certain works of art. The “modern oblivion” of Steinberg’s subtitle was just that: centuries during which the central fact of Christ’s phallus in hundreds of Renaissance paintings was overlooked, denied, and, sometimes, bowdlerized. Steinberg adduces several examples of Christ’s genitalia being painted over or touched up to make them look like a mere blur. In one case, probably in the mid- to late nineteenth century, the Alinari brothers, famous for their photographic reproductions of paintings, blackened out the Christ child’s penis in their photograph of a fifteenth-century “Madonna and Child” by Giovanni Bellini. Such censorship, Steinberg believes, was meant as distraction from an uncomfortable theological premise: “A disturbing connection of godhead with sexuality.”

To bring to the surface this suppressed artistic trend, Steinberg reproduced dozens of paintings and drawings in which Christ’s genitalia are indisputably a central thematic concern. There are paintings of the Christ child touching his penis, and of the Virgin handling the infant Christ’s penis. In some pictures, the Christ child exhibits his genitals in a style similar to Venus displaying her sex. “Again and again,” Steinberg writes, “we see the young God-man parading his nakedness, or even flaunting his sex in ways normally reserved for female enticements.”

Many representations of the Three Magi show one of the foreign kings closely inspecting the infant Christ’s genitalia. Depictions of Christ on the cross and of the dead Christ lying in the Virgin’s arms clearly portray Christ with an erection. In some images, which Steinberg calls “psychologically troubling,” the divine Father touches his Son’s penis, “a conciliation,” Steinberg writes, “which stands for the atonement, the being-at-one, of man and God. For this atonement, on which hinges the Christian hope of salvation, Northern Renaissance art found the painfully intimate metaphor of the Father’s hand on the groin of the Son, breaching a universal taboo as the fittest symbol of reconcilement.”

“The Sexuality of Christ” takes up, to put it mildly, an ultra-sensitive subject. For that reason, Steinberg stresses that Renaissance artists’ emphasis on Christ’s penis is an esthetic choice guided by deep religious belief, though he occasionally hints that Renaissance artists could at the same time have been having sly fun with the subject. And it is hard to believe that in, say, quattrocento Florence, an epoch so liberated in its sexual mores—Fra Filippo Lippi, for example, lived openly with a defrocked nun, whom he used as a model for his Madonnas—artists could resist being simultaneously worldly and pious.

For Steinberg, however, theological motives were preëminent. He held that artists used the evidence of Christ’s genitals to prove that Christ submitted to becoming human before returning to the godhead. The revelation of his penis demonstrates, as Steinberg puts it, Christ’s “humanation,” that moment of incarnation which proved Christ’s love for humankind. And the many representations of the Christ child’s circumcision are important as foretellings of his crucifixion—the blood of Christ’s penis is fulfilled in the blood from Christ’s wounds.

Entering with obvious relish the realm of Christian sexual hermeneutics, Steinberg relies on St. Augustine, who emphasized his surrender to and then escape from the “fleshpots of Carthage,” to argue that Christ’s erection was a singular way to demonstrate Christ’s chastity. Without the capacity to yield to lust, Christ’s triumph over carnal desire would have no human meaning. Unlike men after the fall of Adam, who fell victim to lust, Christ willed his erection; it was not an involuntary physiological event. By both willing and resisting it, he declared his victory over the stain of sin bequeathed to humanity by Adam and Eve, and over the death that their carnal weakness brought into the world. That, after all, is the significance of the resurrection.

To drive this point home, Steinberg had to prove that during the late Middle Ages and the Renaissance the word “resurrection” could be used as a double entendre, connoting both the divine event and the humble mortal fact of an erection. Steinberg quotes from one of Boccaccio’s fourteenth-century tales in the Decameron, in which a pious young girl inflames the desire of a monk named Rustico, causing in the latter a “resurrection of the flesh.” Steinberg notes that it was not until modern times that the original phrase was accurately translated from Italian, a censoring that he sees as analogous to the later bowdlerization of Christ’s penis in Renaissance paintings.

The vulnerable component of Steinberg’s perspective was that it was almost entirely speculative. Steinberg does quote from some sermons of the time in support of his argument concerning the centrality of the circumcision, but he builds his case mostly on logic and on physical evidence. Christ’s penis is a prominent element of countless paintings in the Renaissance. That is undeniable, and a theological explanation is the only one that made sense to him.

Michelangelo: Cristo della Minerva, in the church of Santa Maria sopre Minerva, 1519-21 (the bronze loincloth was added during the Baroque period)

The skeptical response to Steinberg’s thesis was that the attention paid to Christ’s penis was merely the consequence of Renaissance naturalism. Steinberg had a convincing set of rejoinders: No children in actual life have been known to receive powerful kings shortly after their birth while smiling benignly and proudly displaying their genitals. It is not a medical fact that dying men experience an erection in the moment of their decease. And even if the emphasis on Christ’s penis in Renaissance painting were the product of fidelity to real life, Christ was no ordinary man.

The most cogent criticism of Steinberg’s book came from Caroline Bynum, a feminist scholar. Bynum pointed out that in medieval texts Christ was often portrayed in feminine terms, and she gave as evidence paintings in which a feminized Christ appears. Steinberg conceded that Christ was sometimes portrayed as both male and female—“In one category of metaphors, the wound [in his side] is said to lactate and give birth”—but responded that this did not diminish the universal resonance of phallic imagery, nor did it lessen the impact of the other paintings he offered as evidence. Steinberg’s and Bynum’s arguments do not appear to be mutually exclusive. An androgynous Christ with a highly symbolic phallus does not seem out of the question.

Steinberg also argued that the artists he was using as examples were not illustrating preëxisting texts. They were confronted by the entirely new artistic problem, made possible by the Franciscan emphasis on Christ’s nakedness, of how to portray Christ’s naked body. In response, they created their own theology, embedded in their representations of Christ. “Renaissance art,” wrote Steinberg, “harnessed the theological impulse and developed the requisite stylistic means to attest the utter carnality of God’s humanation in Christ.” Byzantine art had to prove the divinity of Christ in the face of schisms and iconoclasms; Byzantine artists had no special use for Christ’s naked body. But the more confidently situated Catholic artists of the Renaissance celebrated Christ’s carnal humanity.

Particularly striking now is the original book’s postscript, written by a Jesuit scholar named John W. O’Malley. In the course of defending Steinberg’s thesis, O’Malley writes that the “ ‘Renaissance theology’ ” of Christ’s penis that was put forward by the artists Steinberg discusses “was severely damaged, perhaps in large part destroyed, by the bitter controversies sparked by the Reformation and Counter-Reformation.”

The Jesuit O’Malley is talking about a time when Catholicism was under such siege that the freedom of embodying Christ’s love for humanity in his naked body, a freedom fueled by Franciscan piety, vanished, giving way to polemics and proselytizing. As James Carroll vividly demonstrates in his Profile of Jorge Mario Bergoglio, it is this very lapse into militancy that the present Jesuit Pope, inspired by Franciscan piety, is determined to correct. Pope Francis could well agree with Steinberg, who lamented that the human Christ disappeared “as modern Christianity distanced itself from its mythic roots; as the person of Jesus was refined into all doctrine and message, the kerygma of a Christianity without Christ.” That, Steinberg says, was when “the exposure of Christ’s genitalia became merely impudent.”

One might add that in our own epoch the Catholic Church’s denial of Christ’s sexuality runs parallel to its denial of human sexuality, taboos that resurface in once scandal after another.

In modern times, the Catholic Church has been under siege to an unprecedented degree, as much by internal rifts and abuses as by unbelief and competing Protestant sects. In response, its doctrine and its message have become all the more abstract and inflexible; all the more a Christianity without Christ. The current Pope, by heeding the call to “follow naked the naked Christ,” seems determined to make inseparable the alliance between the naked body that lives, works, suffers, and dies, and the naked body that was created with the capacity to experience physical love. If this is so, then Pope Francis has an ally in Leo Steinberg, the displaced Russian Jew whose modernist, heretical instincts led him to the grave, beautiful, profound, and, at times, playful depiction of Christ’s sexuality.


I have long been interested in the bodily aspect of divinity. Greek gods certainly had bodies and sexuality; Yahweh did have a body, at least in the earlier books, though his sexuality remains unclear (except for the Kabbalists). But the insistence that Jesus was fully human leaves us little choice but to assume some stance about it.

As for Christ’s sexuality, it’s strange and incomplete at best: sexuality without sex. It’s telling that Jesus is shown without a partner — except for those who draw an unsurprising conclusion from the phrase “John, the Beloved Disciple,” or those who cling to the idea that Jesus was married to Mary Magdalene. In the main, there is no denying that the four gospels avoid the subject of Jesus’ sexuality, and the general impression was that Jesus lived and died a virgin. 

Insofar as Jesus is regarded as the supreme role model to be imitated, this is not a feasible model, but one bound to produce pathology.But the pathology goes back already to Yahweh, a god without a mate, an angry father for whom no loving arms wait wherever “home” might be.

Somewhat on a tangent: Someone on Facebook argued that Jesus’ DNA was identical with Mary’s. That would make Jesus Mary’s clone, necessarily female. The Y chromosome comes only from the father. If Yahweh’s was the biological father of Jesus, then he somehow produced sperm that carried the penis-producing Y chromosome. The most logical solution to this is to suggest that Yahweh himself had genitals.


Favorite painting and most obvious is Jesus and the upward-slanted loin cloth. It was the first thing I noticed in the painting. How can anybody see anything else in the painting on first look?

It’s really the Quaternity because the globe is the fourth main symbol. Interesting juxtaposition of the loincloth and the cherubim.

Favorite line in blog is “......sixteenth century artists lavished particular care on Christ’s penis, the part of Christ’s body that made him most mortal, and which proved his union with humankind.”

So Michelangelo’s god’s butt fits right into that theme.


As a friend said, “Michelangelo wanted to show us the human face of God.” She meant “side” — but we had a good laugh over this “face.” Now, I still think that Michelangelo’s god is mooning Pope Sixtus, but that’s minor next to the bigger point: like the Greek gods, Yahweh has a human body. 

It’s one of the great themes of Renaissance: the human body is good. There is no absolute chasm between the human and the divine: if we were made in god’s image, then the body is glorious.

The “globe” (globus cruciger) is officially called the “royal orb.” It’s a symbol of the king’s authority, and a Christian symbol of highest authority in general. Since the Christian god is supposed to be the “king of kings,” the orb stands for the whole world.

Normally one hand of the king would hold the orb, while the other one holds the scepter. Here instead of the scepter we get the cross with the body of Jesus.

The royal regalia are very important in this painting. Religions were born during the era of kings and emperors. They seem to thrive under monarchy (the absolute kind), and decline under democracy. Democracy has found a way to allow dissent, something that religion generally cannot tolerate.


~ “The diminutive Yezhov, who was nicknamed the Bloody Dwarf and was a true sadist, had a curved leg and a limp, while suffering from “myasthenia and neurasthenia, anemia, angina, sciatica, psoriasis, and even malnutrition” and other ailments. During the Terror, his teeth began to fall out. He drank until he lost consciousness. One of his buddies (later arrested as a Polish spy) would bring him prostitutes, while another (whom Yezhov’s NKVD also arrested) joined him in farting competitions. In one report, Yezhov claimed to have discovered numerous interlocking conspiracies: one fascist plot in the NKVD, another in the Kremlin, a Polish espionage group, several Trotskyite groups—the list of plots goes on and on until Yezhov concluded: “I have enumerated only the most important.” But the specific charges really didn’t matter, since Stalin set quotas for arrests.

Interrogation virtually always involved torture, followed either by execution or a sentence in the Gulag. Those who knew they were about to be arrested—like Politburo members—often committed suicide to avoid the interrogation. Despite the purges in the NKVD, by 1938 it grew to over a million men.

Of course, Yezhov himself was eventually arrested and replaced by the still more sadistic Lavrenty Beria. When Yezhov’s apartment was searched, it turned out he had preserved as souvenirs the bullet casings with which Zinoviev and Kamenev, two of the original Bolshevik leaders, had been shot. His one regret was that he hadn’t killed more people. He promised he would die with Stalin’s name on his lips.

The event that was used as an excuse for the Great Terror was the assassination of a popular party figure, Sergei Kirov, who was shot by one Leonid Nikolaev. Mysteriously, the guards had been withdrawn from each floor of the building where Kirov worked and his personal bodyguard was absent. Even stranger, Nikolaev had already been caught trying to sneak into the building with a revolver, and had been released! Over the years, one story after another was promulgated, each involving an ever-growing network of spies. Especially in 1937–38, literally millions of people were accused of belonging to branches of a vast conspiracy whose main purpose was killing Kirov. Conquest’s classic book on the topic, Stalin and the Kirov Murder, cautiously concludes that circumstantial evidence points to Stalin as the instigator of Kirov’s killing.

Yezhov with Stalin before his fall from grace
One of Stalin’s decrees ordered the arrest of wives of traitors, just for being their wives, and in one famous toast he promised to destroy every enemy and also “all his kin, his family.” Other decrees made being late to work punishable by a term in the Gulag and the theft of even a minute amount of grain a capital crime. Any attempt to call such punishments excessive was denounced as “rotten liberalism.”

For Stephen Kotkin, this aspect of the regime—its destruction of its own most loyal members—constitutes something unprecedented in world history, and he gropes for reasons. Even if Stalin was afraid of other officials challenging him, he could sack or transfer anyone at will. But not only did he murder them or send them to slave labor camps, but, “in a huge expenditure of state resources, had them tortured to confess . . . not to being corrupt or incompetent, but to plotting to assassinate him and restore capitalism on behalf of foreign powers.”

Kotkin rightly dismisses explanations based on Stalin’s childhood or Georgian upbringing. Stalin’s character surely made a difference, but his character was itself shaped by his experience as a revolutionary and a dictator. As Kotkin notes, absolute power not only corrupts absolutely, it also shapes absolutely. Above all, Bolshevik ideology was crucial. It taught the inevitability of maximal brutality in class warfare and treated anything less—such as refraining from torture—as an impermissible, liberal lapse. For a Bolshevik, there is no such thing as “human values,” only “class values.” Killing millions not only posed no moral dilemma for Stalin; “on the contrary, to pity class enemies would be to indulge sentiment over the laws of objective historical development.”

The best proof that terror inhered in Bolshevism itself, Kotkin observes, “was the relative ease with which Stalin could foist the bloodbath upon the political police, army, party-state, cultural elites, and indeed the entire country.” He could count on the widespread acceptance of Marxist-Leninist ideology. “It was no accident . . . that a single leader had emerged atop a single-party system that, on the basis of class analysis, denied legitimacy to political opposition.”

Yezhov after his fall from grace


~ “From his perch as a linguist eavesdropping on Soviet-backed forces in Eastern Europe, [Jeffrey Carney] knew that Washington’s portrayal of the other side was a lie. The enemy wasn’t an unstoppable juggernaut preparing to invade the West. Its combat units were barely functional. And it was the U.S. that was trying to provoke the Soviets into an incident that could lead to war.

Depressed and looking for an escape, Carney bolted for Checkpoint Charlie, the gateway to Communist East Berlin, near midnight on April 22, 1983, and asked for political asylum. It didn’t work out as planned; within hours, East German intelligence agents blackmailed him into returning to his unit as their spy. If he refused, they made clear, they’d leak his planned “defection” to his bosses.

[The “Able Archer” military exercise] “This situation could have been extremely dangerous if during the exercise—perhaps through a series of ill-timed coincidences or because of faulty intelligence—the Soviets had mis­perceived U.S. actions as preparations for a real attack.”

That was exactly what worried Carney—that one shot would lead to another, and maybe even a nuclear war. “We underestimated the Russian psyche,” Carney says. “They were institutionally paranoid. The average American would not launch a rocket and shoot a plane out of the air. But they don’t think like we do.”

As Able Archer unfolded in the summer of 1983, Soviet state-controlled radio started making announcements “several times a day” suggesting a U.S. attack was imminent, the study notes. New street signs went up in Moscow and other cities showing the locations of air raid shelters. A Soviet air force unit in Poland began carrying out drills to speed up the transfer of nuclear weapons from storage to aircraft. Some in the Ronald Reagan administration worried that the Soviets were preparing for an invasion of Europe. In response to a Western attack, Moscow’s war doctrine called for the destruction of most European cities and ports using nuclear weapons, followed by a massive ground invasion that would put Soviet troops on the Atlantic in 14 days.

 “One misstep,” Reagan recalled years later, “could trigger a great war.”

Carney had no idea what he was getting into when he crossed into East Berlin in the spring of 1983. His access to some of the Pentagon’s most sensitive electronic-spying operations had driven him to reconsider his initial enthusiasm for the election of Reagan, who had dubbed the Soviet Union “an evil empire” bent on crushing the West. Newspaper reports at the time described the Russians as unstoppable. “Perhaps the first moment I realized there was a problem, a big discrepancy, was while I was waiting for the bus to go to work one day,” Carney recalls. “Stars and Stripes, the military newspaper, had an article about Soviet superiority in the European theater. I remember laughing with a friend, a Russian linguist, about the numbers and technical information cited in the report. It stood at complete odds with what we saw in our intel reports every day.”

The truth, he says, was that Communist-allied units were hampered by fuel and food shortages, alcoholism and even cholera, picked up by soldiers rotating into East Germany from the Soviet Far East. Soldiers were siphoning off brake fluid to get high. He doubted many were battle-ready. “Ronald Reagan,” Carney began to think, “was intent on making Russia an evil empire, whether it was evil enough on its own or not.”

Beginning in May 1983, Carney started looking for “important” documents to steal. The more he read, the more he was concerned about Washington’s electronic warfare programs and weapons, which could fry the Soviets’ command-and-control telecommunications. “[They] were mind-boggling in their reach and ability,” he says. “Many of them were purely offensive, and...would have only found use in a first-strike scenario.”

Later that year, Carney learned that U.S. warplanes were about to fly into Soviet airspace to simulate an attack on a sensitive military site and measure how the enemy responded. War jitters were already high with the impending deployment of U.S. Pershing ballistic missiles in West Germany. In September, the Russians shot down a Korean airliner that wandered over its missile testing area on the Kamchatka Peninsula, in the Soviet Far East. Fearing a similar result, Carney rushed to tell his Stasi East German handler what was coming.

He says another incident in particular, in the fall of 1983, drove him from an “unwilling to a very willing spy.” Since it’s still classified, he refuses to divulge it further, for fear it could land him back in prison. “It was an intentional, aggressive provocation of the Soviet Union in a very sensitive area,” he says, “that would have made [Russian radar monitors] flip out.”

He adds, “When it was explained to me, I said, ‘You’ve got to be kidding. You are going to push their buttons. People are going to be shot down.’”

But Carney has few regrets. “I regret the pain I caused people, I regret the fact that I was in a position where I didn't have the whole picture and I made decisions where I ended up hurting people,” he says. “Unintentionally, though, I think what I did—and there are hundreds and hundreds of people who did what I did, on both sides: American spies, Russian spies, German spies—all of us together made it basically impossible for a war to break out. And I think that's where the focus should be.” ~


In Poland we knew very well that the US had total military superiority. But in the US the fear-mongering went on, and the reckless militarism kept bringing the world to the brink of nuclear war.


The 1918 pandemic was unusual in that it killed many healthy 20- to 40-year-olds, including millions of World War I soldiers. In contrast, people who die of the flu are usually under five years old or over 75.

The factors underlying the virulence of the 1918 flu are still unclear. Modern-day scientists sequenced the DNA of the 1918 virus from lung samples preserved from victims. However, this did not solve the mystery of why so many healthy young adults were killed.

The 1918 flu and World War I

While a mild flu circulated during the spring of 1918, the deadly strain appeared on U.S. soil on Tuesday, Aug. 27, when three Navy dockworkers at Commonwealth Pier in Boston fell ill. Within 48 hours, dozens more men were infected. Ten days later, the flu was decimating Camp Devens. A renowned pathologist from Johns Hopkins, William Welch, was brought in. He realized that “this must be some new kind of infection or plague.” Viruses, minuscule agents that can pass through fine filters, were poorly understood.

With men mobilizing for World War I, the flu spread to military installations throughout the U.S. and to the general population. 

The quest to understand the 1918 flu fueled many scientific advances, including the discovery of the influenza virus. However, the virus itself did not cause most of the deaths. Instead, a fraction of individuals infected by the virus were susceptible to pneumonia due to secondary infection by bacteria. In an era before antibiotics, pneumonia could be fatal.

Recent analyses revealed that deaths in 1918 were highest among individuals born in the years around 1889. An earlier flu pandemic emerged then, and involved a virus that was likely of a different subtype than the 1918 strain. These analyses engendered a novel hypothesis, discussed below, about the susceptibility of healthy young adults in 1918.

Exposure to an influenza virus at a young age increases resistance to a subsequent infection with the same or a similar virus. On the flip side, a person who is a child around the time of a pandemic may not be resistant to other, dissimilar viruses. Flu viruses fall into groups that are related evolutionarily. The virus that circulated when Adolfo was a baby was likely in what is called “Group 2,” whereas the 1918 virus was in “Group 1.” In fact, exposure to the “Group 2” virus as a young child may have resulted in a dysfunctional response to the “Group 1” virus in 1918, exacerbating his condition.

Support for this hypothesis was seen with the emergence of the Hong Kong flu virus in 1968. It was in “Group 2” and had severe effects on people who had been children around the time of the 1918 “Group 1” flu.

To 2018 and beyond

What causes a common recurring illness to convert to a pandemic that is massively lethal to healthy individuals? Could it happen again? Until the reason for the death of young adults in 1918 is better understood, a similar scenario could reoccur. Experts fear that a new pandemic, of influenza or another infectious agent, could kill millions. Bill Gates is leading the funding effort to prevent this.

Flu vaccines are generated each year by monitoring the strains circulating months before flu season. A time lag of months allows for vaccine production. Unfortunately, because the influenza virus mutates rapidly, the lag also allows for the appearance of virus variants that are poorly targeted by the vaccine. In addition, flu pandemics often arise upon virus gene reassortment. This involves the joining together of genetic material from different viruses, which can occur suddenly and unpredictably.

An influenza virus is currently killing chickens in Asia, and has recently killed humans who had contact with chickens. This virus is of a subtype that has not been known to cause pandemics. It has not yet demonstrated the ability to be transmitted from person to person. However, whether this ability will arise during ongoing virus evolution cannot be predicted.

The chicken virus is in “Group 2.” Therefore, if it went pandemic, people who were children around the time of the 1968 “Group 2” Hong Kong flu might have some protection. I was born much earlier, and “Group 1” viruses were circulating when I was a child. If the next pandemic virus is in “Group 2,” I would probably not be resistant.

It’s early days for understanding how prior exposure affects flu susceptibility, especially for people born in the last three to four decades. Since 1977, viruses of both “Group 1” and “Group 2” have been in circulation. People born since then probably developed resistance to one or the other based on their initial virus exposures. This is good news for the near future since, if either a “Group 1” or a “Group 2” virus develops pandemic potential, some people should be protected. At the same time, if you are under 40 and another pandemic is identified, more information would be needed to hazard a guess as to whether you might be susceptible or resistant.

ending on beauty:

Among twenty snowy mountains,
The only moving thing
Was the eye of the blackbird.

I was of three minds,
Like a tree
In which there are three blackbirds.

~ Wallace Stevens

Not having found any image of blackbirds that pleases me, I'm posting instead this fireball meteor seen on November 14 over Italy’s Dolomite Alps (Ollie Taylor)