Sunday, October 23, 2016


Warsaw: Lazienki Castle, First Snow


Other grandmothers knitted.
Mine only crocheted.
And exclusively hairnets.
Ever since I was a toddler,

I remember her that way,
with a little silver hook,
spiraling around and around
the nothing at the top.

Endless hairnets! 
She kept her hair short.
Even after eighty,
it was only beginning to gray.

Her hairnets were brown
or black, the yarn so fine
the hairnet hardly showed.
It was not about need.

Only now I see
it was about that spiraling
around empty space,
the eye of wisdom that opens

when you come to know
how in one moment
you can lose all, except
your own soul.

Everything else
is a ball of yarn.
It’s about the flight
of the hook.

~ Oriana © 2016

Quickly: when I use the word “soul,” I don’t mean the detachable little ghost that’s supposed to leave the body at the moment of death, a brain-free consciousness or self that continues to live on for eternity. Rather, I mean the core values, the innermost essence. But once published, the poem belongs to the reader who may read his or her meaning into this undefinable word. And that’s fine with me.

“In one moment you can lose all” — I had in mind specifically the fact of being taken to Auschwitz, losing not only your possessions (how minor that really is) but your whole former life — your profession, your social identity, your human rights — everything but that very core of yourself that you could still preserve.

For me the moment of a similar overwhelming multiple loss was leaving Poland and coming to America. Of course at the moment of leaving I didn’t yet comprehend the loss. That came later, when I saw that indeed “in the morning I had a homeland; / in the evening I had two suitcases.” What came even later, after the loss of home and family, the language and the culture, was the perception that I still had my “homeland of the mind.” Eventually I also made a home in poetry, but my first refuge was simply my intellect.

Kraków; photo: Ania Maria


~ “Last night for the third time in as many months I found myself explaining to someone raised outside of a devoutly religious environment that religious people are not stupid simply because they believe nonsensical things.

Very often they flatly disagree and insist that anyone who believes in things like demons and angels and Young Earth Creationism must be morons. But then like last night they get a puzzled expression as they sit across from me and finally admit, “The thing is, you don’t seem stupid to me. So how on earth did you ever believe such things?”


The first thing you have to realize is that intelligence is compartmental. By that I mean that people who employ sharp wit and critical thinking about one area of life (or even multiple areas) can still remain almost juvenile about a number of others. One need only look at how adept many of history’s greatest thinkers were at parsing ideas related to their own field of expertise but were complete disasters in their personal lives because they could never wrap their heads around the intricacies of human social interaction.

To see what I mean by compartmental intelligence, look no further than Ben Carson, who distinguished himself as a pioneering brain surgeon but who displays the political acumen of a remedial third grader. Or consider another less-well-known medical example whom I’ve mentioned here before: The last Sunday School teacher I had before leaving the church is a world-class oncologist who chairs an international committee on research protocols in his medical field, but he also studies “creation science” as a hobby. He uses up-to-date, state-of-the-art treatments for fighting cancer but gets all of his geological theories from the Institute for Creation Research, which quit putting out new theories in the early 1970’s (or as some would argue, in the late Bronze Age).


Another thing you must realize is that very intelligent people will believe very nonsensical things if you get to them young enough. When you grow up in an environment which takes for granted that a system of belief is sacred, your knowledge base and your critical thinking skills grow up around that belief structure in such a way as to leave it undisturbed. In fact, an argument could be made that without the checks and balances of the scientific method, human reasoning only serves to rationalize and validate the emotional content already in place in our psyches from our earliest years. We think in order to rationalize what we already believe.


Another thing which is almost impossible to grasp if you were never devout is how deeply we were taught to distrust ourselves. The notion of sin and human brokenness is bedrock to the Christian message, and the church drove this home to us before we even learned to read and write. We learned at an early age that human reasoning cannot be trusted.
“For as the heavens are higher than the earth, so are my ways higher than your ways, and my thoughts than your thoughts.”

With a narrative like that, is it any wonder that Christians grow up suspicious of the life of the mind? We were taught to distrust our own intellects even within those subcultures which otherwise valued science, education, and exploration (I know that’s inconsistent but see point #1). We learned early on that when our powers of logic and reasoning conflict with the teachings of our faith, we should privilege “what God says” over what anyone else thinks makes sense. Who can disagree with God himself, amirite?


And finally, people who did not grow up thoroughly enveloped by a community of faith will find it difficult to appreciate how heavily the social pressure to remain faithful keeps us from freely embracing our own cognitive dissonance. I recall clearly how apprehensive I became each time I collided with my own inner skeptic, realizing how costly it would be for me if my pursuit of reality ever led me outside the Christian fold. I knew long before I finally became honest with myself that I could lose everything, and for the most part I was right. When your whole life is built around an idea, challenging that idea shakes you to the core of who you are, both psychologically and socially. For some of us, this threatens to demolish our entire world.

And that’s why we hold on to irrational beliefs long after our own critical thinking skills seem like they should have outgrown these inferior ideas. Those ideas were always privileged for us, and it’s not as easy as it sounds to shake them when they are the very house in which you live.” ~



I freak out when I remember what bizarre things I used to believe when still a Catholic. A devil perched on my left shoulder, whispering temptations to sin. Behind me, or slightly to the right, my Guardian Angel. The world full of angels and saints and demons, of course, a sky filled with ghosts (with vastly more ghosts right underfoot, in hell. And this vast world, with billions of people, was ruled by the Invisible Man in the Sky who could (and did) read every thought in everyone’s head.

I was told those things at the age of eight, and at the age of ten I still believed them. Serious doubt didn’t begin until the age of twelve or so. And only soon at fourteen did doubt win.

The only thing I could never believe was that the idea that god was good. In my eyes, god was evil. He was cruel. He out-Hitlered Hitler. There was no way I could love an evil god, so I knew I was doomed to eternal damnation. I did believe in that, and to believe it was a sin against the Holy Ghost, the one sin that would not be forgiven. There seemed to be no way out.

“Suffering is good for you”; “Human reason is very weak,” (i.e. “you’re too dumb to understand, so shut up”); “You are a sinner who deserves eternal damnation” — this and other harmful twaddle was the constant fare. The power of repetition. And, above all, a child’s trust that adults know better and are telling the truth. When I feel astonished that I truly believed this and more nonsense, I have to remind myself that I was indeed a child, even if an intelligent child.

And besides, what good was intelligence? It was held to be completely inadequate — “Of course this doesn’t make sense to you; it’s a divine mystery.” Any atrocious bunch of nonsense can be defended as “divine mystery.”

A child is easily intimidated by adult “authority.” Many thoughts were forbidden, the penalty being eternal hellfire. It was an Orwellian culture obsessed with sinning “in deed, in word, and in thought.” I was especially worried about sinning in thought — Orwell’s “thought crime.”

I was also told that god chooses who will believe in him and who won’t — “Faith is a gift.” It is a gift he gives to some and not to others (who are doomed to hell). Oddly, no group seemed as likely to possess the gift of faith as old, uneducated women. But now it strikes that it wasn't their belief in god that was deep and impervious to doubt. It was their belief in the devil.

Alfred Stieglitz: “Going to Prayer,” 1895


But then religion is so out of kilter with reality that it can be shed more easily than more subtle kinds of indoctrination and social pressures. We may not even be aware that we harbor certain views as absolute truth.

Coming to another culture showed me this — certain things I took absolutely for granted were regarded with horror in the US. I had no idea the US public was so conservative. In my teens, if someone had told me, America is a very conservative and religious country, I would have burst out laughing. I naively thought technological progress = progressive social ideas, so the more technologically advanced a society is, the more we can expect things like paid maternity leave and free medical care for everyone (remember, I grew up with those). What an eye opening it was.

  Some things are of course universal, like nationalism. And since the mystery we’re discussing here is how people can believe all kinds of nonsense, I remember how my mother used to remark that Hitler was the greatest buffoon in modern history, perhaps in all history. “How could people fall for this buffoon?” my mother would ask for the thousandth time, and again not even try to answer. Sometimes we just don’t have a convincing answer.

Well, he was very skilled at whipping up a purely visceral nationalistic frenzy of wanting to make Germany great again. Watch his body language:

It’s still an undeniably buffoonish performance, so the mystery remains.


The essence of Buddha’s great wisdom was pointing out that much suffering comes from delusional thinking. Now, “delusional” is a strong term, and it may be difficult for some to accept. But it’s time we understood that thought disorders are extremely common — just as one need not be certifiably insane to experience hallucinations or false memories. All it takes is the right circumstances.

Let’s say that as a child you experienced some degree of emotional insecurity — and it’s hard to meet someone who had a mostly happy and secure childhood — a “good-enough” childhood (I truly hope such people exist, and it’s just my strange luck that I don’t meet them). A school where you were never teased or bullied (or practically never — remember, we are talking about the “good-enough” childhood). Teachers who’d never stoop to demeaning and shaming you and making you feel stupid. Clergy who praised you for being a good boy or girl rather than a sinner who deserved eternal damnation. I realize there has been an enormous progress toward less child abuse, but it’s still awfully common to have grown up in the “I'm not OK, you’re not OK” mode.

Sooner or later something bad is bound to happen — “shit happens” is the most succinct translation of the First Noble Truth — and we are required to cope with adversity, aka finding ourselves deep in doodoo. It’s rarely out own fault, pure and simple. There are circumstances. There is other people’s doo-doo. But cope we must.

One way, alas, is by falling into delusional thinking that builds on the early patterns of self-loathing and a sense of abandonment. Now, both “I am a total failure in life” and “I had to do it all by myself; no one ever helped me” are outrageously false beliefs easily contradicted if you only stop and think and remember — astounding, all it really takes is remembering — the gazillion times when you did succeed and the innumerable instances when you did receive help from someone or from numerous others — from the whole society, in fact — but oddly enough, those memories are blocked. Anger, hate, resentment, depression — it’s incredibly easy to start riding the automatic spiral, and not see the thought disorders and memory disorders underlying the suffering. Life-changing insight may come only when half or more of our life is over.

Or it may never come. But if it does, it should teach us patience with those who aren’t there yet — and also the humility of knowing that though we’re now more enlightened about X or Y, we still harbor all kinds of false beliefs, despite being intelligent and educated. It’s simply the human  condition.

The Devil and a Woman, stained glass, before 1248, from Sainte Chapelle, now at Cluny


~ “A new paper by philosopher Neil Van Leeuwen [suggests] that factual belief isn't the same as religious belief.

Behind the common word "belief" is something like this:

Devon (factually) believes that humans evolved from earlier primates over 100,000 years ago.

Devon (religiously) believes that humans were created less than 10,000 years ago.

Factual beliefs seem to influence the way we act and think in pretty much all contexts, whereas religious beliefs have a more circumscribed scope. Even when engaged in pretend play, for example, children know that they shouldn't really bite the Play-Doh "cookie"; the factual belief that Play-Doh isn't food infiltrates imaginative play. And even when imagining an improbable scenario, like having a pet cat on Pluto, factual beliefs will typically guide the inferences one draws — for instance, that the Plutonian cat needs food to survive. These findings suggest that factual beliefs have a wide-ranging influence on cognition and behavior.

Not so when it comes to religious beliefs. One study, for example, found that members of the Vezo tribe in Madagascar endorsed some aspects of life after death in a ritual context but not in a naturalistic context. Another study found that even people who explicitly endorsed an omnipotent, omnipresent and omniscient God didn't think about God in these terms (for instance, as capable of being in more than one place at once) when engaged in imaginative storytelling. These findings suggest that religious beliefs govern how we think and act in appropriately "religious" contexts but not necessarily beyond.

A second reason to differentiate factual and religious belief comes from how these beliefs respond (or don't respond!) to evidence. Van Leeuwen provides a nice example: At the end of the last century, many people (factually) believed there was a "Y2K problem." Due to the way dates were handled by digital computers, people worried that computer systems would go wonky on and after Jan. 1, 2000. However, nothing much happened and, in the face of this evidence, people stopped believing there was a serious Y2K problem.

Now consider a superficially similar religious belief: A doomsday cult's prediction that the world will end on some particular date. Many such dates have come and gone, without an ensuing rejection of the beliefs that generated the prediction. These doomsday beliefs were held religiously, not factually; they were — as a result — relatively immune to evidence.

In these respects (and others that Van Leeuwen describes), religious beliefs are more like fictional imaginings than like factual beliefs. We can imagine that the Play-Doh is a cookie without having this imagining infiltrate our thoughts and actions in all contexts — and we can imagine that the Play-Doh is a cookie in the face of evidence to the contrary.

Like fiction or imaginative play, religious beliefs may persist alongside factual beliefs precisely because they operate within restricted contexts and aren't firmly tethered to evidence. An important difference, however, is in the contexts that fictions and religion typically govern.

"How can something so serious as religion," asks Van Leeuwen, "be rooted in the same capacity that yields something as frivolous as fiction?"

His answer, of course, is that fiction needn’t be frivolous: "Humans, in fact, take many fictions incredibly seriously." Still, it doesn't follow that it's rational to entertain any religious beliefs, even if human psychology provides a suite of mechanisms for doing so.

Van Leeuwen's paper can help us make sense of how people hold seemingly contradictory factual and religious beliefs — a very real phenomenon that's been of interest to psychologists.

"I think there are two main messages. The first is an encouragement in the direction of self-knowledge. What psychological state is actually going on in your mind when you say (for example) 'I believe in God, the Father almighty ... '? If it's credence as opposed to factual belief, as I think and as the word 'creed' suggests, then perhaps you have no business pushing it on someone else as if it were a factual belief — no matter how much it may do for you personally. So I think that self-knowledge can yield a certain amount of humility and restraint. This paper, I hope, can facilitate self-knowledge."

"Second, I think another important message is that people with different religions from your own (if you have a religion) may not be as crazy as you think. Having a credence that (say) the ancestors are alive and watching is very different from having a factual belief that the ancestors are alive and watching. It could be that the former isn't crazy, even if the latter would be. So I think that grasping this psychological distinction could foster a healthier level of understanding and curiosity toward others.”

The Maoris believed this was the entrance to the Underworld: Cape Reinga, New Zealand


This reminded me of the famous poem by Thomas Hardy about the belief that on Christmas Eve cows and sheep kneel at midnight (this belief is also expressed in one of Polish Christmas carols) — and the poet’s refusal to actually go to the barn and check — because of “hoping it might be so” while knowing deep down that it isn’t. Van Leeuwen proposed that religious beliefs are not literal but rather “literary” — closer to fiction and imagination. Karen Armstrong also suggested that religion is not literal but mythological and metaphorical.


The problem, however, seems to be a lot of confusion as to which beliefs are factual and which are “merely” (if that’s the word) religious. Yes, many people are able to compartmentalize religion and hold their beliefs only loosely and chiefly for one hour on Sunday. But there are those who seem genuinely convinced that angels and devils exist and can help or hurt us, that miracles violating the laws of nature happen all the time, that the dead continue to exist in the sky, and so on. There are those who at least seem to believe all this as firmly as they believe that the earth is round.

But at least in the West we are past the point of burning alive those who doubt those various archaic beliefs, and it does appear that religious beliefs are increasingly more loosely held and more confined to ritual occasions.

Perhaps the most interesting part of the article is the point about “credence” not being bound to evidence — thus doomsday dates come and go, but those who were preparing to be “raptured” just shrug off the non-fulfillment of prophecy and stand ready for next time.

I suspect we need to study in more detail how the brain functions in terms of imagination, fiction, false memories, and acceptance of various degrees of “reality.” There is no denying that children only pretend to eat Play-Doh cookies. But Catholics are supposed to believe that the wafer (or a piece of cracker) becomes literally the body of Jesus. In past centuries, people killed and died for that belief. It was “factual” then — is it merely “religious” now?

The human brain seeks survival, not truth, so it's easy to see the hand of evolution here. Myths can serve survival, especially the collective survival. And then there is wishful thinking, so hard to resist! Sometimes I wonder how science ever emerged, given our bias to believe whatever makes us happy.

“My mother was watching me from heaven!” someone who just narrowly escaped an accident may exclaim. But later the same person may claim to have left religion a long time ago, and is in fact not a church goer. But are you going to needle him, “So, does your mother really watch over you from heaven?” That would be unkind. We understand that he adopts the belief about his mother in heaven in times of emotional need. 

Even more interesting is to look at "religious professionals": how much do ministers, rabbis, priests, monks and nuns REALLY believe? Already in my early teens I strongly suspected that some priests were non-believers. Not that they were jolly about it; they looked tortured, depressed. There was an occasional jolly fat priest, but most priests looked seriously unhappy. In part it may have been celibacy. I remember a sad monk in a TV documentary; he said that every day he thinks what it would have been like if he'd gotten married and had a family life. "I hope god is pleased with my sacrifice," he finished. I felt so sorry for him: he sacrificed sexual and emotional/family fulfillment to worship a fictional character.

And those doubt-filled letters of Mother Teresa, what an eye-opener! Apparently as a young nun she really did expect Jesus to come to her cell as a bridegroom . . . and later was forever bitter “because I don't have Him, so it all means nothing.” How revealing that it wasn’t quite to help people that she did her good work, but to have a special relationship with the imaginary Beloved . . . She (now officially a saint in spite of those letters) admitted that she never sensed the presence of god.

And then there is the fact that occasional hallucinations are a perfectly natural phenomenon among people who are not mentally ill. It just takes special circumstances — prolonged fasting, for instance, or extreme danger. It seems that Mother Teresa heard a voice telling her to “serve the poorest of the poor” during an illness when she was running a high fever. Apparently she craved more such “mystical experiences” — but that opens up another huge chapter.

Can we make a general claim that people understand the difference between religious beliefs and factual beliefs? Not with any clarity about it. But it’s probably a step in the right direction to suggests that beliefs fall into those two (or several) categories. 



“We must here make a clear distinction between belief and faith, because, in general practice, belief has come to mean a state of mind which is almost the opposite of faith. Belief, as I use the word here, is the insistence that the truth is what one would “lief” or wish it to be. The believer will open his mind to the truth on the condition that it fits in with his preconceived ideas and wishes. Faith, on the other hand, is an unreserved opening of the mind to the truth, whatever it may turn out to be. Faith has no preconceptions; it is a plunge into the unknown. Belief clings, but faith lets go. In this sense of the word, faith is the essential virtue of science, and likewise of any religion that is not self-deception.” ~ Alan Watts

I’ve just rediscovered this thought-provoking statement. It’s interesting that Watts sees faith and belief as almost opposites. Belief is akin to having a closed mind. Faith, according to Watts, is open-mindedness.

In common usage there is no such opposition between faith and belief. In fact there isn’t even a “clear distinction” between the two words. Yet obviously there are different shades of meaning, and those differences can be significant. When a person says “I believe in kindness” it’s a not a factual belief like “the earth is round,” nor a religious belief like “Jesus died for our sins.” (I find it fascinating that Watts traced the etymology of “belief” to “lief,” related to wishing or desiring; to him a belief [I think he means mainly religious beliefs] is a type of wishful thinking).

Faith seems to be a broader term, and is closer to “trust.” I was raised in large part by an Auschwitz survivor (my grandmother), and yet, like Anne Frank, I have faith that most people are good at heart. I have been mocked for it, called naive, overly optimistic, and “rather silly.” But in spite of having experienced my share of cruelty and deception, and in spite of having, through my grandmother’s eyes, stared into an abyss of enormous evil, I still find that *most* people are good and even altruistic, glad to help others if they can. Likewise I have faith in some other conceptions about reality that I have reached over the years, though I realize that they are not absolute and keep on evolving.

My special challenge has been developing the faith (trust) that no matter what happens, I will be able to cope with it somehow. It has taken me a long time and many life experiences to come to trust in my ability to cope. Still trembling a bit, I think that I have enough intelligence, emotional strength, accumulated wisdom (“This too shall pass” is priceless), and other resources to be able to cope rather than fall apart under stress, come what may.

This kind of “faith in oneself” may sound pretty obvious, even trivial, to someone who’s always had high self-esteem. But many women know what it’s like to have experienced being put down and disvalued, of being made to feel incompetent and inadequate; those women (and some men, but women in particular) will understand that gaining faith in your ability to cope can be an achievement.

I believe in hard work; I believe in studying things in depth; I believe that “you get what you pay for” in more ways in one. I believe in treating others as I myself would like to be treated. I believe in forgiving rather than trying to take vengeance. I believe in moving on rather than holding grudges.

I also have faith in “negative capability.” I believe in waiting for clarity to arrive “in its own sweet time” (i.e. “ripeness is all”) rather than rushing for an answer; I believe the cognitive unconscious has the capacity to produce amazing solutions. Perhaps “I have faith in” would be a more accurate expression. My long experience with the creative process has taught me to trust my unconscious.

I also have faith in the collective human genius and the collective human goodness, a dominant tendency to cooperate rather take pleasure in inflicting harm.  When Bernie Sanders defined his spirituality as acting from the knowledge that “we’re all in this together,” that was an example of this faith (trust) in human solidarity — also called humanism. Once we fully grasp the fact that “we’re all in this together,” we see the need to work together, to help one another.

But perhaps we’re getting too caught up in words here. What matters is not how precisely we define the difference between “belief” and “faith,” or even what we believe and/or have faith in, but how we act.

William Blake: Behemoth


    Strange clouds forming above the Bermuda Triangle could explain why dozens of ships and planes have mysteriously vanished in the notorious patch of sea.
    Using radar satellite imagery, [meteorologists] discovered bizarre “hexagonal”-shaped clouds between 20 and 50 miles wide forming over the dodgy patch of water.
     The blasts of air are so powerful, they can reach 170 mph — a hurricane-like force easily capable of sinking ships and downing planes.

Will it convince the conspiracy nuts that something supernatural isn’t at play here? I doubt it. They’re impervious to evidence. As the saying goes, you can’t reason someone out of something they were never reasoned into in the first place.


~ “The habit of always saying “please” and “thank you” first began to take hold during the commercial revolution of the sixteenth and seventeenth centuries — among those very middle classes who were largely responsible for it. It is the language of bureaus, shops, and offices, and over the course of the last five hundred years it has spread across the world along with them. It is also merely one token of a much larger philosophy, a set of assumptions of what humans are and what they owe one another, that have by now become so deeply ingrained that we cannot see them.

The English “please” is short for “if you please,” “if it pleases you to do this” — it is the same in most European languages (French s’il vous plait, Spanish por favor). Its literal meaning is “you are under no obligation to do this.” “Hand me the salt. Not that I am saying that you have to!” This is not true; there is a social obligation, and it would be almost impossible not to comply. But etiquette largely consists of the exchange of polite fictions (to use less polite language, lies). When you ask someone to pass the salt, you are also giving them an order; by attaching the word “please,” you are saying that it is not an order. But, in fact, it is.

In English, “thank you” derives from “think,” it originally meant, “I will remember what you did for me” — which is usually not true either — but in other languages (the Portuguese obrigado is a good example) the standard term follows the form of the English “much obliged” — it actually does means “I am in your debt.” The French merci is even more graphic: it derives from “mercy,” as in begging for mercy; by saying it you are symbolically placing yourself in your benefactor”s power — since a debtor is, after all, a criminal. Saying “you’re welcome,” or “it’s nothing” (French de rien, Spanish de nada) — the latter has at least the advantage of often being literally true — is a way of reassuring the one to whom one has passed the salt that you are not actually inscribing a debit in your imaginary moral account book. So is saying “my pleasure” — you are saying, “No, actually, it’s a credit, not a debit — you did me a favor because in asking me to pass the salt, you gave me the opportunity to do something I found rewarding in itself!” ~

Debbie Milma: Please, 1993


Fascinating. Saying “please” and “thank you” is something we take for granted, unaware that such “good manners” didn’t exist until relatively recently in human history. You didn't need to thank a slave. Relatively speaking, we live in an era of emphasis on human dignity.

Someone pointed out to me that Southerners cultivated exquisite manners toward their white peers. And Hitler was known for “beautiful manners” towards women — his secretaries, for instance, who were notoriously in love with him. This almost makes me want to say, “Beware of people with beautiful manners — they may be compensating for being complete bastards toward SOME human beings.” Beautiful manners and rank prejudice — not uncommon. And in my unfortunately experience I’ve found that great charm can go together with utter cruelty. Of course in most cases this need not be true. Ideally, we should have beautiful manners when dealing with anyone. 

ending on beauty:

“She smelled the way the Taj Mahal looks by moonlight.” ~ Raymond Chandler, The Little Sister

Sunday, October 16, 2016


Mt. Whitney near the summit


Near an exit to Death Valley,
next to a rusty two-pump gas station,
there used to stand a shack
with a faded sign:
Only clumps of sage brush,
a Joshua tree like a broken candelabra.
We passed it every summer
on the way to Whitney Portal.

I could imagine only too well
the still life inside:
a beer-sticky formica counter,
the sticky plastic tablecloth
smeared with a sticky rag.
A fan frantically whirrs,
moving the hot air around;
one sluggish fly, a few locals
sticky with beer and sweat
under a half-gone neon
of Miller’s Highlife,
and non-stop country-western
songs all whining into one —
“God may forgive you, but I won’t.”

The last time, in dead August heat,
we were going to Whitney Portal
to celebrate my mother’s
eighty-seventh birthday.
She could walk only slowly;
it’d been ten years since she hiked
to the top of Mt. Whitney.
Yet she insisted on hiking on her birthday,
fragile and lovely like a dying orchid.

She went with us part of the steep trail.
Around, sheer walls of granite,
pale beige or rosy with streaks of greenish gray,
or burning gold in the setting sun.
And the cascades, shivers of white
against the shiny skirt of rock;
the huge coins of eroded stone
above the deep green of fir and pine.
Near the streams, the tender monkey flowers,
wild rose and Indian paintbrush,
blue borage and tall purple candles
of lupine, the regal wolf flower.

She had to stop and rest many times.
Told me again about the thunderstorm
at the summit that could have killed everyone.
Exclaimed more than once,
“The high mountains. Just smell the air!
The high mountains make me feel alive.”


A New Age friend told me of her near-death
trip out of the body, floating among the planets
and the stars. “There were colored lights
and music, and galaxies like swirling neon.”
God was like the sun, she said,
only brighter. “He told me, ‘Go
back!’ I felt angry, so very angry.
Who’d want to go back?”

I asked her, “Are there trees there?”
She glanced at me as though rudely
interrupted in her ecstasy. “Oh no.
Nothing like trees.” I thought,
if there are no trees, I’m not interested.

And if my mother had had a choice:
an afterlife floating around
the galaxies, admiring the colored lights,
hearing the music of the spheres,
or waiting for a long time,
centuries perhaps — in August,
at the Still Life Café,
temperature one-hundred-and-five,
hoping for a campsite at Whitney Portal,
I had no doubt what she’d choose —
knowing the granite that rises there,
immense and nearly vertical,
a cascade of light.

~ Oriana © 2016

(another view from Mt Whitney trail near the summit; both photos were taken by my mother)

How strange it feels to be looking at a poem in which my mother is still alive and the Still Life Cafe is in its original forlorn location rather than in Lone Pine, gentrified and meaningless.

My mother would indeed choose to wait for a campsite at Whitney Portal because she knew what was important, and it wasn’t an imaginary afterlife amid the swirling galaxies. The galaxies are of course fascinating to ponder, but to us humans at this stage, only one planet is important. It’s interesting that there is a book by Nancy Abrams, A God That Could Be Real, which argues that the only god that could be of interest to us humans is not cosmic (so much for “cosmic consciousness” as the ultimate in spiritual chic), but planetary.

We don’t want heaven, we want life, this life, just more beautiful and more loving.

(Abrams considers god real as an “emergent phenomenon.” Here is one explanation of emergence: “Cells have individual life, but when billions are gathered together in a certain form, what emerges is greater than the sum of the parts: it is (or can be) a human being. Humans themselves have individual life, but when millions focus their efforts in certain ways, other realities emerge. One might be called “the stock market,” which exists and has definite rules and characteristics. Another is “the media," and so on.” ~ from Amazon

Bird migration is an example of an emergent phenomenon. In terms of religion, god didn’t create us; we created god as part of our collective brain function. According to Abrams, that man-created god could be just as real as bird migration.

This is my first shameless digression in a long time.)

Back to what is important: to make wise use of what little time remains. “That’s not important,” my mother would say countless times during her last years. She wanted “what wakefulness remains” reserved for the essence. That included the daily walks where she could look at trees, dogs, children, squirrels. A bird hopping on the pavement was important. The sale at Sears was not important. Neither was meeting the tax deadline, even if the IRS seemed to differ.

 My mother on her 75th birthday.

What, then, IS important? The answer depends on the person and on the stage of life. Right now, amid medical difficulties, holding on to the bliss of slow reading and slow writing has become primary. “Harvesting” my poems and bringing them to perfection is important, building on my strengths rather than striking out in new directions as I did in my twenties and early thirties.

What else is important? Beauty and tenderness, but much has been written about those. So let me repeat: slow reading and slow writing. That’s how I become more my central self. Though this is not yet old age (but will it ever be? doesn’t it start at only at ninety?), I identify with what May Sarton (Journal of a Solitude) says


~ “When Adolf Hitler turned 30, in 1919, his life was more than half over, yet he had made not the slightest mark on the world. He had no close friends and was probably still a virgin. As a young man, he had dreamed of being a painter or an architect, but he was rejected twice from Vienna’s Academy of Fine Arts. He had never held a job; during his years in the Austrian capital before World War I, he survived by peddling his paintings and postcards, and was sometimes homeless. When war broke out in 1914, he entered the German Army as a private, and when the war ended four years later, he was still a private. He was never promoted, the regimental adjutant explained, because he “lacked leadership qualities.”

Yet within a few years, large crowds of Nazi supporters would be hailing this anonymous failure as their Führer. At 43, Hitler became the chancellor of Germany, and by 52 he could claim to be the most powerful man in the history of Europe, with an empire that spanned the continent. In the sheer unlikely speed of his rise — and then of his catastrophic fall — Hitler was a phenomenon with few precedents in world history.

Hitler cries out for explanation, and perhaps always will, because even when we know all the facts, his story remains incredible, unacceptable. How could so insignificant a man have become so potent a force for evil? How could the world have allowed it to happen? And always, the unspoken fear: Could it happen again?

Historian and journalist Volker Ullrich sees his subject as a consummate political tactician, and still more important, as a gifted actor, able to show each of his audiences — from the rowdies at mass meetings in beer halls to the elites in the salons of rich industrialists — the leader it wanted to see.

Like most biographers of Hitler, Ullrich passes quickly over his subject’s early years, which are little documented, in part because one of his last orders before his suicide in 1945 was for all his private papers to be burned. The story of Hitler’s public life doesn’t really begin until 1919, when he emerged in Munich as a far-right agitator, one of many who capitalized on the chaos in Germany created by the world war and a short-lived leftist revolution in Bavaria.

By 1923, his National Socialist German Workers’ Party had grown bold enough to try to overthrow the provincial government, in what became known as the Beer Hall Putsch. The coup failed, however, and after a short stint in jail, Hitler decided it would be easier to destroy the deeply unpopular Weimar Republic by legal means. He maneuvered ruthlessly toward this goal, aided by widespread despair over hyperinflation and then the Great Depression, until his triumphant elevation to the chancellorship. Notably, the Nazis never won a majority of the vote in any free election. Hitler came to power because other, more respectable politicians thought they would be able to control him.

Once in office, Hitler quickly proved them wrong. With dizzying speed, he banned and imprisoned political opponents, had his party rivals murdered, overrode the constitution and made himself the center of a cult of personality to rival Stalin’s. These moves did not dent Hitler’s popularity. On the contrary, after years of internecine ideological warfare, the German people went wild with enthusiasm for a man who claimed to be above politics. The fact that he hated Jews with a demented passion only added to his popularity in a deeply anti-Semitic society.

Hitler was a man who evacuated his inner self, as much as possible, in order to become a vessel for history and what he believed to be the people’s will. On a podium, he could mesmerize huge crowds with his rhetoric about Germany’s destiny. But everything we learn from Ullrich about Hitler’s personal life — what he ate for breakfast (cookies and chocolate), how he bored his guests with endless monologues, even his clandestine love affair with Eva Braun — is commonplace. He was himself conscious, on some level, that he was a thoroughly undistinguished person. When in the company of intellectuals or aristocrats, what Ullrich calls his “inferiority complex” was inflamed, and he grew fidgety and irritable.

Hitler’s mediocrity is all the more noticeable in this book because Ullrich strives not to mythologize his subject, knowing how many myths are already in circulation. There is a tendency, in stories about Hitler, to try to locate the magic key that explains him. Thus people sometimes say that he hated Jews because a Jewish doctor failed to save his mother from cancer, or that he was sexually neurotic because he was missing part of his genitals. Ullrich summarily dismisses both of these legends, noting that Hitler actually had a good relationship with his mother’s doctor, and that records of his medical examinations reveal no physical abnormality.

More important, Ullrich is consistently skeptical of the myths Hitler tried to create about himself. Much of the evidence we possess about the early life comes from the stories he told, and from the tendentious propaganda of “Mein Kampf.” These were designed to further Hitler’s image as a man of destiny, which meant that they were highly melodramatic. For instance, in 1939, while visiting the Bayreuth Festival, Hitler remarked that it was seeing a performance of Wagner’s opera “Rienzi” as a teenager that first gave him a sense of his heroic destiny: “That was the hour everything started.” Ullrich chalks this story up to “Hitler’s need for exaggerated self-importance.”

Yet he doesn’t deny that Wagnerian opera had a profound influence on the young Hitler’s view of the world. In fact, the strange thing about Hitler is not that he imagined himself as the leading figure in a historic drama — many people have such grandiose fantasies — but that life ended up vindicating him. It might have taken a world war, the Great Depression and other calamities to prepare the way, but in the end Germany decided to see Hitler just as he saw himself; the country matched his psychosis with its own. What is truly frightening, and monitory, in Ullrich’s book is not that a Hitler could exist, but that so many people seemed to be secretly waiting for him.” ~

Oriana: In lieu of a commentary, let me simply quote the last sentence: “What is truly frightening is not that a Hitler could exist, but that so many people seemed to be secretly waiting for him.”


Part of the charisma of Communism was the idea of its own historical necessity. Its victory was assured and only a matter of time; the progress of history was the writing on the wall. Milosz derives the concept of historical necessity from Christianity, but I think it started with ancient Judaism (and Milosz too actually starts with the conquest of Canaan). The world had a beginning, and the world will have an end, followed by the Last Judgment. History is the unfolding of god’s justice.

Tiger (Tygrys) was the nickname of Tadeusz Kroński, who in a 1949 letter to Milosz wrote this notorious passage: “Just because the majority [of the Polish population] is against us, we are to give up a great historical opportunity? . . . With the butts of Soviet rifles we’ll teach the people of this country how to think rationally, without alienation.” And thus he later became infamous as the man who wanted to use Soviet rifles to teach people how to think rationally.

However, he also said, “Anyone who crosses himself in public crucifies Christ. I also cross myself, but only when no one can see me. (Please keep this a secret. I can say it only to my closest friends.)” (But I am not aware if he ever held the view that Christ was the first Communist.)

In “Native Realm,” Milosz writes: ~ “Powerless Europe in 1948 had already been described in the Book of Joshua. The inhabitants of Canaan trembled when the Israelites arrived on the Jordan because they knew that the Lord had delivered Canaan over to the newcomers and that nothing could resist His will. At the sound of the Israelite trumpets, dismay filled the hearts of Jericho’s defenders. Now the trumpet of Communism resounded so loudly in Paris that the more discerning were convinced that to resist resist the verdict of historical Providence would be futile.

The citizens of the declining Roman Empire . . . felt weak in the face of Christian fanatics announcing the good news of the Last Judgment. Thus when Tiger spoke of “Christians,” it was understood he meant Communists. The allegory is justified insofar as the idea of inevitable progress or of a hidden force behind the scenes -- implacable toward all who disobeyed the Teacher’s commands -- took its origins from Christianity: without Christianity, after all, there would have been no Hegel or Marx.

. . . Tiger, of course, adored Hypatia, the last pagan philosopher of Alexandria, not the dirty, terrifying mob of Christians who tore her apart. And yet, he said, the future did not belong to Hypatia but to the Christians.” ~


Milosz, however, saw a countervailing force. Aside from the rise of the Soviet Empire, he’d also witnessed something that he considered equally important: the “Americanization” of Europe and, more slowly, the world. While fanatical Communism kept losing its charisma, Americanization was marching on. It’s been widely equated with “modernization.” It’s only now that we see another charismatic movement make gains against Americanization, and that is of course militant Islam. Countries such as Iran, once quite Westernized, have become theocracies. The triumphalist mood after the collapse of the Soviet union has given to a wide perception of vulnerability and decline. I haven’t heard the phrase “historical necessity” for a very long time. Let’s hope that the phantom of historical necessity will never again haunt the world.

To be sure, apocalyptic gurus still abound . . . Or simply those who are a variation on “Gott mit uns” of the German belt buckles.

Friedrich Hegel


Odd: Communism lasted such a short time, while the waiting for the First and Second Coming, that ultimate historical necessity, continues. The chronic failure of the prophecy of the end of the world does not seem to inspire much skepticism about its veracity. It’s easily explained away: so, there’s been a delay . . .  As for all the New Testament statements about how imminent this end was supposed to be, surely we can find a metaphorical meaning: the world will end for each of us, won’t it?

Wait, but what about the graves opening and so on? That was surely understood in literal terms? Some apologists like Karen Armstrong have put forth a bizarre notion that during the Middle Ages, for instance, people understood religion as pure allegory, and only we moderns have become literal — that’s why we find religious dogma bizarre. I say that mostly only we moderns have the sophistication to engage in metaphorical interpretation. Centuries ago, it took an incredibly exceptional mind to be radical enough to see mere metaphor. Even a great mind like St. Augustine took the story of Adam and Eve absolutely literally.

Communism versus Christianity is an interesting contrast between the fizzle or reality versus the power of fiction. For a moment I was tempted to conclude that the fall of communism presages the fall of Christianity, but obviously if something is fiction, then nothing needs to be delivered, and the promise of the inevitable — the historical necessity of trumpets, skeletons stepping out and putting on flesh, the rest — can continue for another thousand years (though some don’t expect humanity to survive more than a hundred more years)

Last Judgment, stained glass, Cluny


~ “During World War I, Zurich, the largest city in neutral Switzerland, was a refuge for artists, writers, intellectuals, pacifists, and dodgers of military service from various countries. A handful of these decided in 1916 to create a new kind of evening entertainment. They called it Cabaret Voltaire and established it at Spiegelgasse 1, not far from the room that was occupied by an occasional visitor to the cabaret, Vladimir Ilyich Lenin.

The initiator of the group appears to have been Hugo Ball. He was, like most Dadaists, a writer but had also worked in the theater and performed in cabarets. After having to leave Germany as a pacifist, he settled with Emmy Hennings in Zurich where, pale, tall, gaunt, and near starving, he was regarded as a dangerous foreigner. At the Voltaire, he declaimed his groundbreaking phonetic poem “Karawane” (Caravan)—written in nonsensical sounds—to the bewilderment of the public. After a few intense months of Dada activity he left the group, turned to a gnostic Catholicism, and died in the Swiss countryside, regarded as a kind of saint. His diary Die Flucht aus der Zeit (The Flight from Time) remains one of the principal accounts of Dadaism.

Among the artists of stature who emerged from Dada, Hans Arp was perhaps the steadiest and most consistent. A friend of Max Ernst, Kurt Schwitters, and Wassily Kandinsky, and a gifted poet, he was devoid of malice and envy, and had a superior sense of humor. His later spouse Sophie Taeuber, a notable artist herself, taught at the Applied Arts School in Zurich. She created marionettes and was a member of Rudolf von Laban’s dancing school, which had introduced a new expressive style of dance. During her Dada appearances as a dancer she wore a mask to disguise her identity.

In Tristan Tzara, calm and self-assured yet with a thunderous voice, Dadaism had its most passionate advocate and most tireless propagandist. André Breton called him an impostor avid for publicity but reconciled with him in 1929. Tzara’s poems influenced Allen Ginsberg and William Burroughs, and a few of them were translated by Samuel Beckett. Like Arp, he subsequently became a Surrealist.

Emmy Hennings, before living with Hugo Ball, had been an alluring drifter. Diseuse, actress, barmaid, and model, she became a femme fatale for more than a few German poets. She was a gifted cabaret performer who sang “Hab keinen Charakter, hab nur Hunger” (Devoid of Character, I’m Just Hungry). An important presence at the Dada events, “her couplets,” according to Huelsenbeck, “saved our lives.”

Soon, there was also Walter Serner, a cynic and anarchist who, as a writer, would become notorious for his thrillers and scandalous novels. Tristan Tzara called him “a megalomaniac outsider.” This was a time when dandies wore monocles. Serner wore one, and so did some of his Dadaist colleagues. He rebelled against society by being a high-class confidence man, producing a juridical thesis of which 80 percent later turned out to be plagiarized. Writing under the name of his painter friend Christian Schad, he reviewed a collection of his own stories. He also enjoyed feeding the press false information. His essay “Letzte Lockerung” (Ultimate Loosening) is for some a Dada classic.

From the near improvisation of the first events at the Cabaret Voltaire, one of the most influential avant-garde movements of the century emerged. The word “Dada” was introduced only a couple of months later. There are several explanations for it: the babble of a child, the word for a toy, the double “yes” in Slavic languages and Romanian, and Dada lily milk soap and hair tonic, which was first produced in 1912.

Dada was a joint achievement of the group. Its soirées were multimedia events: they combined words and literature, singing, music (with Ball at the piano), dance, art, farce, and a fair amount of noise. “Repelled by the butcheries of the world war 1914 we surrendered to the arts,” said Hans Arp. “We looked for an elemental art that would free the people from the insanity of the times, and for a new order that might establish a balance between heaven and hell.” “What we celebrated was a buffonade and a requiem mass at the same time.

In Paris, Tzara created a stir with his Dada manifesto of 1918 as well as with his electrifying presence. There, Erik Satie, another major composer, was a Dada sympathizer, and the literary ground for Dada had been prepared by the poet Guillaume Apollinaire. In 1920, Breton and a number of writers and artists who later became Surrealists joined Tzara, but in 1922 the Dadaists officially fell out with one another—according to Theo van Doesburg, over the question of whether a locomotive was more modern than a bowler hat.

The profound difference between Dada and Surrealism was that the Surrealists had a program and a dogmatic leader (Breton) while Dada was freewheeling and steeped in ambiguity. It was everything as well as nothing. Nevertheless, each of its branches had a different character. Berlin Dada, with Huelsenbeck, Raoul Hausmann, and Johannes Baader—an eccentric who intruded into the National Assembly to distribute Dada leaflets—was the most aggressive and political. The virtuoso draftsman George Grosz, another member of the Berlin group, despised bourgeois culture as well as modern art.

Picabia, The Lovers, After the Rain, 1925

Hausmann, a Dadaist with philosophical ambitions, and his companion Hannah Höch became champions of photomontage and collage, techniques central to Dadaism. The surpassing master of collage was, however, Kurt Schwitters, an artist of genius with a very different temperament from most Dadaists; he was apolitical and totally devoted to “Merz,” his own brand of Dada. Extremely tall, he used his booming voice to declaim, shout, hiss, and scream his mighty poem “Ursonate,” to this day the most striking specimen of phonetic poetry. His recitations were said to be so impressive that audiences were seized first by laughter, then by awe. Schwitters was also part of the Amsterdam Dada scene that was connected to Theo van Doesburg and the constructivist movement De Stijl.

In Cologne, Max Ernst produced some of the most exquisite Dada drawings and photomontages of the early 1920s. Together with the son of a banker who called himself Johannes Baargeld (cash), Ernst shocked the public with a Dada exhibition that was promptly closed by the police. A Dadaist sentence by Ernst reads, “Thanks to an ancient, closely guarded monastic secret, even the aged can learn to play the piano with no trouble at all.”

According to Schwitters:

    Dada subsumes all big tensions of our time under the biggest common denominator: nonsense…. Dada is the moral gravity of our time while the public collapses with laughter. As do the Dadaists.

Traditionalists see Dadaists as silly people. To a degree, they are right. Silliness was liberating from the constraints of reason. Silliness has the potential to be funny, to provoke laughter, and make people realize that laughter is liberating. Raoul Hausmann mentioned the sanctity of nonsense and “the jubilation of orphic absurdity.” To Dadaists, Charlie Chaplin was the greatest artist in the world.

There seems to me more than a little resemblance between the world a hundred years ago and much of what we observe today. This is no all-out war, but there is a sense of a deep crisis and an overbearing feeling of menace, of being faced with enormous threats. Karl Kraus, the Viennese moralist, satirist, and critic, wrote, “As order has failed, let chaos be welcome.” The buzz that Dada has recently generated in Zurich was best illustrated last February, when the Kunsthaus invited the people of Zurich to attend a fancy dress ball coinciding with the Dadaglobe exposition. No fewer than nine hundred masked neo-Dadaists turned up.” ~


Paintings regarded as Dada seems like typical modern art to me. In that sense, in the visual arts Dada is still going strong. In poetry . . . well, language poetry can be seen as the direct descendant, but there aren't that many admirers. Surrealism likewise proved to be more vital in the visual arts than in poetry. 

Kurt Schwitters, The Psychiatrist, 1919 ("This has a bit of Steam Punk about it" ~ Gwyn Henry)


I love to imagine Lenin — who did live in Zurich on Spiegelgasse (Mirror Alley) — attending a Dada performance. By contrast, Putin is totally inartistic.

But, seriously, Lenin was not a secret Dadaist (that theory, along with the notion that the Dadaists found Lenin’s idea “the greatest Dada,” is strictly tongue in cheek). The avant-garde Russian art of the first years after the revolution had more of a futurist feel. The main movement was called “Constructivism.” And it is true that Lenin at first did support avant-garde art, championed by his  Commissar for Education, Anatoly Lunacharsky. Lenin’s great dream was to make Russia a leading modern country, like the United States, a country he openly admired.

Still, the only part of Dada that Lenin would have appreciated was its desire to destroy the old order — and certain elements of the industrialism that pop up in the imagery. But Lenin was basically too “bourgeois” to approve of the bohemian and anarchic spirit of Dada. I realize that he would have hated to be called a bourgeois, a term he used to denounce almost anything he disliked.)

Klee, The Little Jester

Power and violence are always based on psychopathy and fantasy. Violence denies the basics on which authentic human interaction must be based: empathy, trust, and compassion. ~ Ralf Klinger


Though I could easily call myself a gnostic atheist (one who knows that god doesn’t exist), I prefer the label “literary atheist.” This is a person who understands the power of fiction. Any atheist can say that god is fictional, but a literary atheist knows that fictional characters are a part of our psyche and can have an amazing power to influence our behavior.

When a character like Superman is invented, he enters the collective psyche of humanity. “Star Wars” is an even better and more positive example, with Yoda as a spiritual guide and the master of “The Force.” Or even Harry Potter. Having supernatural powers is a big part of the appeal of those characters that become cultural icons.

A literary atheist regards a god, including any Abrahamic god (Yahweh, Allah, the Christian Trinity) as a fictional character. Perhaps “mythological” would be a more precise label, but “fictional” covers more ground. Just because a character is fictional doesn’t mean that he or she doesn’t “exist.” A fictional character can have a vivid neural existence, having become an indelible part of our psyche, along with the main narratives.

“Stories that never happened can have infinitely more power than stories that did.” Of course. You can have all kinds of impossible things take place to convey moral lessons and create strong emotions.

The story of the woman taken in adultery is regarded by many as illustrating the very essence of Christianity. It’s almost a foundational story. Yet it doesn’t appear in the early Greek manuscripts. Scholars have established that it’s a later medieval addition. Even those who know the story is made-up can still treasure it as a story, and still use the expression “to cast the first stone.”

Ultimately, so what if the story is made up? It’s all made up. It’s possible that Jesus never even existed — or if he did, we can never excavate the “historical Jesus” from the layers and layers of legend. A powerful story is not based on “it really happened.” Factual accuracy is beside the point. A story enters our psyche and exerts its influence. Even something clearly unbelievable, e.g. the resurrection, can contain an element that the psyche may find valuable — the symbolism of rolling back the stone, or the very idea that we can survive something devastating.

A literary atheist is a gnostic atheist, but with a subtle difference: she recognizes that a fictional character can be very powerful part of our lives, often more powerful than an actual person. The human brain doesn’t strictly separate reality from imagination. It’s not just young children who confuse “imaginary” and “real” characters and events; adults show the same tendency, as demonstrated by the phenomenon of false memory. And “false memory” is the rule, not the exception. In a way, all characters (including ourselves and our friends) are “imaginary.”

So in a way it’s fine that Yahweh is a fictional character. The distressing part is that he’s not a well-written one. This is not surprising, given that the bible was written and edited over a long time by many men. He’s not the creation of a single literary genius; he’s a collective creation.

And then there is the question of selective reading and shifting interpretation over the centuries that followed. Given that, it’s remarkable how, for all the efforts to soften him, he remains an obnoxious character, definitely “not a swell dude,” as someone recently put it. But a character doesn’t have to be likable to be powerful — literature is full of villains and good guys, as well as more complex villains who now and then have a gracious moment.

But let’s face it, Yoda is wiser and more endearing by far.


~ “One of the central embarrassments of Christianity arises from one of the most central errors of its founding figurehead. Jesus Christ was convinced that the next world — a radically different world from the observable reality of Roman Judea in which he found himself — was, as he continuously put it, “at hand.”

He was the prophet of this change in the exact same way John the Baptist had been the prophet of his own coming — that is, as a roadside herald, trumpet in hand, declaring the coming of something extremely imminent. Jesus repeatedly tells his listeners that he is a divisive figure, an enemy of complacency — he repeatedly tells people they must choose sides, this dusty live-a-day world all around them, or the next world, which is just about to dawn and change everything.

The problem with this particular mistake (the world didn’t change, the kingdom of Heaven didn’t arrive, the Romans kept nailing troublemakers to scaffolding) is that it elicits some of Jesus’ most straightforward comments – none more so than Matthew 19:21, when the Master is confronted by a rich young man who is righteous and God-abiding (when he’s given a list of commandments, he comments that he’s been following them his whole life – in other words, crucially, he’s not a sinner). The young man asks what he must do to gain eternal life, and Jesus’ answer hits him right between the eyes: “If thou wilt be perfect, go and sell that thou hast, and give to the poor, and thou shalt have treasure in heaven: and come and follow me.”

The young man refuses and goes away disappointed, and that’s when Jesus utters his famous imprecation that it’s easier for a camel to pass through the eye of a needle than for a rich man to enter the kingdom of Heaven.

Hardly any rich Christians have wanted to do what their Savior explicitly commands them to do. The text from Matthew provides the title of Peter Brown’s dense, magnificent new book (with its gigantic sub-title), Through the Eye of a Needle: Wealth, the Fall of Rome, and the Making of Christianity in the West, 350-550 AD, and the subject – the way early Christians got around the embarrassment of not wanting to be poor – is explored in 500 pages of fascinating, engaging prose and 100 pages of close-packed and amazingly comprehensive notes. The conflict between the sacred calling of Christianity and the more mundane concerns of spes saeculi, the hope of advancement in this world, is here given an examination like it’s never had before, with money at the heart of it all.

Also at the heart of it all is that pivotal figure, St. Augustine, and readers who’ve already encountered Brown’s justly revered Augustine of Hippo will know to expect fine writing and fine insight into the figure who, more than anybody, tried to work out a theocratic framework that would allow his congregation to be wealthy if only they avoided avarice. Blatant double-talk like that would come in very handy to Christians of every subsequent century.

~ Augustine’s justification of wealth came at the right time. In a world that had been unexpectedly shaken by renewed civil war and by barbarian invasion, there was no point in denouncing the rich for the manner in which they had gained their wealth. Those whose wealth had survived the shocks of this new crisis were unlikely to feel guilty about what little of it was left to them. The radical critiques of wealth and the wealthy associated with the preachings of Ambrose and with the Pelagian De divitiis were out-of-date. Such radicalism had been the product of an age of affluence. It had played on the disquiet of the comfortable rich of the fourth-century age of gold. It had less effect on persons who now faced the prospect of losing everything.” ~

Reliquary of the Holy Umbilical Cord, Cluny, 1407

Oriana: One aspect of the veneration of relics and images is very striking to me: it was the old paintings and statues that some centuries later (usually during the late Middle Ages) started weeping, bleeding, or even walking about. This is fascinating because those old paintings were much less likely to be naturalistic, but were stiff and stylized. I found that true of the “miraculous icons” in Poland — mostly Byzantine in style. Those are not the beautiful Madonnas that were painted later, with Mary as a lovely young woman — but the severe, awkward images from earlier centuries.


I checked it online, and the quote seems authentic. Religion mostly works to anoint the ruler  (the “divine right of kings”) and protect the ruling class. There have been exceptions, and peasant uprisings where the poor did murder the rich — usually quickly suppressed, and strongly condemned by religious leaders (e.g. Luther thundered against the peasant uprising of 1525).


~ “Of the seven deadly sins, the one with perhaps the most diverse menu of antivenins is the sin of pride. Need a quick infusion of humility? Climb to a scenic overlook in the mountain range of your choice and gaze out over the vast cashmere accordion of earthscape, the repeating pleats swelling and dipping silently into the far horizon without even deigning to disdain you. Or try the star-spangled bowl of a desert sky at night and consider that, as teeming as the proscenium above may seem to your naked gape, you are seeing only about 2,500 of the 300 billion stars in our Milky Way — and that there are maybe 100 billion other star-studded galaxies in our universe besides, beyond your unaided view.” ~ Natalie Angier

Oriana: And now astronomers think there may be almost ten times more galaxies than previously thought and have increased the estimate to trillions.

The better to startle you: a chicken embryo
ending on beauty


My childhood theory about why prayers weren’t answered was that Yahweh didn’t speak Polish. So what did it matter if we politely called him Mr. God in a language he didn’t understand. The gods who knew Polish were hiding in the woods like the partisans. I wondered how they survived the winter. Their drinking songs could sometimes be heard.

~ Oriana Ivy

Sunday, October 9, 2016


human heart without muscle or fat, with only the arteries and capillaries exposed


Unendurable, later, this necropolis of love.
I recognize even the raised spot
in the pavement on Fourth Avenue 

where he stumbled once.

Oh city within the city, eternal city of time. 

That stairway of dead echoes. A half-
figure peers into my old window,
his shadow broken on the railing.

Must I walk forever
down these stumbling streets,
read again the warped,
rusted warning signs?

Yes, says the theater marquee, 

in red neon blood 
announcing a silent classic,

“The Death of Orpheus” —

as if it didn’t happen
long ago, when he turned
away from his art. He said,
“Maybe the meaning of my life

will be through you — you write,”
and I reeled, a dazzled moth —
though we both preferred
to turn off the light.

~ Oriana Ivy © 2016

How strange it is to come across poems about old loves, so intense once, now essentially meaningless, almost puzzling. The greatest love of my youth was a narcissist. He once made a statement that startled me: “Perhaps the meaning of my life will be through you — you write.” To give him some credit, he also said “You have talent.” Thus he delivered the antidote to the poison of having been told, three years earlier, that I had no talent.

And later he almost ruined that memory by trying to justify his cruelty: “If I hadn’t caused you pain, this” — pointing to a poem of mine — “wouldn’t be here.” He wanted to believe that he created me as a writer.

His own art happened to be acting. He had an obvious talent for acting and took drama classes in college. But a teacher happened to make an unfortunate remark: “You are not handsome enough to be a leading man.” And that verdict was the end of this talented man’s acting ambition. If he couldn’t be a leading man and have the adulation that goes with it, then there was no point being an actor.

“Hamlet doesn’t have to be handsome,” I remarked when he told me this story. But even if he heard me, it was too late (unless for amateur theater). He preferred to mourn for his lost glory as a star.

In spite of the grandiosity, he had no secure sense of his own accomplishments. They were always only for show anyway, only a means to earn admiration. He took fencing lessons and cello lessons; when the recorder became popular, he tried that for a while. He’d go for the intellectually chic (this included food — he wouldn’t be caught dead eating iceberg lettuce; it had to be red-leaf or butter lettuce, in those pre-radicchio years); he drove a stick-shift VW, professed love for Beethoven’s Late Quartets, and played the abstruse game of Go rather than chess. But he never stayed with anything past its peak popularity.

He seemed to realize his own lack of substance. For a narcissist, he could be surprisingly self-aware. He said, more than once, “I know I am shallow” — and, in a moment of despair (he was no stranger to despair): “Deep down, I am a piece of shit.”

I realize that there is no single explanation for the various sub-traits of narcissism, of which self-loathing may be one. Did he have a terrible childhood? Yes. His father was an abusive alcoholic, and he grew up in the poverty and brutality of the “mean streets.” It was about survival, not about learning empathy or discovering your true talents and vocation.

Whatever causes the narcissistic personality disorder, it must be terrible not to have a genuine center, a seriousness about something for which you feel reverence. For me it's both beauty and the intellect, the collective genius of humanity; at a more specific level, it's still mostly poetry. That, and the ideal of kindness.

Finally, though, I agree that most of us have a need to worship something. For me that’s beauty. And that’s my center. Creativity is strongly connected to it. And yet, much as I hope that I will be able to work until almost the very end, I can also imagine no longer being able to write, and yet still having beauty at the center of my life — in a receptive way.

(A note on the title: the poem is part of my Eurydice series. In my personal “revision”, Eurydice, disappointed in Orpheus, becomes a singer herself.)

Photo: Alexey Menschikov


It’s not that suffering made me strong — on the contrary: I think I would be stronger and healthier now if I’d received a lot of affection back then and found creative work sooner. I believe that happiness, not suffering, makes us strong — the happiness of being loved, and the happiness of doing the work we love. Emotional support makes us strong, and the fulfillment and self-forgetfulness that come with paying full attention to whatever we are doing. As Freud said, “Love and work.”

But the suffering in my youth gave me the special scale of comparison that makes almost everything minor now, while elevating a sunset, say, to miraculous abundance and enchantment. Why, the astonishing fact that I am still alive! I don't have to remind myself to count my blessings.

This morning I woke up to a tiny miracle: a flock of mourning doves wandering in the grass in my backyard, pecking at some invisible seeds. Whole five minutes of watching them and listening to their cooing! I was flooded with happiness. O tiny gods! Would I be caressing such ecstasies if not for the years of squalor and degraded love, the years of “unrequited soul” as Hafiz says it?

Perhaps I would. Perhaps even more so. But offhand it’s plausible to think that my daily appreciation of small beauties is enhanced because they feel like such a gift after the torments.

But in the main, my youth wasn't completely wasted because, among other things, it created a different definition of pain, catastrophe, defeat, degradation. And people are surprised that I don't take novocaine for minor dental procedures. That’s not even real pain! Real pain is so obliterating that you either pass out or are totally filled with the desire to die. Not to die and go to heaven, but the desire for oblivion: not to be. That is pain. Other kinds, that's “discomfort.”

As I get older, I see how fewer and fewer things fit into the category of “important.” Interesting how often I remember my mother in her last years — I knew they were her last not only because she was getting close to 90, but also because she began to say, "That's not important" about a lot of things.

earth photographed from space



“Almost everything, all external expectations, all pride, all fear of embarrassment or failure —these things just fall away in the face of death, leaving only what it is truly important. Remembering that you are going to die is the best way I know to avoid the trap of thinking you have something to lose. You are already naked. There is no reason not to follow your heart.” ~ Steve Jobs

“Remembering that I’ll be dead soon is the most important tool I have encountered to help make the big choices in life.”  ~ Steve Jobs


The older I get, the less interested I become in wasting time on fussing about my clothes and the like trivia. Life is just too short for that. We are indeed “already naked.” And yet . . .  the day most beautiful at sunset, just as life seems ever more beautiful as we foresee its close. What my heart desires is the fullest possible communion with that beauty.

I only wish Steve Jobs added that full acceptance of death is also likely to make us more kind toward others — again, there is simply no time for petty arguments etc. So much nonsense falls away, and ideally more empathy and kindness make itself manifest — that’s part of the
so-called "mellowing with age.”

Van Gogh: Sunflowers 1887


I stole this image from an ad for solar energy, and why not. I am a cloud-watcher. Years ago I thought my main identity was “immigrant,” and after that “writer,” and after that “woman.” Now that I feel wonderfully posthumous, i.e. post-poetry, I am a cloud-watcher again. Call this a second childhood, but no, it's not the same. I never had such thoughts as now. When I was a child, I thought only I was real, while others were programmed robots. No more. Now I see that the clouds are more real than I am.


This is the most useful commencement address I’ve ever heard. Relaxl: you don’t have to have a dream. Imagine: in this culture that nags you, starting in childhood, to “think big,” to know exactly what you want to do with the rest of your life already at the age of nine, but certainly by nineteen, here comes a “success story” from the entertainment business, and he tells you it’s OK not to have a dream.

In fact, he tells you to beware of long-term goals; you’re likely setting yourself up for despair. Instead, Tim Minchin says, be passionate about short-term goals. Wow! almost my own “doctrine of tiny steps.” Instead of worrying about not having a big dream, or, later in life, about having just run out of your big dream, concentrate on the task right ahead, no matter how small. “Whatever thy hand finds to do, do it with all thy might,” as the preacher advises in the wonderfully secular Book of Ecclesiastes.

Then another task will present itself. To paraphrase Kafka, it has to; it will roll in ecstasy at your feet.

You’ll see that next task out of the proverbial corner of your eye. But if you are preoccupied with the “big dream,” trying to consciously control the course of your life, you are blinded and might miss something important.

Tim Minchin doesn’t explain this, but let me quickly explain why it’s so hard to have a coherent “dream.” We are not a single self; the brain has many competing neural pathways. Call them multiple selves. And even those shift and evolve depending on the stage of life. So “go with the flow” is the best solution. Trust life. Trust your unconscious. But when you do work, especially the sort that really calls to you at the moment, throw yourself into it. Be micro-ambitious.

Tim does urge us to remember that it’s mostly luck, and being grateful for whatever luck we’ve had — the luck of even existing. It’s illogical to take pride in one’s achievements or blame oneself (or others) for one’s failures — too much of it is sheer luck. The understanding of luck — of the power of circumstances — is the key to humility and non-judgment.

“Searching for meaning in life is like looking for a rhyme scheme in a cookbook” ~ Tim Minchin


When Franz Wright first sent a few of his early poems to his famous father, James Wright wrote back: “So you are a poet. Welcome to hell.” Dante’s Canto III comes to mind, the inscription on the gate:

Through me the way into the suffering city,
through me the way into eternal pain . . .
Abandon hope, you who enter here.

The poets’ hell was also mentioned by Milosz. It was a part of the hell of artists: those who put the love of art ahead of human love. Milosz said that Anna Kamińska was not an eminent poet; she was too good a good human being “to learn the wiles of the craft.” Her life was rich with human joys and suffering rather than creative agony and ecstasy.

There is no circle of poets in Dante’s hell. Virgil is one of the noble pagans who dwell in Limbo. Brunetto Latino, Dante’s mentor, runs on the burning sand under a rain of fire as punishment for homosexuality, not poetry. Most unforgettable is the troubadour Bertran de Born, who holds his severed head like a lantern. But no one is in hell for the idolatrous dedication to his art rather than to god.

Agony and ecstasy, the cross and the delight: the agony of poetry’s difficulty, the capriciousness of inspiration, waiting ten years for the right ending (now and then it’s precisely what happens), the impossibility of writing good work every time. And this before we even begin to lament the wounds in the struggle for recognition, the constant rejection and humiliation. “You die not knowing” if your work was any good, as Berryman says in Merwin’s poem.

For Franz, there was also the problem of being regarded as “the wrong Wright,” the son not half the lyricist that his father was. “No magic,” I kept thinking when I read Franz’s poems. But all poets have the less personal but even more demanding mothers and fathers — the great poets whose best work set the standard.

It took me years of despair to come to see that the last words written on gate also pointed to the paradoxical way out of hell, especially the hell of trying to get published. “Abandon hope” —  stop striving for instant perfection and struggling for recognition, and enjoy the peaceful pleasure of concentrating on the work itself, on the beautiful unfolding of the creative process.

This is Buddhist and Taoism wisdom, but not exclusively so. Some Western thinkers have also discovered the bliss of dropping the striving, of dropping the self-flagellation with the whip of “Achieve! achieve!” They advise dropping the dream, the great ambition, and concentrating on “micro-ambition”: the task at hand, without thinking of the results. “Don’t have a dream!” Focus totally on what’s in front of you.

What goes together with hope is its dark twin, fear. “Hope and fear — why we cannot fly.” I forget who said it (a poet, I think), but it sounds true. These are irrelevant, distracting emotions.

It’s also a matter of trust, of relinquishing conscious control. The best writing flows from the unconscious when it is ready, in its own time. Once writing ceased to be overwhelmingly important, I began to watch with pleasure how it emerges, one image leading to the next, one idea opening an infinity of ideas. That’s where the inner critic must awake and choose only the best — again, with as little struggle as possible, since choice too is part of the inspiration, and will come when it is ripe.


In Dante’s hell I’d probably find myself in the circle of the heretics. For Dante it meant those who denied the immortality of the soul, i.e. the afterlife. Those who dared to think for themselves and concluded that consciousness dies when the body dies (which seems to be also the Old Testament view) are doomed to live in open tombs filled with flame. After Judgment Day in the Valley of Josaphat near Jerusalem, the heretics, their bodies restored, will return to lie down in their tombs — but now the stone lid of the tomb will be shut.

One might point out that the suffering would be greater if the heretics had some hope of getting out of the tomb and seeing “the sweet light” of earth. Then they’d be trying and trying, only to fail again and again. But without hope, they will not engage in useless struggle. Strange as it may sound, they’ll be at peace while being everlastingly consumed by the eternal flame.


Jerusalem, the Valley of Josaphat, assumed to be the site of the Last Judgment. In the foreground, a cemetery (I think): a treeless, flowerless, stony place that makes old European cemeteries look so luxurious and sweet in a melancholy way — I almost want to say “cozy.” Cremation is now the way — we aren’t yet ripe for the eco-burial. Otherwise, who wouldn’t want to “rest” in a cozy cemetery?

ONLY 58% OF AMERICANS STILL BELIEVE IN HELL (so much for Pascal's Wager?)

It doesn’t surprise me that the belief in hell is waning, since there is less and less tolerance for cruelty, and besides, as soon as I arrived, I noticed that Americans don’t see themselves as sinners (a huge change after Polish Catholics, who seemed 50 years behind) deserving any kind of punishment, much less eternal. The most recent (2013) Harris poll found the belief in Satan and hell down to 58%. I expect this to slide below 50% soon.

~ “It is increasingly difficult to convince educated people that they and their friends and children deserve infinite suffering for finite failings—or that a god who acts like an Iron Age tyrant (or domestic abuser) is the model of perfect love. A group called Child Evangelism Fellowship aroused intense opposition in Portland last summer in part because outsiders to biblical Christianity were appalled that insiders would try to convert small children by threatening them with torture.

The appeal of hell as a part of the faith package appears to be in decline, even among Evangelicals. According to a 2011 survey, while 92% of Americans claimed some sort of belief in God, only 75% believed in hell. A 2013 Harris poll put belief in the devil and hell at 58 percent. As one theology professor, Mike Wittmer, put it: “In a pluralistic, post-modern world, students are having a more difficult time with (the idea of) people going to hell forever because they didn't believe the right thing.”

The decline of hell-belief may be due to the same factors that may be causing the decline in bible belief more broadly — globalization and the internet. It gets harder to imagine oneself blissfully indifferent to the eternal torture of Muslims, Buddhists, Jews, and atheists when those people have names and faces and are Facebook friends.” ~

Gian Lorenzo Bernini, Anima dannata (damned soul), 1719

It’s said that in order to capture the agony, 20-year-old Bernini held his hand over flame while watching the reflection of his face in the mirror.


Those who still believe in hell seem to be particularly ardent supporters of it. One man on FB said, “But without hell, who'd ever want to follow Jesus?” Sad.

Pascal lived in an era where the clash with competing religions hardly existed. Now of course it’s the Muslims who claim that all non-Muslims go to hell, and we are aware of that, and the whole game seems more and more ridiculous.


One world at a time. ~ Henry David Thoreau, on being asked about the afterlife


In surgery, anesthesiology and urology, around two-thirds of doctors who have registered a political affiliation are Republicans. In infectious disease medicine, psychiatry and pediatrics, more than two-thirds are Democrats.

It’s possible that the experience of being, say, an infectious disease physician, who treats a lot of drug addicts with hepatitis C, might make a young physician more likely to align herself with Democratic candidates who support a social safety net. But it’s also possible that the differences resulted from some initial sorting by medical students as they were choosing their fields.

Dr. Ron Ackermann, the director of the institute for public health and medicine at Northwestern University, says he remembers his experience rotating through the specialties when he was in medical school. “You’ll be on a team that’s psychiatry, and a month later you’re on general surgery, and the culture is extraordinarily different,” he said. “It’s just sort of a feeling of whether you’re comfortable or not. At the end, most students have a strong feeling of where they want to gravitate.”

One explanation could be money. Doctors tend to earn very high salaries compared with average Americans, but the highest-paid doctors earn many times as much as those in the lower-paying specialties. The fields with higher average salaries tended to contain more doctors who were Republican, while the comparatively lower-paying fields were more popular among Democrats. That matches with national data, which show that, for people with a given level of education, richer ones are more likely to lean Republican (possibly because of a concern over the liberal policy goal of taxing the wealthiest at a higher rate).

The sorting may also reflect the changing demographics of medicine. As more women have become doctors in recent years, they have tended to cluster in certain specialties more than others. The data showed that female physicians were more likely to be Democrats than their male peers, mirroring another trend in the larger American population. So as women enter fields like pediatrics, obstetrics/gynecology and psychiatry, they may be making those fields more liberal.

Over all, the partisanship of doctors looks very different from a generation ago, when most physicians identified as Republicans. The influx of women may help explain that change, too. The researchers Adam Bonica, Howard Rosenthal and David Rothman compared political donations by doctors in 1991 with those in 2011 and 2012. The study found that doctors had become substantially more likely to give to Democrats.

New doctors can’t explain all of the change, though. Even older doctors in the new data look close to evenly split between the parties. It’s likely that many older doctors have switched parties over the year. That’s true broadly for well-educated professionals in the United States, who have become increasingly Democratic in recent years.

The shift reflects how the practice of medicine has been changing, too. Doctors used to essentially be small-business owners. As such, they may have been more attracted to Republican aims of low taxes and limited regulation. These days, more and more doctors are employees of large companies or hospitals.


~ “Ever make a decision and then your brain finally feels at rest? That's no random occurrence.

Brain science shows that making decisions reduces worry and anxiety — as well as helping you solve problems.

Upward Spiral (US): Making decisions includes creating intentions and setting goals — all three are part of the same neural circuitry and engage the prefrontal cortex in a positive way, reducing worry and anxiety. Making decisions also helps overcome striatum activity, which usually pulls you toward negative impulses and routines. Finally, making decisions changes your perception of the world — finding solutions to your problems and calming the limbic system.

But deciding can be hard. I agree. So what kind of decisions should you make? Neuroscience has an answer.

Make a "good enough" decision. Don't sweat making the absolute 100% best decision. We all know being a perfectionist can be stressful. And brain studies back this up.

Trying to be perfect overwhelms your brain with emotions and makes you feel out of control.

US: Trying for the best, instead of good enough, brings too much emotional ventromedial prefrontal activity into the decision-making process. In contrast, recognizing that good enough is good enough activates more dorsolateral prefrontal areas, which helps you feel more in control.

As Swarthmore professor Barry Schwartz said in my interview with him: “Good enough is almost always good enough.”

So when you make a decision, your brain feels you have control. And, as I’ve talked about before, a feeling of control reduces stress. But here’s what’s really fascinating: Deciding also boosts pleasure.

US: Actively choosing caused changes in attention circuits and in how the participants felt about the action, and it increased rewarding dopamine activity.

Want proof? No problem. Let's talk about cocaine.

You give two rats injections of cocaine. Rat A had to pull a lever first. Rat B didn't have to do anything. Any difference? Yup: Rat A gets a bigger boost of dopamine.

US: So they both got the same injections of cocaine at the same time, but rat A had to actively press the lever, and rat B didn’t have to do anything. And you guessed it — rat A released more dopamine in its nucleus accumbens.

So what's the lesson here? Next time you buy cocaine … whoops, wrong lesson. Point is, when you make a decision on a goal and then achieve it, you feel better than when good stuff just happens by chance.

And this answers the eternal mystery of why dragging your butt to the gym can be so hard.

If you go because you feel you have to or you should, well, it's not really a voluntary decision. Your brain doesn't get the pleasure boost. It just feels stress. And that's no way to build a good exercise habit.

US: Interestingly, if they are forced to exercise, they don't get the same benefits, because without choice, the exercise itself is a source of stress.

So make more decisions. Neuroscience researcher Alex Korb sums it up nicely:

We don’t just choose the things we like; we also like the things we choose.” ~


Choice is also a burden, a source of stress, but only if you agonize over it. The beauty of this article is pointing out that we don’t have to agonize — that we can make a “good-enough decision.” Once we realize that the decision doesn’t have to be perfect, we can decide quickly — and then we’ll quickly feel better. 

ending on beauty

Whose one white note was feast enough
for all the throats of dusk.

~ Cecilia Woloch

Photo: David Whyte