Sunday, June 26, 2016


Zayed Mosque, Abu Dhabi


Not metaphor, but outside
the window, the muezzin
calling from town, his voice faint
as a phantom arrived in the room.
How many times have I felt
shame at those words—Allahu
akbar—felt it twist
like a knife inside me?
Once, lonely, I listened
for hours to recordings
of the adhan through
Macbook speakers,
wept and knew myself
in the presence of God.
My God, during this all holy
month, when I am so far
everything back home
seems like a dream, its violence
only a wakeable sleep—
where are you to refuse those
who call out to you, who undo
what you’ve made in your name?
I am not asleep and they are
not waking. Again there is blood
on the floor in your name
and there is no god
but you, so answer.

~ Leila Chatti —Poets Respond
June 19, 2016


Leila Chatti: “I am currently in my second home, Tunisia, where I return each summer to visit my family. It is Ramadan and I was preparing for the evening meal (iftar), during which one breaks their fast, when I first read the news about Orlando. And as I was reading, the adhan—the call to prayer–came drifting in from town. I was so startled by the juxtaposition I had to sit for a long time. I thought of the line by Naomi Shihab Nye: ‘What does a true Arab do now?’ And what does a true Muslim do, too? I wrote this poem.” (website)


Again there is blood
on the floor in your name
and there is no god
but you, so answer.

Here is an example of how the personal and the political can be flawlessly married. Rare, but possible. And it feels absolutely authentic. I suspect the personal element has to be dominant for that to happen — it's the intimate tone. It allows the reader to understand the speaker’s suffering.

How many times have I felt
shame at those words—Allahu
akbar—felt it twist
like a knife inside me?

Of course it’s humans who create a god in whose name they then proceed to oppress and kill. But it may take another century before that reality is acknowledged. Here the speaker may be on the verge of facing the truth. At the very least, she feels deeply wounded that the people of her religious tribe are causing mayhem again and again. 

I am not asleep and they are
not waking. 

~ beautifully said.

She is not denying that religion can have positive aspects. The call to prayer is beautiful and can even lead to a mystical experience. That’s why she feels all the more betrayed by the misuse of religion by those who want the power to impose their will and oppress others. The poet’s voice is a crying in the wilderness, a desperate plea to an absent god to answer, explain, perhaps even do something to stop future atrocities. But only humans can find answers — if they try hard and are lucky. 

When something horrible happens, the best answer is not an answer at all. It’s presence. The only true comfort is another human being who will be there with you in it. ~ R.J. Twain


~ “One way that biblical literalism screws with people’s heads is this: Children are taught from a young age that God is perfect—the essence of Love and Truth. But when you look a little closer at the stories in the Bible, it turns out that he’s an awful lot like Trump.

“He is powerful, and He wants us all to know it…

“I am the Lord, and there is no other; besides me there is no god. I arm you, though you do not know me, so that they may know—from the rising of the sun and from the west—that there is no one besides me; I am the Lord, and there is no other. I form light and create darkness, I make weal and create woe; I the Lord do all these things . . . To me every knee shall bow, every tongue shall swear. . . ” (Isaiah 45:5-7 KJV).

Me, me, me, I, I, I, I, I.

He’s an insatiable attention seeker. From Genesis through Revelation, the Bible lays out precisely how people should grovel and sing God’s praises and otherwise kiss up. God wants his adoring followers to beg for things that he already knows they need. He loves the smell of burnt offerings and dictates just what should be burnt and when. He demands proof of loyalty, like cutting off the cover of your penis, or whacking relatives who don’t think he’s awesome, or being willing to turn your child into a human sacrifice…

He’s racist and able-ist. God may claim credit for making us all, but that doesn’t prevent him from picking favorites or finding some people repugnant simply by accident of birth. The Old Testament narratives are about favored blood lines, whites—I mean Hebrews—who get the right to claim land already occupied by other ethnic groups. According to God’s rules, even slaves must be treated better if they are Hebrew slaves.

But being Hebrew won’t help if you’re handicapped. Jehovah, like Trump thinks that arthrogryposis is just gross. Stay away! “No one of your offspring throughout their generations who has a blemish may approach to offer the food of his God. For no one who has a blemish shall draw near, one who is blind or lame, or one who has a mutilated face or a limb too long, or one who has a broken foot or a broken hand, or a hunchback, or a dwarf, or a man with a blemish in his eyes or an itching disease or scabs or crushed testicles.” (Leviticus 21:17-21)

He demeans women. If a guy with crushed balls might contaminate Jehovah’s inner sanctum, a menstruating woman would be far worse. Whatever you do, don’t let Megyn Kelly sit on the furniture! And by the way, a woman who gives birth to a girl baby is nasty for twice as long as one who gives birth to a boy. But don’t get too insulted. Women can be saved through childbearing.”

There’s much more, please read the whole thing. And he fulfills the psychological need for a big daddy/protector as well. Just as the people of the Bible cried out to [Yahweh] to protect them and destroy their enemies, so today’s conservatives cry out to Trump for the same thing." ~


We shrug and say that’s just the archaic tribal mentality — boosting “us” — the chosen people, the greatest country in the world, the exceptional nation among inferior humanity — against the “other.” We say that god is dead — meaning specifically the vengeful, wrathful, nationalist, sexist, narcissistic, petty, jealous, infantile, and altogether distasteful Yahweh. But when this mentality suddenly crops up in the 21st century, there is reason to wake up — and lament.


The tome in [Solomon’s {the names have been changed}] hands was Alan Dershowitz’s The Genesis of Justice (2000), which used Talmudic and Hasidic interpretations of the Bible to argue that stories in the book of Genesis, from Adam and Eve eating the apple to Noah and his ark, constituted God’s learning curve — a means of establishing a moral code and the rules of justice that prevail today.

What struck him about the book was its depth, and a complexity of thought that he had been raised to believe was the exclusive domain of the rabbis whose authority commanded his community of ultra-Orthodox Jews. The book’s brilliance, coupled with its unabashed heresy, created the first of many cracks in Solomon’s faith. Seeing the scriptures interpreted in methods so compelling and yet entirely inconsistent with the dogmas of his youth caused Solomon to question everything he believed to be true.

. . . Yanky’s study partner took Yanky to a presentation by the British scientist and New Atheist Richard Dawkins, author of The God Delusion (2006). ‘It wasn’t so much that Dawkins was so convincing, or interesting even,’ Yanky told me between short sips of beer. ‘It was just, I was sitting there with this whole group of people who were having this one viewpoint.’ He experienced for the first time what religion looked like from the outside, a series of often ridiculous and always questionable ideas shattering its absolute hold on his psyche.

And as Dawkins spoke, Yanky realized that there was one answer that took care of all of his questions — God did not write the Torah because He does not exist.

Yanky was devastated by his realization that there is no God. ‘It was very upsetting,’ he said, talking quickly. ‘I remember laying in bed and feeling like the world had come to an end. It wasn’t a relief. It was very painful.’

He was so upset that his first move after this realization was to search out the smartest and most learned rabbis, hoping that they would have answers for him and be able to convince him that he was wrong — that there is a God, that the Torah is true. He wrote anonymous letters to a few respected rabbis, and posted them snail-mail (though this was 2000, he had little to no contact with the internet, as the most pious Jews don’t). The letters contained his questions, mostly culled from the contradictions between the first chapters of the Old Testament and evolutionary theory.

The explanations he got from rabbinic scholars were weak and obscure. One rabbi sent him a bizarre note, including a story about sitting in a boat, ‘an elaborate allegory intending to describe how we only coast along over the deep waters of the Torah,’ Yanky recalled. ‘It was cool, but it didn’t help me. Thanks Rabbi.’ With nowhere left to turn, he was finally forced to admit what he was: an atheist leading a double life, forced to stay under wraps lest his boss fire him, his wife divorce him, and his children get thrown out of school.

Wherever there is an insular Jewish enclave, there are individuals who have come to the conclusion that God does not exist, and yet they maintain their religious cover for social, familial and economic reasons. Many are well-established in their communities, even leaders. Many are financially successful, family men and women, moral people. ‘I am your neighbor with kids in your children’s class,’ wrote one undercover atheist anonymously on a blog. ‘I am one of the weekly sponsors of the Kiddush club… I was your counselor in camp… I do not believe in God.’

Moishe’s journey from believer to atheist happened in a matter of weeks, after a few passages from Maimonides convinced him that the greatest Jewish scholar was, like himself, an undercover atheist.

‘I’m desperate to tell my kids the truth,’ Moishe confessed. And yet, he doesn’t dare. Moishe is not alone. Many I spoke to stay inside the confines of their Orthodox lives for fear of harming their children, opting instead to let them continue to believe what they themselves now consider to be fairy tales.

‘To me, lying to my children was the worst part,’ said another undercover atheist – I’ll call him Yisroel. Yisroel has a very good job – he makes in the high six figures – and is very attached to his wife and children, the opposite of the stereotype that prevails in religious communities surrounding those who lose the faith, namely that they are ‘liars who want to do drugs, cheat on their wives and eat cheeseburgers’, as he put it. Yisroel’s greatest wish is that his children will learn to think critically and figure things out for themselves. But he has no plans to accelerate that process. ‘I take it one day at a time; I don’t have any long-term goal about that,’ he told me when we met in a Manhattan deli on a rainy afternoon.

A few lucky men convinced their wives of their new-found convictions, giving them a partner in crime. One man I spoke to — Yechiel — who lives in Jerusalem told me it was not as painful for his wife when he convinced her. ‘Women are in a much more minor role in the community,’ he said. ‘Women are expected to express religious devotion by raising the kids, by much more physical things – getting a job, supporting their husband’s learning. Much less a direct spiritual experience, so for her to give it up wasn’t giving up much.’

But it was for him. He remembered the direct aftermath of his loss of faith. ‘I was praying to Hashem [God]: Give me back my belief, prove to me that it’s true, begging and begging. At some point, I realized it’s just plain stupid.’ Still, he said: ‘If you would see me in the street, my white shirt and black yarmulke, you wouldn’t know anything at all.’ His wife is now pushing for more changes to their lifestyle, but fear of hurting his parents keeps Yechiel in line.

As long as ultra-Orthodox communities continue to marry people off at such young ages, doubters will remain stuck, Solomon contends. ‘Religion has survived a lot of major challenges,’ he said, and the recent turn towards fundamentalism within ultra-Orthodox Jewish communities is just that – a coping mechanism to weed out the non-conformists. ‘The radicalization of ultra-Orthodox Judaism is a sign of its success, not its failure.’

But Moishe believes that the phenomenon of atheism is deeply entrenched in the Orthodox way of life. ‘Everybody’s faking,’ he insisted. ‘I think it’s all going to come crashing down. I say 20 years.’


As humans acquire knowledge and culture evolves, they change their concept of god — sometimes going through of stage of claiming, like Dershowitz, that god too learns through his mistakes — or that it’s our consciousness that’s actually building divine consciousness (Rilke: “We are building god”). From there it’s only a short step to perceiving that we might as well jettison the concept of an external god altogether — it’s humans who are evolving to be more just and compassionate.

But it’s sobering to consider just how recent the idea of human rights is, or the still radical notion of not seeking revenge, or that punishment should not be torture — perhaps we shouldn’t even have punishment, just rehabilitation?

The ability to openly discuss such topics is also very recent. The freedom of thought and speech exists side by side with the remnants of archaic religions. The harm they cause is especially blatant in the case of radical Islam.

But — there are underground atheists in Islam and in other “extreme religions.” The conflict between orthodox teachings and the modern world is only growing. It is not too far-fetched to ponder that perhaps “Moishe” is right: ‘Everybody’s faking,’ he insisted. ‘I think it’s all going to come crashing down. I say 20 years.’

Scholars are divided: some say that at least in the affluent countries religion will disappear by 2050 (or sooner), while others lament that fundamentalist Islam and the other fundamentalist movements (e.g. the African kind of Catholicism, with its emphasis on Satan and exorcism) will take over.

If atheism wins, it will be what Nigel Barber calls “the triumph of earthly pleasures over pie-in-the-sky.” For me, the foremost earthly pleasure is beauty. In Warsaw I had lilacs in May and chestnut-tree blossoms in June — and that was enough to make me perceive the earth as the real paradise. Later came the Pacific Ocean, the Eastern Sierra and the Rockies — the splendor of the American wilderness.

An imperfect paradise, but a real one. As for “holy scriptures,” there is great literature. Now if only we figured out how to significantly fight aging so as to gain more time for the richness of life. The project is doable — now if only we didn’t have to waste resources struggling against what remains of archaic tribal mentality.



I see it more and more clarity: this is my last chance for unbridled hedonism. Recently I was reminded of that again: there was an opportunity to see the sea lions in La Jolla. It meant the ache of walking a longer distance than I usually do (though finally there is hope of improvement), but both the sea lions and the beauty of the ocean — all the layers of aquamarine and indigo — were totally worth it. What a privilege to live near such beauty!

What initially touched off the insight was far from poetic. I was looking for the all-fat diet in the first book that Atkins wrote on the ketogenic regimen for weight loss. I did not find it, by found this question: “Are you eating luxuriously enough?” A rhetorical question — Atkins knew that the reader, terrorized by decades of anti-fat propaganda, thought that weight-loss would come from subsisting on non-fat cottage cheese.

Of course unbridled hedonism is pretty individual. I am not into orgies. I'm not into “luxury cruises.” Maybe my horizons will expand if I turn out to be a candidate for knee replacement. But for now, unbridled hedonism is whipped cream and purple dahlias.

And something interesting to read, of course. Intellectual and esthetic debauchery combined.

It’s also taking the time to lie down. THE SACREDNESS OF LYING DOWN.

Time to read slowly, for pleasure.

Slow swimming.

Slow everything. Savoring, knowing there will be an end to it. Only so many mornings, only so many amber afternoons.

I am just beginning to explore my kind of unbridled hedonism. I expect other items will be added to the list as I remember them. My memory of pleasure has become dull over the years of struggle — the kind of effort that Buddhism and Taoism warn against almost as their most important principle: don’t struggle! And all the well-meaning friends who were telling me not to give up. More than anything else, I needed to give up.

Twenty years ago, I know exactly what I’d have predicted: that I’d now be saying, “This is my last chance for significant achievement.” Now I know that “significant” is pointless — I am not in control of that. What I write, what I’ve written — ephemera that must give me and my readers pleasure in the now. We are of the moment and we live for the moment. Rather than lament our "momentariness," let us give ourselves fully to the moment.

Digital art, Catrin Welz-Stein

No famous person from the past, no matter how great, would make sense now. Lincoln wouldn’t. Queen Victoria wouldn’t. Julius Caesar or Lenin, none of them would make sense. To be resurrected would be the most terrible punishment — so much for people who pay incredible sums to have themselves frozen, in the crazy hope that a future humanity would be interested in re-animating them.

We are of the moment and live in the moment. And besides, can I complain about over 190,000 page-views of my blog, and an international audience? One of life’s ironic but ultimately pleasant surprises: it’s my prose that has gained me readers. But not the posts that I regard as my best. How little we control.

I love the phrase: “We have to redefine victory.” I’ve redefined achievement so that the inner slave-driver is now pretty irrelevant.

“God is not dead. He is alive and working on a much less ambitious project.” ~ graffito

Yes, even god can finally come to his senses. Nietzsche lost his before he could grow older and wiser. This is what the world lost: a sane Nietzsche at sixty — working on much less ambitious projects. Imagine: Nietzsche — happy.

And this by Nietzsche reminds me of my youth: “What do you love in others? — My hopes” (The Joyful Wisdom, III, 272). Yes, my own hopes and ambitions was what I used to project on men — who, with one final exception, weren’t even a fraction as ambitious as I used to be.

Speaking of others, what about service? Yes (this is offered as the abridged version of Molly Bloom’s “yes” soliloquy). Unbridled hedonism would be insufferable without feeling a part of a network. I hope my doing the Poetry Salon and my blog are my current way of serving, providing the mental nourishment of beauty and poetry and whatever wisdom I can muster. Above all, beauty. I try to nourish others with beauty.

Ultimately, life is meaningful only through touching the lives of others. That’s why I never saw god as providing any kind of meaning. Only doing something for other humans has ever given me a sense of meaning and fulfillment. Even just chatting with a neighbor provides meaningful pleasure the way prayer never could. My hope is that I can continue to be of use.

If god existed, what kind of meaning could he find it his own existence, with no equals to talk to, and his inability to answer prayers, since that would violate the laws of nature? He’d die of boredom and uselessness.

What a journey it’s been: from great expectations to a very modest goal of being of some small use and enjoying life. Enjoying life! — how intensely I used to despise that idea.


~ so are you still denying yourself the goodness of sour cream, butter, full-fat milk and yogurt (not the kind with added sugar)?

First we had the marvelous news that full-fat milk and other dairy products lower blood pressure and reduce the risk of heart disease, stroke, and cancer. How can this be? It turns out that saturated fats don’t increase the small, dense LDL cholesterol, easily oxidized and thus dangerous. They increase the big. “fluffy” cholesterol particles, considered benign and possibly even protective.

“In addition to raising LDL cholesterol, saturated fat also increases high-density lipoprotein, or HDL, the so-called good cholesterol. And the LDL that it raises is a subtype of big, fluffy particles that are generally benign. Doctors refer to a preponderance of these particles as LDL pattern A.

The smaller, more artery-clogging particles are increased not by saturated fat, but by sugary foods and an excess of carbohydrates, Dr. Chowdhury said. “It’s the high carbohydrate or sugary diet that should be the focus of dietary guidelines,” he said. “If anything is driving your low-density lipoproteins in a more adverse way, it’s carbohydrates.”

When the researchers looked at fatty acids in the bloodstream, for example, they found that margaric acid, a saturated fat in milk and dairy products, was associated with lower cardiovascular risk. Two types of omega-3 fatty acids, the polyunsaturated fats found in fish, were also protective. But a number of the omega-6 polyunsaturated fatty acids, commonly found in vegetable oils and processed foods, may pose risks, the findings suggested.

“My take on this would be that it’s not saturated fat that we should worry about” in our diets, said Dr. Rajiv Chowdhury, the lead author of the new study and a cardiovascular epidemiologist in the department of public health and primary care at Cambridge University.

In the new research, Dr. Chowdhury and his colleagues sought to evaluate the best evidence to date, drawing on nearly 80 studies involving more than a half million people. They looked not only at what people reportedly ate, but at more objective measures such as the composition of fatty acids in their bloodstreams and in their fat tissue. The scientists also reviewed evidence from 27 randomized controlled trials – the gold standard in scientific research – that assessed whether taking polyunsaturated fat supplements like fish oil promoted heart health.

The researchers did find a link between trans fats, the now widely maligned partially hydrogenated oils that had long been added to processed foods, and heart disease. But they found no evidence of dangers from saturated fat, or benefits from other kinds of fats.

ending on beauty

now that, more nearest than your fate
and mine (or any truth beyond perceive)
quivers this miracle of summer night
her trillion secrets touchably alive

~ ee cummings, #37

Julian Alden Weir: Nocturne, Queensboro Bridge

Saturday, June 18, 2016


At one time I don’t know when
at one time I thought I had the right the duty
to shout at the plowman
look look listen you blockhead
Icarus is falling
Icarus the son of vision is drowning
leave your plow
leave your field
open your eyes
there Icarus
is drowning
or that shepherd
with his back turned to the drama
of wings sun flight
of fall

I said oh blind ones

But now I don’t know just when now
I know the plowman should till the field
the shepherd watch over his flock
the venture of Icarus is not their venture
this is how it must end
And there is nothing
about a beautiful ship sailing on
to its port of call

~ Tadeusz Różewicz, from “A Didactic Tale”

Jaroslaw Anders:

~ In “A Didactic Tale,” Rozewicz meditates on the painting “The Fall of Icarus,” traditionally attributed to Brueghel. In the painting, which has inspired several poems, most notably Auden’s “Musée des Beaux Arts,” Icarus’s fatal fall is virtually unnoticed by the central figure of the painting, the plowman preoccupied with his mundane task. The earthbound gravity wins over the reckless upward striving of the human spirit. But unlike Auden’s stoical, detached observation (“everything turns away/Quite leisurely from the disaster”), Rozewicz unambiguously takes the side of the plowman.

At one time, he says, he might have thought he had the right to shout “look look listen you blockhead/Icarus is falling/Icarus the son of vision is drowning.” But now he understands that “the plowman ought to till the land/the shepherd watch over his flock/the venture of Icarus is not their venture.” It is the unremarkable folk, the common people, who keep the world from self-destructing. Like Brueghel’s plowman, they move around with their eyes fixed on the ground, preoccupied with the necessary tasks of caring and feeding. ~ “Against Color,” The New Republic, November 8, 2011


I think already Auden is more on the side of the plowman. Auden saw the historical failure of lofty ideologies and great visions and ambitions. The plowman, the shepherd, the sailors -- these are the people who are of use to others, and ultimately advance progress, in small but essential ways. Already the myth itself was meant as a cautionary tale. Our admiration was supposed to be for sensible, moderate Daedalus, the brilliant engineer, not for the foolish youngster with his hubris.


Cherub and Lupe Velez, a Mexican and American movie star. Her suicide in 1944 is described in a book "From Bananas to Buttocks."


We now know, from dearly bought experience, much more about post-traumatic stress experience than we used to. Apparently, one of the symptoms by which it is made known is that a tough veteran will say, seeking to make light of his experience, that “what didn’t kill me made me stronger.” This is one of the manifestations that “denial” takes. ~ Christopher Hitchens, “Mortality,” 2012

Oriana: “About suffering they were never wrong, the Old Masters.” So begins Auden’s great poem based on Brueghel’s painting, The Landscape with the Fall of Icarus. The Old Masters knew that suffering goes on in untidy corners, is ignored and later soon forgotten. Nietzsche, on the other hand, was wrong about suffering. “That which does not kill us makes us stronger” is perhaps his most famous aphorism. This notion lives on — which is ironic, Nietzsche’s life having been rather short and miserable and all the worse for his suffering (the newest thinking is that it was brain cancer) — making it pretty obvious that the great philosopher was in denial. 

Worse, the saying continues to resonate — and not just with Catholics, lapsed or otherwise, who were brainwashed to believed that “suffering is good for you.” We all got brainwashed.

Noam Shpancer: “One reason is that suffering, as Freud famously recognized, is an inevitable part of life. Thus we have developed many ways to try to ease it--one of which is bestowing upon it transformative powers (another is by believing in an afterlife, of which Freud disapproved; still another is cocaine, of which he was, for a time, a fan).

Another reason is that American culture, born of trauma and imbued with a hopeful can-do ethos, wants to believe this idea, finding it self-affirming. Once we have acquired a certain belief we tend to see, remember, and report mostly instances and events that support it. This is called confirmation bias.

Yet another reason we think trauma may be transformative is that we see variants of this process around us. Bacteria that are not killed entirely by an antibiotic will mutate and become resistant to it. People who go through the hardship of training tend to improve their performance. But human beings are not bacteria, and good training is not a traumatic event.

Now it is true that, in an evolutionary sense, those who survive a calamity are by definition the fittest. But it is not the calamity that made them so. For our minds, however, the leap is short between seeing the strong emerge from a calamity and concluding that they are strong because of the calamity.
Years ago, during my mandatory army service in Israel, I took part in anti-terrorist training that involved working with the K9 unit. I asked the unit commander where he found those vicious attack dogs of his. Most people, he said, believe that wild street dogs make the best anti-terrorist dogs, having survived the, well, dog-eat-dog world of the mean streets. But the truth is just the opposite. Street dogs are useless for this--or any other--work because they are unpredictable and not trainable. Dogs that have been well cared for, loved, and protected all their lives--those are the best anti-terrorist dog candidates.

And this is true for humans as well. Mayhem and chaos don't toughen you up, and they don't prepare you well to deal with the terror of this world. Tender love and care toughen you up, because they nurture and strengthen your capacity to learn and adapt, including learning how to fight, and adapting to later hardship.

Nietzschean — and country song — wisdom notwithstanding, we are not stronger in the broken places. What doesn't kill us in fact makes us weaker. Developmental research has shown convincingly that traumatized children are more, not less, likely to be traumatized again. Kids who grow up in a tough neighborhood become weaker, not stronger. They are more, not less likely to struggle in the world.”


It’s being loved that makes us stronger. Connect, connect, connect! When hardship strikes, we need empathy. And that’s just “hardship.” In case of real trauma, we need a great deal of empathy and other help too, or we may never recover.

Chuck Lorre: "That which does not kill me makes me bitter.” I used to be a classic example of this. In the end I came to my senses, not wanting to waste any more life on bitterness. But some people stay bitter to the end, in depression in their seventies and eighties, getting more and more bitter. So sad.

People resist the idea that Nietzsche was wrong because they want to justify suffering as something that "toughens us up." Unfortunately this serves as an excuse for all kinds of cruelty. It’s interesting that Nietzsche saw religions as being at bottom systems of cruelty, yet didn’t see that the “suffering is good for you” argument is also in service to cruelty.

Some people even imagine that the more diseases we survive, the better for the body — that's part of the argument against vaccines. No, diseases harm us. Some of the damage is irreversible.

True — there are examples of people who in response to tragedy were transformed into heroes and/or activists. But we are eager to forget about those countless forgotten victims whose lives have been destroyed by suffering. We don’t want to contemplate what drives people to suicide. Yet it bears repeating: suffering does not make us stronger. Being loved is 

what makes us stronger.


Many people believe that suffering makes us stronger because it makes a more interesting story — the overcoming of suffering.


I’m constantly aware of how much more I could do without my handicap — I’d be a more interesting person, actually, having had more experiences — by now having traveled to Italy, Ireland, and Lithuania, and having met who knows how many interesting people who’d enrich my psyche.

And instead of reading about inflammation and trying to find out which remedies are the least harmful, I could be reading wonderful essays and poems.

So much energy goes into dealing with the suffering that could go into better venues. Aren’t children better off when not crippled by polio? Should we bring it back because it “builds character”?

But it’s perfectly human not to want to acknowledge how destructive suffering is, and to go into denial about that obvious aspect. We latch on to the stories of exceptional individuals. Even they weren’t exactly made stronger by suffering — their life expectancy, for instance, got permanently shortened, and that “worn out” feeling and various other “souvenirs” of the illness or another catastrophe may never go away — but they found a creative adaptation to adversity.

Yes, that’s a more interesting story than the story of being slowly destroyed by suffering, which is what life is, but in different ways, at different pace and different degrees of intensity — and the degree and speed of destruction matter a lot. It’s marvelous to contemplate the stories of survival and some kind of accomplishment against all odds. The once-per-history Stephen Hawking. But when I think of people I’ve known who did succumb to cancer after long months of terrible suffering — perhaps they died with stoic heroism, without complaining, but no, they did not gain anything from the cancer that they would not gladly give back for even one more month of normal life. Perhaps even just one more week. Who knows, maybe even one more day.

The price of suffering is well-known to those who suffer, but they know their stories, unless crowned by a miraculous recovery, are not welcome. It’s similar to the stories of immigrants — unless they conform to what people want to hear, those stories are not welcome, and the enormity of suffering involved in being an immigrant remains unknown.

Let’s face it, the stories of suffering as suffering are totally unwelcome. You’re supposed to be destroyed quietly, almost secretly. Your obituary may mention how brave you were and could even manage to crack a joke. It omits the potential contributions you might have otherwise made during that “long battle with X.” 


I’ll devote more space to this in an upcoming blog — one of the posts will be devoted to the life-long damage resulting from early trauma (including being bullied at school — it’s by no means trivial).

For now, let me briefly contemplate what happens when someone has a serious but non-fatal stroke. So, something bad happened that didn’t kill — but did it make the person stronger?

Consider also the resources spent on rehabilitation — and we are talking the best case here, where such resources are available. Perhaps in addition to the experts there is also a devoted spouse or parent available to continue the intensive training, trying to make the person relearn speech and basic skills and some knowledge about the world. Imagine several years of this intensive effort and expenditure — all of it just to make up for the loss, not to advance to higher ground than before. And no “resistance to a future stroke” is being built.

You may object that stroke is too extreme an example. Let’s take stammering, then. The King’s Speech was an excellent movie that showed how the future King George VI struggled to overcome his stammer. We are shown the childhood abuse that led to his “losing his voice.” And we are shown the heroic persistence of his speech therapist and his pupil, crowned with success — of sorts.

The king manages — but barely only manages. He does not become an inspiring speaker that perhaps he would have become had the suffering not taken place, the stammer did not develop, and the time spent trying to overcome the handicap would have been freed for developing public speaking (or another skill) to the level of excellence.

Did suffering make King George stronger? While the overcoming of his handicap is certainly in the “inspiring” category, my final verdict, when I ponder the suffering, is “What a waste.” Unfortunately, most suffering is that. Chronic stress doesn’t make us more resilient. On the contrary, even minor stress can be very damaging if it comes on top of chronic stress.

A stray thought: our denial about the ravages and sheer waste of suffering may in part be due to the example of athletic training. But that’s not suffering in the real sense — and besides, the philosophy of “no pain, no gain” is now being seriously questioned. No, we don’t want too much inflammation, and we most definitely don’t want muscle or tendon damage!


In an ideal world, we wouldn’t be perceived as soldiers. We would be singers, dancers, lovers; travelers and explorers; loved children and loving parents. It’s not good to be walking with a pebble in your shoe and that constant irritation — even if it’s just a small pebble! — eclipsing the more worthwhile aspects of life. Do not go into denial and praise the pebble. If it’s possible to remove the pebble, by all means remove it.

Cardinalis cardinalis in flight


“In early 2012, a neuropathologist named Daniel Perl was examining a slide of human brain tissue when he saw something odd and unfamiliar in the wormlike squiggles and folds. It looked like brown dust; a distinctive pattern of tiny scars. Perl was intrigued. At 69, he had examined 20,000 brains over a four-decade career, focusing mostly on Alzheimer’s and other degenerative disorders. He had peered through his microscope at countless malformed proteins and twisted axons. He knew as much about the biology of brain disease as just about anyone on earth. But he had never seen anything like this.

The brain under Perl’s microscope belonged to an American soldier who had been five feet away when a suicide bomber detonated his belt of explosives in 2009. The soldier survived the blast, thanks to his body armor, but died two years later of an apparent drug overdose after suffering symptoms that have become the hallmark of the recent wars in Iraq and Afghanistan: memory loss, cognitive problems, inability to sleep and profound, often suicidal depression. Nearly 350,000 service members have been given a diagnosis of traumatic brain injury over the past 15 years, many of them from blast exposure. The real number is likely to be much higher, because so many who have enlisted are too proud to report a wound that remains invisible.

Perl and his lab colleagues recognized that the injury that they were looking at was nothing like concussion. The hallmark of C.T.E. is an abnormal protein called tau, which builds up, usually over years, throughout the cerebral cortex but especially in the temporal lobes, visible across the stained tissue like brown mold. What they found in these traumatic-brain-injury cases was totally different: a dustlike scarring, often at the border between gray matter (where synapses reside) and the white matter that interconnects it. Over the following months, Perl and his team examined several more brains of service members who died well after their blast exposure, including a highly decorated Special Operations Forces soldier who committed suicide. All of them had the same pattern of scarring in the same places, which appeared to correspond to the brain’s centers for sleep, cognition and other classic brain-injury trouble spots.

Then came an even more surprising discovery. They examined the brains of two veterans who died just days after their blast exposure and found embryonic versions of the same injury, in the same areas, and the development of the injuries seemed to match the time elapsed since the blast event. Perl and his team then compared the damaged brains with those of people who suffered ordinary concussions and others who had drug addictions (which can also cause visible brain changes) and a final group with no injuries at all. No one in these post-mortem control groups had the brown-dust pattern.

Perl’s findings, published in the scientific journal The Lancet Neurology, may represent the key to a medical mystery first glimpsed a century ago in the trenches of World War I. It was first known as shell shock, then combat fatigue and finally PTSD, and in each case, it was almost universally understood as a psychic rather than a physical affliction. Only in the past decade or so did an elite group of neurologists, physicists and senior officers begin pushing back at a military leadership that had long told recruits with these wounds to “deal with it,” fed them pills and sent them back into battle.

Trinitrotoluene, or TNT, was first used in artillery shells by the German Army in 1902. Soon after the First World War started in 1914, a rain of these devices was falling on the hapless men on each side of the front. It was a level of violence and horror far beyond the cavalry charges of earlier wars. Very quickly, soldiers began emerging with bizarre symptoms; they shuddered and gibbered or became unable to speak at all. Many observers were struck by the apparent capacity of these blasts to kill and maim without leaving any visible trace. The British journalist Ellis Ashmead-Bartlett famously described the sight of seven Turks at Gallipoli in 1915, sitting together with their rifles across their knees: “One man has his arm across the neck of his friend and a smile on his face as if they had been cracking a joke when death overwhelmed them. All now have the appearance of being merely asleep; for of the several I can only see one who shows any outward injury.”

One British doctor, Frederick Mott, believed the shock was caused by a physical wound and proposed dissecting the brains of men who suffered from it. He even had some prescient hunches about the mechanism of blast’s effects: the compression wave, the concussion and the toxic gases. In a paper published in The Lancet in February 1916, he posited a “physical or chemical change and a break in the links of the chain of neurons which subserve a particular function.” Mott might not have seen anything abnormal in the soldiers’ brains, even if he had examined them under a microscope; neuropathology was still in its infancy. But his prophetic intuitions made him something of a hero to Perl.

Mott’s views were soon eclipsed by those of other doctors who saw shell shock more as a matter of emotional trauma. This was partly a function of the intellectual climate; Freud and other early psychologists had recently begun sketching provocative new ideas about how the mind responds to stress. Soldiers suffering from shell shock were often described as possessing “a neuropathic tendency or inheritance” or even a lack of manly vigor and patriotic spirit. Many shell-shock victims were derided as shirkers; some were even sentenced to death by firing squad after fleeing the field in a state of mental confusion.

This consensus held sway for decades, even as the terminology shifted, settling in 1980 on “post-traumatic stress disorder,” a coinage tailored to the unique social and emotional strain of returning veterans of the war in Vietnam. No one doubted that blasts had powerful and mysterious effects on the body, and starting in 1951, the U.S. government established the Blast Overpressure Program to observe the effects of large explosions, including atomic bombs, on living tissue. One of my uncles recalls standing in the Nevada desert as an Army private in 1955, taking photographs of a nuclear blast amid a weird landscape of test objects: cars, houses and mannequins in Chinese and Soviet military uniforms. At the time, scientists believed blasts would mainly affect air pockets in the body like the lungs, the digestive system and the ears. Few asked what it would mean for the body’s most complex and vulnerable organ.

Daniel Perl is continuing to examine the brains of blast-injured soldiers. After five years of working with the military, he feels sure, he told me, that many blast injuries have not been identified. “We could be talking many thousands,” he said. “And what scares me is that what we’re seeing now might just be the first round. If they survive the initial injuries, many of them may develop C.T.E. years or decades later.”

Perl takes some solace from the past. He has read a great deal about the men who suffered from shell shock during World War I and the doctors who struggled to treat them. He mentioned a monument in central England called “Shot at Dawn,” dedicated to British and Commonwealth soldiers who were executed by a firing squad after being convicted of cowardice or desertion. It is a stone figure of a blindfolded man in a military storm coat, his hands bound behind him. At his back is a field of thin stakes, each of them bearing a name, rank, age and date of execution. Some of these men, Perl believes, probably had traumatic brain injuries from blasts and should not have been held responsible for their actions. He has begun looking into the possibility of obtaining brain samples of shellshocked soldiers from that war. He hopes to examine them under the microscope, and perhaps, a century later, grant them and their descendants the diagnoses they deserve.”

INTROVERSION OR INTELLECT? (a more subtle understanding of introversion)

“What many people ascribe to introversion really belongs in the intellect/imagination domain. Intellect/imagination represents a drive for cognitive engagement of inner mental experience, and encompasses a wide range of related (but partially separate) traits, including intellectual engagement, intellectual curiosity, intellectual depth, ingenuity, reflection, introspection, imagination, emotional richness, artistic engagement, and aesthetic interests.

Traits such as sensitivity and social anxiety are also not part of Big Five introversion-extraversion domain. To be sure, many people may think of themselves as introverted because they are highly sensitive. But research shows that sensory processing sensitivity is independent of introversion. The various manifestations of being a highly sensitive person — inhibition of behavior, sensitivity to environmental stimuli, depth of information processing, and physiological reactivity — are linked to neuroticism and intellect/imagination, not introversion.

Finally, there's a common misconception that all introverts enjoy solitary activities. However, that isn't a defining feature of introverts. Responses such as "Enjoy spending time by myself" and "Live in a world of my own" involve an equal blend of introversion and intellect/imagination. Contrary to popular conceptualizations of introversion, preferring to be alone is not the main indicator of introversion.

The desire for positive social attention seems to be a particularly strong indicator of extraversion [4]. For example, Jacob Hirsh and colleagues found that taking into account the rest of the Big Five personality traits (agreeableness, neuroticism, conscientiousness, and intellect/imagination), the following 10 behaviors were most uniquely predictive of extraversion (from a list of 400 activities):

1. Told a dirty joke.

2. Planned a party.

3. Entertained six or more people.

4. Told a joke.

5. Volunteered for a club or organization.

6. Tried to get a tan.

7. Attended a city council meeting.

8. Colored my hair.

9. Went to a night club.

10. Drank in a bar.

Why might the drive for social attention be so strongly linked to extraversion? One possibility is that many human rewards are social in nature. Our complex social lives are probably the dominant force in human evolution, driving the evolution of intelligence, creativity, language, and even consciousness. The human reward system, therefore, most likely evolved to be particularly responsive to social rewards.

There are costs to extraverted behavior, however. This includes time and energy that could be invested in other activities, such as accomplishing a goal (conscientiousness) or engaging with ideas and imagination (intellect/imagination). There is also the risk that inappropriate attention-seeking behavior can fall flat, leading to reduced attention-holding power. Finally, high levels of exploration of the environment can expose extraverted individuals to increased physical risks. For instance, extraverts are more likely to be hospitalized due to accident or illness, and are more likely to become involved in criminal or antisocial behaviors and get arrested.

It's important to distinguish, however, between the most prominent behavioral manifestation of extraversion (desire for social attention) and the core underlying mechanism of extraversion (reward sensitivity). Even though reward sensitivity need not be limited exclusively to social situations, high reward sensitivity likely motivates extraverts to seek out potentially rewarding positive social interactions, and fuels them to display behaviors that will increase social attention (e.g., friendliness, smiling, high energy, loudness, exhibitionism, positive emotions).

From a biological perspective, reward sensitivity is likely governed by dopamine. While dopamine is involved in a variety of cognitive and motivational processes, the unifying function of dopamine is exploration. According to Colin DeYoung, "the release of dopamine, anywhere in the dopamingergic system, increases motivation to explore and facilitates cognitive and behavioral processes useful in exploration."

A lot of introverts notice that they often need to be alone to recharge their batteries after vigorous social interactions, whereas extraverts appear to gain energy from social interactions. This can be explained by dopamine's function in energizing potentially rewarding social interactions, as well as its role in overcoming the cost of effort. For introverts, such interactions are more effortful and tiring due to their less active reward system.”


Funny that a big indicator of extraversion is telling dirty jokes.

 I thought wanting to spend time alone so I can process experience was the very definition of introversion, but it does make more sense to speak of being high on the intellect/imagination dimension. By the way, this dimension is traditionally designated as “openness to experience” — which doesn’t seem to be an accurate equivalent, though openness to INNER experience would be part of “intellect/imagination.”

I think I have an openness to ideas. Experiences — I need to think about the possible cost, including unpleasant memories and impact on health.

I can even imagine myself becoming a lot more sociable — if I lived around interesting, educated people, for instance. The whole dimension of introversion is not terribly clear. So much depends on the context. When I visited Boston and met a lot of educated people I became so sociable I could hardly shut up.

So perhaps it's more about the quality of people introverts meet. With the right people, I am sociable; with those who are into small talk or women who talk exclusively about their children, I find excuses to leave.

I'd still need lots of solitude in order to process the social interactions. A little goes a long way because I need to relive anything significant and think about it — to let my mind loose on it. If I don't have enough time to process, then life seems to just flee like sand through the fingers and becomes pretty meaningless. But the processing of experience is part of the intellect/imagination dimension rather than introversion per se.

I still think there is something to the augmenter/reducer dimension introduced in the sixties by the psychologist Aneseth Petrie. Reducers have weaker cortical arousal and need strong stimulation (e.g. noisy music); augmenters tend to magnify stimulation, so they prefer the quiet and the subtle. Reducers, who tend to be chronically understimulated, are more likely to be smokers and rely on coffee and other stimulants to raise their arousal level. They are easily bored. Augmenters seek out silence or soothing stimulation — doesn’t that sound like the classic description of an introvert?

“Neither Jesus nor any writer of the bible says anything about the soul going anywhere when they describe death. Nor do they identify the soul with the mind, or with the whole human being, as Christians began doing when in the fourth century. Jesus certainly taught that there will be life after death — the resurrection — but he didn’t teach that there will be life right after death, which is what most Christians now believe.

Jesus talked about souls, but he didn’t think of them in the way that most Christians do. Like the other first-century Jews, he thought of a person’s soul as what made him or her be alive [“the animating principle” — oddly enough, that’s how the Catholic Encyclopedia defines the soul]. Common words for soul were nefesh and ruach in Hebrew, and spiré and pneuma in Greek. Like the words for soul in most languages these came from words for breath, wind, and what’s moving. The reason words meaning air that’s moving were used for what makes us alive is probably that people noticed that as long as we are alive, we breathe, when when we “breathe our last breath,” we die. So they thought that our breath is what keeps us alive. Nothing was thought of as immortal here, as the soul is immortal in Greek dualism. The soul was understood to be mortal just like the rest of the person, and at death, both were destroyed.

If Jesus thought of the soul as what makes a person alive, and not as the person’s nonphysical, immortal mind, where did the now popular idea that the soul is the nonphysical, immortal mind come from? That idea came not from the bible but from Greek philosophy. Greek-influenced Christians tended to be dualists, thinking of each person as two things: a mortal body being controlled by an immortal soul.

The most influential of those dualists was Augustine, who defined a human being as “a rational soul using a mortal and earthly body.” That definition would have puzzled Jesus because he thought of a human being as one thing — a living body, not two things — a soul, plus a body that it “uses.”

In switching to Platonic ideas about death liberating the immortal soul, Christian thinkers quietly put aside Jesus’ ideas, which he shared with the writers of the bible, that death destroys us. What Jesus added was that the destruction of death is not permanent because at the end of the world God will intervene in the natural order and resurrect everyone [in flesh], judge them, and reward or punish them.

In Jesus’ day, this idea of the resurrection was less than two centuries old and was not accepted by all Jews. The Sadducees rejected it because it was not well-grounded in the scriptures. If you read through the whole Old Testament — over one thousand pages — God says nothing at all about anyone living after they die. And just before he drives Adam and Eve out of the garden, he scolds Adam, saying, “You are dust, and to dust you shall return.”

There are just two sketchy prophetic passages in the OT that suggest a future resurrection, and it is not a resurrection of the human race. These passages were written at a time when Jews were being persecuted, and in both of them only Jews — maybe only some Jews — will be resurrected.

Any Jew who believed in the resurrection of the dead at the time of Jesus, then, had very little to base it on. Jesus is vague about what it will involve, except to suggest that everyone, not just some Jews, will be resurrected, and there will judgment after resurrection, followed by happiness for the good people and suffering for the bad. But whatever he said about the resurrection of the dead, it is clear that he did not say that people’s souls go to heaven or hell when they die.” ~ John Morreall, “Questions for Christians,” 2014. (John Morreall is professor and chair of the Department of Religious Studies at the College of William and Mary.)

(by that logic there are no human souls in heaven or hell right now, as the author explains in the chapter that follows; heaven or hell were to follow the bodily resurrection of the whole 


Master of the Parrot, Madonna and Child. Note that Baby J looks like Dennis the Menace.


I am amazed how much stuff I was indoctrinated with is wrong from the point of view of the first-century Jewish beliefs. Such fantastic fabrications! The Jewish beliefs were fabrications as well, but I was taught heresies of those original fabrications.

And surely at least SOME clergy knew there was no scriptural support for the idea that there are any human souls in heaven or hell? That nobody is in a "better place" until the resurrection? (never mind that the resurrection seems awfully delayed).

I was so heavily indoctrinated — or call it having had an excellent memory even as a child — that later I discovered that absurdities still cling to my psyche, e.g. the soul being somehow separate from the body and going somewhere after the body dies. I'd never say that I believe that, but some of this nonsense still clings like a stubborn weed and has to be uprooted from the psyche. So it helps that even Jesus — if historical Jesus ever existed — did not believe in a soul separate from the body.

But what helps most is simply the view of “soul” as consciousness, an emergent phenomenon that stems from brain function. Once brain function ceases, consciousness ceases the way flame is gone when the fuel is exhausted. Consciousness doesn’t “go anywhere.” It ceases. 

Monet: Antibes seen from Salis Gardens

ending on beauty:

And when the Heavens — disband —
And Deity conclude —
Then — look for me. Be sure you say —
Least Figure — on the Road —

~ Dickinson, 401

This being the Southwest, "no figure on the road" is the typical experience. After the biblical  “end of the world” (eagerly looked forward to by many), for a while at least, things here would look pretty much the same . . . or in Nebraska, say.

I’m glad for the companionship of clouds.

By the way, “heavens disband” because “the kingdom of heaven” did not mean a place in the clouds. It meant the future paradise here on earth. “Thy kingdom come” — not that we “go to heaven” — the Jews at the time of Jesus had no concept of “going to heaven” — but that heaven comes to earth.


Sunday, June 12, 2016



I let him brush against me,
let his face muss my hair.
I turn to the half-glimmer

of dawn in his eyes —
hold his hand and whisper,
“No, no, it’s impossible.”

I wake up and wonder,
is it all behind me,
that alphabet of glances,

silences — is it all behind us,
that fire and shiver,
lost to us like lilacs,

like the scent of rain —
gone from us forever
because we’re not young —

No, it’s the sacred
shyness of the soul — the heart’s
double truth, an eternal flame.

The flame says nothing is wasted,
not even youth on the young.
How high that highest lantern

shines above our fear of the dark.
He said nothing and I said no,
but in silence everything was said.

~ Oriana © 2016

Upon awakening, I was also keenly aware that the man in the dream was only a “mere suggestion” of a man. It wasn’t anyone I knew — in fact that was the whole point. He was a generic figure standing for the situation when two people meet and within a short time know that under different circumstances they’d likely become lovers.

But do we really know it? A poem like this needs to be kept short, so I didn’t go into “the heart’s double truth.” Infatuation is intoxicating — nature’s trap to get the woman pregnant. But thanks to our complex brain with its competing pathways, there is also usually a shyness about these matters . . .  and there is hardly anymore more erotic than those initial silences between potential lovers.

I’ll say no more.


After an interminable wait (“We thought you became a missing person in that office!” the formidable stem-cell coordinator joked — not funny), the X-rays were up-loaded on the computer — another delay, because the temp technician didn’t know how to do it. Then I heard the PA’s voice (PA stands for “physician assistant” — they do some pretty sophisticated stuff these days) — I heard his loud, powerful voice from the adjacent room (I was in the tiny sausage-like treatment room).  And the voice exclaimed:

“But that’s an incredible improvement!”

And the PA, Patrick, rushed in to talk with me. “I’d love to see the X-rays,” I said. “Just a moment, let me take a picture.” Patrick rushed out. Soon he was back, his I-phone in his outstretched hand. “See, this is your previous X-ray: it’s bone on bone. Now look here: see all the space?”

~ “Does that mean that new cartilage is growing?”

~ “It’s growing.”

In a nutshell: cartilage is very difficult (many sources say “impossible”) to regenerate. Medical advice websites will tell you it simply doesn’t happen (so stop wasting your money on unproven pills and procedures, you fool). And the MRI made it clear that my left knee is extremely damaged. “Severe degeneration.” A damaged ligament, possibly torn. Bones near the knee becoming cystic rubble. The blackness of inflammation surrounding everything. A war zone. Think Syria.

~ “Yes, it’s the new cartilage that’s created the space.”

Patrick gently patted the area right above my knee on the left side. “No swelling,” he said. “Last time you were swollen.” I was amazed that he’d remember. I was still too taken aback by it all to tell him, in regard to swelling, that there were the good days and the bad days — the main factor seemed to be the amount of walking or trying out some recommended exercise (now I know better; more about this later).

And the stem cells aren’t yet done with their good work, Patrick said. I’ll probably have another treatment in December — at half price! Patrick got a big, big hug after that announcement.

But here is the bad news: there’s been very little functional improvement. So we discussed some options for that. Actually there is an adjuvant treatment that helps the stem cells: “Hyaluronic acid helps the stem cells,” Patrick said. He meant injections — hyaluronic acid is not absorbed orally.

As I’ve indicated, the mainstream medical view is that cartilage cannot heal. Stem cells treatment is still very new and untested — most MDs would warn against it. No insurance covers the cost. I knew from the start that I may be wasting my money, depleting my savings chasing a false hope. I took that risk. I just “had to do it.”


There are also new ways to fight inflammation. I told Patrick that my experience has taught me that the key is fighting inflammation. One cause of inflammation is too much exercise, of the wrong sort (and almost all typical exercise is the wrong sort). A damaged joint, already inflamed, can get horribly inflamed after wrong, stressful exercise. In fact, in extreme cases, only the passive motion machine may do good rather than harm.

I realize that exercise is precisely what every website recommends for arthritis. It’s like the pressure to eat “whole grains” and stick to low-fat, high-carbohydrate diet — except that Atkins was a powerful voice who showed otherwise — and all nutritionists condemned him. Some of them are still vocal, though study after study has validated Atkins.

Now, we all know the benefits of exercise for healthy people with undamaged joints, but when it comes to severe damage, painful inflammation can follow (my joint damage began with a torn meniscus when I shattered my knee in a fall down a slippery staircase; I seemed to recover from that, but months later too much walking on hard pavement led to chronic pain, which led to seek medical attention; this led to a disastrous meniscus removal surgery that predictably led to terrible arthritis, as it always does — it wasn’t known back then; the meniscus, a crucial part of the knee’s shock-absorption system, was regarded as a useless clump of tissue, its function unknown, so surgeons removed it with the same gusto with which they used to remove tonsils).

Even too much walking — and “too much” isn’t much by normal standards — can lead to awful pain. Walking, especially on hard surface, is an impact exercise — not as much as running, but it’s still impact. So much for the countless websites saying that nothing is so good for arthritis as walking! Worse, even “gentle stretching” is not at all gentle — there comes a point when any active movement comes with a risk of pain. 

When I reflect on my worst episodes of pain over the years, I now see it with terrible clarity: with only one exception they were all caused by overexercise. 

It took me decades to realize this. Talk about being a slow learner!

Riding a bike, trying to learn to play tennis, a tai-chi class, a yoga class, climbing a steep hill, trying to keep up with the rest of the group — one by one I discovered what I mustn’t do. I discovered it the hard way. As if my awful migraines weren’t enough . . . “Perhaps my life really is a punishment for something terrible I must have done in my past life,” I even said to a couple of friends in what now seems a fit of metaphysical insanity.

Just as “health food” can be the very worst diet for a particular individual, so “healthy exercise” can be the worst thing for a person with an injured joint.


I don’t know enough to comment on a special “light workout” recommended by a certain book — say 60 steps once every hour, and “micro-pushes” against a ball — but even with such minimal workout I would suggest proceeding with caution. If the pain worsens — and it may take a few days to find out — immediately stop any new activity. Stretching, trying to strengthen the quads — all risky, all may aggravate the pain. And pain indicates inflammation and more cartilage loss.

At the same time, we know that lack of motion is bad. I’d like to learn more about the passive continuous motion machines that are being introduced not only after knee replacement, but also after cartilage injuries — I think the light is beginning to dawn that cartilage CAN regenerate — but it takes time and just the right treatment. Passive motion machines show great promise.

The usual treatment is anti-inflammatories. The downside — and this will sound like a cruel joke — is an actual acceleration of joint damage. That, and if the doses are high and taken for a long time, a high risk of kidney failure, and an increased risk of a heart attack and stroke.

But a short-term use of high-dose NSAIDs can make a terrific difference — for a while.

The new anti-inflammatory developments, Patrick said, include microdoses of NSAIDs and the injections of ibuprofen directly into the knee.


I told him I just discovered something much simpler: using a compression bandage to wrap the knee. I couldn’t tolerate various knee sleeves and braces, especially the dreadfully expensive ones meant to compensate for the misalignment of the knee that develops after meniscus removal. But the day before my X-ray appointment I happened to see an elderly man at the library, adjusting the elastic bandage around his knee. At home one of the first posts I read was by a man who’d just happened to discover that wrapping his knee brought great relief from pain. In a dresser drawer I found a generous length of elastic bandage — left from my first stem-cell treatment, I later realized.

On went the bandage, snap went the velcro-like sticky part that holds it, and — instant relief.

I’d been relying on a pain lotion, and having to constantly reapply it was quite disruptive. The bandage turned out to be more effective.


During the consultation, the word “surgery” was never mentioned.

Oh, by the way, my left shoulder also got some stem cells for bursitis and a possible tear. X-rays showed no bursitis or any other abnormality. I expected as much: the recovery was very quick.


Actually, there IS a bit of functional improvement, but definitely not the sort that many people desire: being able to take do pretty much everything, even certain sports (running is out, but one retired nurse claims that after her knee replacement and lots of physical therapy she can sit in the lotus position). For me, more improvement is likely to come later, but given the absence of meniscus, I'll never have full function. Thanks to the original ignoramus surgeons who crippled not just me, but many UCLA football players, the trauma of the surgery was too much to contemplate, and the possible complications later. Infections and resulting amputations are rare, but they do happen to some patients.

For someone still at a relatively early to intermediate stage of joint deterioration, I wouldn't hesitate to recommend stem cells. At my advanced stage, it’s up to the person, after considering the many factors involved: age, state of fitness, presence or absence of cardiovascular disease, ability to tolerate some disability, and so on.

The interesting thing is that I was told back in 1992 that I needed TKR or I wouldn't be able to walk within two years. The orthopedic surgeon showed me the devastating X-rays. But because I remembered the horror of my first knee surgery, when my meniscus was removed, I decided to research alternative treatments. Glucosamine was just gaining publicity then, but the recommended doses were pathetically insufficient. Being a wild spirit, I started taking more and more of it; two years passed, then three, then four etc and I was walking better than before. And not just walking — I was hiking in the mountains. Thank goodness I didn't have knee replacement back in the early nineties, when the artificial joints were of terrible quality and the procedure extremely invasive. Now I still haven't 100% ruled out TKR, but I'm very happy that a biological solution exists.


The big promise of stem cells: heart disease, brain diseases, and autoimmune diseases. The future is not yet, but you can read online about the first steps being made.

One last detail: as I was driving home, my brain started playing Rachmaninoff’s Second Piano Concerto, the grand, sweeping orchestral part. It was involuntary — it’s just the sort of thing my brain does now and then instead of commenting in words. I love it when it happens.


You can read about my stem cell treatment (the step-by-step procedure) here:

OUR INNER MONOLOGUE (Julian Jaynes: When the gods stopped speaking — how his theory fares today)

“In the beginning of the book, Jaynes asks, “This consciousness that is myself of selves, that is everything, and yet nothing at all—what is it? And where did it come from? And why?” Jaynes answers by unfurling a version of history in which humans were not fully conscious until about 3,000 years ago, instead relying on a two-part, or bicameral, mind, with one half speaking to the other in the voice of the gods with guidance whenever a difficult situation presented itself. The bicameral mind eventually collapsed as human societies became more complex, and our forebears awoke with modern self-awareness, complete with an internal narrative, which Jaynes believes has its roots in language.

The kind of search that Jaynes was on—a quest to describe and account for an inner voice, an inner world we seem to inhabit—continues to resonate. The study of consciousness is on the rise in neuroscience labs around the world, but the science isn’t yet close to capturing subjective experience. That’s something Jaynes did beautifully, opening a door on what it feels like to be alive, and be aware of it.

He writes that the characters in The Iliad do not look inward, and they take no independent initiative. They only do what is suggested by the gods. When something needs to happen, a god appears and speaks. Without these voices, the heroes would stand frozen on the beaches of Troy, like puppets.

The combination of instinct and voices—that is, the bicameral mind—would have allowed humans to manage for quite some time, as long as their societies were rigidly hierarchical, Jaynes writes. But about 3,000 years ago, stress from overpopulation, natural disasters, and wars overwhelmed the voices’ rather limited capabilities. At that point, in the breakdown of the bicameral mind, bits and pieces of the conscious mind would have come to awareness, as the voices mostly died away. That led to a more flexible, though more existentially daunting, way of coping with the decisions of everyday life—one better suited to the chaos that ensued when the gods went silent. By The Odyssey, the characters are capable of something like interior thought, he says. The modern mind, with its internal narrative and longing for direction from a higher power, appears.

He cites a carving of an Assyrian king kneeling before a god’s empty throne, circa 1230 B.C. Frequent, successive migrations around the same time in what is now Greece, he takes to be a tumult caused by the breakdown. And Jaynes reflects on how this transition might be reverberating today. “We, at the end of the second millennium A.D., are still in a sense deep in this transition to a new mentality. And all about us lie the remnants of our recent bicameral past,” he writes, in awe of the reach of this idea, and seized with the pathos of the situation. “Our kings, presidents, judges, and officers begin their tenures with oaths to the now-silent deities, taken upon the writings of those who have last heard them.”

It’s easy to find cracks in the logic: Just for starters, there are moments in The Iliad when the characters introspect, though Jaynes decides they are later additions or mistranslations. But those cracks don’t necessarily diminish the book’s power. Particularly, [Dennett] thinks Jaynes’ insistence on a difference between what goes on in the minds of animals and the minds of humans, and the idea that the difference has its origins in language, is deeply compelling.

“There is such a difference between the consciousness of a chimpanzee and human consciousness that it requires a special explanation, an explanation that heavily invokes the human distinction of natural language,” though that’s far from all of it, he notes. “It’s an eccentric position,” he admits wryly. “I have not managed to sway the mainstream over to this.”

It’s a credit to Jaynes’ wild ideas that, every now and then, they are mentioned by neuroscientists who study consciousness. In his 2010 book, Self Comes to Mind, Antonio Damasio, a professor of neuroscience, and the director of the Brain and Creativity Institute at the University of Southern California, sympathizes with Jaynes’ idea that something happened in the human mind in the relatively recent past. “As knowledge accumulated about humans and about the universe, continued reflection could well have altered the structure of the autobiographical self and led to a closer stitching together of relatively disparate aspects of mind processing; coordination of brain activity, driven first by value and then by reason, was working to our advantage,” he writes.

In 2009 [Kujisten] highlighted brain-imaging studies suggesting that auditory hallucinations begin with activity in the right side of the brain, followed by activation on the left, which sounds similar to Jaynes’ mechanism for the bicameral mind. He hopes that as time goes on, people will revisit some of Jaynes’ ideas in light of new science.

Ultimately, the broader questions that Jaynes’ book raised are the same ones that continue to vex neuroscientists and lay people. When and why did we start having this internal narrative? How much of our day-to-day experience occurs unconsciously? What is the line between a conscious and unconscious process? These questions are still open. Perhaps Jaynes’ strange hypotheses will never play a role in answering them. But many people—readers, scientists, and philosophers alike—are grateful he tried.


I continue to be fascinated by the “inner chatter” as well as by the question that first arose with religion lessons: why did multiple people hear god speak in the biblical times, but now there is only silence? Jaynes may be onto something — and he came up with his “bicameral” theory well before neuroimaging and all we’ve learned about hallucinations and dreams.

Emile Cioran said, “The Greeks awakened to philosophy the moment their gods were no longer adequate; ideas begin where Olympus leaves off. To think is to stop venerating.” But when and why did the gods become inadequate? Here Jaynes's theory may be relevant.

To me the lingering death of religion, along with the shift toward naturalistic explanations and humanism as opposed to the worship of an imaginary Superman in the Sky, is the greatest story and adventure of humanity, the most dramatic development in the cultural evolution. First, granted, came the development of language, and nothing compares to the quantum leap of that -- but that was so long ago . . .  The development of modern consciousness is relatively recent.

“Cultural” doesn’t mean that the biology of brain function is not involved. On the contrary, we know that speech rewires the brain. Perceiving our inner voice as simply our thoughts rather than a god speaking to us (also, think of a schizophrenic hearing voices) — what a breakthrough! Amazingly, there are those — certain Jungians in particular — who prefer the archaic — or maybe “schizophrenic” is a more accurate term — idea that our thoughts are being broadcast to us from the astral realm, and the brain is
only a radio.

Detail of "Dormition of the Virgin" at Cloisters, New York. Photo: Leonard Kress

“The real cause of depression, and all the rest of psychiatric symptoms, follows from the way one’s unique consciousness is formed in the brain all through development from embryonic life to age twenty. Our developmental experience is mapped in the limbic system and the cortex as incredibly complex circuits of neuronal maps that reflect the impacts of love, respect, deprivation, and abuse as digested by one’s unique temperament. These brain maps generate human consciousness — which is organized in as a drama in the theater of the brain with a cast of personas, feeling relationships between them, scenarios, plots, set designs and landscapes. The internal play is the consummate creation of the human genome. Once established, beginning at age three, the representational play operates via top down cortical processing, and is the invisible prism through which we live our lives.

Serotonin and the other neurotransmitters operate in the synapses of our limbic cortical brain maps connecting the trillions of neurons that create the mappings that form our plays. Serotonin has no life of its own. It is merely a brain mechanism that serves the neuronal organization of consciousness, the play itself. The way the limbic-cortical brain maps our experience reflects the actuality of our experience. IF OUR CHARACTER PLAY IS TOO DAMAGED BY DEPRIVATION AND ABUSE, IT GENERATES AN INVISIBLE SADOMASOCHISTIC PLAY THAT IS FILLED WITH ATTACK AND HUMILIATION, endless war. Consequently the activated internal play is one of continuous internal fighting between personas. As such it feeds on the serotonin supply on an ongoing basis. It is inevitable that the supply will be overtaxed. This is not the result of a serotonin problem. It is built in from a damaged characterological play. It is not a question of ‘if’, but only ‘when’ serotonin will be overused and depression will appear.”


I feel uneasy about anyone who states, “The REAL cause of X is Y.” But the two paragraphs I chose do speak to my own experience. I used to have an acute sense of an “anti-self” within. The anti-self wanted me to fail just to prove that I was a total failure — especially in poetry (alas, I did pay for becoming a poet with a life-long sense of failure) and in love, but also in professional life — in everything that mattered.

I realize that the play of sub-personalities a simplified conceptualization — but the anti-self that felt real and kept producing images of my dead body or my body falling in an act of suicide. So “depression as an internal theater” makes some sense to me, as does the sadomasochistic tone of it, “attack and humiliation.”

As a result of insight, or perception shift, my own “anti-self” appears to have vanished, while the “wise one” keeps pointing out that having “failed” (in quotation marks because success is a matter of definition) at this or that doesn’t mean that I don’t have a lot to give. “You can fail at everything you do, and still be a success as a person.” But it was interesting to read this, and to find myself nodding my head.

The rest of the article is controversial, I know. I step away from the drug issue. I simply don’t know. “I did it my way.”

But I have also experienced a mood elevating effect from anti-inflammatories, so the issue of inflammation and depression is also of great interest to me. And the fact that anti-inflammatories produce mood elevation shows that physiological factors do have an impact and must be taken into consideration — without denying that the sadomasochistic automatic thinking is critical in deepening and perpetuating depression.

With billions of people intersecting, interacting globally, continuously, we can figure that billion of things bounce off each other globally, continuously. Good things, bad things, unpredictable things, kind and terrible things, strange and unexplainable things.

 But the Law of Large Numbers degrades to nonsense when we try connecting dots for personal meaning and comfort. We want the dots to spell a message that we, among all people on earth, are special to God. That's when Jesus shows up on toast. That's when a hurricane kills dozens in New Orleans — because — as Pat Robertson put it — God hates gays.

People connect dots for personal reasons. A family survives a tornado in Kansas that blows their neighbors off the map, and they declare on national TV that God answered their prayers. Too bad about the neighbors.

Since time began we have been story telling, pattern seeking, meaning making animals. But religious dots, like history, are connected only by survivors.

Chris Kammal, a Florida paramedic:

 “I work as a medic fireman. I see death and mayhem routinely. I have run thousands of calls over the last 23 years and so many of them were people in extreme crisis, or already dead when we got there.

I was asked the other day if I've ever seen miracles or things that can't be explained. My response was that I see many things beyond explanation but never anything miraculous. The miraculous implies that forces outside this world intervened, defying the basic principles of nature. But theres no evidence that the laws of physics have ever been suspended to save someone who was standing in the way.

A tree falling on a child doesn't reverse course because a mom cries out to God. There's no evidence that the universe has ever been manipulated by outside forces — in spite of ancient mythology or miraculous bible stories.

I've seen cars recognizable only by their tires, yet everyone came out alive with light injuries. I've seen cars with only moderate damage, with everyone dead.

I've seen a college freshman waiting at school for his mom to pick him up, but before she got there another driver had a heart attack, jumped the curb and killed the kid. Random is the rule.

The universe does not care who you are, where you come from, how religious you are, or how much money you have. We are all potential victims. Of course, good information and alert thinking help avoid problems before they happen. But you can't always avoid a drunk barreling the wrong way in your lane, or a tsunami that washes 430,000 innocent people to their deaths.

When you are at wits end, or your life is on the line, or you're down in the foxhole of war, you might or might not pray. It helps people transcend the circumstances they're trapped in. When people pray, it can ease the stress in their minds. They need hope, even if it's fantastical. Personally, I think it's no different than doing a line of coke, or smoking a joint.

It's my belief that if more of the world embraced the truth of randomness, we would spend less time being afraid of imaginary, omnipotent gods in the sky, and spend more time helping our fellow human beings.”


“A tree falling on a child doesn’t reverse course because a mom cries out to God” — I don’t think there is a single adult in the world who believes otherwise. Gravity rules, not god — the laws of nature in general. The best discussion of this that I’ve come across happens to be a chapter on “Signs and Wonders” in Jesse Bering’s “The Belief Instinct” — one of the most important books I’ve ever read. Its intellectual clarity is breath-taking. 

We are wired for seeking patterns. For instance, we tend to see faces even in inanimate objects: trees, cars. Our pattern-seeking cognitive bias, the practically irresistible tendency to connect the dots, can easily lead us astray. 

Speaking of cognitive errors, my father often pointed out that the basic error was the assumption of divine omnipotence. Once we remove that, more interesting concepts of something like a deity can be developed (though I'm not sure we need to cling to “deity”; when Rabbi Kushner speaks about the “power of ideals” it’s a lot more clear to keep calling it the “power of ideals” rather than hanging the god baggage on it). The healing power of empathy and of our own unconscious are of special interest to me, how certain neural pathways are capable of taking over and the brain can rewire itself in an instant. 

Paul Klee, The Forgetful Angel, 1939

ending on beauty:

The truth is, we are nearer to heaven
Each time we lie down.
Take a look at the cat
Rolled over with its feet in the air.

~ Charles Simic, from “Midsummer”