Saturday, August 26, 2017


Nathan Oliveira (American, 1928-2010): Couple with Light, 2003


And you wait, keep waiting for that one thing
which would infinitely enrich your life:
the powerful, the unique and uncommon,
the awakening of sleeping stones —
depths that would reveal you to yourself.

In the dusk you notice the book shelves
with their volumes in brown and gold;
and you think of places where you traveled,
of paintings, and shimmering gowns
worn by women found and lost.

And suddenly you know: that was it.
You rise, and before you stands
the shape of a vanished year,
its fears and hopes and prayer.

~ Rainer Maria Rilke, first two stanzas translated by A. E. Flemming, the third one by Oriana Ivy


The German the word for remembrance is “Erinnerung” — loosely speaking, sinking into the inner self, making something part of the psyche again.

This is a great poem about the shift to the second half of life. In youth, we are always waiting for the great thing: great love, first marriage, first “real job,” or the answer to our most urgent question — whatever it is that will be change our life.

Let’s ponder the wonderful first stanza, which I see as the “first half of life”:

And you wait, keep waiting for that one thing
which would infinitely enrich your life:
the powerful, the unique and uncommon,
the awakening of sleeping stones —
depths that would reveal you to yourself.

Yes, even the stones will wake. The trumpets will sound and wake up even the stones. The meaning of your life will be revealed.

And suddenly, Rilke says, it’s not the stones that wake up. It’s we who wake up: “the great thing” has already happened.

No, we don’t get all we want from life, but at least some of what we wanted to happen does happen — has already happened — and we didn’t know it at the time.

For me one insight was that what I wanted from America I actually had in Warsaw, and lost it forever: for one thing, living among educated people, in an intellectual community; for another, having access to a rich cultural life. Now, Warsaw was a lot more vibrant then than later — one of the paradoxes of the fall of communism; and of course there are wonderful things about living in California. Still, what an irony that certain things I wanted to happen had already happened —  in a country I left, expecting that everything would be better, more progressive, more advanced (. . . insert the sound of bitter laughter here).


Don't we often fail to appreciate what we have in youth, dreaming of something so much better? And take steps to chase those dreams we will eventually find lacking, not what we hoped they would be?


Of course, of course . . . That's one reason why youth is said to be wasted on the young. It takes the perspective of years to appreciate what we once had. The danger is that we may get to idealize the past. But the heirloom is always a bit chipped somewhere -- another lesson, if we live long enough. The point is to live in reality (some of it quite enjoyable!), rather than in a futile dream — in spite of all those silly self-help books that counsel otherwise.


“And suddenly you know: that was it.”

The first and the second half of life: living in the future, for the future, versus living in reality, with both more resignation and more contentment. It’s not quite so symmetrical, but it holds overall. And that is perhaps the greatest difference between these two stages of life: waiting and no longer waiting. True: you can perhaps never entirely “not wait.” But life is no longer filled with waiting for life.

For a while you may proceed along a path: the fulfillment of a vocation, the deepening of a commitment. But especially in regard to vocation and what might be called one’s “career in life,” in the end there is the awareness that nothing is any longer a stepping stone to anything else. Now you can at last savor the miracle of simply existing — against all odds, and after so much anxiety, fear, loss, hope and prayer.


What an essential truth in these lines! The tedium and anxiety of waiting replaced by remembrance and the appreciation of living in the present. A sense of completion without the grief of loss. Living in the present, savoring the world in the richness we have learned to see.

Eclipse crescents downtown Portland, OR; Andrew Caldwell


No longer “waiting for the great thing” means that we can at last become comfortable looking at the events and high points of the past. Yes, it has already happened: that was great love. That was the year of great creativity. That was the most fulfilling job. That was the best trip ever, richest in beauty and experience.

Remembrance is one of my favorites among Rilke’s poems. It redeems the past; it even affirms its greatness. It also confirms the insight that we can’t properly grasp experience until later, when it has become the past. Of course the past also contains a lot of pain and disappointment, and that used to prevent me from wanting to go near “reminiscing” — thus missing the good parts, the events quite worth acknowledging and savoring again in memory.

Interestingly, the relatively neutral good memories have come first: the beauty of hiking in the mountains, once hiking has become impossible. Then came the beauty of creativity — interestingly, in the context of my pathetic medical history: the multiple health problems, the surgeries, the pain, the pain, the pain. “Nevertheless, I managed to write some fine poems.” It seemed like a radiant statement on a banner. The power of “nevertheless” to save us.

Reviewing love relationships has turned out to be the most difficult. I generally don’t do that — I don’t look at photographs, I don’t voluntarily summon up memories. And yet spontaneous memories can recur — yes, there were beautiful moments, and they live on.

Life is such a mix of the good and the bad, the beautiful and the ugly, pain and pleasure, that sometimes we wonder how anyone manages at all, much less gets to claim that on the whole they are happy, they feel blessed. One answer is that we practice selective recall. As one famous poet, Octavio Paz, pointed out, what counts is not what “really happened,” but what we remember — and how we choose to tell it.

But another answer is the one provided by Rilke: it really did happen. The great, life-changing thing we were waiting for has already happened. Love has happened. Marriage has happened, the good and the bad of it. Finding and fulfilling one’s vocation has happened. Amazing adventures and surprises have happened. All kinds of solemn and comic and interesting events have happened. We may have missed their importance at the time, preoccupied as we were with waiting for the future. But there comes a time when we can — and need to — acknowledge those events, and take what pleasure and meaning we may get from them. 


Another note on remembrance. Comparing memories with my sisters was interesting. One sister said “It’s like we didn't live in the same house with the same people!” The three of us are each only one year apart in age, so that is not the determining factor — and yet the very atmosphere, the mood and tenor of experience is recalled differently by each of us. To such a degree that what for one is remembered as tragic and painful, made little or no impression on another. Some basic facts are agreed on, edited this way or that, but the emotional tone, emphasis and meanings are different and particular to each.

So there is history . . . flavored and reshaped even in this small and personal arena. Imagine that on the scale of nations and the histories of nations . . . The stories are told according to the beliefs and biases of the storytellers — who won't accept their story is just that, a story, and not truth graven in stone. And so we have our current arguments on confederate monuments, morals, social values, justice and infamy.


Certainly. History is written by the victors, but in the US we have the peculiar situation where those who lost the Civil War for decades have been trying to win it on the sneak — not in the sense that they can restore slavery, but they think they can restore white supremacy. And now of course the neo-Nazis delude themselves that they complete Hitler’s project of eradicating anyone different — again, not accepting that Hitler lost the war, trying to reverse history, though this time Germany is a minor player and Russia a major one — and Islam the wild card that no one expected. Seventy-two virgins! — but only in the afterlife. It would be so funny if it weren’t tragic.

So the past isn’t even the past. And the present keeps changing the past, since the past — our own, or collective — is indeed a story, and it’s all in the interpretation. Insight can be a redeeming feature, though — even if again it’s just a perspective, seeing what we poorly remember with different eyes. I remember reading Rilke’s Remembrance for the first time in my late twenties, and it barely registered; then years later the poem was a revelation, and it helped increase my awareness of how certain major milestones in life (e.g. not just first love, but great love) were already behind me. And now I found it again, after yet more milestone experiences, and the insight that the great thing I was waiting for has already happened is all the more powerful.  


I remember this teachers’ mantra, especially in high school: “Your whole life lies ahead of you.” It vaguely irritated me then — as if what lay ahead would be wonderful, while already in our teens we knew better: it would be a mix of the good and the bad. Above all, it would be difficult. Still, we were indeed filled with waiting for “real life.” Would it start at graduation, when we received the strangely called “certificate of maturity” (provided we passed the maturity exam)? Given the many joke about the certificate of maturity, that couldn’t be it. Would it be the college degree, then, and the first full-time job? Would it be marriage? Childbirth? No one dared ask.

Then came the mantra that “life begins at forty.” That definitely had the aura of consolation. Women in particular seemed to dread turning forty (at that point I was in the US; in Poland, it seems forty was when women ceased to celebrate their birthdays and chose to receive flowers on their name day instead, as if the patron saint suddenly gained importance). Then “fifty is the new forty.” The most daring would even say, “Sixty is the new forty.” Baby boomers are supposedly “re-inventing aging.”

My personal solution, at one point, was to call myself “posthumous.” Since I was no longer an active poet, which had been my most “real” (at at least most intense) life, now I could enjoy a more Taoist-like life of not struggling. Now I could take it easy and just “be” rather than constantly do something. 

True, I no longer needed to “prove myself.” But life is never devoid of challenges and surprises. It’s never exactly easy — though easier in many ways than during the desperate years that were also the years of high creativity. The greatest surprise has been my realization that it was too late for despair. I could brood over all the disappointments — or I could concentrate on making what modest contribution I could still make — and on enjoying the years that still remained.

And a new mantra emerged: “I can cope with whatever life throws at me.” And another one: “Eat good food now because there are no restaurants in heaven.” That’s a metaphor. I know my readers will get it. 

The only heaven is right here: the heaven you perceive around you, in the beauty of the world, and the heaven you manage create for yourself. 

Finally, yes, the surprises. If I’ve learned anything, it’s this: whatever you imagine lies ahead, that won’t be quite it. In fact it may be something entirely different.

We should stay open to the odd, the unexpected, the unscheduled. It’s not so much a lowering of standards as a questioning of their purpose, and the various uses of love. ~ Barbara Holland 

Charles Sherman: The Heart of Infinite Love, 2016 (ceramic with copper and iron oxide patinas)

a darker view:

Life — the way it really is — is a battle not between Bad and Good but between Bad and Worse. ~ Joseph Brodsky

Along similar lines:

Tragedy is the clash of right and right. ~ Amos Oz


Here is an interesting image of "remembrance" that has recently come my way: In the archives of the Prague Castle; M. Peterka, 1958

And here is its companion image to whatever is carefully preserved in those archives, by Sebastian Bianek:

“In some ways suffering ceases to be suffering at the moment it finds a meaning, such as the meaning of a sacrifice.” ~ Viktor Frankl

In my late teens I asked my mother, a scientist, how come all the animals sent into space died, in spite of extensive training, while humans routinely survive. My mother said, “That's because a human being knows the meaning of what he is doing.”

PEOPLE USED TO PLAY THE ROLE OF THE GODS (an interesting theory)

~ “Just as different actors play the same character — Hamlet, Sherlock Holmes, Dracula — so in ancient times different people performed the role of gods such as Ishtar, Jehovah, Baal, Zeus, Apollo, Dionysus, and so on.

An original genius performs a dynamic leadership role that congregates people into a thriving nation, and once the originating person dies the society chooses a person to continue that role. The spirit of the Tribal God becomes immortal as each subsequent person assumes the role.

Buddha is played by the Dalai Lamas, Jesus is played by the Popes and the Patriarchs, and Mohammed was played by the Caliphs.
At first the new actor takes the same name as the original “god”, pretending to actually be them reincarnated, like all the men who played Zeus, then when people realize the actual first god-man is dead the actor acts as a substitute like the Pope as vicar of Christ, and now leaders playing god use a title of authority like King and President.

Always it seems we humans must appoint a leader for each group in the hierarchy of societies.

Over the past 10,000 years our Tribal Leader has evolved from God to Priest to King to President to reflect our growing awareness of the mortality of God, a fictional character we invent which is played by a person who leads the pageantry of power that sustains social order whose chief purpose is to organize how we breed children and grow food.” ~ Surazeus Seamount, posted on Facebook 

Velasquez, Pope Innocent X, 1650


A compelling presentation. It wouldn't surprise me if the "historical Jesus" was a fusion of 2 or 3 different men — one of them an apocalyptic nut (probably a genuine schizophrenic), and another more of the school of Rabbi Hillel. Wasn't seen as a god right away, but later there was no stopping it. Popes have turned out to be a very poor substitute as a “representative of Christ on earth” — though Francis has shocked us in a good way by trying to actually follow Christ’s teachings, thereby disgusting the conservatives.


~ “In the most recent study, Brick Johnstone studied 20 people with traumatic brain injuries affecting the right parietal lobe, the area of the brain situated a few inches above the right ear. He surveyed participants on characteristics of spirituality, such as how close they felt to a higher power and if they felt their lives were part of a divine plan. He found that the participants with more significant injury to their right parietal lobe showed an increased feeling of closeness to a higher power.

“Neuropsychology researchers consistently have shown that impairment on the right side of the brain decreases one’s focus on the self,” Johnstone said. “Since our research shows that people with this impairment are more spiritual, this suggests spiritual experiences are associated with a decreased focus on the self. This is consistent with many religious texts that suggest people should concentrate on the well-being of others rather than on themselves.”

Johnstone says the right side of the brain is associated with self-orientation, whereas the left side is associated with how individuals relate to others. Although Johnstone studied people with brain injury, previous studies of Buddhist meditators and Franciscan nuns with normal brain function have shown that people can learn to minimize the functioning of the right side of their brains to increase their spiritual connections during meditation and prayer.

Johnstone makes the comparison to other kinds of disciplines; “It is like playing the piano, the more you train your brain, the more the brain becomes predisposed to piano playing. Practice makes perfect.”

While researchers have been focused on finding a ‘God spot’ in the brain, the new research suggests that it might be better to focus on the neuropsychological questions of self focus vs selfless focus. As Prof. Johnstone explains: “when the brain focuses less on the the self (by decreased activity in the right lobe) it is by definition a moment of self-transcendence and can be understood as being connected to God or Nirvana. It is the sensation of feeling like you are part of a bigger thing.”


This finding has been replicated: “Spiritual transcendence (i.e., emotional connection with the numinous/mystical) is a specific spiritual dimension that appears to be primarily related to increased selflessness associated with decreased RPL functioning.”

The work on meditating Buddhist monks and nuns at prayer showed a similar decrease in the activity of the right parietal lobe.

Why would this be of interest to a secularist like myself? Because as a poet acquainted with the creative process, and as someone who has often experienced awe before the beauty of nature, I have known for quite a while that the key is an intense focus on something other than the self. Singing, dancing, acting, gardening — all kinds of activities provide the entrance to that special state where daily cares are lifted and we often speak of the delight of “losing ourselves” in something. 


The murderer kills because he seeks “justice." ~ Steven Pinker (in a lecture on the culture of honor versus the culture of dignity)

Humans are moralizing animals, with a curious need to pass judgment on others and see that they get punished. Pinker studied the causes of violence, including the most common motive for homicide. According to police records, it’s not material gain; it’s “justice.” The killer is carrying out capital punishment; his victim deserves to die for this or that reason (cf “I want justice” in “The Godfather”)

Thus, the perpetrator sees himself as the real victim: the victim of injustice, which needs to be corrected. Society is also to blame, since it’s stacked against the perpetrator, who is thus forced to mete out justice himself. This is of course perverse logic, but it’s fascinating that criminals have a psychological need to see their actions as morally justified.

Likewise, wars tend to be justified using the language of moral principles. Pinker suggests we need to think of morality less in terms of blame and punishment, and more in terms of minimizing harm and maximizing flourishing.

It seems to me that when progressives speak of justice, it’s likely to mean human rights, equal opportunity, equal pay, etc. When conservatives speak of justice, they mean retributive justice: punishment, vengeance. Not in 100% of the cases, but it’s a tendency.

~ “God appears, in a whirlwind. Throughout the Old Testament, as Freud claimed, God takes the part of the angry father. Here he surpasses himself, by pointing out to the four men what he is and they are not: the creator of all things. “Where wast thou when I laid the foundations of the earth? . . . When the morning stars sang together, and all the sons of God shouted for joy?” He proudly inventories the wonders he fashioned. Most thrilling, perhaps, is his portrait of the warhorse:

    Hast thou given the horse strength? hast thou clothed his neck with thunder?

    Canst thou make him afraid as a grasshopper? the glory of his nostrils is terrible.

    He paweth in the valley, and rejoiceth in his strength: he goeth on to meet the armed men. . . .

    He swalloweth the ground with fierceness and rage: neither believeth he that it is the sound of the trumpet.

    He saith among the trumpets, Ha, ha! and he smelleth the battle afar off, the thunder of the captains, and the shouting.

“Ha, ha!” That is the spirit of God’s answer to Job. I am power itself, he says. How dare you question me?

Job immediately apologizes for challenging his maker: “I abhor myself, and repent in dust and ashes.” Now God addresses the three friends, who told Job that God is just. He punishes them for presuming to say that they understand his ways. Then he turns to Job and tells him that he alone has spoken the truth—apparently, that God is not understandable. For this, God rewards him.

The story is bewildering, from beginning to end. How could God, being God, allow Satan to seduce him into destroying a good man? More important is the moral: that we have no right to question him for doing such things. (God, for all that he says from the whirlwind, never answers Job’s questions.) Furthermore, the Book of Job seems to claim that all wrongs can be righted by property. If everything was taken away from Job, the problem is settled by God’s giving it all back, mostly twofold—fourteen thousand sheep for his seven thousand, etc. As for the ten dead children, in this case Job gets only ten back, but the new daughters are more beautiful than any other women in the land.

For people who take the Bible seriously as an explanation of life and as a guide to right conduct, all this is mysterious. It is certainly not the first instance in which God inflicts appalling misery on his people. In Genesis, he killed everyone on Earth except those on Noah’s ark. But Job is highly individualized—a person like us. He is probably the character in the Old Testament we sympathize with most closely. (David is his only competition.) Therefore, his struggle to go on believing in God is something that theologians and moralists have had to think about. Their conclusions are the subject of Mark Larrimore’s book.

Larrimore quotes a passage from Voltaire’s “Candide” (1759): “ ‘What difference does it make,’ said the dervish, ‘if there is good or evil? When His Highness sends a ship to Egypt, does he worry about whether or not the mice are comfortable on board?’ “ Voltaire said that Candide was “Job brought up to date.”

Many philosophers, probably without meaning to, inched their way toward the same position. Kant said that all we could do with doubts about God was admit them. For Kant, Larrimore writes, “the book of Job shows that the problem of evil must remain an open wound.” Larrimore thinks that’s still true: that the dispute between Job and his friends epitomizes modern thought. There are no answers, only riddles.

In the face of that impasse, the discussion often shifts from content to style. In the eighteenth and nineteenth centuries, a number of people who wrote on Job—the German theorist Johann Gottfried von Herder, the Anglican bishop Robert Lowth—stopped trying to figure out God’s plan, and instead focussed on his poetry, whose sublimity, they felt, was meaning enough. Indeed, the ambiguity boosted the sublimity. This position was undoubtedly reassuring, but the new aestheticism could also be seen as a failure of moral seriousness. Furthermore, it placed God at a very far remove from humankind.

One of the reasons that Job complains so bitterly is that he thought that he and God had a relationship. Now it is sundered: “I cry unto thee and thou dost not hear me.”

His sense of abandonment is a great part of the poignance of the Book. But as the Enlightenment, whose efficient universe had little place for a punishing God, yielded to Romanticism, with its worship of passion, many thinkers had less need for a pleasant, companionable God. An excellent example is William Blake, who between 1805 and 1810 produced a series of twenty-one watercolor illustrations for the Book of Job. Blake did not need God to make sense. He wanted him to be a figure of pure energy, like the “Tyger, burning bright.”

In the twentieth century, the most pressing new influence on the interpretation of Job’s story was the Shoah, after which, Larrimore writes, Job “became Jewish.” The person most responsible for his conversion was Elie Wiesel, an Auschwitz survivor. Wiesel began lecturing about Job as early as 1946. He regards the Book as a great text, and a great torture. For many, Job epitomized the suffering of the Jews during the Second World War and also their perceived response to it, which, in the nineteen-sixties, Hannah Arendt described as going like lambs to the slaughter. As God played dice with his life, Job grieved and protested, but he didn’t take any action. This interpretation anguished Wiesel. An alter ego in one of his novels “never ceased resenting Job.” He says, “that biblical rebel should never have given in.” Eventually, Wiesel decided that Job hadn’t given in.

If, for many Westerners, the question of why God allows good people to be tortured is no longer a pressing issue, why is it that Job appears to be the most fascinating book of the Old Testament? I can’t think of a single character in the Bible, apart from Jesus or David, who is quoted more often than the dramatis personae of the Book of Job are.

This is without doubt due, in part, to the Book’s amorality. I believe that if you woke a lot of people in the middle of the night, and asked them why they cared about the Book of Job, they would name the most troubling, least sympathetic character in that document: God. He, not Job, is the star of the Book, and though he is not loving or fair, that seems to be part of the attraction. Once God appears and speaks, you are almost blown to the ground. “Hast thou an arm like God?” he demands. Then, in a rolling magnificat, he names the things that he has created: the earth, the sea, the night, the light, the constellations, the clouds, the winds, the dew, the rain, the snow, the hail, the frost, the thunder and lightning. He goes on to the animals: the goats, the asses, the hinds, the peacocks, the ostriches, the grasshoppers. In two celebrated passages, he describes with pride the monsters he created: Behemoth and Leviathan, Behemoth’s counterpart in the sea: “His breath kindleth coals, and a flame goeth out of his mouth.” God’s description of the warhorse is even more exalting, because this creature is unquestionably real, not fantastic. Likewise the eagle: “She dwelleth and abideth on the rock, upon the crag of the rock, and the strong place. From thence she seeketh the prey, and her eyes behold afar off.” She brings pieces of flesh back to her children. They feed on the blood.

God’s speech slaughters the moral, the what-should-be, nature of the rest of the Book of Job. It is the knife flash, the leap, the teeth. And despite, or because of, its remorselessness, it is electrifying. It is like an action movie, or a horror movie. Of course, Job is important in the story, but today he seems the pretext, the one who is like us, and makes the argument that we would make. As for God, he makes the argument that, at least as far as nature is concerned, is true." ~


Again I return to this rich article, this time pondering what Milosz said: that the modern tendency has been more and more to equate god with nature, and nature is amoral: “Everything devours everything.” There is no justice, not even some veiled sort we are too inferior to grasp. It’s pure, amoral power: the warhorse, the Leviathan, and the Behemoth.

The Book of Job could be understood to present god as nature. That’s just how things are — it has nothing to do with what you deserve. Justice is a human concept; the sense of fairness does not exist in nature — except, to some extent, in all social species (wolves, elephants). Once you have a group, cooperation becomes important, and a certain degree of altruism (or call it simply “caring”) serves the survival of the group.

But that’s not good enough for humans. We want bad things to happen only to bad people. But that’s just wishful thinking. The world is not just, and humans must live with that reality. There is no just ruler of the universe who will “wipe out every tear.” Fortunately there are many others who understand the frequently random nature of suffering, those on whom we can count to extend not blame but consolation. 


That tremendous bully of a God in the book of Job does represent something I see as an uncomfortable truth. We are not the apple of any God’s eye. Complaints of unfairness and undeserved misfortunes refuse to see the randomness of things, instead demanding an infantile world of “fairness” — of rewards to the deserving, good and obedient, and punishment for the undeserving, disobedient and unruly. This is a childish vision that in no way mirrors nature. Clinging to this false idea is a comfort . . . daddy’s  watching over me, I am his beloved child, I will be protected and rewarded for my devotion and good behavior. All untrue, but a sheltered hiding place for those unable to see or accept that we are merely incidental, and not the core and reason for the universe.


Pagan cultures had the notion that even the gods were subject to Fate . . .  which was different from the Hebrew idea that god is just and rewards the righteous and punishes the wicked, so if you got sick, or your crops failed, you must have sinned and were now being punished. Note that Job’s “comforters” try to convince him that either he must have sinned, or one of his children. Science has been liberating us from this habit of blaming the victim, but the idea returned with a vengeance with the popular New Age belief that we attract misfortune with negative thinking. Cancer? You must have thought you don’t deserve perfect health. Lost your job? It’s not the economy, no; it’s that you were afraid you’d lose your job. Thought crime.

One way or another, people have a hard time accepting randomness. Doing so would make them kinder, and being kind is actually a pleasure . . .  but it has to be discovered, and someone has to set an example. Some months ago, on a weekend, a neighbor of mine who happens to be an electrician repaired my garage door opener for free in about ten minutes, saving me hundreds of dollars — but what I most remember is how happy he was to be performing this kind deed, how radiant. He must have felt so good about himself . . . if only more of us had the skills and the opportunities . . . we wouldn’t think of worshiping a Nazi-like deity. 

But the Book of Job at least acknowledges the uncomfortable truth that bad things happen to good people — in that sense it’s a book of wisdom. Another wise book is Ecclesiastes, which is shockingly secular and ends by telling us to work with dedication (“whatever thy hand finds to do, do it with all thy might”), but also to “put on a clean garment” and enjoy life.

~ “With billions of people intersecting, interacting globally, continuously, we can figure that billion of things bounce off each other globally, continuously. Good things, bad things, unpredictable things, kind and terrible things, strange and unexplainable things.

But the Law of Large Numbers degrades to nonsense when we try connecting dots for personal meaning and comfort. We want the dots to spell a message that we, among all people on earth, are special to God. That's when Jesus shows up on toast. That's when a hurricane kills dozens in New Orleans — because — as Pat Robertson put it — God hates gays.

People connect dots for personal reasons. A family survives a tornado in Kansas that blows their neighbors off the map, and they declare on national TV that God answered their prayers. Too bad about the neighbors.

Since time began we have been story telling, pattern seeking, meaning making animals. But religious dots, like history, are connected only by survivors.


Chris Kammal, a Florida paramedic:

“I work as a medic fireman. I see death and mayhem routinely. I have run thousands of calls over the last 23 years and so many of them were people in extreme crisis, or already dead when we got there.

I was asked the other day if I've ever seen miracles or things that can't be explained. My response was that I see many things beyond explanation but never anything miraculous. The miraculous implies that forces outside this world intervened, defying the basic principles of nature. But there's no evidence that the laws of physics have ever been suspended to save someone who was standing in the way.

A tree falling on a child doesn't reverse course because a mom cries out to God. There's no evidence that the universe has ever been manipulated by outside forces — in spite of ancient mythology or miraculous bible stories.

I've seen cars recognizable only by their tires, yet everyone came out alive with light injuries. I've seen cars with only moderate damage, with everyone dead.

I've seen a college freshman waiting at school for his mom to pick him up, but before she got there another driver had a heart attack, jumped the curb and killed the kid. Random is the rule.

The universe does not care who you are, where you come from, how religious you are, or how much money you have.

We are all potential victims. Of course, good information and alert thinking help avoid problems before they happen. But you can't always avoid a drunk barreling the wrong way in your lane, or a tsunami that washes 430,000 innocent people to their deaths.

When you are at wits end, or your life is on the line, or you're down in the foxhole of war, you might or might not pray. It helps people transcend the circumstances they're trapped in. When people pray, it can ease the stress in their minds. They need hope, even if it's fantastical. Personally, I think it's no different than doing a line of coke, or smoking a joint.

It's my belief that if more of the world embraced the truth of randomness, we would spend less time being afraid of imaginary, omnipotent gods in the sky, and spend more time helping our fellow human beings.” ~


No, there is no cosmic justice. But now and then we get to see “poetic justice” — as when a preacher who’s been blaming natural disasters on gay rights gets his own house destroyed in a natural disaster.


“Heaven goes by favor. If it went by merit, you would stay out and your dog would go in.” ~ Mark Twain

ending on beauty:

My heart is moved by all I cannot save:
so much has been destroyed
I have to cast my lot with those
who age after age, perversely,
with no extraordinary power,
reconstitute the world.

~ Adrienne Rich

~ not the philosopher kings (there aren’t any, and those who may have come close were a disaster), but ordinary shoppers at Home Depot, buying tomato seedlings and baby peach trees. Or by those who share their beautiful photographs, like Haley Hyatt.

Saturday, August 19, 2017


White light image of the solar corona during totality of a solar eclipse (NASA)

~ “Pnin slowly walked under the solemn pines. The sky was dying. He did not believe in an autocratic God. He did believe, dimly, in a democracy of ghosts. The souls of the dead, perhaps, formed committees, and these, in continuous session, attended to the destinies of the quick.” ~ Nabokov, Pnin

I read this paragraph one day after waking at midnight from a nightmare that had the vividness of a hallucination rather than the vagueness of a typical dream. In the dream (or vision) I was lying in my bed. I looked up and saw about eight-ten people, mostly but not exclusively men, mostly fortyish — mature but not elderly, “in their prime.” Their faces, each quite distinct and individual, had a serious look. Though none wore a white coat or sported a stethoscope, I assumed that these were medical professionals who have come to bring me bad news. And I also suspected that they were ghosts. “Who are you?” I asked. They vanished without answering, and I woke up (never end a poem with “and I woke up,” is one of the various taboos you hear at a poetry workshop).

Those solemn people at my bedside — that’s what I imagine a committee of ghosts might look like.

Not that I believe in ghosts. To me the “soul” is a loose equivalent of the mind (including feelings and unconscious neural pathways). It’s a complex function of the brain that ceases when the brain dies. It’s certainly not a little person that leaves the body and roams around the universe (or the “astral realm”), preserving one’s identity, remembering everything that happened.

And yet . . . when you grow up in a culture where the belief in souls and ghost is still commonplace, and haunts (so to speak) books and movies, it is not the least surprising to dream about ghosts and to respond to their literary or cinematic depictions. And it’s rather pleasant to ponder that the ghosts we conjure up do seem to care about us.

(OK, my “ghosts” were wearing clothes — nice casual clothes that are typical of what people wear when they go shopping, for instance. Yet why would a soul need clothes? Souls in art are mostly nude, just without genitals. In cartoons, they wear nightgowns or simple shapeless robes. But dreams tend to show the dead just as we remember them, i.e. usually in clothes.)

Now, the idea of committees of ghosts attending to the living would solve the pesky problem of giving the dead something to do, and be of some use. It seems that earthly life is what matters anyway, even to the ghosts. Since physical pleasures — eating, sex, napping, playing with pets — don’t exist in the Christian heaven, we can’t imagine anything worth doing “up there.” Well, I personally can imagine strolling for some portion of eternity around the Garden of Eden and never getting tired of the plants and their doings (assuming change happens, which some would deny in the name of perfection), but I know that most people aren’t as fond of botanical displays.

But never mind me. “Pnin” is a terrific novel about a Russian emigrè in America — and the protagonist isn’t a pedophile, and the portrait of American culture is relatively benign. (By the way, my spell check always corrects Pnin to “pain” — and there is a certain insight to that, as being an immigrant goes.)

~ “[Nabokov’s] American novels are distinct from what he’d done before; more than that, Roper writes, they’re an apotheosis – “the claim to greatness rests most solidly on the American books”. Indeed, from the outset, Roper makes Nabokov himself into a peculiarly American figure, as much immigrant hustler as aristocratic European, someone who’d acted out scenarios from Mayne Reid’s Wild West books as a child, and who was scrappily resourceful in making his name, despite racking up some 60 rejections from US publishers before he even arrived.

Nabokov liked all sorts of things about his adopted country, its trashy cultural ephemera as well as its natural beauty, its openness but also its odd conservatism, in which he perhaps sensed a different kind of opportunity (“what charms me personally about American civilization,” he wrote to his agent before the move, “is exactly that old-world touch, that old-fashioned something which clings to it despite the hard glitter, and hectic nightlife, and up-to-date bathrooms”).

His delight in it is beguiling, as is the image Roper offers of him as a particular kind of immigrant. Not an émigré in the mould of Thomas Mann or Bertolt Brecht, Nabokov immersed himself in the new place, not least via his work as an lepidopterist, through which he made all kinds of friends. Far from keeping to a rarefied enclave, Roper’s Nabokov is a figure more like Ayn Rand, who came as he did from St Petersburg – although she was, as Roper tactfully notes, “a writer of different attainments”, she also made a “wholesale embrace of what she took for Americanism” – or Billy Wilder, who’d made movies in German and French before his American classics.

Like Wilder, Nabokov did a good line in American comedy. “Reality was vital and vulgar here,” Roper suggests, providing “‘exhilarating’ opportunities for burlesque”. Part of the joke in Pnin is that the bumbling foreigner, with his futile eagerness to fit in, can be more American than the Americans by virtue of sheer desirous optimism. Think of Pnin’s set of false teeth grinning to itself in its container – “It was a revelation, it was a sunrise, it was a firm mouthful of efficient, alabastrine, humane America” – or of his passion for the washing machine: “Casting aside all decorum and caution, he would feed it anything that happened to be at hand, his handkerchief, kitchen towels … just for the joy of watching through that porthole what looked like an endless tumble of dolphins with the staggers.” Pnin is far cosier than Lolita, but they do share an expansiveness: one a social comedy swollen with feeling; the other perhaps the most boisterous, spirited parody-tragedy you could conceive.

Had Nabokov succeeded in his attempts to move to England instead in the late 30s, there would have been no Lolita – and not just because of the open road or the gum-chewing teen: the whole shape of the narrative, the language and energy of it, is unimaginable without the American landscape and culture. We even have a European version to compare it to, The Enchanter, written in 1939, which has the pedophile fleeing with the child, but shares few of Lolita’s other qualities. “Like the author of a story about bulls and capes who changes the setting to Spain,” Roper writes, in bringing the theme to America, “Nabokov inherited a stage.”

In place of the snobbery, the famous superiority complex, Roper finds someone who “immersed himself in the demos”, both in theory and in practice. And more important, Roper gently rejects Nabokov’s claim that “inventing America” meant simply collecting some local color to “inject a modicum of average ‘reality’ … into the brew of individual fancy”. Instead of a hermetically sealed genius, roaming around but never changing, letting nothing in, Roper finds a writer who is open, flexible, susceptible to influence; who doesn’t know everything in advance, but rather makes discoveries and passes them on.” ~


Heaven — the dream of having landed permanently in the winners’ circle — speaks volumes about the quest to escape reality’s dilemmas. ~ Jeremy Sherman
But what would the disembodied soul do? This is where Nabokov’s fantasy has its charm — the ghosts would in some manner nurture the living. This reminds me of Lebanese woman I met in college — she said it’s good to have many dead: they pray for you, they take care of you. And it’s by no means a rare idea. Even people who are not overtly religious may conclude a tale of a near-accident with the remark, “My mother in heaven must have been watching out for me,” and no one raises an eyebrow.

If a misfortune DOES happen, oddly enough no one says, “My mother in heaven must have forgotten to pray for me.”


But speaking of America, I think this is something that Nabokov would gladly include in a novel or memoir:

~ Jonathan Stickland, a member of Freedom Caucus, generously supported by Empower Texas, made news in the 2015 session by posting a sign outside his office:

Jonathan Stickland
District 92

~ The New Yorker, July 10 & 17, 2017, p. 56

As someone commented, When did he stop being a fetus?


Speaking of soul: the Catholic Encyclopedia does a very poor job of defining it, while snarling that those who reject the body/soul dualism are closet atheists. But by beginning with the “animating principle present in all living things,” these theologians aren’t very far from the teachings of the Bhagavad Gita, which extends the meaning to “unchanging, indestructible, indivisible presence within everything.” (So, you thought that a rock doesn’t have a soul? Or, if it does, that it's a different, lower-quality soul than yours?)

As opposed to such universal soul, most Western believers in the soul see it as exclusively human and unique to each person. On the other hand, the Jungian writer James Hillman evades the universal versus individual dilemma by saying the soul is “the poetic basis of mind . . . the imaginative possibility in our natures, the experiencing through reflective speculation, dream, image, fantasy.” This seems to equate the soul with the inner life or “inner world.” Few would argue with that, though maybe it would be simpler to use the term “inner life.”

Another Jungian, Thomas Moore, wisely refuses to define the soul, but says we know it when we see it; when something has the qualities of genuineness and depth, we call it soulful. Again, this seems accurate enough, but why not speak of “depth” instead? Then we can ponder what “depth” entails. When I think of Kate Blanchett’s extraordinary performance in “Blue Jasmine” as having depth, I realize that more than anything I mean multidimensional. She can’t be summarized with a single label, e.g. “narcissistic” or “traumatized.” She carries the mystery of what a human being is as a verb, a process, shifting from moment to moment.

For the neuroscientist, there is of course only brain function that ceases when the brain completes all the stages of dying. NDEs, too, are brain function under extreme circumstances, as are all mystical visions and experiences. But when we say something is “only” brain function, we need to be aware that we are talking about a magnificence we have barely begun to explore — for instance, mirror neurons that appear to underlie empathy were discovered only in 1996.

Can we ever understand the brain fully? Since it’s the human brain studying the human brain, philosophers claim that a full understanding is not possible. And that’s fine — mystery is more thrilling than answers.


. . . now, weak, short of breath, my once-firm muscles melted away by cancer, I find my thoughts, increasingly, not on the supernatural or spiritual, but on what is meant by living a good and worthwhile life — achieving a sense of peace within oneself. I find my thoughts drifting to the Sabbath, the day of rest, the seventh day of the week, and perhaps the seventh day of one’s life as well, when one can feel that one’s work is done, and one may, in good conscience, rest.

~ Oliver Sacks, “Sabbath”

Milosz too loved his old age. It seemed to be the happiest time of his life, which amazed him. He wrote that the older he grew, the more he loved life and the beauty of the world. He used to think that with age we are supposed to distance ourselves from life in preparation for departure — that it would be natural to withdraw, to become more aloof to earthly delights. And yet the opposite was happening.

I was departing, the first star ran to greet me,
and the glow was so beautiful, and life was so good
that I said, I will return, though there is no returning.

~ Czeslaw Milosz

“the first star” — this is partly in reference to the symbolism of the first star on Christmas Eve. It’s Polish custom to begin the Christmas Eve supper when the first star is sighted (presumably the first star, like that Star of Bethlehem, signals that the divine child is about to be born; there may also be a pagan reference)

I wonder if perhaps the best translation of the first phrase is the literal one: “I was descending,” or even “I was setting” (like the sun or the moon).

And the wonderful thing was both Milosz and Sacks continued to write until the very end. Their main focus was peace and receptivity, but they continued to contribute out of the richness of their minds. They didn’t have to strain; the writing skill, honed with hard work, just kept on giving. 


I have found aging to be a wonderful thing in many ways. The body, yes, wants to disappoint us, with knobs and wrinkles, waning stamina and crepey skin. But there is also a delicious freedom — to say what you think, wear what you want, pursue your own inclinations without first making sure you’re not offending, disappointing or defying all the rules and expectations. It is more tempting to shake things up, rock the boat, go your own way — you care less now about acceptance, popularity, scoring points, getting ahead. There’s just so much nonsense you’re done with, and you can concentrate on what you love, whether that is art or ideas, adventures or explorations, in your thoughts or in the world at large.

I find myself like Sacks and Milosz, in love with the world in all its splendor. For myself, the grief and troubles of the past have been resolved, or lost the worst of their power, and the treasure house of memory is full, and rich, and infinitely useful. The years of learning and practice have resulted in ease working with the things I love the most — words, ideas, images. The world is endlessly fascinating, entertaining and astounding. I want to live long and go out singing! — Like the poets and artists writing and painting on into their nineties, making new things. Living.


Yes, living at last! I’ve been wondering which factor is primary in making these years so much better than most of the past. I think it’s the freedom from “trying to get ahead,” to succeed in life. What unholy pressure is put upon us, women too, to meet the standards of worldly success. Sure, he hear about stopping to smell the roses, but what are roses next to all that pressure, the whip of guilt and shame forever hanging over us when we are young? Your fault, your fault, your fault, society seems to be saying when something doesn’t work out — and something is always not going to work out.

Once this “getting ahead” is out of the way, savoring the beauty of the world is at last fully possible. And the beauty of self-acceptance. I find both the world and the self to be such discovery awaiting me each day.

I know that health may fail in a serious way at some point, and make daily living more difficult. We never know how many GOOD years are still left. But that makes every day precious as never before — before that awareness was born. Talk about finally, finally “living in the now.” About taking a deep breath and realizing, as if for the first time, how delicious it feels. 

~ “As viewed from Earth, the pattern of bright highlands and dark maria on the moon’s surface never seems to change. This leads us to wonder: What does the far side of the moon look like?

Some speculate that this side beyond our view is a “dark side,” a frozen and desolate surface devoid of sunlight, possibly haunted by malevolent forces. In fact, this myth has permeated popular culture. In the modern era, the dark side of the moon has continuously captured human imagination, sparking a 1990 thriller of the same name and inspiring the title of Pink Floyd’s popular “Dark Side of the Moon” album. However, in understanding the science behind the moon’s orbit, we can prove that there is no dark side after all.


No matter where we are on Earth, we see and always have seen only one face of the moon. Since the moon rotates on its axis in the same amount of time that it takes the body to orbit our planet, the same half face of the moon is consistently exposed to viewers on Earth. This timing is caused by a phenomenon called tidal locking, which occurs when a larger astronomical body (Earth) exerts a strong gravitational pull on a smaller body (the moon), forcing one side of the smaller body to always face the larger one. Due to tidal locking and other astronomical variables, only 59 percent of the moon’s surface can ever be seen from our planet. The remaining 41 percent, then, remains a mystery, and a subject of creative musings and astronomical research.

The fact that we earthlings cannot see the far side of the moon does not mean that this face is never exposed to sunlight. In fact, the far side of the moon is no more and no less dark than the hemisphere we do see. Since the moon is a sphere and light shines radially outward from the sun, one hemisphere of the moon is illuminated at all times, except in the case of a lunar eclipse.

However, the hemisphere fully lit is only the side of the moon we see from Earth during a full moon. During the moon’s other phases, its apparent shape depends on how much of the sunlit hemisphere we can see from Earth, and how the moon creates its own shadow away from where the sunlight hits. For example, when we see a quarter moon from Earth, we are seeing one quarter of the sunlit surface and three quarters of the moon in shadow. If we could set our sights on the far side of the moon, we would see three quarters illuminated by the sun and one quarter in shadow.

Photographic evidence now also confirms that the moon has no dark side. Photographs of the far side of the moon did not exist until 1959, when images were transmitted from the Soviet spacecraft Luna 3. Recently, NASA confirmed the existence of a well-lit far side of the moon by using images from the Wide Angle Camera onboard the Lunar Reconnaissance Orbiter, which fully orbits the moon to construct a full map of its surface. Since we have never seen the far side of the moon from Earth, this 360 degree view may look foreign to us. In fact, we see that the far side of the moon, paradoxically, is lighter in color than the near side since it has fewer dark maria.

Each night we look up at the sky and see the moon in all its glory, we should remember that there is a whole other side to this celestial body that no human has ever seen with the naked eye.



We got a wonderful present from Hitler and Stalin that they never meant to give us. We were immune for sixty years or so to aggression, racism, and militarism. They made us partly immune to those things. Not it appears this Stalin-Hitler gift has reached its expiration date. We were spoiled by this. So maybe we are just emerging from a relatively golden age. ~ Amos Oz



It's been 152 years since the Union Army defeated the Confederate States of America, and 72 years since the Allies defeated the Third Reich. Why, despite decades of social progress for ethnic minorities, do people still embrace fascist and neo-Confederate ideologies?

A model developed in the early 1990s might help explain the persistence of ideologies that promote social inequality. Social dominance theory postulates that societies seek to minimize class conflict by promoting ideologies that promote the superiority of one group. The eight-item social dominance orientation scale measures how strongly a person supports hierarchical social relations.

Social dominance theory seeks to explain how hierarchy-enhancing ideologies do not just drive social inequality, but are also a result of it. It suggests that a single personality trait, called social dominance orientation (SDO), strongly predicts a person’s political and social views, from foreign policy and criminal justice to civil rights and the environment. What's more, it offers insight into how ideologies such as racism, sexism, and xenophobia tend to arise from the unequal distribution of a society's resources.

“Social dominance theory provides a yardstick for measuring social and political ideologies,” says Felicia Pratto, a psychologist at the University of Connecticut who helped create the theory. “[Social dominance orientation] is one way – not the only one – to try to figure out what those ideologies are ‘about.’ ”

A person’s SDO can be measured with as few as eight survey items that gauge how strongly a person believes in hierarchical social relations. Respondents are asked to say how much they agree or disagree with statements. At one end of the spectrum are statements suggesting that “An ideal society requires some groups to be on top and others to be on the bottom,” and “It is unjust to try to make groups equal.” Statements at the other end suggest that “Groups at the bottom are just as deserving as groups at the top,” and “No one group should dominate in society.”

People with high SDO scores are more likely to believe that women and men are naturally different and should have different workplace roles. They are more likely to accept theories of racial superiority and to believe that their country is inherently better than other countries. They tend to oppose lesbian, gay, bisexual, and transgender rights; affirmative action; interracial marriage; and social welfare programs. They tend not to call themselves environmentalists. They tend to support military action overseas and the death penalty at home. They tend to believe in capitalism and that the world is basically just. And they are more likely to choose “hierarchy enhancing” careers such as law enforcement, military, business, and politics.

People with low SDO scores, by contrast, tend to hold social attitudes associated with egalitarianism. They tend to work in “hierarchy attenuating” careers such as social work and counseling, special education, or journalism.

“One thing that I think most people find surprising about the theory is its argument that many phenomena that we think of as different or unique – say, racism, sexism, and homophobia – may have a common root,” says Christopher Federico, a professor in the University of Minnesota’s psychology and political science departments. “That is, all of them may be similarly rooted in a desire for intergroup hierarchy, despite having different targets and being enforced in somewhat different ways.”

Men, on average, tend to have higher SDO scores than women, an observation that has led researchers to suggest that SDO may be partly rooted in biology, although research indicates that SDO is not a genetically heritable trait.

“There is a strong tendency for countries that have more equality for women, such as higher education levels, less unequal pay between men and women, and more women in political office, to have lower SDO scores,” says Professor Pratto, who now teaches psychology at the University of Connecticut.

Pratto notes in an email interview that SDO is not a binary phenomenon, but a gradient. “Most people are a bit to the low side of the middle in egalitarian countries, and the mean tends to be higher in more hierarchical countries,” she says.

Social dominance theory starts with the observation that in every society that has moved past the hunter-gatherer stage and produces economic surpluses, social hierarchies emerge. Those at the top of the pecking order develop and promote social beliefs – for example, the idea that poor people remain so because they are lazy – that legitimize the hierarchy.

“We know that people high in SDO are more likely to support conservative social policies,” Professor Federico says in an email to the Monitor. “However, this relationship is more pronounced among those in high-status groups. Among members of low-status groups, individuals low and high in SDO do not differ as much in their political attitudes.”

Federico says that SDO is a thought to be a highly stable trait, but that doesn’t mean it’s impossible for an individualto change his or her social attitudes.

“There are people who mentally practice being egalitarian, so that what they do habitually when confronted with a stimulus that they know might provoke prejudice is to associate a good feeling with it, or bring to bear their egalitarian values,” says Pratto. “People can do this so much that they eventually become automatic at doing it.”

Counter-right-wing demonstrators in Boston, August 19, 2017


On Nazis and that desire for dominance. I can’t help but think much of it depends on anger and resentment, and above all, fear. Fear of being displaced or of having no place. Of being at the bottom, with no one lower and less powerful. Some of this is like the chicken and the egg — did surplus wealth lead to hierarchies and dominance, or is it something deep in the genes that seeks expression in social hierarchies? We know change is possible, as the article states, with some reduction in the desire for dominance. Is it an expression or a perversion of nature? There are those troubling, murderous, chimpanzees. Are we seeing the spasms of a dying thing, or the revival of a monster? There are troubling and challenging times, and there is no room for complacency, no comfortable hiding place, no acceptable neutrality.


Toxic masculinity seems to be involved. The biological part of it — the male struggle for dominance — will be the hardest to root out (if ever). Thank goodness at least we have sports as one outlet for that. My greatest hope, however remote, is that humanity will eventually achieve an arrangement where fulfilling work (not necessarily a “job”) will be available for everyone, as well as community and affection.


~ “I came to Washington to work for God, FDR, and the millions of forgotten, plain common workingmen.”

Frances Perkins was born on April 10, 1882 in Boston, Massachusetts. She graduated from Mount Holyoke College in 1902, and Columbia University in 1910 with a master's degree in sociology. In 1910 she became head of the New York Consumer's League, lobbying for better working hours and conditions. In 1933, Franklin Roosevelt appointed Ms. Perkins as his Secretary of Labor, a position she held for twelve years, longer than any other Secretary of Labor and making her the first woman to hold a cabinet position in the United States.

Historian Arthur Schlesinger Jr. has described Frances Perkins in vivid terms: “Brisk and articulate, with vivid dark eyes, a broad forehead and a pointed chin, usually wearing a felt tricorn hart, she remained a Brahmin reformer, proud of her New England background . . . and intent on beating sense into the heads of those foolish people who resisted progress. She had pungency of character, a dry wit, an inner gaiety, an instinct for practicality, a profound vein of religious feeling, and a compulsion to instruct . . .”

As Secretary of Labor she played a key role writing New Deal legislation, including minimum wage laws. However, her most important contribution came in 1934 as chairwoman of the President's Committee on Economic Security. In this position she was involved in all aspects of the reports and hearings that ultimately resulted in the Social Security Act of 1935.

Prior to going to Washington, Perkins held positions in State government in New York, first as an aid to governor Al Smith and then to Franklin Roosevelt when he became governor. Smith, a machine politician from the old school, was an early social reformer with whom Frances Perkins made many a common-cause. At Smith's funeral in 1944 two of his former Tammany Hall political cronies were overheard to speculate on why Smith had become a social crusader. One of them summed the matter up this way: "I'll tell you. Al Smith read a book. That book was a person, and her name was Frances Perkins. She told him all these things and he believed her."

Following her tenure as Secretary of Labor in 1945, Ms. Perkins was asked by President Truman to serve on the U.S. Civil Service Commission, which she did until 1952 when her husband died, and she resigned from Federal service. Following her government service career, Ms. Perkins continued to be active as a teacher and lecturer until her death on May 14, 1965.

The Social Security Act was signed by FDR on 8/14/35. Taxes were collected for the first time in January 1937 and the first one-time, lump-sum payments were made that same month. Regular ongoing monthly benefits started in January 1940.

~ “Frances Perkins' husband, Paul Wilson, suffered from chronic mental illness and spent most of their married life confined to mental institutions. On the day of the signing of the Social Security Act, as she was leaving her office to go to the signing ceremony, she received a phone call breaking the news that her husband had wandered away from his hospital and was lost somewhere in New York City. She went to the White House for the signing and took her place immediately behind FDR for the photographers and newsreel cameramen. As soon as the ceremony ended she rushed to Union Station where she boarded the first train to New York City. There, several hours later, she finally located her confused and disoriented husband wandering the streets of the city.” ~

New York, 1937
~ “In a study published online Nov. 6, 2011 in Nature Medicine, investigators at the Stanford University School of Medicine have shown that the development of osteoarthritis is in great part driven by low-grade inflammatory processes. This is at odds with the prevailing view attributing the condition to a lifetime of wear and tear on long-suffering joints.

“It’s a paradigm change,” said William Robinson, MD, PhD, the study’s senior author, of the implication of the findings. “People in the field predominantly view osteoarthritis as a matter of simple wear and tear, like tires gradually wearing out on a car.” It also is commonly associated with blow-outs, he added, such as a tear in the meniscus — a cartilage-rich, crescent-shaped pad that serves as a shock-absorber in joints — or some other traumatic damage to a joint.

Osteoarthritis is the most common joint disease, afflicting some 27 million people in the United States alone. It is characterized by breakdown of cartilage, most often in the knees, hips, fingers and spine.

It has long been known that osteoarthritic joint tissues host a heightened number of migratory inflammatory cells and of some of the substances these cells secrete — “not nearly as much as in the case of rheumatoid arthritis, which is clearly an autoimmune disease, but enough to make us wonder if inflammation is also a major player in osteoarthritis as well,” Robinson said. His team’s observation of increased numbers of certain specialized inflammatory proteins early in the progress of osteoarthritis, before it becomes symptomatic, suggested that inflammation might be a driver, rather than a secondary consequence, of the disease.

The study showed that, indeed, initial damage to the joint sets in motion a chain of molecular events that escalates into an attack upon the damaged joint by one of the body’s key defense systems against bacterial and viral infections, the so-called complement system. This sequence of events begins early in the development of osteoarthritis.

The complement system consists of an orchestra of proteins present in blood. Upon activation of the complement cascade — typically, in response to the presence of bacterial or viral infection — these proteins engage in a complex interplay, variously enhancing or inhibiting one another’s actions at certain points and culminating in the activation of a protein cluster called the MAC (for “membrane attack complex”). By punching holes in the membranes of bacterial or virally infected human cells, the MAC helps to clear the body of infections.

An early clue regarding the complement system’s key role in osteoarthritis came when Robinson and his colleagues, employing advanced lab techniques, compared the levels of large numbers of proteins present in the joint fluid taken from osteoarthritis patients with levels present in fluid from healthy individuals. They found that the patients’ tissues had a relative overabundance of proteins that act as accelerators in the complement cascade, along with a dearth of proteins that act as brakes.

Further experiments in mice and with human tissue showed that the MAC, the heavy artillery of the complement system, was damaging joint-tissue cells, but not by punching holes in them. Instead, it was binding to cartilage-producing cells in these tissues and causing them to secrete, on their own, still more complement-component proteins as well as other inflammatory chemicals, and other specialized proteins, or enzymes, that chew up the matrix of cartilage occupying the spaces between cells

They demonstrated that breakdown products of cartilage destruction, including one called fibromodulin, can directly activate the complement system, fostering a continuing cycle of joint-tissue damage.

Finally, the investigators showed that all these insults inflicted by the complement system — measured by microscopic examination of mouse joints — were mirrored by functional impairment. Bioengineered mice lacking a key complement-component protein, without which the complement system fails to activate, maintained their ability to walk normally, while normal mice developed a hindered gait due to severe osteoarthritis following meniscal injury.

“Recent findings suggest that low-grade complement activation contributes to the development of degenerative diseases including Alzheimer’s disease and macular degeneration. Our results suggest that osteoarthritis can be added to this list of diseases,” said Robinson.

Drugs that target the complement system may someday prove useful in preventing the onset of osteoarthritis in people who have suffered joint injuries, Robinson said, though he cautioned that this system is so crucial to our defense against microbial infection that systemic delivery of complement inhibitors would likely not be safe. But it is possible that a brief period of local administration of a complement inhibitor might provide benefit to patients developing osteoarthritis, while minimizing their risk for the development of infections.

“Right now we don’t have anything to offer osteoarthritis patients to treat their underlying disease,” Robinson said. “It would be incredible, for the one-third of humans over 60 who have it, to find a way to slow it down.”


Alas, it is our own immune system that tears down the cartilage. In rheumatoid arthritis, the inflammation is more severe and the destruction more rapid, but basically the old distinction doesn't hold: both rheumatoid arthritis and osteoarthritis are auto-immune diseases.  

Dali: Endless Enigma


~ “American doctors have been noticing an increase in osteoarthritis of the knee. Even correcting for body mass index and age, osteoarthritis of the knee is twice as common now as it was before the 1950s.

"That's an incredible difference," says Daniel Lieberman, a professor of human evolutionary biology at Harvard University and co-author of the study.

Conventional wisdom is that osteoarthritis of the knee results mostly from wear and tear, which is why, these days, it's more common among older people and those whose excess body weight puts extra stress on those joints.

"So, going into it, I suppose my expectation was that people in the past, especially early hunter-gatherers and early farmers, would have had a much higher prevalence of osteoarthritis than people do today," Wallace says. Surely all that running around, squatting, twisting and other activity in the days before cars and couches would have worn out joints quickly.

But that's not what the evidence showed.

"I was actually extremely surprised to find that [osteoarthritis] is much more common today" than it was in Americans long ago, says Wallace.

"Your joints aren't just like your automobile tires that wear out as you use them," he says. In fact, exercise helps nutrients diffuse into cartilage in the knee and keep it strong and healthy.

That's not to say that [less] exercise fully explains the trend that the Harvard researchers have noted.

"There may be dietary factors that may be important," Loeser suggests. And sports injuries, which he says "have become more and more common" may be contributing to arthritis, too.


So there it is: if you were born after WW2, your risk of arthritis is double what it would have been in the past. We aren’t sure why, but we can conclude that the “wear-and-tear” explanation is incorrect. Sports and other injuries (e.g accident-related trauma) are more than mere “wear and tear.” Otherwise, being physically active seems to be actually preventive, perhaps by being ultimately anti-inflammatory. (As we age, however, the short-term inflammation that follows intense exercise becomes more pronounced and lasts longer.)


ending on beauty:

And then I rose
in the dazzle of light, to the pine trees
plunging and righting themselves in a furious wind.

To have died and come back
raw, crackling,
and the numbness

That clumsy
pushing and wheeling inside my chest, that ferocious
upturn —
I give myself to it. Why else
be in a body?

~ Chana Bloch, “Afterlife”

photo: Susan Rogers