Saturday, August 19, 2017


White light image of the solar corona during totality of a solar eclipse (NASA)

~ “Pnin slowly walked under the solemn pines. The sky was dying. He did not believe in an autocratic God. He did believe, dimly, in a democracy of ghosts. The souls of the dead, perhaps, formed committees, and these, in continuous session, attended to the destinies of the quick.” ~ Nabokov, Pnin

I read this paragraph one day after waking at midnight from a nightmare that had the vividness of a hallucination rather than the vagueness of a typical dream. In the dream (or vision) I was lying in my bed. I looked up and saw about eight-ten people, mostly but not exclusively men, mostly fortyish — mature but not elderly, “in their prime.” Their faces, each quite distinct and individual, had a serious look. Though none wore a white coat or sported a stethoscope, I assumed that these were medical professionals who have come to bring me bad news. And I also suspected that they were ghosts. “Who are you?” I asked. They vanished without answering, and I woke up (never end a poem with “and I woke up,” is one of the various taboos you hear at a poetry workshop).

Those solemn people at my bedside — that’s what I imagine a committee of ghosts might look like.

Not that I believe in ghosts. To me the “soul” is a loose equivalent of the mind (including feelings and unconscious neural pathways). It’s a complex function of the brain that ceases when the brain dies. It’s certainly not a little person that leaves the body and roams around the universe (or the “astral realm”), preserving one’s identity, remembering everything that happened.

And yet . . . when you grow up in a culture where the belief in souls and ghost is still commonplace, and haunts (so to speak) books and movies, it is not the least surprising to dream about ghosts and to respond to their literary or cinematic depictions. And it’s rather pleasant to ponder that the ghosts we conjure up do seem to care about us.

(OK, my “ghosts” were wearing clothes — nice casual clothes that are typical of what people wear when they go shopping, for instance. Yet why would a soul need clothes? Souls in art are mostly nude, just without genitals. In cartoons, they wear nightgowns or simple shapeless robes. But dreams tend to show the dead just as we remember them, i.e. usually in clothes.)

Now, the idea of committees of ghosts attending to the living would solve the pesky problem of giving the dead something to do, and be of some use. It seems that earthly life is what matters anyway, even to the ghosts. Since physical pleasures — eating, sex, napping, playing with pets — don’t exist in the Christian heaven, we can’t imagine anything worth doing “up there.” Well, I personally can imagine strolling for some portion of eternity around the Garden of Eden and never getting tired of the plants and their doings (assuming change happens, which some would deny in the name of perfection), but I know that most people aren’t as fond of botanical displays.

But never mind me. “Pnin” is a terrific novel about a Russian emigrè in America — and the protagonist isn’t a pedophile, and the portrait of American culture is relatively benign. (By the way, my spell check always corrects Pnin to “pain” — and there is a certain insight to that, as being an immigrant goes.)

~ “[Nabokov’s] American novels are distinct from what he’d done before; more than that, Roper writes, they’re an apotheosis – “the claim to greatness rests most solidly on the American books”. Indeed, from the outset, Roper makes Nabokov himself into a peculiarly American figure, as much immigrant hustler as aristocratic European, someone who’d acted out scenarios from Mayne Reid’s Wild West books as a child, and who was scrappily resourceful in making his name, despite racking up some 60 rejections from US publishers before he even arrived.

Nabokov liked all sorts of things about his adopted country, its trashy cultural ephemera as well as its natural beauty, its openness but also its odd conservatism, in which he perhaps sensed a different kind of opportunity (“what charms me personally about American civilization,” he wrote to his agent before the move, “is exactly that old-world touch, that old-fashioned something which clings to it despite the hard glitter, and hectic nightlife, and up-to-date bathrooms”).

His delight in it is beguiling, as is the image Roper offers of him as a particular kind of immigrant. Not an émigré in the mould of Thomas Mann or Bertolt Brecht, Nabokov immersed himself in the new place, not least via his work as an lepidopterist, through which he made all kinds of friends. Far from keeping to a rarefied enclave, Roper’s Nabokov is a figure more like Ayn Rand, who came as he did from St Petersburg – although she was, as Roper tactfully notes, “a writer of different attainments”, she also made a “wholesale embrace of what she took for Americanism” – or Billy Wilder, who’d made movies in German and French before his American classics.

Like Wilder, Nabokov did a good line in American comedy. “Reality was vital and vulgar here,” Roper suggests, providing “‘exhilarating’ opportunities for burlesque”. Part of the joke in Pnin is that the bumbling foreigner, with his futile eagerness to fit in, can be more American than the Americans by virtue of sheer desirous optimism. Think of Pnin’s set of false teeth grinning to itself in its container – “It was a revelation, it was a sunrise, it was a firm mouthful of efficient, alabastrine, humane America” – or of his passion for the washing machine: “Casting aside all decorum and caution, he would feed it anything that happened to be at hand, his handkerchief, kitchen towels … just for the joy of watching through that porthole what looked like an endless tumble of dolphins with the staggers.” Pnin is far cosier than Lolita, but they do share an expansiveness: one a social comedy swollen with feeling; the other perhaps the most boisterous, spirited parody-tragedy you could conceive.

Had Nabokov succeeded in his attempts to move to England instead in the late 30s, there would have been no Lolita – and not just because of the open road or the gum-chewing teen: the whole shape of the narrative, the language and energy of it, is unimaginable without the American landscape and culture. We even have a European version to compare it to, The Enchanter, written in 1939, which has the pedophile fleeing with the child, but shares few of Lolita’s other qualities. “Like the author of a story about bulls and capes who changes the setting to Spain,” Roper writes, in bringing the theme to America, “Nabokov inherited a stage.”

In place of the snobbery, the famous superiority complex, Roper finds someone who “immersed himself in the demos”, both in theory and in practice. And more important, Roper gently rejects Nabokov’s claim that “inventing America” meant simply collecting some local color to “inject a modicum of average ‘reality’ … into the brew of individual fancy”. Instead of a hermetically sealed genius, roaming around but never changing, letting nothing in, Roper finds a writer who is open, flexible, susceptible to influence; who doesn’t know everything in advance, but rather makes discoveries and passes them on.” ~


Heaven — the dream of having landed permanently in the winners’ circle — speaks volumes about the quest to escape reality’s dilemmas. ~ Jeremy Sherman
But what would the disembodied soul do? This is where Nabokov’s fantasy has its charm — the ghosts would in some manner nurture the living. This reminds me of Lebanese woman I met in college — she said it’s good to have many dead: they pray for you, they take care of you. And it’s by no means a rare idea. Even people who are not overtly religious may conclude a tale of a near-accident with the remark, “My mother in heaven must have been watching out for me,” and no one raises an eyebrow.

If a misfortune DOES happen, oddly enough no one says, “My mother in heaven must have forgotten to pray for me.”


But speaking of America, I think this is something that Nabokov would gladly include in a novel or memoir:

~ Jonathan Stickland, a member of Freedom Caucus, generously supported by Empower Texas, made news in the 2015 session by posting a sign outside his office:

Jonathan Stickland
District 92

~ The New Yorker, July 10 & 17, 2017, p. 56

As someone commented, When did he stop being a fetus?


Speaking of soul: the Catholic Encyclopedia does a very poor job of defining it, while snarling that those who reject the body/soul dualism are closet atheists. But by beginning with the “animating principle present in all living things,” these theologians aren’t very far from the teachings of the Bhagavad Gita, which extends the meaning to “unchanging, indestructible, indivisible presence within everything.” (So, you thought that a rock doesn’t have a soul? Or, if it does, that it's a different, lower-quality soul than yours?)

As opposed to such universal soul, most Western believers in the soul see it as exclusively human and unique to each person. On the other hand, the Jungian writer James Hillman evades the universal versus individual dilemma by saying the soul is “the poetic basis of mind . . . the imaginative possibility in our natures, the experiencing through reflective speculation, dream, image, fantasy.” This seems to equate the soul with the inner life or “inner world.” Few would argue with that, though maybe it would be simpler to use the term “inner life.”

Another Jungian, Thomas Moore, wisely refuses to define the soul, but says we know it when we see it; when something has the qualities of genuineness and depth, we call it soulful. Again, this seems accurate enough, but why not speak of “depth” instead? Then we can ponder what “depth” entails. When I think of Kate Blanchett’s extraordinary performance in “Blue Jasmine” as having depth, I realize that more than anything I mean multidimensional. She can’t be summarized with a single label, e.g. “narcissistic” or “traumatized.” She carries the mystery of what a human being is as a verb, a process, shifting from moment to moment.

For the neuroscientist, there is of course only brain function that ceases when the brain completes all the stages of dying. NDEs, too, are brain function under extreme circumstances, as are all mystical visions and experiences. But when we say something is “only” brain function, we need to be aware that we are talking about a magnificence we have barely begun to explore — for instance, mirror neurons that appear to underlie empathy were discovered only in 1996.

Can we ever understand the brain fully? Since it’s the human brain studying the human brain, philosophers claim that a full understanding is not possible. And that’s fine — mystery is more thrilling than answers.


. . . now, weak, short of breath, my once-firm muscles melted away by cancer, I find my thoughts, increasingly, not on the supernatural or spiritual, but on what is meant by living a good and worthwhile life — achieving a sense of peace within oneself. I find my thoughts drifting to the Sabbath, the day of rest, the seventh day of the week, and perhaps the seventh day of one’s life as well, when one can feel that one’s work is done, and one may, in good conscience, rest.

~ Oliver Sacks, “Sabbath”

Milosz too loved his old age. It seemed to be the happiest time of his life, which amazed him. He wrote that the older he grew, the more he loved life and the beauty of the world. He used to think that with age we are supposed to distance ourselves from life in preparation for departure — that it would be natural to withdraw, to become more aloof to earthly delights. And yet the opposite was happening.

I was departing, the first star ran to greet me,
and the glow was so beautiful, and life was so good
that I said, I will return, though there is no returning.

~ Czeslaw Milosz

“the first star” — this is partly in reference to the symbolism of the first star on Christmas Eve. It’s Polish custom to begin the Christmas Eve supper when the first star is sighted (presumably the first star, like that Star of Bethlehem, signals that the divine child is about to be born; there may also be a pagan reference)

I wonder if perhaps the best translation of the first phrase is the literal one: “I was descending,” or even “I was setting” (like the sun or the moon).

And the wonderful thing was both Milosz and Sacks continued to write until the very end. Their main focus was peace and receptivity, but they continued to contribute out of the richness of their minds. They didn’t have to strain; the writing skill, honed with hard work, just kept on giving. 


~ “As viewed from Earth, the pattern of bright highlands and dark maria on the moon’s surface never seems to change. This leads us to wonder: What does the far side of the moon look like?

Some speculate that this side beyond our view is a “dark side,” a frozen and desolate surface devoid of sunlight, possibly haunted by malevolent forces. In fact, this myth has permeated popular culture. In the modern era, the dark side of the moon has continuously captured human imagination, sparking a 1990 thriller of the same name and inspiring the title of Pink Floyd’s popular “Dark Side of the Moon” album. However, in understanding the science behind the moon’s orbit, we can prove that there is no dark side after all.


No matter where we are on Earth, we see and always have seen only one face of the moon. Since the moon rotates on its axis in the same amount of time that it takes the body to orbit our planet, the same half face of the moon is consistently exposed to viewers on Earth. This timing is caused by a phenomenon called tidal locking, which occurs when a larger astronomical body (Earth) exerts a strong gravitational pull on a smaller body (the moon), forcing one side of the smaller body to always face the larger one. Due to tidal locking and other astronomical variables, only 59 percent of the moon’s surface can ever be seen from our planet. The remaining 41 percent, then, remains a mystery, and a subject of creative musings and astronomical research.

The fact that we earthlings cannot see the far side of the moon does not mean that this face is never exposed to sunlight. In fact, the far side of the moon is no more and no less dark than the hemisphere we do see. Since the moon is a sphere and light shines radially outward from the sun, one hemisphere of the moon is illuminated at all times, except in the case of a lunar eclipse.

However, the hemisphere fully lit is only the side of the moon we see from Earth during a full moon. During the moon’s other phases, its apparent shape depends on how much of the sunlit hemisphere we can see from Earth, and how the moon creates its own shadow away from where the sunlight hits. For example, when we see a quarter moon from Earth, we are seeing one quarter of the sunlit surface and three quarters of the moon in shadow. If we could set our sights on the far side of the moon, we would see three quarters illuminated by the sun and one quarter in shadow.

Photographic evidence now also confirms that the moon has no dark side. Photographs of the far side of the moon did not exist until 1959, when images were transmitted from the Soviet spacecraft Luna 3. Recently, NASA confirmed the existence of a well-lit far side of the moon by using images from the Wide Angle Camera onboard the Lunar Reconnaissance Orbiter, which fully orbits the moon to construct a full map of its surface. Since we have never seen the far side of the moon from Earth, this 360 degree view may look foreign to us. In fact, we see that the far side of the moon, paradoxically, is lighter in color than the near side since it has fewer dark maria.

Each night we look up at the sky and see the moon in all its glory, we should remember that there is a whole other side to this celestial body that no human has ever seen with the naked eye.



We got a wonderful present from Hitler and Stalin that they never meant to give us. We were immune for sixty years or so to aggression, racism, and militarism. They made us partly immune to those things. Not it appears this Stalin-Hitler gift has reached its expiration date. We were spoiled by this. So maybe we are just emerging from a relatively golden age. ~ Amos Oz



It's been 152 years since the Union Army defeated the Confederate States of America, and 72 years since the Allies defeated the Third Reich. Why, despite decades of social progress for ethnic minorities, do people still embrace fascist and neo-Confederate ideologies?

A model developed in the early 1990s might help explain the persistence of ideologies that promote social inequality. Social dominance theory postulates that societies seek to minimize class conflict by promoting ideologies that promote the superiority of one group. The eight-item social dominance orientation scale measures how strongly a person supports hierarchical social relations.

Social dominance theory seeks to explain how hierarchy-enhancing ideologies do not just drive social inequality, but are also a result of it. It suggests that a single personality trait, called social dominance orientation (SDO), strongly predicts a person’s political and social views, from foreign policy and criminal justice to civil rights and the environment. What's more, it offers insight into how ideologies such as racism, sexism, and xenophobia tend to arise from the unequal distribution of a society's resources.

“Social dominance theory provides a yardstick for measuring social and political ideologies,” says Felicia Pratto, a psychologist at the University of Connecticut who helped create the theory. “[Social dominance orientation] is one way – not the only one – to try to figure out what those ideologies are ‘about.’ ”

A person’s SDO can be measured with as few as eight survey items that gauge how strongly a person believes in hierarchical social relations. Respondents are asked to say how much they agree or disagree with statements. At one end of the spectrum are statements suggesting that “An ideal society requires some groups to be on top and others to be on the bottom,” and “It is unjust to try to make groups equal.” Statements at the other end suggest that “Groups at the bottom are just as deserving as groups at the top,” and “No one group should dominate in society.”

People with high SDO scores are more likely to believe that women and men are naturally different and should have different workplace roles. They are more likely to accept theories of racial superiority and to believe that their country is inherently better than other countries. They tend to oppose lesbian, gay, bisexual, and transgender rights; affirmative action; interracial marriage; and social welfare programs. They tend not to call themselves environmentalists. They tend to support military action overseas and the death penalty at home. They tend to believe in capitalism and that the world is basically just. And they are more likely to choose “hierarchy enhancing” careers such as law enforcement, military, business, and politics.

People with low SDO scores, by contrast, tend to hold social attitudes associated with egalitarianism. They tend to work in “hierarchy attenuating” careers such as social work and counseling, special education, or journalism.

“One thing that I think most people find surprising about the theory is its argument that many phenomena that we think of as different or unique – say, racism, sexism, and homophobia – may have a common root,” says Christopher Federico, a professor in the University of Minnesota’s psychology and political science departments. “That is, all of them may be similarly rooted in a desire for intergroup hierarchy, despite having different targets and being enforced in somewhat different ways.”

Men, on average, tend to have higher SDO scores than women, an observation that has led researchers to suggest that SDO may be partly rooted in biology, although research indicates that SDO is not a genetically heritable trait.

“There is a strong tendency for countries that have more equality for women, such as higher education levels, less unequal pay between men and women, and more women in political office, to have lower SDO scores,” says Professor Pratto, who now teaches psychology at the University of Connecticut.

Pratto notes in an email interview that SDO is not a binary phenomenon, but a gradient. “Most people are a bit to the low side of the middle in egalitarian countries, and the mean tends to be higher in more hierarchical countries,” she says.

Social dominance theory starts with the observation that in every society that has moved past the hunter-gatherer stage and produces economic surpluses, social hierarchies emerge. Those at the top of the pecking order develop and promote social beliefs – for example, the idea that poor people remain so because they are lazy – that legitimize the hierarchy.

“We know that people high in SDO are more likely to support conservative social policies,” Professor Federico says in an email to the Monitor. “However, this relationship is more pronounced among those in high-status groups. Among members of low-status groups, individuals low and high in SDO do not differ as much in their political attitudes.”

Federico says that SDO is a thought to be a highly stable trait, but that doesn’t mean it’s impossible for an individualto change his or her social attitudes.

“There are people who mentally practice being egalitarian, so that what they do habitually when confronted with a stimulus that they know might provoke prejudice is to associate a good feeling with it, or bring to bear their egalitarian values,” says Pratto. “People can do this so much that they eventually become automatic at doing it.”

Counter-right-wing demonstrators in Boston, August 19, 2017

~ “I came to Washington to work for God, FDR, and the millions of forgotten, plain common workingmen.”

Frances Perkins was born on April 10, 1882 in Boston, Massachusetts. She graduated from Mount Holyoke College in 1902, and Columbia University in 1910 with a master's degree in sociology. In 1910 she became head of the New York Consumer's League, lobbying for better working hours and conditions. In 1933, Franklin Roosevelt appointed Ms. Perkins as his Secretary of Labor, a position she held for twelve years, longer than any other Secretary of Labor and making her the first woman to hold a cabinet position in the United States.

Historian Arthur Schlesinger Jr. has described Frances Perkins in vivid terms: “Brisk and articulate, with vivid dark eyes, a broad forehead and a pointed chin, usually wearing a felt tricorn hart, she remained a Brahmin reformer, proud of her New England background . . . and intent on beating sense into the heads of those foolish people who resisted progress. She had pungency of character, a dry wit, an inner gaiety, an instinct for practicality, a profound vein of religious feeling, and a compulsion to instruct . . .”

As Secretary of Labor she played a key role writing New Deal legislation, including minimum wage laws. However, her most important contribution came in 1934 as chairwoman of the President's Committee on Economic Security. In this position she was involved in all aspects of the reports and hearings that ultimately resulted in the Social Security Act of 1935.

Prior to going to Washington, Perkins held positions in State government in New York, first as an aid to governor Al Smith and then to Franklin Roosevelt when he became governor. Smith, a machine politician from the old school, was an early social reformer with whom Frances Perkins made many a common-cause. At Smith's funeral in 1944 two of his former Tammany Hall political cronies were overheard to speculate on why Smith had become a social crusader. One of them summed the matter up this way: "I'll tell you. Al Smith read a book. That book was a person, and her name was Frances Perkins. She told him all these things and he believed her."

Following her tenure as Secretary of Labor in 1945, Ms. Perkins was asked by President Truman to serve on the U.S. Civil Service Commission, which she did until 1952 when her husband died, and she resigned from Federal service. Following her government service career, Ms. Perkins continued to be active as a teacher and lecturer until her death on May 14, 1965.

The Social Security Act was signed by FDR on 8/14/35. Taxes were collected for the first time in January 1937 and the first one-time, lump-sum payments were made that same month. Regular ongoing monthly benefits started in January 1940.

~ “Frances Perkins' husband, Paul Wilson, suffered from chronic mental illness and spent most of their married life confined to mental institutions. On the day of the signing of the Social Security Act, as she was leaving her office to go to the signing ceremony, she received a phone call breaking the news that her husband had wandered away from his hospital and was lost somewhere in New York City. She went to the White House for the signing and took her place immediately behind FDR for the photographers and newsreel cameramen. As soon as the ceremony ended she rushed to Union Station where she boarded the first train to New York City. There, several hours later, she finally located her confused and disoriented husband wandering the streets of the city.” ~

New York, 1937

~ “In a study published online Nov. 6, 2011 in Nature Medicine, investigators at the Stanford University School of Medicine have shown that the development of osteoarthritis is in great part driven by low-grade inflammatory processes. This is at odds with the prevailing view attributing the condition to a lifetime of wear and tear on long-suffering joints.

“It’s a paradigm change,” said William Robinson, MD, PhD, the study’s senior author, of the implication of the findings. “People in the field predominantly view osteoarthritis as a matter of simple wear and tear, like tires gradually wearing out on a car.” It also is commonly associated with blow-outs, he added, such as a tear in the meniscus — a cartilage-rich, crescent-shaped pad that serves as a shock-absorber in joints — or some other traumatic damage to a joint.

Osteoarthritis is the most common joint disease, afflicting some 27 million people in the United States alone. It is characterized by breakdown of cartilage, most often in the knees, hips, fingers and spine.

It has long been known that osteoarthritic joint tissues host a heightened number of migratory inflammatory cells and of some of the substances these cells secrete — “not nearly as much as in the case of rheumatoid arthritis, which is clearly an autoimmune disease, but enough to make us wonder if inflammation is also a major player in osteoarthritis as well,” Robinson said. His team’s observation of increased numbers of certain specialized inflammatory proteins early in the progress of osteoarthritis, before it becomes symptomatic, suggested that inflammation might be a driver, rather than a secondary consequence, of the disease.

The study showed that, indeed, initial damage to the joint sets in motion a chain of molecular events that escalates into an attack upon the damaged joint by one of the body’s key defense systems against bacterial and viral infections, the so-called complement system. This sequence of events begins early in the development of osteoarthritis.

The complement system consists of an orchestra of proteins present in blood. Upon activation of the complement cascade — typically, in response to the presence of bacterial or viral infection — these proteins engage in a complex interplay, variously enhancing or inhibiting one another’s actions at certain points and culminating in the activation of a protein cluster called the MAC (for “membrane attack complex”). By punching holes in the membranes of bacterial or virally infected human cells, the MAC helps to clear the body of infections.

An early clue regarding the complement system’s key role in osteoarthritis came when Robinson and his colleagues, employing advanced lab techniques, compared the levels of large numbers of proteins present in the joint fluid taken from osteoarthritis patients with levels present in fluid from healthy individuals. They found that the patients’ tissues had a relative overabundance of proteins that act as accelerators in the complement cascade, along with a dearth of proteins that act as brakes.

Further experiments in mice and with human tissue showed that the MAC, the heavy artillery of the complement system, was damaging joint-tissue cells, but not by punching holes in them. Instead, it was binding to cartilage-producing cells in these tissues and causing them to secrete, on their own, still more complement-component proteins as well as other inflammatory chemicals, and other specialized proteins, or enzymes, that chew up the matrix of cartilage occupying the spaces between cells

They demonstrated that breakdown products of cartilage destruction, including one called fibromodulin, can directly activate the complement system, fostering a continuing cycle of joint-tissue damage.

Finally, the investigators showed that all these insults inflicted by the complement system — measured by microscopic examination of mouse joints — were mirrored by functional impairment. Bioengineered mice lacking a key complement-component protein, without which the complement system fails to activate, maintained their ability to walk normally, while normal mice developed a hindered gait due to severe osteoarthritis following meniscal injury.

“Recent findings suggest that low-grade complement activation contributes to the development of degenerative diseases including Alzheimer’s disease and macular degeneration. Our results suggest that osteoarthritis can be added to this list of diseases,” said Robinson.

Drugs that target the complement system may someday prove useful in preventing the onset of osteoarthritis in people who have suffered joint injuries, Robinson said, though he cautioned that this system is so crucial to our defense against microbial infection that systemic delivery of complement inhibitors would likely not be safe. But it is possible that a brief period of local administration of a complement inhibitor might provide benefit to patients developing osteoarthritis, while minimizing their risk for the development of infections.

“Right now we don’t have anything to offer osteoarthritis patients to treat their underlying disease,” Robinson said. “It would be incredible, for the one-third of humans over 60 who have it, to find a way to slow it down.”


Alas, it is our own immune system that tears down the cartilage. In rheumatoid arthritis, the inflammation is more severe and the destruction more rapid, but basically the old distinction doesn't hold: both rheumatoid arthritis and osteoarthritis are auto-immune diseases.  

Dali: Endless Enigma


~ “American doctors have been noticing an increase in osteoarthritis of the knee. Even correcting for body mass index and age, osteoarthritis of the knee is twice as common now as it was before the 1950s.

"That's an incredible difference," says Daniel Lieberman, a professor of human evolutionary biology at Harvard University and co-author of the study.

Conventional wisdom is that osteoarthritis of the knee results mostly from wear and tear, which is why, these days, it's more common among older people and those whose excess body weight puts extra stress on those joints.

"So, going into it, I suppose my expectation was that people in the past, especially early hunter-gatherers and early farmers, would have had a much higher prevalence of osteoarthritis than people do today," Wallace says. Surely all that running around, squatting, twisting and other activity in the days before cars and couches would have worn out joints quickly.

But that's not what the evidence showed.

"I was actually extremely surprised to find that [osteoarthritis] is much more common today" than it was in Americans long ago, says Wallace.

"Your joints aren't just like your automobile tires that wear out as you use them," he says. In fact, exercise helps nutrients diffuse into cartilage in the knee and keep it strong and healthy.

That's not to say that [less] exercise fully explains the trend that the Harvard researchers have noted.

"There may be dietary factors that may be important," Loeser suggests. And sports injuries, which he says "have become more and more common" may be contributing to arthritis, too.


So there it is: if you were born after WW2, your risk of arthritis is double what it would have been in the past. We aren’t sure why, but we can conclude that the “wear-and-tear” explanation is incorrect. Sports and other injuries (e.g accident-related trauma) are more than mere “wear and tear.” Otherwise, being physically active seems to be actually preventive, perhaps by being ultimately anti-inflammatory. (As we age, however, the short-term inflammation that follows intense exercise becomes more pronounced and lasts longer.)


ending on beauty:

And then I rose
in the dazzle of light, to the pine trees
plunging and righting themselves in a furious wind.

To have died and come back
raw, crackling,
and the numbness

That clumsy
pushing and wheeling inside my chest, that ferocious
upturn —
I give myself to it. Why else
be in a body?

~ Chana Bloch, “Afterlife”

photo: Susan Rogers

Sunday, August 13, 2017


Eclipse in St. Petersburg; Alexandr Petrosyan

At times … I wish

I could meet in a duel

the man who killed my father

and razed our home,

expelling me

a narrow country.

And if he killed me,
I’d rest at last, 

and if I were ready— 

I would take my revenge!


But if it came to light,

when my rival appeared, 

that he had a mother 

waiting for him,
or a father who’d put 

his right hand over

the heart’s place in his chest
whenever his son was late
even by just a quarter-hour
for a meeting they’d set— 

then I would not kill him, 

even if I could. 


Likewise … I
would not murder him  

if it were soon made clear
that he had a brother or sisters  

who loved him and constantly longed to see him. 

Or if he had a wife to greet him 
and children who 
couldn’t bear his absence  

and whom his gifts would thrill.
Or if he had 
friends or companions,
neighbors he knew
or allies from prison 
or a hospital room, 
or classmates from his school …
asking about him 
and sending him regards. 



But if he turned 

out to be on his own— 
cut off like a branch from a tree—
without a mother or father,
with neither a brother nor sister,
wifeless, without a child,
and without kin or neighbors or friends,  

colleagues or companions,
then I’d add not a thing to his pain
within that aloneness—  

not the torment of death,
and not the sorrow of passing away.
Instead I’d be content 
to ignore him when I passed him by   

on the street— as I 

convinced myself 

that paying him no attention 

in itself was a kind of revenge. 

~ Taha Muhammad Ali (1931-2011); translated by Peter Cole, Yahya Hijazi, and Gabriel Levin

Just the second stanza of this poem makes me tear up . . . 

But if it came to light,

when my rival appeared, 

that he had a mother 

waiting for him,
or a father who’d put 

his right hand over

the heart’s place in his chest
whenever his son was late
even by just a quarter-hour
for a meeting they’d set— 

then I would not kill him, 

even if I could. 

This is one of the most humanitarian poems I’ve ever come across. Talk about the power of poetry to touch our heart. 

It's the kind of poem that makes us feel more human; it expands our circle of empathy. 

Like most great poems, it’s amazingly simple: mother waiting, father touching his heart when the son is late, children thrilled by their father’s little gifts.

Mary: The best revenge may simply be survival. Avoiding the trap of becoming an echo of your enemy.

Oriana: Yes, becoming like your enemy is giving them victory. And it's always a defeat for humanity.

Taha Muhammad Ali

~ “There are three pervasive myths that are widely circulated about the "roots" of the Middle East conflict:

Myth 1: Judaism has nothing to do with Zionism.
Myth 2: Islam has nothing to do with Jihadism or anti-Semitism.
Myth 3: This conflict has nothing to do with religion.

To the "I oppose Zionism, not Judaism!" crowd, is it mere coincidence that this passage from the Old Testament describes so accurately what's happening today?

"I will establish your borders from the Red Sea to the Mediterranean Sea, and from the desert to the Euphrates River. I will give into your hands the people who live in the land, and you will drive them out before you. Do not make a covenant with them or with their gods." ~ Exodus 23:31-32

Or this one?

"See, I have given you this land. Go in and take possession of the land the Lord swore he would give to your fathers — to Abraham, Isaac and Jacob — and to their descendants after them." ~ Deuteronomy 1:8

There's more: Genesis 15:18-21, and Numbers 34 for more detail on the borders. Zionism is not the "politicization" or "distortion" of [ancient] Judaism. It is the revival of it.

And to the "This is not about Islam, it's about politics!" crowd, is this verse from the Quran meaningless?

"O you who have believed, do not take the Jews and the Christians as allies. They are [in fact] allies of one another. And whoever is an ally to them among you — then indeed, he is [one] of them. Indeed, Allah guides not the wrongdoing people." ~ Quran, 5:51

What about the numerous verses and hadith quoted in Hamas' charter? And the famous hadith of the Gharqad tree explicitly commanding Muslims to kill Jews?

[“The last hour would not come unless the Muslims will fight against the Jews and the Muslims would kill them until the Jews would hide themselves behind a stone or a tree and a stone or a tree would say: Muslim, oh the servant of Allah, there is a Jew behind me; come and kill him; but the Gharqad tree (the boxthorn) would not say, for it is the tree of the Jews.”]

Please tell me — in light of these passages written centuries and millennia before the creation of Israel or the occupation — how can anyone conclude that religion isn't at the root of this, or at least a key driving factor? You may roll your eyes at these verses, but they are taken very seriously by many of the players in this conflict, on both sides. Shouldn't they be acknowledged and addressed? When is the last time you heard a good rational, secular argument supporting settlement expansion in the West Bank?

Denying religion's role seems to be a way to be able to criticize the politics while remaining apologetically "respectful" of people's beliefs for fear of "offending" them. But is this apologism and "respect" for inhuman ideas worth the deaths of human beings?

People have all kinds of beliefs — from insisting the Earth is flat to denying the Holocaust. You may respect their right to hold these beliefs, but you're not obligated to respect the beliefs themselves. Religions don't need to be "respected" any more than any other political ideology or philosophical thought system. Human beings have rights. Ideas don't. The oft-cited politics/religion dichotomy in Abrahamic religions is false and misleading. All of the Abrahamic religions are inherently political. . . .

Settlement expansion is simply incomprehensible. No one really understands the point of it. Virtually every US administration has unequivocally opposed it. There is no justification for it except a Biblical one, which makes it slightly more difficult to see Israel's motives as purely secular.

At its very core, this is a tribal religious conflict that will never be resolved unless people stop choosing sides.

So you really don't have to choose between being "pro-Israel" or "pro-Palestine." If you support secularism, democracy, and a two-state solution — and you oppose Hamas, settlement expansion, and the occupation — you can be both.” ~

 the gharqad tree (boxthorn)

This is the most intelligent article I've ever read on this complex topic. I am afraid that only secularism can save the region. It won't happen in our lifetime, but eventually, I hope, eventually . . . 


I see no solution as long as people cling to their “Holy Books” which do exactly as the article says—instruct believers in hatred and violence toward those outside the “Chosen” group, whose struggle for dominance, and the ERADICATION (this must be understood) of the other groups is sanctioned, even demanded, by their particular “God.”

Those apologists who deny these instructions exist in the “Holy Books” are simply cherry picking what they want and ignoring what they don’t want to acknowledge. “Scripture” has been and continues to be used as justification for hatred and injustice—and it’s all there, without any need to edit or change anything. Words of a “jealous” “vengeful” and “angry” god.

To move forward, humanity must move away from these primitive, tribal divisions and demands toward a more enlightened, inclusive and humane model for social behavior. Us and Us, not Us and Them.


Apologists will always fish out some passage that appears universalist rather than tribal. But let’s face it: once you see the passages urging genocide, that’s pretty overwhelming.

Can we somehow salvage the poetic parts, and leave out the bloodthirsty tribalism? We could, say, have the beauty of Psalm 147 (By the rivers of Babylon) if we delete the ending about dashing the heads of the enemy’s infants against stone. But that’s messing with the truth, always a dubious act. First, let “the faithful,” the sheep, become aware just what it is that the “holy” scripture contains.

“The truth is rarely pure and never simple.” ~ Oscar Wilde

~ “Ours is a ‘banality of evil’ approach,” says Hammad Sheikh. Sheikh’s personal interest in the psychological origins of group violence began when he was growing up in Germany. “I could never believe that the Nazis were these evil people who had taken over. Millions of ordinary people had followed Hitler, and I met them. They had been fanatics. But in my childhood, they were nice old people shaking my hand and giving me chocolate.”

Not only are perpetrators of conflict not the cold-blooded psychopaths they’re often assumed to be; they may actually be distinguished for having an unusually high degree of compassion. In his studies of the neural mechanisms of prejudice and empathy, Emile Bruneau, a cognitive neuroscientist at MIT, has found that some terrorists scored higher than average on measures of empathy. Their intense empathy is limited, however, to members of their own group. “The problem is not that they lack empathy,” Bruneau says. “They have plenty. It’s just not distributed evenly.”

Leaders of modern states frequently assume that their opponents are out to maximize their largely material rewards and minimize their pain. They are thought to respond to incentives (“We’ll give you food and other aid”) and avoid disincentives (“We’ll bomb you”). But Atran, who has talked to far more terrorists and likely received far more death threats than any other social scientist, has found that this kind of horse-trading is usually anathema to people in conflict zones.

In fact, it’s anathema to most of us. That is because people of all cultures hold “sacred values”—things that are too cherished to be compromised. For example, you might relinquish a weekend day to work for money. But if your religion prohibits working on the Sabbath, no amount of money can compel you to do so. Anything—a nation, a religious landmark, a legal status—can be construed as sacred, at which point defending it is perceived as a matter of right and wrong, not of costs and benefits.

Negotiating transactionally with people who are motivated by moral imperatives is bound only to infuriate them. As Jeremy Ginges, a psychologist at the New School for Social Research, wrote in a paper published last year, “Regardless of the specific issue (whether it concerns the right to make salt or to protect an old growth rain forest, a ‘holy’ city, or a national boundary), all sacred values appear to be defined by a taboo against material trade­-offs.”
Ominously, a survey of some 1,400 Iranians conducted a few years ago by Atran and his colleagues found that 14 percent of them saw the maintenance of their country’s nuclear program as sacred.

A survey by Sheikh, Ginges, and Atran in 2013 found that 86 percent of Palestinians consider “protecting Palestinian rights over Jerusalem” as a value ranked just slightly less than “protecting the family” and equal to “fairness to others.” The “right of return”—the demand of Palestinians to be able to return to the ancestral homeland from which their families fled during Israel’s establishment in 1948—was held sacred by 78 percent.

These findings may sound like grounds for despair, but the researchers argue that acknowledgment of an adversary’s sacred values—even if they conflict with one’s own—can make negotiations more successful. This is not just because it allows negotiators to avoid the error of offering to horse-trade over an issue that’s impervious to negotiation. It’s because people often respond well to having their sacred values acknowledged, even if that recognition comes in the form of a gesture that makes no practical difference. As Atran and the political scientist Robert Axelrod wrote several years ago, by making “symbolic concessions of no apparent material benefit”—for example, an apology for a past wrong or an acknowledgment of the other side’s legitimate right to its position—negotiators “might open the way to resolving seemingly irresolvable conflicts.” In some cases, an apology means more than a very large pile of money.
The possibility of engineering people away from their natural prejudices and impulses sounds like the plot of a science-fiction story. It’s exhilarating to imagine a scenario where the causes of a suicidal willingness to fight could be identified and eliminated, where propaganda promoting group violence could be instantly negated by a well-tested antidote, and where psychological profiles help tailor a perfect anti-conflict message to each person’s distinct biases. We’re a long way from there, and no researcher is operating under the fantasy of discovering a magic bullet. But addressing these possibilities with scientific inquiry so far appears to be a push in the direction of a more humane future.” ~

Originally I was going to post an image of the American Nazis, but have decided to show a beautiful tree instead:
“The best time to plant a tree was 20 years ago. The second best time is now.” ~ Chinese proverb

~ “Hatred . . . is powerfully governed by the illusion that those we hate could (and should) behave differently. We don’t hate storms, avalanches, mosquitoes, or flu. We might use the term “hatred” to describe our aversion to the suffering these things cause us—but we are prone to hate other human beings in a very different sense. TRUE HATRED REQUIRES THAT WE VIEW OUR ENEMY AS THE ULTIMATE AUTHOR OF HIS THOUGHTS AND ACTIONS. Love demands only that we care about our friends and find happiness in their company. It may be hard to see this truth at first, but I encourage everyone to keep looking. It is one of the more beautiful asymmetries to be found anywhere.” ~


~ “Sinuses. Blind spots. External testicles. Backs and knees and feet shoddily warped into service for bipedal animals. Human birth canals barely wide enough to let the baby's skull pass — and human babies born essentially premature, because if they stayed in utero any longer they'd kill their mothers coming out (which they sometimes do anyway). Wind pipes and food pipes in close proximity, leading to a great risk of choking to death when we eat. Impacted wisdom teeth, because our jaws are too small for all our teeth. Eyes wired backwards and upside-down. The vagus nerve, wandering all over hell and gone before it gets where it's going. The vas deferens, ditto. Brains wired with imprecise language, flawed memory, fragile mental health, shoddy cost-benefit analysis, poor understanding of probability, and a strong tendency to prioritize immediate satisfaction over long-term gain. Birth defects. 15-20% of confirmed pregnancies ending in miscarriage (and that's just confirmed pregnancies — about 30% of all pregnancies end in miscarriage, and as many as 75% of all conceptions miscarry).

And that's just humans. Outside the human race, you've got giraffes with a vagus nerve traveling ten to fifteen feet out of its way to get where it's going. You've got sea mammals with lungs but no gills. You've got male spiders depositing their sperm into a web, siphoning it up with a different appendage, and only then inseminating their mates -- because their inseminating appendage isn't connected to their sperm factory. (To wrap your mind around this: Imagine that humans had penises on their foreheads, and to reproduce they squirted semen from their testes onto a table, picked up the semen with their head-penises, and then had sex.) You've got kangaroo molars, which wear out and get replaced — but only four times, after which the animals starve to death. You've got digger wasps laying their eggs in the living bodies of caterpillars — and stinging said caterpillars to paralyze them but not kill them, so the caterpillars die a slow death and can nourish the wasps' larvae with their living bodies.

You're going to look at all this, and tell me it was engineered this way on purpose?

Yes, there are many aspects of biological life that astonish with their elegance and function. But there are many other aspects of biological life that astonish with their clumsiness, half-assedness, inefficiency, pointless superfluities, glaring omissions, laughable failures, "fixed that for you" kluges and jury-rigs, and appalling, mind-numbing brutality. (See Some More of God's Greatest Mistakes for just a few of the most obvious examples.) If you're trying to reconcile all this with a powerfully magical creator god who made it this way on purpose, it requires wild mental contortions at best, and a complete denial of reality at worst.” ~


I learned much of this in my first college biology class, decades ago, so for me this was just a neat summary. But I have met those who seriously argue that god "designed" evolution. This is a common cognitive bias: if something exists, it must have been "made" for some particular purpose by an intelligent agent. An excellent book on the types of cognitive bias underlying religious beliefs is Jesse Bering's "The Belief Instinct.”

A bit more from the article:

~ Evolution is messy. Evolution is wildly inefficient. See #3 above. It's not just the products of evolution that are inefficient, either. The process itself is inefficient -- inherently so, almost by definition. If you're an all-powerful magical being trying to create sentient life, evolution is the long, long, long way around. If you're trying to get from Point A to Point B, evolution is a slow, meandering walk down convoluted dirt roads, with thousands of stops on the way to visit your doddering uncles who never shut up.

And evolution is brutal. It's not just that the results of the process are often uncomfortable, frustrating, even painful. The process itself is inherently brutal. The process ensures that most animals die in dreadful suffering and terror: they die from starvation, from injury, from disease, from birth defects, from being torn to pieces and devoured by other animals.

If there were a god who was using evolution to direct life in the direction he wanted, it immediately begs the question: Why? Why on earth would anyone do this?” ~


~ “In her lead essay for the most recent Boston Review forum, “Beyond Blame,” Barbara Fried points out that the last four decades have been “boom years for blame,” with neoliberal policy increasingly holding the individual solely responsible for his fate. Freedom and dignity have become intertwined with personal responsibility—and blame is our new rallying cry. The growing fragility of our communities and families over the same time period has solidified the notion that one has only oneself to rely on. Former representative and presidential candidate Ron Paul epitomized the spirit of blame in 2011 when he passionately argued in a televised debate that the decision to forego health insurance was a fundamental right of Americans. When the moderator asked him if this would mean that someone without health insurance who was critically injured should die rather than receive government help, audience members could be heard shouting, “Yeah!” Take a risk and succeed, and you are a hero. Take a risk and fail, and you are to blame—even if it costs you your life. Risk and blame are the hallmarks of worthy personhood in contemporary American society.

Blame is clearly implicated in power and inequality, as its attribution favors the powerful. But the puzzling question is why people who do not benefit from a system of blame—that is, most Americans—cling so fiercely to its creed. Seeking an answer, I spent several years researching the American working class, the very people whose homes are underwater and whose college debt goes unpaid. I witnessed how blame was deployed in everyday life to solve problems—to anchor the self, judge worthiness, grant dignity, and make sense of failures. In short, I learned that blame is a strategy to make certain what is uncertain.

When jobs are short-term, families are fragile, institutions are hollow, and trust is in short supply, taking sole responsibility for one’s own fate lends a sense of control and meaning. Blame proves a vital mechanism for coping with the chaos, hopelessness, and insecurity that threatens daily to strip our lives of dignity and order. We numb the ache of betrayal and the hunger for connection by reaching for images of ourselves as masters of own fates.

Self-blame is shored up by a multi-million dollar self-help industry. But its true power lies in its promise that we can will ourselves to happy and successful lives, in its ability to make a virtue out of failure, insecurity, and uncertainty. As a young woman, Kelly, a line cook who has lived on and off in her car, explained, “Life doesn’t owe me any favors. I can have a sense of my own specialness and individuality, but that doesn’t mean that anybody else has to recognize that or help me accomplish my goals.” Those who embrace blame tend to have little empathy for those who cannot pull themselves up by their bootstraps. If I have to go it alone, the logic goes, then everyone else should, too.

As Fried argues, blame is costly, both socially and politically. Blame divides potential communities of solidarity into winners and losers. Even more worrisome, the quest for personal responsibility and the eagerness to blame oneself for failure obscures the larger forces that have weakened our social safety net, our communities, and our families. Doing away with gratuitous blame—directed at others and at ourselves—requires building institutions that restore, carefully and thoughtfully, our collective supply of meaning, trust, and dignity.” ~

Brueghel: Detail of Netherlandish Proverbs

Ours is a highly competitive society where it’s very easy to see oneself as a “loser.” I have suffered from that immensely. Eventually I managed to see the broader context — those “larger forces” that the article merely hints at. I’ve come to redefine “success” — but only after years and years of needless suffering, including suicidal depression.

And it’s true that at least some people who were brought up the harsh way, who felt abandoned and had to cope somehow, appear to lack empathy for the unfortunate. I heard people say, “Nobody helped me, I had to work my way through college, I had to do everything by myself — so why should anyone else have it easy?” Once I heard a woman argue, “There was no birth control when I was young — I either had to take my chances or not have sex. Why should young women today be allowed to have fun and not have to pay for it?”

I find it a peculiar logic: “I had a hard life, so why should anyone have it easy?” It doesn’t follow. But it’s not about rationality: it’s about bottomless RESENTMENT.

But that kind of resentment at least acknowledges external facts such as “I had it rough.” Worse by far is sheer, irrational self-blame. You were told that if only you work hard enough, “you can be anything you want to be” — and that didn’t happen. The first step toward healing is often seeing that “it’s not your fault.” It really is/was the circumstances. We don’t choose the most important factors: when and where we are born, who our parents are. We don’t choose our genes, or whether we are male or female. Our race and religion are typically decided by geography (location, location!) We don’t choose the huge social and historical forces that mold our lives.

(Speaking of religion, it may help to remember that after forty years of wandering through the desert, Moses didn’t get to enter the Promised Land. What a loser!)

I realize that the issue of free will versus determinism is ultimately unsolvable. I like the saying, “I can do what I want to do, but I can’t choose what I want.” Not that we always know what we want — sometimes that’s the most difficult thing to know, especially in big matters: what is it that we most want in this life? If indeed (as so many assume) we are on this earth to accomplish some purpose, some task, what is it? Most people have absolutely no idea, so they hide behind something like, “I just want my family to be happy.” And then what? “I still haven’t decided what I want to be when my children grow up,” I once overheard a woman say — a woman obviously smart enough to realize that she was no longer young enough not to have “found herself.” But truly, how many people “find themselves”? And do we need to bear the burden of having to figure out what our “real self” is like? What if it’s a fiction, like the “soul mate”?

Some people discover their vocation early: Frank Lloyd Wright wanted to be an architect already at the age of nine. But school and the culture at large seem perversely bent on derailing even a strong sense of early vocation — which most young people simply don’t have, and they need to be told that’s OK — and that they will be help and guidance for them, rather than blame. “Don’t be so hard on yourself,” and “Nobody’s perfect” should be the guiding mottoes for recovering self-blamers. Relax: life itself will guide you, and most people will be supportive.

Karen Horney, one of Freud’s pupils, “thought that both children and adults, overwhelmed by a threatening world, compensated for their anxiety by creating an ideal image of themselves — the ‘idealized self’ — which gradually constituted their sense of who they were. The result was “their self-imposed subjection to the ‘tyranny of the should’. The unending hunt to realize the perfectionist image inevitably trapped them within ‘a pride system, which veiled a hidden self-contempt and alienation. Life became a series of hostile inward encounters, with the actual ‘self’ living in a constant tension, torn between the tyrannical demands of the ideal self and the insistent efforts of the submerged ‘real’ self to express its need for spontaneous growth.” (~ Peter Watson, The Age of Atheists, p. 359)

Except that there is no “real self” either. There is only constant evolution according to circumstances and the stage of life. Relax. Don’t struggle and strive so much. Let go. Go with the flow. Trust your unconscious mind. I can hardly believe that it took me decades to acquire this simple wisdom and undo — never completely, but almost — the damage of self-blame.


On the issue of self blame—I think there are at least two directions this takes today. One eschews all responsibility—my favorite (?) example of this is those ads for rehab places that state “addiction can happen to anyone, anywhere, anytime” as though it acts like a contagious disease, or like a lightning strike, and has little or nothing to do with personal choices or personal responsibility.

On the other side, there is the idea that you are responsible for everything, including things like cancer, mental illness, and poverty—if you had just fought harder, worked more, been a better person, and had the determination to “pull yourself up by your bootstraps” you would not be mired in any of these situations. Sort of a secular version of the righteous are rewarded and the unworthy punished: good people enjoy health and wealth, sinners are poor and sick.

At the times I was in the deepest stages of depression I felt to blame for everything that went wrong, in widening circles, starting with my own situation and moving outward, past the boundaries of delusion, until I was responsible, almost, for all the worst things in the world. At that extreme I could recognize the folly of this kind of thinking, even if I couldn’t stop it. I also found my reactions to self blame were always more on the order of shame rather than guilt.


Though I lean more toward determinism, I am also keenly aware that anyone who knows me well could say: But look, you’ve decided not to be depressed anymore. You made a CHOICE. And you’ve met several others, both in person and online, who also made that choice, albeit for different main reasons (for two of the women, it was their family — “I didn’t want my family to suffer; I realized I had no right to make them suffer”).

And all I can say is “Ripeness is all.” When there is a confluence of strong influences, a person has no choice, ahem . . . but to make a life-changing choice. Or at least that’s how it feels psychologically — there is the subjective experience of making a choice. I better leave the conundrum right here before it drives me crazy and the imaginary smoke of burnt neural circuits starts coming out of my ears.

Re: blaming yourself for everything. I know it well. Scott Peck makes this distinction: a neurotic feels they are to blame for everything, while a “character-disordered” individual always blames others. Between the two, I’d rather be a neurotic and deal with neurotics rather than, say, with an alcoholic who was in an accident but keeps claiming it was the other driver’s fault (and similar, and worse).

But if there is a confluence (hah! I like that word) between the two extremes it’s that even a neurotic, though it’s all her fault, also feels that life is rigged against her. If I were still prone to depression, there is no way I could have my knee replacement without brooding about the unfairness of it, how some people run around perfectly pain-free while I’ve had these brutal, barbarous surgeries. And I’d have lots of good cries about it. So, let me celebrate PROGRESS. Better late in life than never. Let me celebrate the phrase “It’s too late in life for . . .”

(And my thanks for the poet Jack Gilbert, who helped me see this when I read a line of his: “It’s too late for discontent.”)

(PS: A minor point, but it’s helped me against self-blame, so perhaps it will help others: I imagined appearing in court, being judged for my various misdeeds. And I realized that in adulthood I’ve never done anything truly awful! The highest penalty I can imagine meriting would be “community service.”)



~ “Octopuses can squirt water at an annoyingly bright bulb until it short-circuits. They can tell humans apart (even those who are wearing the same uniform). And, according to Peter Godfrey-Smith, a philosophy professor at University of Sydney and City University of New York, they are the closest creature to an alien here on earth.

That’s because octopuses are the most complex animal with the most distant common ancestor to humans. There’s some uncertainty about which precise ancestor was most recently shared by octopuses and humans, but, Godfrey-Smith says, “It was probably an animal about the size of a leech or flatworm with neurons numbering perhaps in the thousands, but not more than that.”

This means that octopuses have very little in common with humans, evolution-wise. They have developed eyes, limbs, and brains via a completely separate route, with very different ancestors, from humans. And they seem to have come by their impressive cognitive functioning—and likely consciousness—by different means.

 Octopuses display signs of curiosity, and Godfrey-Smith believes it’s extremely likely that they’re conscious beings. “I think the exploratory behaviors, the fact that they attend to things, they have good eyes, they evaluate, are little bits of good evidence that there’s something it’s like to be an octopus.”

Part of this is impressionistic; Godfrey-Smith acknowledges that they simply look like intelligent, conscious creatures. But they also perform certain tasks that are known to be conscious in humans. “Dealing with novelty, when you attend to a novel thing, is always conscious in humans,” he adds.

Given the distant common ancestry between octopuses and humans, conscious octopuses would mean that consciousness has evolved on earth twice. Godfrey-Smith believes it’s plausible that there are more than two branches of evolution where consciousness independently developed.” ~



Between one grey morning hour
And the next
The TV marine biologist
Shocked me to wonder
At the octopus
Drawn to a pattern
So unlike our own
Solving problems
With its nine brains
With its blood
With changes in the color
And texture of its skin
Learning maybe changing
In ways that seem
More than reflex
Or instinct can explain
Small and beautiful
And poisonous
Or a steaming heap
On someone’s dinner plate
A different answer
To the same long question
True as our own

We really know so very little about consciousness in other creatures; we have only begun to understand our own. And nature is so fond of redundancy, it would be foolish to assume we are the only self-conscious beings in this world, part of the old idea we are the pinnacle of creation, put here by a Divine Creator to take dominion over the earth and all its creatures, put there for our use and entertainment, with no purpose of their own.


This reminded me that in ancient Judaism a woman had no soul, while a man had one soul on weekdays and two souls on the Sabbath. Or so I was told. In any case, even in the developed world a lot of men (most?) seem to assume that the purpose of women is to serve the needs of men. So we don’t even have to go to the animal kingdom for an example of dominionism.


On August 11 there was a concert at the Gniezno Cathedral (near Poznań). Around 8:30, lightning struck. The lights went out and some stained glass shattered and fell out. By the light of a few candles, the musicians played on and finished the concert. Then one of them playing Vivaldi’s Four Seasons on the accordion. No one in the audience left either. 


On the clouds and weather: we are getting smarter, and going in when storms threaten. Apparently Florida is some kind of lightning strike capital, there have been at least three local lightning fatalities and one major fire here in the last few weeks. I still love the energy of storms, and the wonderful sunsets that often follow.


I too love the energy of violent weather. Sure, safety first — I'm glad you’re taking shelter. Nothing like watching a great thunder storm while safe and dry.  Or even with some degree of risk remaining, since nothing is 100% free of risk . . . I would have loved to have been there at that concert in the Gniezno Cathedral.


ending on a baby elephant: