Saturday, May 27, 2017


Andrea Mantegna: Children Playing with Masks, 1495. Note the child's arm stuck out like a tongue. That's perhaps the most creative part of this unusual work.

My grandmother’s story before it’s forgotten: her parents dying
the father first. When the widow realizes the disease will take
    her too
she walks from house to house, sails from island to island
with her daughter. “Who can take care of Maria?”
A strange house on the other side of the bay takes her in.
They could afford to do it. But the ones that could afford it
    weren’t the good ones.
Piety’s mask cracks. Maria’s childhood ends too soon,
she’s an unpaid servant, in perpetual coldness.
Year after year. Perpetually seasick behind the
long oars, the solemn terror
at the table, the expressions, the pike skin crunching
in her mouth: be grateful, be grateful.
                    She never looked back.
But because of this she could see The New
and seize it.
Break out of the bonds.

I remember her, I used to snuggle against her
and at the moment she died (the moment she passed over?) she
    sent out a thought
so that I, a five year old, understood what had happened
a half an hour before they called.

I remember her. But in the next brown photo
someone I don’t know —
by the clothes from the middle of the last century.
A man about thirty, the powerful eyebrows,
the face that looks me right in the eye
whispering “Here I am.”
But who “I” am
is someone no one remembers any more. No one.

TB? Isolation?

Once he stopped
on the stony, grass-streaming slope coming up from the sea
and felt the black blindfold in front of his eyes.

~ Tomas Tranströmer, from Baltics, VI
translated by Samuel Charters

The last section brings us the story of the poet’s grandmother: her widowed mother, sick with TB,  going from house to house, island to island (the setting is the Stockholm Archipelago), trying to find foster parents for her daughter. Then childhood’s painful end with the family who treated her like an unpaid servant. Finally, the closeness between the grandmother and the poet as a child — at the moment of her dying, that mysterious communication between the two.

Countless people have told a similar story about knowing exactly when someone they loved died. The speaker isn’t sure if death is a “passing over” to another realm of consciousness. Perhaps all we can say is that while we are alive, up to the last moment, our minds are interconnected in more ways than we know.

Then we come to the brief vignette of a stranger whom no one remembers, no one (Tranströmer’s emphatic repetition). Then the masterful lines about how that man knew death was coming for him, with the image of the black blindfold, as before being executed by a firing squad. But let’s not forget the single line before that passage: “TB? Isolation?”

The two are equated. The poet implies a person can die of isolation, and I suspect that at least some old people die precisely of that. We are social animals, and absolutely need a signal from others that says “live!” to cells of our bodies. We must feel needed, appreciated, cherished — dare we say “loved”?

“Needed” is probably sufficient. Needed at least by our cat and houseplants. That’s why a handful of nursing homes allow pets and houseplants, though this is still regarded as a radical, experimental practice. No one denies the benefits, but these are disregarded in favor of less bother. Affection? The purring of a cat? Forget it. Make them take antidepressants.

But I digress. The important step at this point is to return to the beginning of this section, where the speaker says: “My grandmother’s story before it’s forgotten: . . . ” I think this is the paradox of being human: the poet appears almost desperate to save his grandmother’s story from oblivion, so she doesn’t become like the stranger whom no one remembers anymore.

But is being forgotten a tragic fate, or merely a universal one? Kings and queens, heroes, great artists, great discoverers — those are rare exceptions that indeed confirm the rule — and with time, their names too begin to fade. What seems to be left is our awareness that great multitudes of people lived before us. And here is what they say: I lived, I loved, I had my dreams, my joys, my sorrows. And we can nod, and perhaps remember, “No man is an island.” We are connected to them. It doesn’t matter if the connection is anonymous.

On further thought, I think Tranströmer is fully aware that even though he tells his grandmother's story, it will be quickly forgotten anyway. He doesn't have Shakespeare's (and various other poets') illusions about being able to make someone “immortal.” But for a moment he makes us care about the grandmother, and that moment of affection is enough. I also feel affection for her mother, who knew she was dying and was no doubt barely holding on, but tried so hard to find an adoptive family for her daughter. I can imagine little details here: trying to make her daughter look pretty in a little bonnet, or maybe a ribbon in her hair — all that caring, while dying.

So ultimately we are moved by something universal: a mother’s love. No matter how bad the foster parents turned out to be, Maria understood that her sick mother tried to make sure her daughter would not starve, and held on to life until she could be sure of that. No hardship and humiliation would stop her. I think Maria preserved the memory of her mother’s love, and that memory was her salvation — it prevented her spirit from being broken. She preserved her self-worth and was able to form a positive vision of a future self.

I think what makes us kind or unkind is almost never religion, but mainly the kindness or unkindness we received in childhood — and later on. Receiving sufficient love (even if abuse happens later on) is a kind of vaccine against the child’s growing up to become an abuser in his or her turn.

Grinda, part of the Stockholm Archipelago

One of the most reliable sex differences in reactions to marriage is in who files for divorce. This difference has been documented at least as far back as 1867. It is still true now, in places such as Europe, Australia, and the U.S. Who is more likely to walk away from a marriage? Women. They initiated about 62 percent of divorces in the U.S. in 1867, and that number is now close to 70 percent.

Some marriages end with the death of a spouse, and that can be deeply distressing for both men and women. There are indications, though, that women adapt faster to bereavement than men do.

Once a marriage ends, women are much less likely than men to try marriage again. Rates of remarriage are almost twice as high for men as for women. Some of that can be explained by more advantageous sex ratios for men who want to remarry than women, but that is unlikely to be the entire explanation for such a big difference.

There are also some indications that women savor their solitude more than men do. When asked whether they enjoy their time alone, women are more likely than men to say that they do.

Porsche introduced its first electric car already in 1898! Cleaner, quiet, easy to start — alas, the electric car was favored by women, so it became stigmatized as a “woman's car.” 

An electric car, England, 1896. The first practical electric car was invented in 1884.


~ “Eighty! I can hardly believe it. I often feel that life is about to begin, only to realize it is almost over . . . One’s reactions are a little slower, names more frequently elude one, and one’s energies must be husbanded, but even so, one may often feel full of energy and life and not at all “old.” Perhaps, with luck, I will make it, more or less intact, for another few years and be granted the liberty to continue to love and work, the two most important things, Freud insisted, in life.

My father, who lived to 94, often said that the 80s had been one of the most enjoyable decades of his life. He felt, as I begin to feel, not a shrinking but an enlargement of mental life and perspective. One has had a long experience of life, not only one’s own life, but others’, too. One has seen triumphs and tragedies, booms and busts, revolutions and wars, great achievements and deep ambiguities, too. One has seen grand theories rise, only to be toppled by stubborn facts. One is more conscious of transience and, perhaps, of beauty.

At 80, one can take a long view and have a vivid, lived sense of history not possible at an earlier age. I can imagine, feel in my bones, what a century is like, which I could not do when I was 40 or 60. I do not think of old age as an ever grimmer time that one must somehow endure and make the best of, but as a time of leisure and freedom, freed from the factitious urgencies of earlier days, free to explore whatever I wish, and to bind the thoughts and feelings of a lifetime together.” ~

But what about dying, you may ask. Writing two years before his death of cancer (and he did know at this point that he had a nasty kind of cancer — in remission, but bound to recur) Sacks seems unperturbed. He hopes to die “in harness,” working to the end. “I have no belief in (or desire for) any post-mortem existence, other than in the memories of friends and the hope that some of my books may still ‘speak’ to people after my death.”

(I’ve lost the link, but this probably appeared in the New York Times)

  campanula x rays


This is how I felt in my late teens and twenties

 Artist: Luthyen


~ “That the weak liberal parties dominated the new [Kerensky-led] government was to be expected. What worried Lenin were the reports he was receiving that his own Bolsheviks were vacillating over the way forward. Theory had bound them, together with most of the left, to the Marxist orthodoxy that, at this stage, the revolution in Russia could be only bourgeois-democratic. Socialism was possible only in advanced economies like Germany, France or even the United States, but not in peasant Russia. (Leon Trotsky and his band of intellectuals were among the few dissenters from that view.)

Since the course of the revolution was thus preordained, all that socialists could do was offer support to the provisional government as it carried through the revolution’s first phase and developed a full-fledged capitalist society. Once this was completed, then they could agitate for a more radical revolution.

The Bolshevik slogan that embodied his tactical thinking was “peace, land and bread.” As for the revolution, he now argued that the international capitalist chain would break at its weakest link. Winning over the Russian workers and peasants to create a new socialist state would pave the way for an insurrection in Germany and elsewhere. Without this, he argued, it would be difficult to build any meaningful form of socialism in Russia.

From February to October, arguably the most open period in Russian history, Lenin won over his party, joined forces with Trotsky and prepared for a new revolution. The provisional government of Alexander Kerensky refused to withdraw from the war. Bolshevik agitators among the troops at the front assailed his vacillations. Large-scale mutinies and desertions followed.

Within the workers’ and soldiers’ councils, or soviets, Lenin’s strategy began to make sense to large numbers of workers. The Bolsheviks won majorities in the Petrograd and Moscow soviets, and the party was developing rapidly elsewhere. This merger between Lenin’s political ideas and a growing class consciousness among workers produced the formula for October.

Far from being a conspiracy, let alone a coup, the October Revolution was perhaps the most publicly planned uprising in history. Two of Lenin’s oldest comrades on the party’s central committee remained opposed to an immediate revolution and published the date of the event. While its final details were obviously not advertised beforehand, the takeover was swift and involved minimal violence.

That all changed with the ensuing civil war, in which the nascent Soviet state’s enemies were backed by the czar’s former Western allies. Amid the resulting chaos and millions of casualties, the Bolsheviks finally prevailed — but at a terrible political and moral cost.

The choice that followed the revolution of October 1917 was thus not between Lenin and liberal democracy. The real choice was to be determined instead by a brutal struggle for power between the Red and White armies, the latter led by czarist generals who made no secret that if they won, both Bolsheviks and Jews would be exterminated. Pogroms carried out by the Whites saw entire Jewish villages wiped out. A majority of Russian Jews fought back, either as members of the Red Army or in their own partisan units. Nor should we forget that a few decades later, it was the Red Army — originally forged in the civil war by Trotsky, Mikhail Tukhachevsky and Mikhail Frunze (the former two killed later by Stalin) — that broke the military might of the Third Reich in the epic battles of Kursk and Stalingrad. By then, Lenin had been dead for almost two decades.

A White propaganda poster

Weakened by a stroke for the last two years before he died in 1924, Lenin had time to reflect on the achievements of the October Revolution. He was not happy. The Revolution had to admit its mistakes and renew itself, he believed; otherwise, it would fail. Yet this lesson went unheeded after his death. His writings were largely ignored or deliberately distorted. No subsequent Soviet leader emerged with Lenin’s vision.

“His mind was a remarkable instrument,” wrote Winston Churchill, no admirer of Bolshevism. “When its light shone it revealed the whole world, its history, its sorrows, its stupidities, its shams, and above all, its wrongs.”

Of his successors, neither of the notable reformers — Nikita Khrushchev in the 1950s and ’60s and Mikhail Gorbachev in the 1980s — had the capacity to transform the country. The implosion of the Soviet Union owed almost as much to its degraded political culture — and, at times, the ridiculous deficiency of the bureaucratic elite — as it did to the economic stagnation and resource dependency that set in from the 1970s. Obsessed with mimicking the technological advances of the United States, its leaders cut the ground out from beneath their feet. In the revolution’s final, sorry chapter, not a few of its bureaucrats rediscovered themselves as millionaires and oligarchs — something Trotsky had predicted from exile in 1936.
In the national-conservative Russia of its president, Vladimir V. Putin, there are no celebrations this year of either the February Revolution or the October one.

“After their death,” Lenin wrote of revolutionaries, “attempts are made to convert them into harmless icons, to canonize them, so to say, and to hallow their names to a certain extent for the ‘consolation’ of the oppressed classes and with the object of duping the latter.” After his death, against the cries of his widow and sisters, Lenin was mummified, put on public display and treated like a Byzantine saint. He had predicted his own fate.


Lenin was actually flexible and pragmatic — for instance, he allowed limited capitalism (NEP, later abolished by Stalin).

I think the article points out something obvious that generally escapes attention: it wasn't the Revolution per se that caused most casualties, but the Civil War that followed. The carnage can ultimately be traced to Kerensky’s failure to withdraw from WWI. To be sure, he was under tremendous pressure by the Western allies. And of course now we have the wisdom of hindsight.

Russia's Civil War: American Troops near Vladivostok, 1918. “The experience in Siberia for the soldiers was miserable. Problems with fuel, ammunition, supplies and food were widespread. Horses accustomed to temperate climates were unable to function in sub-zero Russia. Water-cooled machine guns froze and became useless.” (Wiki)

“. . . most men and women will grow up to love their servitude and will never dream of revolution.” ~ Aldous Huxley, Brave New World 


The hardest thing I had to give up when I left Christianity was the concept of heaven. To be sure, I do think that existence would eventually be boring regardless of the state I found myself in — eternity was a long time to live (although I’ll admit — as a believer I thought this wouldn’t really be an issue). But there was also a disappointment that there were many family members and friends I would never see again.

I have to say — I think this is one of the reasons some I know cling to religion so tightly. It’s this tenuous connection that they have towards the dead that gives them a vested interest in making sure belief in God and heaven and the rest is protected.
The easiest thing to give up, by contrast, was the concept of hell. Usually bringing this up has many Christians insisting that the image I have of hell isn’t accurate — there seems to be more attempts to sanitize the concept of hell, in my experience, than any other concept in the Bible. I think this speaks to how uncomfortable the basic concept is — it’s unpleasant to think that people are going to heaven, while others are going to hell (regardless of the way you define “hell”).

The most common thing I hear from Christians is that people shouldn’t worry about hell. God’s going to be just and it’ll all make sense when we die. But even the trust that it’s OK for this being to send some people to hell based on his own judgment (which is supposedly just, whatever it is) is disturbing — especially since the judgment of this supposed being doesn’t have to answer to anybody.

The encouragement to trust God’s judgment above our own experience and sense of empathy is, if God doesn’t exist, an encouragement to trust a fictional being that is created by a few people, and for that trust to actually trump our actual real-life experiences and relationships.

During my last few years as a Christian, this approach of “just trust God’s judgment” wasn’t enough for me. I was always trying to look behind the curtain. Also, I was beginning to trust my own love and valuation of other people so much that God’s supposed opinion of how worthy they were to go to heaven began to matter less and less. The refrain, “Oh, if you saw it from God’s view it would all make sense” was an increasingly difficult position to take …my own empathy and trust in the beauty of other people I knew began bleeding through, and I began to see that the Bible — and the God in it — had less and less to do with the empathetic view of people that was growing in my heart.

And I eventually came to see that, although people are occasionally wrong, and although they do malicious things once in a while … no one deserves eternity in hell. So it’s a real relief not to believe in it.

And it’s really nice to live for the world that actually exists without feeling as if this life will be judged by a higher power.

It’s simpler. I enjoy life as it comes, embrace people who make this life better, and more soberly and honestly appreciate the lives of those who have gone before, knowing that there is only one me and that, for awhile, I contributed another verse to existence in a unique way that no one can replace and that doesn’t seem likely to come around again.

Leaving the concept of hell has given me a lot of peace. Asking me if I miss the concept of heaven (as an abstractly beautiful place — I wouldn’t want to spend a moment with the God of the Bible up there) is like asking me if I wish I could believe there were a million dollars in my bank account. Sure, it would be nice to see a couple buddies who have passed on after I die. But many things are there that indicate that isn’t the case, and that death is just a part of existence that is is simply “there.” This honesty makes it easier to live with myself and with others; there’s not this nagging view that I’m starring in my own Truman Show. Things are more peaceful, in a way, when I don’t have to twist my mind to think things that don’t seem real… leaving a lot more brainspace to get to the business of living a life that is, at least to me, much more authentic.” ~

Tintoretto: Paradise (detail). Isn’t this the most repulsive image of paradise imaginable? 


Love and work: I need no other heaven. But then I remember beauty. I need beauty to survive, or life is not worth living.

“This I teach men: no longer to bury one’s head in the sand of heavenly things, but to bear it freely, an earthly head, which creates a meaning for the earth.” ~ Nietzsche

To turn to something more fun, here is “Devil carrying his soul basket”; Holkham Bible, England c. 1320-1330. Images of hell are a lot more common in religious art than images of heaven. Note all the extra faces. Note also that one of the souls is a monk — so people did not automatically assume that monks were holy. 

~ “It was on my 15th birthday in the Summer of 2005 in Northern Morocco that, by chance, I got my hands on a series of online articles about the European Enlightenment. What I read was as transfixing as it was transformative, and marked the beginning of a radical change in my life.

It was particularly impactful to learn that until the 18th century, Europe — now a developed and free continent — had been characterized by the same religious dogmatism, sectarianism, and attacks on free expression that today underpin many Muslim states. It was encouraging to read that those philosophers of Europe on the front lines of the struggle for freedom — advocates of individual liberty, intellectual openness, and the eradication of religious oppression — had accomplished so much while being so few. They were lone voices without popular support, suffering from persecution, and living in exile; akin to many secularists and intellectuals in the Muslim world today.

This Europe, with its literature, philosophy, and revolutionary heritage, was a source of inspiration. I was a young Moroccan who had a complicated history of religious education — obliged to leave school and join a Salafi Madrasat the age of 14 — but reading the works of enlightenment intellectuals served as the catalyst of my own personal enlightenment.

The views of Spinoza on religion as an organized dogma, and the courage of Voltaire in the face of religious persecution, or Diderot and his belief in the importance of science and reason are particularly relevant and speak to the present challenges of the Islamic world.

To understand such fascination with the European Enlightenment in the Muslim world, we must be cognizant of the historical context of both.

Today, in France, Germany and England state-sponsored religious persecution is unheard of. One need not fear repercussion for speaking their mind or practicing their religion as they see fit, or not at all.

Meanwhile, the situation in the Muslim world is radically different from today’s Europe, but at the same time very much akin to the Europe of Spinoza and Denis Diderot. The issues with which these thinkers grappled continue to pose a serious challenge to many around the world, anywhere from Tangier to Jakarta.

I encountered Europe in person in the Spring of 2011, when I arrived in Geneva as a political refugee. It was a great shock to discover that the Europe of the enlightenment — the Europe that I read about in the books that had moved me to write and fight for freedom — had ceased to exist. While it may still exist geographically — you can still see it and you can visit it — you can no longer unconditionally immerse yourself in its ideas or experience the values and humanist principles it was founded upon.

It is now another Europe. It is a Europe where artists and writers must censor themselves in fear of death threats. A Europe where making caricatures of Jesus is considered freedom of speech, but drawing Mohamed is hate speech. A Europe where many liberals and feminists bury their heads in the sand when faced with the suffering of apostates, women, and minorities in the Islamic world. At the same time, far-Right populists exploitatively portray themselves as the new voice of freedom and enlightenment values, while their actions demonstrate an utter rejection these principles.

I was not surprised to see a French documentary about supporters of the far-Right National Front political party, who seemed to crave a return to the pre-enlightenment era, with one middle-aged man stating on camera that “I want Louis XIV to be back.”

It is equally unsurprising to see radical Islamists in Europe calling for Sharia law and demonizing the secular pluralist societies in which they live freely, while being supported by many liberals who confuse criticism of Islam with anti-Muslim bigotry or hate toward Muslims.

This betrayal of religious and sexual minorities in the Muslim world by some of those who identify as Leftists is disheartening. Their characterizations of ex-Muslims, feminists and liberals in the Islamic world as being euro-centrist or traitors of their own tribe are condescending and reek of ideological paternalism. However, there have also been many reasonable voices within the Left who don’t simply blame all the current misery in the Muslim world on Western foreign policy, but acknowledge and denounce the role of violent theocratic regimes such as in Iran and Saudi Arabia, and their responsibility in sponsoring terrorism and promoting the ideology of extremism.

So, has Europe ceased to exist? Perhaps the Europe which once inspired me never existed in the first place, and I am just confronted with the harsh realities of a continuing struggle towards enlightenment. Perhaps, in a moment of deficit, I was simply projecting my hopes and dreams onto my favorite philosophers and their books. Or maybe it did exist and still does; as long as I can speak my mind freely, as long as I can criticize the Left, the Right, and the Islamists without fear of persecution or jail, and as long as I am not taking my freedom for granted, Europe, this Europe which inspired me, will always exist.” ~

Somewhere in England, of course. It’s so easy to imagine the stately matron saying, “How do you do?”

"THE BIO-HUMANISM FALLACY: The assumption that you could eliminate our cultural infrastructure and people would remain civilized, as though human progress is now in our DNA, inborn in our bones.

When people say that Trump is a baby there's more to it than they notice. Without civilizing cultural constraints, we are all Trump at birth.

We don't fall from grace, we are cultured up from slime ball. Beastly behavior is largely our default state. The crass inhumanity of the dark past — that's still us, without culture to straighten us out.

The Bio-humanism fallacy is related to the illusion of autonomy. We feel free to be individuals not because we're less dependent on society but because the society we depend upon has become so reliable that we don't notice its effects." ~ Jeremy Sherman, Facebook


For the most part, I agree with this — and the movie “Lord of the Flies” was an unforgettable statement contradicting this fallacy (although we also need to remember that the movie starts out by presenting the boys’ attempt at fairness and democracy). In the history of humanity, there has been no mythical Golden Age, no paradise and fall from grace, but only a steady struggle to become less aggressive and more cooperative and civilized. So, did we really begin as slime balls whose progress toward civilization is an incomprehensible miracle?

No — and this is huge. No because of the neurobiology of the social species. We seem born pre-wired with a sense of fairness and with rudimentary empathy. We are the most social of the social species, and one most dependent on cooperation.

Alas, we are also wired for the in-group, out-group bias — that has led to many increasingly lethal conflicts, especially as family clans gave way to tribes, and tribes to nation-states and empires (by any other name). Thus, the naval officer at the end of the Lord of the Flies instantly restores civilization by his mere presence — but what’s this war ship we see anchored nearby? Even if that particular war was fought by the allies in defense of civilization, the overall failure of humanity is blatantly obvious. (But we are inching forward, Pinker reminds us, and he has statistics to prove it.)

I wish everyone would become familiar with the Yale Baby Clinic studies on empathy and the sense of fairness in pre-verbal infants — as well as the in-group, out-group bias. We, the most social of all species, don’t start as slime balls, but are in fact pre-wired for empathy and fairness and cooperation, as is true not just of primates but of elephants, wolves, dolphins, and several other species.

It's chiefly the out-group bias that can lead to atrocities. Kindness and fairness are in constant struggle against bullying and might-makes-right. Taking responsibility is a learned behavior — seeking excuses and trying to find scapegoats may seem “only natural.” I agree that it takes a lot of civilizing effort on top of the pre-wiring to end up with a kind, truly cooperative human being. And survival stress must not be too high, or stress of any kind, really, so that the cortex can perform its more subtle tasks.

At the same time, sure, survival comes first and a baby must scream if he perceives danger, without worrying about how annoying the screaming may be to others.

But people-pleasing starts early too, and may lead to excessive altruism. The complexities of the human situation are truly enormous. 



~ “Ah, to sleep, perchance … to shrink your neural connections? That’s the conclusion of new research that examined subtle changes in the brain during sleep.

The researchers found that sleep provides a time when the brain’s synapses — the connections among neurons — shrink back by nearly 20 percent. During this time, the synapses rest and prepare for the next day, when they will grow stronger while receiving new input — that is, learning new things, the researchers said.

Without this reset, known as “synaptic homeostasis,” synapses could become overloaded and burned out, like an electrical outlet with too many appliances plugged in to it, the scientists said.

“Sleep is the perfect time to allow the synaptic renormalization to occur … because when we are awake, we are ‘slaves’ of the here and now, always attending some stimuli and learning something,” said study co-author Dr. Chiara Cirelli of the University of Wisconsin-Madison Center for Sleep and Consciousness.

“During sleep, we are much less preoccupied by the external world … and the brain can sample [or assess] all our synapses, and renormalize them in a smart way,” Cirelli told Live Science.

Sleep is the price people pay for brains that are able to keep learning new things, the researchers said.

Russell Foster, who directs the Sleep and Circadian Neuroscience Institute at the University of Oxford in the United Kingdom, who was not associated with the study, called it a “very nice, clear piece of work.” The findings support the notion that sleep is necessary for the consolidation of memories and thus learning, Foster said.

In a paper published in 2003, Cirelli and Tononi hypothesized about sleep’s role in the growth of synapses, which serve as avenues to ferry information among neurons. Synapses are constantly strengthening, or widening, during the day to accommodate the flow of traffic as the brain soaks up new experiences. But that strengthening cannot go on indefinitely, or else the synapses will become saturated — think “information overload.”

To find evidence for this, the researchers used a new form of electron microscopy that can discern the miniscule changes in the shrinking and subsequent expansion of these microscopic synapses at the nanometer level in mice brains. They found that a few hours of sleep led to an 18 percent decrease in the size of the synapses on average.

Cirelli said that one interesting finding was that this pruning occurred in about 80 percent of the synapses but spared the largest ones. These larger synapses may be associated with the most stable and important memories, connections the brain does not want to lose, the researchers speculated. Yet, the way in which the brain decides what synaptic connections to prune is another mystery to explore, Cirelli said.

“It is critical to have pruning back at night, so that the huge amount of information encoded by temporary synapses during the day won’t overwhelm the brain,” said Foster. “Pruning ensures that only the most important information is retained.”

Foster said he can envision follow-on experiments based upon the Cirelli-Tononi work that would use mouse models to explore the connections among circadian rhythms (the body’s “internal clock”), sleep, synapse pruning and psychiatric disorders. Some of the key features of these disorders seem to be a disruption in neural circuitry, sleep disruption, and impaired cognition and memory, said Foster.

Foster added that resetting synapses may be a core feature of sleep, particularly for humans, with their advanced cognitive abilities compared to other animals. However, pruning is likely to be just one of many essential functions that takes place during the sleep phase, a period during which the body takes advantage of physical inactivity to perform a range of essential housekeeping activities, he said.” ~

ending on beauty

And then I rose
in the dazzle of light, to the pine trees
plunging and righting themselves in a furious wind.

To have died and come back
raw, crackling,
and the numbness

That clumsy
pushing and wheeling inside my chest, that ferocious
upturn —
I give myself to it. Why else
be in a body?

~ Chana Bloch (1940-2017), Afterlife

photo: Susan Rogers

Sunday, May 21, 2017


Sunset in Queensland


Again workers and schoolchildren
three-deep along the boulevards:
this time it was not
a cosmonaut
nor the Ethiopian emperor —
it was Brezhnev

who stood,
monolithic three-quarters profile,
in a long, open black car,
next to the nervous host —
himself a first secretary,
but how slight!
And Brezhnev an impassive mound.
His eyebrows
underlined his hat.

He stood heavy, rotund,
caped in a black coat.
Now and then his pale
pudgy hand
flopped slowly up and down
like a disturbed mollusk;
he did not bother to smile.

We stomped our feet
in the chill;
at his passage, when signaled,
feebly clapping.

For news and documentaries,
they used a soundtrack
with hurrah applause
and shouts of  long-live.
His huge dark back
took over the screen.

~ Oriana

When Brezhnev came to Warsaw, I was one of the thousands of schoolchildren dragged out to applaud his motorcade. Brezhnev scowled —  he didn’t bother to smile at the unimportant Poles. Good optics were reserved for Americans.

Now we see this in reverse: our pouty POTUS is all smiles for the Russians! Grinning ear to to ear! But someone commented that he looks like someone laughing without understanding the joke. And that Kislyak’s grin is by contrast a sly, mission-accomplished expression of satisfaction. 

 Trump with Russian Ambassador Kislyak; American reporters were not allowed into the Oval Office, but Russian state-agency reporters were. 

 Trump, Kislyak, and Foreign Minister Lavrov

~ “They are of the same cloth, Putin, Lavrov, Trump — people who know what's what and what life is all about, why it’s all about money and power and not all that other crazy ephemeral loser talk about democracy, human rights and all that silly nonsense. Life’s about winning, and everything that’s not winning is losing. They are winners, and everyone else is losers.” ~ M. Iossel (who grew up in Leningrad and lived in Russia until the age of 30)


A minor point next to the lack of ethics, but I want to squeeze it in somewhere. His food addiction is a kind of substitute for alcoholism. I've seen this pattern in families. And sometimes an alcoholic switches to a food addiction.


~ “Me: “This man openly boasted of sexual assault, and gives away state secrets like candy to visitors.”

Them: “Oh yeah? Well, Bill Clinton was a serial adulterer and Hillary gave away secrets of her own.”

Me: “So you agree with me that such things render a person unfit for the office?”

[abrupt change of subject]

Even if we were to grant their point, it would be entirely beside the point. Their response fails to invalidate legitimate charges against the person they're defending, but it never occurs to them that they didn't actually say anything that disproved the charges. They're simply admitting they're willing to live with the flaws of their own guy, and they want to get the attention on to something else instead.

That's changing the subject, which is what you do when you don't have a good response that actually deals with the topic under consideration. You just hope no one notices what you did.” ~ Neil Carter


I did notice that the use of the “Tu Quoque” fallacy is the main tactic of the Trump camp. They don’t defend Trump’s actions. Instead, they say: “But Obama . . . But the Democrats . . . But Clinton . . .” — it’s extremely predictable. Trump fans seem to believe that the best defense if offense. Sometimes they attack you directly, but most often they attack Obama.

The worst strategy is to respond by pointing out that what Obama didn’t do X, or if he did, it was in a totally different context, or wasn’t anywhere as bad, etc. Now they’ve got you: now you are on the defensive, and Trump’s misconduct disappears from the discussion. Being aware of the logical fallacy and quick to point it out prevents being hijacked in this manner. 

I admit I’ve fallen into this “But Obama! But Hillary!” trap a gazillion times. Bring up any blatant wrong-doing by Trump, and you’ll immediately hear that Obama (or Clinton, or the Democrats — even going back to LBJ) did something much worse (often bringing up stuff you never heard of, part of the disinformation spread by Fox News and right-wing Talk Radio). If you “bite” and start talking about Obama or whoever, you’re a goner — a punching bag in what’s about to turn into verbal battery, with yourself being accused of being totally biased and “unwilling to consider the other side.”

(And don’t even think of bringing up right-wing terrorism; you immediately get clobbered with “but Stalin.” This isn’t just “big guns” — this is verbal nuclear warfare. There is no winning against a fallacy, but at least you can be aware and refuse to engage.)

To shift to another fallacy, this is somewhat like the Christian apologists saying, “Prove to me that god doesn’t exist. You can’t, can you? Got you, hah-hah!” That particular logical fallacy is “you don’t prove the negative.” You are not required to prove that an invisible teapot isn’t orbiting the moon, or that pink unicorns don’t exist somewhere in the universe (the universe is so vast, how could you be so arrogant as to assume they not exist somewhere?). The burden of proof is on the apologists. It’s a completely different fallacy than “tu quoque,” but it helps tremendously to know that it IS a fallacy, with a neat Latin name. Logic should be a required class even before college. 

“It is not about the economy, for Trump supporters. It is about the legitimization of their inner darkness.” ~ M. Iossel

I remember one distinct time when Trump himself used the tu quoque fallacy. I think it was Bill O’Reilly interviewing him, hardly an “enemy.” The interviewer brought up Trump’s admiration for Putin, then said, “Putin is a killer.” Trump didn’t make the slightest attempt to deny the “killer” label. Instead, he replied with: “And what about us? Are we so innocent?”

Don’t even think of trying to defend America, or at least America as an ideal. Human rights, democracy — forget it. “Tu quoque” — you too. America is just as bad as Russia! And Putin has disappeared from the discussion, as if exonerated. But a fallacy is a fallacy. It helps tremendously to recognize it as such.

This is not about changing the mind of Trump supporters. They are unreachable (but then very few people — possibly none — are open-minded when it comes to politics). This is strictly about self-defense. 


"You know, our politics have become these pure acts of vindictiveness… People who felt like they were being treated cruelly decided to respond with an act of cruelty themselves. Donald Trump is an act of cruelty." ~ Stephen Colbert


~ "While adult psychopaths constitute only a tiny fraction of the general population, studies suggest that they commit half of all violent crimes. Ignore the problem, says Adrian Raine, a psychologist at the University of Pennsylvania, “and it could be argued we have blood on our hands.”

Researchers believe that two paths can lead to psychopathy: one dominated by nature, the other by nurture. For some children, their environment—growing up in poverty, living with abusive parents, fending for themselves in dangerous neighborhoods—can turn them violent and coldhearted. These kids aren’t born callous and unemotional; many experts suggest that if they’re given a reprieve from their environment, they can be pulled back from psychopathy’s edge.

But other children display callous and unemotional traits even though they are raised by loving parents in safe neighborhoods. Large studies in the United Kingdom and elsewhere have found that this early-onset condition is highly hereditary, hardwired in the brain—and especially difficult to treat. “We’d like to think a mother and father’s love can turn everything around,” Raine says. “But there are times where parents are doing the very best they can, but the kid—even from the get-go—is just a bad kid.”

Still, researchers stress that a callous child—even one who was born that way—is not automatically destined for psychopathy. By some estimates, four out of five children with these traits do not grow up to be psychopaths. The mystery—the one everyone is trying to solve—is why some of these children develop into normal adults while others end up on death row.

A trained eye can spot a callous and unemotional child by age 3 or 4. Whereas normally developing children at that age grow agitated when they see other children cry—and either try to comfort them or bolt the scene—these kids show a chilly detachment. In fact, psychologists may even be able to trace these traits back to infancy. Researchers at King’s College London tested more than 200 five-week-old babies, tracking whether they preferred looking at a person’s face or at a red ball. Those who favored the ball displayed more callous traits two and a half years later.

As a child gets older, more-obvious warning signs appear. Kent Kiehl, a psychologist at the University of New Mexico and the author of The Psychopath Whisperer, says that one scary harbinger occurs when a kid who is 8, 9, or 10 years old commits a transgression or a crime while alone, without the pressure of peers. This reflects an interior impulse toward harm. Criminal versatility—committing different types of crimes in different settings—can also hint at future psychopathy.

But the biggest red flag is early violence. “Most of the psychopaths I meet in prison had been in fights with teachers in elementary school or junior high,” Kiehl says. “When I’d interview them, I’d say, ‘What’s the worst thing you did in school?’ And they’d say, ‘I beat the teacher unconscious.’ You’re like, That really happened? It turns out that’s very common.”

Broadly speaking, Kiehl and others believe that the psychopathic brain has at least two neural abnormalities—and that these same differences likely also occur in the brains of callous children.

The first abnormality appears in the limbic system, the set of brain structures involved in, among other things, processing emotions. In a psychopath’s brain, this area contains less gray matter. “It’s like a weaker muscle,” Kiehl says. A psychopath may understand, intellectually, that what he is doing is wrong, but he doesn’t feel it. “Psychopaths know the words but not the music” is how Kiehl describes it. “They just don’t have the same circuitry.”

In particular, experts point to the amygdala—a part of the limbic system—as a physiological culprit for coldhearted or violent behavior. Someone with an undersize or underactive amygdala may not be able to feel empathy or refrain from violence. For example, many psychopathic adults and callous children do not recognize fear or distress in other people’s faces. Essi Viding, a professor of developmental psychopathology at University College London recalls showing one psychopathic prisoner a series of faces with different expressions. When the prisoner came to a fearful face, he said, “I don’t know what you call this emotion, but it’s what people look like just before you stab them.”

Psychopaths not only fail to recognize distress in others, they may not feel it themselves. The best physiological indicator of which young people will become violent criminals as adults is a low resting heart rate, says Adrian Raine of the University of Pennsylvania. Longitudinal studies that followed thousands of men in Sweden, the U.K., and Brazil all point to this biological anomaly. “We think that low heart rate reflects a lack of fear, and a lack of fear could predispose someone to committing fearless criminal-violence acts,” Raine says.

Or perhaps there is an “optimal level of physiological arousal,” and psychopathic people seek out stimulation to increase their heart rate to normal. “For some kids, one way of getting this arousal jag in life is by shoplifting, or joining a gang, or robbing a store, or getting into a fight.” Indeed, when Daniel Waschbusch, a clinical psychologist at Penn State Hershey Medical Center, gave the most severely callous and unemotional children he worked with a stimulative medication, their behavior improved.

The second hallmark of a psychopathic brain is an overactive reward system especially primed for drugs, sex, or anything else that delivers a ping of excitement. In one study, children played a computer gambling game programmed to allow them to win early on and then slowly begin to lose. Most people will cut their losses at some point, Kent Kiehl notes, “whereas the psychopathic, callous unemotional kids keep going until they lose everything.” Their brakes don’t work, he says.

Faulty brakes may help explain why psychopaths commit brutal crimes: Their brains ignore cues about danger or punishment.

Researchers see this insensitivity to punishment even in some toddlers. “These are the kids that are completely unperturbed by the fact that they’ve been put in time-out,” says Eva Kimonis, who works with callous children and their families at the University of New South Wales, in Australia. “So it’s not surprising that they keep going to time-out, because it’s not effective for them. Whereas reward—they’re very motivated by that.”

This insight is driving a new wave of treatment. What’s a clinician to do if the emotional, empathetic part of a child’s brain is broken but the reward part of the brain is humming along? “You co-opt the system,” Kiehl says. “You work with what’s left.”


Many of the teenagers at Mendota grew up on the streets, without parents, and were beaten up or sexually abused. Violence became a defense mechanism. Caldwell and Van Rybroek recall a group-therapy session a few years ago in which one boy described being strung up by his wrists and hung from the ceiling as his father cut him with a knife and rubbed pepper in the wounds. “Hey,” several other kids said, “that’s like what happened to me.” They called themselves the “piñata club.”

But not everyone at Mendota was “born in hell,” as Van Rybroek puts it. Some of the boys were raised in middle-class homes with parents whose major sin was not abuse but paralysis in the face of their terrifying child. No matter the history, one secret to diverting them from adult psychopathy is to wage an unrelenting war of presence. At Mendota, the staff calls this “decompression.” The idea is to allow a young man who has been living in a state of chaos to slowly rise to the surface and acclimate to the world without resorting to violence.

Caldwell mentions that, two weeks ago, one patient became furious over some perceived slight or injustice; every time the techs checked on him, he would squirt urine or feces through the door. (This is a popular pastime at Mendota.) The techs would dodge it and return 20 minutes later, and he would do it again. “This went on for several days,” Caldwell says. “But part of the concept of decompression is that the kid’s going to get tired at some point. And one of those times you’re going to come there and he’s going to be tired, or he’s just not going to have any urine left to throw at you. And you’re going to have a little moment where you’re going to have a positive connection there.”

Forming attachments with callous kids is important, but it’s not Mendota’s singular insight. The center’s real breakthrough involves deploying the anomalies of the psychopathic brain to one’s advantage—specifically, downplaying punishment and dangling rewards. These boys have been expelled from school, placed in group homes, arrested, and jailed. If punishment were going to rein them in, it would have by now. But their brains do respond, enthusiastically, to rewards. At Mendota, the boys can accumulate points to join ever more prestigious “clubs” (Club 19, Club 23, the VIP Club). As they ascend in status, they earn privileges and treats—candy bars, baseball cards, pizza on Saturdays, the chance to play Xbox or stay up late. Hitting someone, throwing urine, or cussing out the staff costs a boy points—but not for long, since callous and unemotional kids aren’t generally deterred by punishment.

In fact, the program at Mendota has changed the trajectory for many young men, at least in the short term. Caldwell and Van Rybroek have tracked the public records of 248 juvenile delinquents after their release. One hundred forty-seven of them had been in a juvenile-corrections facility, and 101 of them—the harder, more psychopathic cases—had received treatment at Mendota. In the four and a half years since their release, the Mendota boys have been far less likely to reoffend (64 percent versus 97 percent), and far less likely to commit a violent crime (36 percent versus 60 percent). Most striking, the ordinary delinquents have killed 16 people since their release. The boys from Mendota? Not one.

The question they are trying to answer now is this: Can Mendota’s treatment program not only change the behavior of these teens, but measurably reshape their brains as well? Researchers are optimistic, in part because the decision-making part of the brain continues to evolve into one’s mid‑20s. The program is like neural weight lifting, Kent Kiehl, at the University of New Mexico, says. “If you exercise this limbic-related circuitry, it’s going to get better.”

To test this hypothesis, Kiehl and the staff at Mendota are now asking some 300 young men to slide into a mobile brain scanner. The scanner records the shape and size of key areas of the boys’ brains, as well as how their brains react to tests of decision-making ability, impulsivity, and other qualities that go to the core of psychopathy. Each boy’s brain will be scanned before, during, and at the end of their time in the program, offering researchers insights into whether his improved behavior reflects better functioning inside his brain.

No one believes that Mendota graduates will develop true empathy or a heartfelt moral conscience. “They may not go from the Joker in The Dark Knight to Mister Rogers,” Caldwell tells me, laughing. But they can develop a cognitive moral conscience, an intellectual awareness that life will be more rewarding if they play by the rules. “We’re just happy if they stay on this side of the law,” Van Rybroek says. “In our world, that’s huge.” ~


I can imagine B.F. Skinner reading this article with a sense of great satisfaction. Even when brain function is abnormal — whether because of early damage, or for genetic reasons that we don’t really understand — rewards work a lot better than punishment. This is true of normal children as well. Animal training used to very cruel; these days it’s all about rewards.

I think that perhaps the greatest revolution is the history of humanity has been the slow stepping away from punishment, from cruelty. Child abuse? That used to be just the normal way to bring up children: don't spare the rod. Until recently, that was never regarded as wrong. Skinner in particular showed us that rewards are truly more effective. You don’t even have to invoke kindness — it's sheer pragmatism. And to see that this works even with psychopaths, with their lack of empathy and abnormally functioning brains — wow!

  The European white stork (believed to bring luck); Stefano Ronchi


~ “It is a mistake to seek purely secular explanations for Mr Trump’s bond with religious conservatives. For one thing, the president’s rhetoric is steeped in time-worn stories about a Christian nation under siege. He is the latest in a long line of politicians to cast believers as a faithful remnant, under attack from the sneering forces of modernity. More specifically, Mr Trump’s language is filled with echoes of a much-mocked but potent American religious movement with millions of followers, known by such labels as “positive thinking” or the “prosperity gospel”.

To historians of religion, like Kate Bowler of Duke University, when Mr Trump speaks of spiritual matters his words fairly ring with the cadences of prosperity preachers. In an address to graduating students at Liberty University on May 13th, Mr Trump promised his audience a “totally brilliant future”, and said that his presidency is “going along very, very well”. He ascribed both happy observations to “major help from God”. Lots of believers credit God for success, but Mr Trump went further. He described an America in which winners make their own dreams come true. He hailed a 98-year-old in the audience whose death by the age of 40 had been predicted by experts. He praised strivers who speak hopes aloud, ignoring doubters, and growled: “Nothing is easier or more pathetic than being a critic.”

That boosterism would sit happily in a sermon by preachers like Joel Osteen, routinely watched by television audiences of 7m, or Creflo Dollar, the Rolls-Royce-owning pastor of an Atlanta megachurch with 30,000 members. This is no accident. As Ms Bowler explained this month at the Faith Angle Forum, a twice-yearly conference about the interplay of politics and religion, as a young man Mr Trump attended a New York church led by Norman Vincent Peale, a “positive thinker” who also officiated at his first marriage. A prosperity preacher, Paula White, spoke at Mr Trump’s inauguration, despite grumbles about her hard-sell techniques, with worshippers prodded to make such “demon-slaying, abundance-bringing” donations as $229, chosen to honour I Chronicles 22:9, with its talk of Solomon earning respite from “enemies on every side”.

Prosperity preachers are often dismissed by mainstream theologians as pompadoured hucksters (think Oral Roberts, a pioneering televangelist) or as near-heretics, for suggesting that believers can achieve God-like powers over their own health and wealth. But they reflect a Trumpian worldview. “Blessed”, a book about the prosperity gospel by Ms Bowler, describes the fine line between telling boastful untruths and “positive confession”, by which a bankrupt might thank God for an imaginary gusher of money, or a deathly ill congregant might insist that she is already cured, in the belief that naming a desire will bring it about. Like the Trump family, megachurch pastors and their immaculately groomed wives and children are held up as models of divine favor: winners who have found the rungs of an invisible ladder to success.

Prosperity ministries revere celebrity—a Los Angeles church gave Jesus his own star, evoking the ones on Hollywood’s Walk of Fame. The movement has deep roots, stretching back to 19th-century touring mesmerists and Pentecostal healers, and to the Depression-era pastor whose version of Psalm 23 began: “The Lord is my Banker, My Credit is Good.”

It is a theology for self-made men who scorn the idea of luck. God gives him “confidence”, the president bragged last year. That is a very American creed.

Let me repost from the previous blog:


~ “The roots of what has come to be known as “Social Darwinism” can be traced back to the robber baron era in the latter nineteenth century. The idea that the economy of a successful capitalist society amounts to a cut-throat competitive struggle, much like what was supposed to be the case in the natural world, was inspired by the British social theorist Herbert Spencer. In fact, it was Spencer who coined the term “survival of the fittest,” not Darwin.

With Social Darwinist rhetoric, and policy proposals, being much in evidence these days, we should try to set the record straight about Darwin. In fact, Darwin's Darwinism was radically opposed to an individualistic, “nature, red in tooth and claw” political ideology (as the poet Alfred Lord Tennyson described it), especially in social species like honeybees and humankind. In his treatise on human evolution, The Descent of Man, published twelve years after The Origin of Species, Darwin recognized that humans evolved in interdependent cooperative groups, not as isolated individuals, and that cooperation was the key to our success.

Indeed, Darwin attributed our dominant position in nature and our remarkable cultural attainments to our evolved social, moral, and mental faculties, in combination with our language abilities. Following a discussion in The Descent devoted to the role of social behavior in various species, Darwin dealt at length with the subject of “Man as a Social Animal.” He concluded that our morality is a product of the evolutionary process, and he believed that our “social instincts,”  including even our capacity for “sympathy,” “kindness,” and the desire for social “approbation,” are rooted in human nature. The rudiments of these behaviors, he pointed out, can be found in other social species as well.

Darwin's Darwinism was grounded in a more accurate understanding of human nature, and of the circumscribed role of competition in any society. Social Darwinism represents a perversion of Darwin's views. It is time to consign it to the museum of antiquated ideologies.” ~

Pierre de Clausade (1910-1976): Coastal scene


At the same time we also know that if stress is very high during childhood, with survival at stake (if not physical, then psychological survival, e.g. struggling against bullies and/or an abusive parent), conscience becomes a luxury. It may simply never develop. The person ends up defensively grandiose and self-centered, boasting non-stop and striking out with fury, overreacting to minor or imaginary slights. That, of course, is the pathology of narcissism. The majority of humans are mostly cooperative, and that, as Darwin pointed out, has been the source of humanity’s great achievements.


Even Nietzsche, generally seen as a great individualist, saw that genius is neither rare nor isolated; what makes genius appear rare is that it takes just the right assemblage of “five hundred hands” to produce great achievement. First, Nietzsche asks us to imagine a Raphael born without hands; then, broadening the figure, he reminds us it takes many others to make genius possible. Some of those necessary “helpers” may be dead, and most will be unknown; all we can say with certainty is that nothing is solely the accomplishment of a single individual, but the result of a very complex network of cooperation.


“It takes hard work to attain nothingness. And then what do you have?” ~ 

the Jewish Buddha 


Just talking about your life means “re-writing” it; it rewires the brain, literally changing the neural pathways. And talk about writing about your life as re-writing it — don’t get me started.


(this struck me: “Freud’s picture of the child, Phillips suggests, resembles at times Anti-Semitic perceptions of the Jew, “sensual, voracious, and transgressive, the iconoclast, the saboteur in a world of [adult] law and order.”) (and the last part, about people’s fear of pleasure and its necessity for survival)

“Although he wrote speculative accounts about the lives of Moses, Michelangelo, Shakespeare, and Leonardo da Vinci, Sigmund Freud had an intense aversion to biography. “To be a biographer,” he wrote in 1936, “you must tie yourself up in lies, concealments, hypocrisies, false colorings, and even in hiding a lack of understanding, for biographical truth is not to be had, and if it were to be had we could not use it…”

And yet, psychoanalysis, the treatment Freud invented, Adam Phillips points out, was predicated on reconstructions of the past. And on using childhood memories, recouped as knowledge, as resources in the making of an unknowable future.

Phillips celebrates Freud, about whom the most dogmatic thing he can find is his skepticism. And his ambivalence. He includes Freud’s work as part of “great modernist literature,” in which “coherent narratives of and about the past were put into question,” but also deems psychoanalysis to be, in no small measure, evidence of Freud’s resistance to modern culture.

Coming of age between two worlds, he argues, Freud endorsed Enlightenment values against the “superstition” of religion, and made some room for freedom, rationality, and choice, while exposing the irrationality of everything that is human, including the rationality of the Enlightenment.

In Becoming Freud, Phillips, the former Principal Child Psychotherapist at Charing Cross Hospital in London and the general editor of the Penguin Modern Classics translations of Freud’s work, uses the story (or, he would no doubt acknowledge, “a” story) of Freud’s early years to make a fascinating (and compelling) case that psychoanalysis in actually a distinctive form of biography, without a known beginning, middle, and end, in which a useful, personal, and private truth may be discerned through a conversation in which patients, often for the first time, speak about and for themselves, answer back, recover, revise, and re-right foundational life experiences.

He also indicates that psychoanalysis, the invention of a self-proclaimed “godless Jew,” was, among other things, about acculturation. No one, Freud insisted, could ever be fully assimilated or would wholly identify with or invest in his culture; however enabling, civilization was inevitably experienced, starting from infancy, as, in varying degrees, oppressive. Freud’s picture of the child, Phillips suggests, resembles at times Anti-Semitic perceptions of the Jew, “sensual, voracious, and transgressive, the iconoclast, the saboteur in a world of (adult) law and order.”

The whole history of psychoanalysis, Phillips asserts, came out of a simple observation: infants survive because someone looked after them and “something was driving them to be looked after.” Interested in how instinctual desire made itself known, Freud gave analysts a parental role, in which they listened carefully to the child. The psychoanalytic story, Phillips emphasizes, is about a couple, mother and child, soon joined by a father to “make the essential triangle.” In the sessions, which take place again with a couple — “though the world outside the consulting room is an always pressing third party” — the viability of appetite is at stake, as shaped by “news from the past for the future.”

Freud’s therapeutic method – “not quite a technique and not simply a talent; and not, it turned out, quite as effective as he wished” — gets people talking about their lives, their resistance to, fears about, and sabotaging of, pleasure. It induces patients to understand pleasure seeking and its relationship to their suffering and their survival.


One reason I’ve always hated to talk about my life has been the keen realization that everything I say is false — not a deliberate lie, but an unavoidable partial and false version, an enormous oversimplification, biased according to the moment and the context of the telling. Of course that happened also in poems, especially childhood poems: a painful sensation of inescapable lying, only partly redeemed by artistic merit.

We can’t even tell a dream without changing it so it makes more sense. Just telling is is interpretation. Just the way we manage to remember it is, usually leaving out so much, which will then quickly dissipate.

But I’ve grown easier on myself, knowing that “absolute truth” is neither knowable nor desirable, and art has to be selective and simplify. Rather than an accurate life story — aside from the important realization that I am not to blame for all the bad things that happened; circumstances played a huge part — it’s more important to have a life philosophy that serves the present, making it worth living. Besides, I can always treasure-hunt and polish the good things I produced in the past, those “inaccurate” poems and prose memories that I enjoy sharing with others.

from another source:

~ “Freud remarks that the fullness of happiness cannot come from any one thing, at least not within a civilization where man’s instincts cannot be completely fulfilled. This is where Christianity fails, he believes, because it declares that the only source of true, lasting happiness is in God. When the Christian pursuing God does not find his desires met, Freud counts that as a failed path to happiness. The nonreligious man, he says, is then free to pursue another path, but the Christian is left to resign himself to the idea that it must not be God’s will for him to be happy.” ~


Freud was correct, I think, in specifying “love and work” as the most important things in life. Pursuing “god” does not work for most; few are capable of true mysticism, which is an intense relationship with an imaginary being. But human love can certainly be a source of happiness. I think any woman would question Freud’s assertion that all happiness is at bottom sexual; women can get an enormous pleasure just from talking to a supportive friend — and the release of oxytocin can be as much as that during orgasm.

True nature lovers, e.g. those who hike a lot, derive a great deal of pleasure from simply standing on a ridge, taking in the panorama. Music lovers . . . but I don’t have to go on. Freud’s views of pleasure were strangely limited. Of course we must have pleasure in order to flourish, but it’s precisely civilization that made many kinds of pleasure available. 

For me, meaningful work is a primary source of pleasure.


“I told her the world was full of nice people. I'd have hated to try to prove it to her, but I said it, anyway.” ~ Jim Thompson, A Hell of a Woman

 Bette Davis in Dangerous, 1935

Franz Kupka: Portrait of Charles Baudelaire, 1907 (Baudelaire died in 1867, but these are his features, quite recognizably). The cigarette is modern.


Baudelaire drawing. Alas, I don’t know the name of the artist. 


I’m all for exercise, but nothing beats having centenarian genes -- and some of those centenarian women’s only exercise was knitting. But for the rest of us . . .

“People who chose to walk briskly for just 11 minutes per day (75 minutes each week) added 1.8 years to their life, compared to non-exercisers. That’s a nice boost for 11 minutes of walking per day! And it gets better. Those who walked 22 minutes every day (or 150 minutes/week or 30 minutes 5 days a week, following the federal recommendation) gained 3.4 years of life on average.

The people who increased their life span the most walked 43 minutes a day, lengthening their life by an average of 4.2 years. After 43 minutes, the benefits of longevity tended to level off. (Note to runners and other vigorous exercisers: You received the same benefit, but in about half the time.)”

ending on beauty:

This is what I want:
to die in the springtime,
beneath the blossoms —
midway through the Second Month,
when the moon is full.

~ Saigyō, tr Steven Carter

Jardin des Plantes