Saturday, June 18, 2016

NIETZSCHE WRONG! SUFFERING WEAKENS; ROZEWICZ: AGAINST ICARUS; JESUS DIDN’T BELIEVE IN SOULS GOING TO HEAVEN; "SHELL SHOCK" IS PHYSICAL BRAIN INJURY

At one time I don’t know when
at one time I thought I had the right the duty
to shout at the plowman
look look listen you blockhead
Icarus is falling
Icarus the son of vision is drowning
leave your plow
leave your field
open your eyes
there Icarus
is drowning
or that shepherd
with his back turned to the drama
of wings sun flight
of fall

I said oh blind ones

But now I don’t know just when now
I know the plowman should till the field
the shepherd watch over his flock
the venture of Icarus is not their venture
this is how it must end
And there is nothing
disturbing
about a beautiful ship sailing on
to its port of call

~ Tadeusz Różewicz, from “A Didactic Tale”

Jaroslaw Anders:

~ In “A Didactic Tale,” Rozewicz meditates on the painting “The Fall of Icarus,” traditionally attributed to Brueghel. In the painting, which has inspired several poems, most notably Auden’s “Musée des Beaux Arts,” Icarus’s fatal fall is virtually unnoticed by the central figure of the painting, the plowman preoccupied with his mundane task. The earthbound gravity wins over the reckless upward striving of the human spirit. But unlike Auden’s stoical, detached observation (“everything turns away/Quite leisurely from the disaster”), Rozewicz unambiguously takes the side of the plowman.

At one time, he says, he might have thought he had the right to shout “look look listen you blockhead/Icarus is falling/Icarus the son of vision is drowning.” But now he understands that “the plowman ought to till the land/the shepherd watch over his flock/the venture of Icarus is not their venture.” It is the unremarkable folk, the common people, who keep the world from self-destructing. Like Brueghel’s plowman, they move around with their eyes fixed on the ground, preoccupied with the necessary tasks of caring and feeding. ~ “Against Color,” The New Republic, November 8, 2011

Oriana:

I think already Auden is more on the side of the plowman. Auden saw the historical failure of lofty ideologies and great visions and ambitions. The plowman, the shepherd, the sailors -- these are the people who are of use to others, and ultimately advance progress, in small but essential ways. Already the myth itself was meant as a cautionary tale. Our admiration was supposed to be for sensible, moderate Daedalus, the brilliant engineer, not for the foolish youngster with his hubris.

*

Cherub and Lupe Velez, a Mexican and American movie star. Her suicide in 1944 is described in a book "From Bananas to Buttocks."

 
NIETZSCHE WAS WRONG: WHAT DOESN’T KILL ME MAKES ME WEAKER


We now know, from dearly bought experience, much more about post-traumatic stress experience than we used to. Apparently, one of the symptoms by which it is made known is that a tough veteran will say, seeking to make light of his experience, that “what didn’t kill me made me stronger.” This is one of the manifestations that “denial” takes. ~ Christopher Hitchens, “Mortality,” 2012

Oriana: “About suffering they were never wrong, the Old Masters.” So begins Auden’s great poem based on Brueghel’s painting, The Landscape with the Fall of Icarus. The Old Masters knew that suffering goes on in untidy corners, is ignored and later soon forgotten. Nietzsche, on the other hand, was wrong about suffering. “That which does not kill us makes us stronger” is perhaps his most famous aphorism. This notion lives on — which is ironic, Nietzsche’s life having been rather short and miserable and all the worse for his suffering (the newest thinking is that it was brain cancer) — making it pretty obvious that the great philosopher was in denial. 

Worse, the saying continues to resonate — and not just with Catholics, lapsed or otherwise, who were brainwashed to believed that “suffering is good for you.” We all got brainwashed.

Noam Shpancer: “One reason is that suffering, as Freud famously recognized, is an inevitable part of life. Thus we have developed many ways to try to ease it--one of which is bestowing upon it transformative powers (another is by believing in an afterlife, of which Freud disapproved; still another is cocaine, of which he was, for a time, a fan).

Another reason is that American culture, born of trauma and imbued with a hopeful can-do ethos, wants to believe this idea, finding it self-affirming. Once we have acquired a certain belief we tend to see, remember, and report mostly instances and events that support it. This is called confirmation bias.

Yet another reason we think trauma may be transformative is that we see variants of this process around us. Bacteria that are not killed entirely by an antibiotic will mutate and become resistant to it. People who go through the hardship of training tend to improve their performance. But human beings are not bacteria, and good training is not a traumatic event.

Now it is true that, in an evolutionary sense, those who survive a calamity are by definition the fittest. But it is not the calamity that made them so. For our minds, however, the leap is short between seeing the strong emerge from a calamity and concluding that they are strong because of the calamity.
Years ago, during my mandatory army service in Israel, I took part in anti-terrorist training that involved working with the K9 unit. I asked the unit commander where he found those vicious attack dogs of his. Most people, he said, believe that wild street dogs make the best anti-terrorist dogs, having survived the, well, dog-eat-dog world of the mean streets. But the truth is just the opposite. Street dogs are useless for this--or any other--work because they are unpredictable and not trainable. Dogs that have been well cared for, loved, and protected all their lives--those are the best anti-terrorist dog candidates.

And this is true for humans as well. Mayhem and chaos don't toughen you up, and they don't prepare you well to deal with the terror of this world. Tender love and care toughen you up, because they nurture and strengthen your capacity to learn and adapt, including learning how to fight, and adapting to later hardship.


Nietzschean — and country song — wisdom notwithstanding, we are not stronger in the broken places. What doesn't kill us in fact makes us weaker. Developmental research has shown convincingly that traumatized children are more, not less, likely to be traumatized again. Kids who grow up in a tough neighborhood become weaker, not stronger. They are more, not less likely to struggle in the world.”

https://www.psychologytoday.com/blog/insight-therapy/201008/what-doesnt-kill-you-makes-you-weaker



Oriana:

It’s being loved that makes us stronger. Connect, connect, connect! When hardship strikes, we need empathy. And that’s just “hardship.” In case of real trauma, we need a great deal of empathy and other help too, or we may never recover.

Chuck Lorre: "That which does not kill me makes me bitter.” I used to be a classic example of this. In the end I came to my senses, not wanting to waste any more life on bitterness. But some people stay bitter to the end, in depression in their seventies and eighties, getting more and more bitter. So sad.

People resist the idea that Nietzsche was wrong because they want to justify suffering as something that "toughens us up." Unfortunately this serves as an excuse for all kinds of cruelty. It’s interesting that Nietzsche saw religions as being at bottom systems of cruelty, yet didn’t see that the “suffering is good for you” argument is also in service to cruelty.

Some people even imagine that the more diseases we survive, the better for the body — that's part of the argument against vaccines. No, diseases harm us. Some of the damage is irreversible.

True — there are examples of people who in response to tragedy were transformed into heroes and/or activists. But we are eager to forget about those countless forgotten victims whose lives have been destroyed by suffering. We don’t want to contemplate what drives people to suicide. Yet it bears repeating: suffering does not make us stronger. Being loved is 

what makes us stronger.
 





Charles:

Many people believe that suffering makes us stronger because it makes a more interesting story — the overcoming of suffering.

Oriana:

I’m constantly aware of how much more I could do without my handicap — I’d be a more interesting person, actually, having had more experiences — by now having traveled to Italy, Ireland, and Lithuania, and having met who knows how many interesting people who’d enrich my psyche.

And instead of reading about inflammation and trying to find out which remedies are the least harmful, I could be reading wonderful essays and poems.

So much energy goes into dealing with the suffering that could go into better venues. Aren’t children better off when not crippled by polio? Should we bring it back because it “builds character”?

But it’s perfectly human not to want to acknowledge how destructive suffering is, and to go into denial about that obvious aspect. We latch on to the stories of exceptional individuals. Even they weren’t exactly made stronger by suffering — their life expectancy, for instance, got permanently shortened, and that “worn out” feeling and various other “souvenirs” of the illness or another catastrophe may never go away — but they found a creative adaptation to adversity.

Yes, that’s a more interesting story than the story of being slowly destroyed by suffering, which is what life is, but in different ways, at different pace and different degrees of intensity — and the degree and speed of destruction matter a lot. It’s marvelous to contemplate the stories of survival and some kind of accomplishment against all odds. The once-per-history Stephen Hawking. But when I think of people I’ve known who did succumb to cancer after long months of terrible suffering — perhaps they died with stoic heroism, without complaining, but no, they did not gain anything from the cancer that they would not gladly give back for even one more month of normal life. Perhaps even just one more week. Who knows, maybe even one more day.

The price of suffering is well-known to those who suffer, but they know their stories, unless crowned by a miraculous recovery, are not welcome. It’s similar to the stories of immigrants — unless they conform to what people want to hear, those stories are not welcome, and the enormity of suffering involved in being an immigrant remains unknown.

Let’s face it, the stories of suffering as suffering are totally unwelcome. You’re supposed to be destroyed quietly, almost secretly. Your obituary may mention how brave you were and could even manage to crack a joke. It omits the potential contributions you might have otherwise made during that “long battle with X.” 






HOW SUFFERING MAKES US WEAKER (a prelude)

I’ll devote more space to this in an upcoming blog — one of the posts will be devoted to the life-long damage resulting from early trauma (including being bullied at school — it’s by no means trivial).

For now, let me briefly contemplate what happens when someone has a serious but non-fatal stroke. So, something bad happened that didn’t kill — but did it make the person stronger?

Consider also the resources spent on rehabilitation — and we are talking the best case here, where such resources are available. Perhaps in addition to the experts there is also a devoted spouse or parent available to continue the intensive training, trying to make the person relearn speech and basic skills and some knowledge about the world. Imagine several years of this intensive effort and expenditure — all of it just to make up for the loss, not to advance to higher ground than before. And no “resistance to a future stroke” is being built.




You may object that stroke is too extreme an example. Let’s take stammering, then. The King’s Speech was an excellent movie that showed how the future King George VI struggled to overcome his stammer. We are shown the childhood abuse that led to his “losing his voice.” And we are shown the heroic persistence of his speech therapist and his pupil, crowned with success — of sorts.

The king manages — but barely only manages. He does not become an inspiring speaker that perhaps he would have become had the suffering not taken place, the stammer did not develop, and the time spent trying to overcome the handicap would have been freed for developing public speaking (or another skill) to the level of excellence.

Did suffering make King George stronger? While the overcoming of his handicap is certainly in the “inspiring” category, my final verdict, when I ponder the suffering, is “What a waste.” Unfortunately, most suffering is that. Chronic stress doesn’t make us more resilient. On the contrary, even minor stress can be very damaging if it comes on top of chronic stress.

A stray thought: our denial about the ravages and sheer waste of suffering may in part be due to the example of athletic training. But that’s not suffering in the real sense — and besides, the philosophy of “no pain, no gain” is now being seriously questioned. No, we don’t want too much inflammation, and we most definitely don’t want muscle or tendon damage!

*

In an ideal world, we wouldn’t be perceived as soldiers. We would be singers, dancers, lovers; travelers and explorers; loved children and loving parents. It’s not good to be walking with a pebble in your shoe and that constant irritation — even if it’s just a small pebble! — eclipsing the more worthwhile aspects of life. Do not go into denial and praise the pebble. If it’s possible to remove the pebble, by all means remove it.

Cardinalis cardinalis in flight
 

“SHELL SHOCK” (BLAST INJURY) IS PHYSICAL BRAIN DAMAGE

“In early 2012, a neuropathologist named Daniel Perl was examining a slide of human brain tissue when he saw something odd and unfamiliar in the wormlike squiggles and folds. It looked like brown dust; a distinctive pattern of tiny scars. Perl was intrigued. At 69, he had examined 20,000 brains over a four-decade career, focusing mostly on Alzheimer’s and other degenerative disorders. He had peered through his microscope at countless malformed proteins and twisted axons. He knew as much about the biology of brain disease as just about anyone on earth. But he had never seen anything like this.

The brain under Perl’s microscope belonged to an American soldier who had been five feet away when a suicide bomber detonated his belt of explosives in 2009. The soldier survived the blast, thanks to his body armor, but died two years later of an apparent drug overdose after suffering symptoms that have become the hallmark of the recent wars in Iraq and Afghanistan: memory loss, cognitive problems, inability to sleep and profound, often suicidal depression. Nearly 350,000 service members have been given a diagnosis of traumatic brain injury over the past 15 years, many of them from blast exposure. The real number is likely to be much higher, because so many who have enlisted are too proud to report a wound that remains invisible.

Perl and his lab colleagues recognized that the injury that they were looking at was nothing like concussion. The hallmark of C.T.E. is an abnormal protein called tau, which builds up, usually over years, throughout the cerebral cortex but especially in the temporal lobes, visible across the stained tissue like brown mold. What they found in these traumatic-brain-injury cases was totally different: a dustlike scarring, often at the border between gray matter (where synapses reside) and the white matter that interconnects it. Over the following months, Perl and his team examined several more brains of service members who died well after their blast exposure, including a highly decorated Special Operations Forces soldier who committed suicide. All of them had the same pattern of scarring in the same places, which appeared to correspond to the brain’s centers for sleep, cognition and other classic brain-injury trouble spots.

Then came an even more surprising discovery. They examined the brains of two veterans who died just days after their blast exposure and found embryonic versions of the same injury, in the same areas, and the development of the injuries seemed to match the time elapsed since the blast event. Perl and his team then compared the damaged brains with those of people who suffered ordinary concussions and others who had drug addictions (which can also cause visible brain changes) and a final group with no injuries at all. No one in these post-mortem control groups had the brown-dust pattern.

Perl’s findings, published in the scientific journal The Lancet Neurology, may represent the key to a medical mystery first glimpsed a century ago in the trenches of World War I. It was first known as shell shock, then combat fatigue and finally PTSD, and in each case, it was almost universally understood as a psychic rather than a physical affliction. Only in the past decade or so did an elite group of neurologists, physicists and senior officers begin pushing back at a military leadership that had long told recruits with these wounds to “deal with it,” fed them pills and sent them back into battle.

Trinitrotoluene, or TNT, was first used in artillery shells by the German Army in 1902. Soon after the First World War started in 1914, a rain of these devices was falling on the hapless men on each side of the front. It was a level of violence and horror far beyond the cavalry charges of earlier wars. Very quickly, soldiers began emerging with bizarre symptoms; they shuddered and gibbered or became unable to speak at all. Many observers were struck by the apparent capacity of these blasts to kill and maim without leaving any visible trace. The British journalist Ellis Ashmead-Bartlett famously described the sight of seven Turks at Gallipoli in 1915, sitting together with their rifles across their knees: “One man has his arm across the neck of his friend and a smile on his face as if they had been cracking a joke when death overwhelmed them. All now have the appearance of being merely asleep; for of the several I can only see one who shows any outward injury.”

One British doctor, Frederick Mott, believed the shock was caused by a physical wound and proposed dissecting the brains of men who suffered from it. He even had some prescient hunches about the mechanism of blast’s effects: the compression wave, the concussion and the toxic gases. In a paper published in The Lancet in February 1916, he posited a “physical or chemical change and a break in the links of the chain of neurons which subserve a particular function.” Mott might not have seen anything abnormal in the soldiers’ brains, even if he had examined them under a microscope; neuropathology was still in its infancy. But his prophetic intuitions made him something of a hero to Perl.

Mott’s views were soon eclipsed by those of other doctors who saw shell shock more as a matter of emotional trauma. This was partly a function of the intellectual climate; Freud and other early psychologists had recently begun sketching provocative new ideas about how the mind responds to stress. Soldiers suffering from shell shock were often described as possessing “a neuropathic tendency or inheritance” or even a lack of manly vigor and patriotic spirit. Many shell-shock victims were derided as shirkers; some were even sentenced to death by firing squad after fleeing the field in a state of mental confusion.

This consensus held sway for decades, even as the terminology shifted, settling in 1980 on “post-traumatic stress disorder,” a coinage tailored to the unique social and emotional strain of returning veterans of the war in Vietnam. No one doubted that blasts had powerful and mysterious effects on the body, and starting in 1951, the U.S. government established the Blast Overpressure Program to observe the effects of large explosions, including atomic bombs, on living tissue. One of my uncles recalls standing in the Nevada desert as an Army private in 1955, taking photographs of a nuclear blast amid a weird landscape of test objects: cars, houses and mannequins in Chinese and Soviet military uniforms. At the time, scientists believed blasts would mainly affect air pockets in the body like the lungs, the digestive system and the ears. Few asked what it would mean for the body’s most complex and vulnerable organ.

Daniel Perl is continuing to examine the brains of blast-injured soldiers. After five years of working with the military, he feels sure, he told me, that many blast injuries have not been identified. “We could be talking many thousands,” he said. “And what scares me is that what we’re seeing now might just be the first round. If they survive the initial injuries, many of them may develop C.T.E. years or decades later.”

Perl takes some solace from the past. He has read a great deal about the men who suffered from shell shock during World War I and the doctors who struggled to treat them. He mentioned a monument in central England called “Shot at Dawn,” dedicated to British and Commonwealth soldiers who were executed by a firing squad after being convicted of cowardice or desertion. It is a stone figure of a blindfolded man in a military storm coat, his hands bound behind him. At his back is a field of thin stakes, each of them bearing a name, rank, age and date of execution. Some of these men, Perl believes, probably had traumatic brain injuries from blasts and should not have been held responsible for their actions. He has begun looking into the possibility of obtaining brain samples of shellshocked soldiers from that war. He hopes to examine them under the microscope, and perhaps, a century later, grant them and their descendants the diagnoses they deserve.”

http://www.nytimes.com/2016/06/12/magazine/what-if-ptsd-is-more-physical-than-psychological.html?action=click&pgtype=Homepage&region=CColumn&module=MostEmailed&version=Full&src=me&WT.nav=MostEmailed&_r=0


 
INTROVERSION OR INTELLECT? (a more subtle understanding of introversion)


“What many people ascribe to introversion really belongs in the intellect/imagination domain. Intellect/imagination represents a drive for cognitive engagement of inner mental experience, and encompasses a wide range of related (but partially separate) traits, including intellectual engagement, intellectual curiosity, intellectual depth, ingenuity, reflection, introspection, imagination, emotional richness, artistic engagement, and aesthetic interests.

Traits such as sensitivity and social anxiety are also not part of Big Five introversion-extraversion domain. To be sure, many people may think of themselves as introverted because they are highly sensitive. But research shows that sensory processing sensitivity is independent of introversion. The various manifestations of being a highly sensitive person — inhibition of behavior, sensitivity to environmental stimuli, depth of information processing, and physiological reactivity — are linked to neuroticism and intellect/imagination, not introversion.

Finally, there's a common misconception that all introverts enjoy solitary activities. However, that isn't a defining feature of introverts. Responses such as "Enjoy spending time by myself" and "Live in a world of my own" involve an equal blend of introversion and intellect/imagination. Contrary to popular conceptualizations of introversion, preferring to be alone is not the main indicator of introversion.

The desire for positive social attention seems to be a particularly strong indicator of extraversion [4]. For example, Jacob Hirsh and colleagues found that taking into account the rest of the Big Five personality traits (agreeableness, neuroticism, conscientiousness, and intellect/imagination), the following 10 behaviors were most uniquely predictive of extraversion (from a list of 400 activities):

1. Told a dirty joke.

2. Planned a party.

3. Entertained six or more people.

4. Told a joke.

5. Volunteered for a club or organization.

6. Tried to get a tan.

7. Attended a city council meeting.

8. Colored my hair.

9. Went to a night club.

10. Drank in a bar.

Why might the drive for social attention be so strongly linked to extraversion? One possibility is that many human rewards are social in nature. Our complex social lives are probably the dominant force in human evolution, driving the evolution of intelligence, creativity, language, and even consciousness. The human reward system, therefore, most likely evolved to be particularly responsive to social rewards.

There are costs to extraverted behavior, however. This includes time and energy that could be invested in other activities, such as accomplishing a goal (conscientiousness) or engaging with ideas and imagination (intellect/imagination). There is also the risk that inappropriate attention-seeking behavior can fall flat, leading to reduced attention-holding power. Finally, high levels of exploration of the environment can expose extraverted individuals to increased physical risks. For instance, extraverts are more likely to be hospitalized due to accident or illness, and are more likely to become involved in criminal or antisocial behaviors and get arrested.

It's important to distinguish, however, between the most prominent behavioral manifestation of extraversion (desire for social attention) and the core underlying mechanism of extraversion (reward sensitivity). Even though reward sensitivity need not be limited exclusively to social situations, high reward sensitivity likely motivates extraverts to seek out potentially rewarding positive social interactions, and fuels them to display behaviors that will increase social attention (e.g., friendliness, smiling, high energy, loudness, exhibitionism, positive emotions).

From a biological perspective, reward sensitivity is likely governed by dopamine. While dopamine is involved in a variety of cognitive and motivational processes, the unifying function of dopamine is exploration. According to Colin DeYoung, "the release of dopamine, anywhere in the dopamingergic system, increases motivation to explore and facilitates cognitive and behavioral processes useful in exploration."

A lot of introverts notice that they often need to be alone to recharge their batteries after vigorous social interactions, whereas extraverts appear to gain energy from social interactions. This can be explained by dopamine's function in energizing potentially rewarding social interactions, as well as its role in overcoming the cost of effort. For introverts, such interactions are more effortful and tiring due to their less active reward system.”

http://blogs.scientificamerican.com/beautiful-minds/will-the-real-introverts-please-stand-up/



Oriana:

Funny that a big indicator of extraversion is telling dirty jokes.

 I thought wanting to spend time alone so I can process experience was the very definition of introversion, but it does make more sense to speak of being high on the intellect/imagination dimension. By the way, this dimension is traditionally designated as “openness to experience” — which doesn’t seem to be an accurate equivalent, though openness to INNER experience would be part of “intellect/imagination.”

I think I have an openness to ideas. Experiences — I need to think about the possible cost, including unpleasant memories and impact on health.

I can even imagine myself becoming a lot more sociable — if I lived around interesting, educated people, for instance. The whole dimension of introversion is not terribly clear. So much depends on the context. When I visited Boston and met a lot of educated people I became so sociable I could hardly shut up.

So perhaps it's more about the quality of people introverts meet. With the right people, I am sociable; with those who are into small talk or women who talk exclusively about their children, I find excuses to leave.

I'd still need lots of solitude in order to process the social interactions. A little goes a long way because I need to relive anything significant and think about it — to let my mind loose on it. If I don't have enough time to process, then life seems to just flee like sand through the fingers and becomes pretty meaningless. But the processing of experience is part of the intellect/imagination dimension rather than introversion per se.

I still think there is something to the augmenter/reducer dimension introduced in the sixties by the psychologist Aneseth Petrie. Reducers have weaker cortical arousal and need strong stimulation (e.g. noisy music); augmenters tend to magnify stimulation, so they prefer the quiet and the subtle. Reducers, who tend to be chronically understimulated, are more likely to be smokers and rely on coffee and other stimulants to raise their arousal level. They are easily bored. Augmenters seek out silence or soothing stimulation — doesn’t that sound like the classic description of an introvert?


JESUS NEVER SAID SOULS GO TO HEAVEN OR HELL AFTER WE DIE; HOW CHRISTIANS SWITCHED TO PLATO
“Neither Jesus nor any writer of the bible says anything about the soul going anywhere when they describe death. Nor do they identify the soul with the mind, or with the whole human being, as Christians began doing when in the fourth century. Jesus certainly taught that there will be life after death — the resurrection — but he didn’t teach that there will be life right after death, which is what most Christians now believe.

Jesus talked about souls, but he didn’t think of them in the way that most Christians do. Like the other first-century Jews, he thought of a person’s soul as what made him or her be alive [“the animating principle” — oddly enough, that’s how the Catholic Encyclopedia defines the soul]. Common words for soul were nefesh and ruach in Hebrew, and spiré and pneuma in Greek. Like the words for soul in most languages these came from words for breath, wind, and what’s moving. The reason words meaning air that’s moving were used for what makes us alive is probably that people noticed that as long as we are alive, we breathe, when when we “breathe our last breath,” we die. So they thought that our breath is what keeps us alive. Nothing was thought of as immortal here, as the soul is immortal in Greek dualism. The soul was understood to be mortal just like the rest of the person, and at death, both were destroyed.

If Jesus thought of the soul as what makes a person alive, and not as the person’s nonphysical, immortal mind, where did the now popular idea that the soul is the nonphysical, immortal mind come from? That idea came not from the bible but from Greek philosophy. Greek-influenced Christians tended to be dualists, thinking of each person as two things: a mortal body being controlled by an immortal soul.

The most influential of those dualists was Augustine, who defined a human being as “a rational soul using a mortal and earthly body.” That definition would have puzzled Jesus because he thought of a human being as one thing — a living body, not two things — a soul, plus a body that it “uses.”

In switching to Platonic ideas about death liberating the immortal soul, Christian thinkers quietly put aside Jesus’ ideas, which he shared with the writers of the bible, that death destroys us. What Jesus added was that the destruction of death is not permanent because at the end of the world God will intervene in the natural order and resurrect everyone [in flesh], judge them, and reward or punish them.

In Jesus’ day, this idea of the resurrection was less than two centuries old and was not accepted by all Jews. The Sadducees rejected it because it was not well-grounded in the scriptures. If you read through the whole Old Testament — over one thousand pages — God says nothing at all about anyone living after they die. And just before he drives Adam and Eve out of the garden, he scolds Adam, saying, “You are dust, and to dust you shall return.”

There are just two sketchy prophetic passages in the OT that suggest a future resurrection, and it is not a resurrection of the human race. These passages were written at a time when Jews were being persecuted, and in both of them only Jews — maybe only some Jews — will be resurrected.

Any Jew who believed in the resurrection of the dead at the time of Jesus, then, had very little to base it on. Jesus is vague about what it will involve, except to suggest that everyone, not just some Jews, will be resurrected, and there will judgment after resurrection, followed by happiness for the good people and suffering for the bad. But whatever he said about the resurrection of the dead, it is clear that he did not say that people’s souls go to heaven or hell when they die.” ~ John Morreall, “Questions for Christians,” 2014. (John Morreall is professor and chair of the Department of Religious Studies at the College of William and Mary.)

(by that logic there are no human souls in heaven or hell right now, as the author explains in the chapter that follows; heaven or hell were to follow the bodily resurrection of the whole 

person)

Master of the Parrot, Madonna and Child. Note that Baby J looks like Dennis the Menace.
 

Oriana:

I am amazed how much stuff I was indoctrinated with is wrong from the point of view of the first-century Jewish beliefs. Such fantastic fabrications! The Jewish beliefs were fabrications as well, but I was taught heresies of those original fabrications.

And surely at least SOME clergy knew there was no scriptural support for the idea that there are any human souls in heaven or hell? That nobody is in a "better place" until the resurrection? (never mind that the resurrection seems awfully delayed).

I was so heavily indoctrinated — or call it having had an excellent memory even as a child — that later I discovered that absurdities still cling to my psyche, e.g. the soul being somehow separate from the body and going somewhere after the body dies. I'd never say that I believe that, but some of this nonsense still clings like a stubborn weed and has to be uprooted from the psyche. So it helps that even Jesus — if historical Jesus ever existed — did not believe in a soul separate from the body.

But what helps most is simply the view of “soul” as consciousness, an emergent phenomenon that stems from brain function. Once brain function ceases, consciousness ceases the way flame is gone when the fuel is exhausted. Consciousness doesn’t “go anywhere.” It ceases. 



Monet: Antibes seen from Salis Gardens

 
ending on beauty:

 
And when the Heavens — disband —
And Deity conclude —
Then — look for me. Be sure you say —
Least Figure — on the Road —

~ Dickinson, 401

This being the Southwest, "no figure on the road" is the typical experience. After the biblical  “end of the world” (eagerly looked forward to by many), for a while at least, things here would look pretty much the same . . . or in Nebraska, say.

I’m glad for the companionship of clouds.

By the way, “heavens disband” because “the kingdom of heaven” did not mean a place in the clouds. It meant the future paradise here on earth. “Thy kingdom come” — not that we “go to heaven” — the Jews at the time of Jesus had no concept of “going to heaven” — but that heaven comes to earth.

 





Sunday, June 12, 2016

WHEN THE GODS STOPPED SPEAKING: OUR INNER MONOLOGUE; RANDOMNESS — OR SIGNS AND WONDERS? DEPRESSION AND S&M THEATER OF SUBPERSONALITIES

 
AUBADE

I let him brush against me,
let his face muss my hair.
I turn to the half-glimmer

of dawn in his eyes —
hold his hand and whisper,
“No, no, it’s impossible.”

I wake up and wonder,
is it all behind me,
that alphabet of glances,

silences — is it all behind us,
that fire and shiver,
lost to us like lilacs,

like the scent of rain —
gone from us forever
because we’re not young —

No, it’s the sacred
shyness of the soul — the heart’s
double truth, an eternal flame.

The flame says nothing is wasted,
not even youth on the young.
How high that highest lantern

shines above our fear of the dark.
He said nothing and I said no,
but in silence everything was said.

~ Oriana © 2016

Upon awakening, I was also keenly aware that the man in the dream was only a “mere suggestion” of a man. It wasn’t anyone I knew — in fact that was the whole point. He was a generic figure standing for the situation when two people meet and within a short time know that under different circumstances they’d likely become lovers.

But do we really know it? A poem like this needs to be kept short, so I didn’t go into “the heart’s double truth.” Infatuation is intoxicating — nature’s trap to get the woman pregnant. But thanks to our complex brain with its competing pathways, there is also usually a shyness about these matters . . .  and there is hardly anymore more erotic than those initial silences between potential lovers.

I’ll say no more.


MY STEM-CELL TREATMENT PROGRESS REPORT

After an interminable wait (“We thought you became a missing person in that office!” the formidable stem-cell coordinator joked — not funny), the X-rays were up-loaded on the computer — another delay, because the temp technician didn’t know how to do it. Then I heard the PA’s voice (PA stands for “physician assistant” — they do some pretty sophisticated stuff these days) — I heard his loud, powerful voice from the adjacent room (I was in the tiny sausage-like treatment room).  And the voice exclaimed:

“But that’s an incredible improvement!”

And the PA, Patrick, rushed in to talk with me. “I’d love to see the X-rays,” I said. “Just a moment, let me take a picture.” Patrick rushed out. Soon he was back, his I-phone in his outstretched hand. “See, this is your previous X-ray: it’s bone on bone. Now look here: see all the space?”

~ “Does that mean that new cartilage is growing?”

~ “It’s growing.”

In a nutshell: cartilage is very difficult (many sources say “impossible”) to regenerate. Medical advice websites will tell you it simply doesn’t happen (so stop wasting your money on unproven pills and procedures, you fool). And the MRI made it clear that my left knee is extremely damaged. “Severe degeneration.” A damaged ligament, possibly torn. Bones near the knee becoming cystic rubble. The blackness of inflammation surrounding everything. A war zone. Think Syria.

~ “Yes, it’s the new cartilage that’s created the space.”

Patrick gently patted the area right above my knee on the left side. “No swelling,” he said. “Last time you were swollen.” I was amazed that he’d remember. I was still too taken aback by it all to tell him, in regard to swelling, that there were the good days and the bad days — the main factor seemed to be the amount of walking or trying out some recommended exercise (now I know better; more about this later).

And the stem cells aren’t yet done with their good work, Patrick said. I’ll probably have another treatment in December — at half price! Patrick got a big, big hug after that announcement.

But here is the bad news: there’s been very little functional improvement. So we discussed some options for that. Actually there is an adjuvant treatment that helps the stem cells: “Hyaluronic acid helps the stem cells,” Patrick said. He meant injections — hyaluronic acid is not absorbed orally.

As I’ve indicated, the mainstream medical view is that cartilage cannot heal. Stem cells treatment is still very new and untested — most MDs would warn against it. No insurance covers the cost. I knew from the start that I may be wasting my money, depleting my savings chasing a false hope. I took that risk. I just “had to do it.”

THE RAVAGES OF EXERCISE

There are also new ways to fight inflammation. I told Patrick that my experience has taught me that the key is fighting inflammation. One cause of inflammation is too much exercise, of the wrong sort (and almost all typical exercise is the wrong sort). A damaged joint, already inflamed, can get horribly inflamed after wrong, stressful exercise. In fact, in extreme cases, only the passive motion machine may do good rather than harm.

I realize that exercise is precisely what every website recommends for arthritis. It’s like the pressure to eat “whole grains” and stick to low-fat, high-carbohydrate diet — except that Atkins was a powerful voice who showed otherwise — and all nutritionists condemned him. Some of them are still vocal, though study after study has validated Atkins.

Now, we all know the benefits of exercise for healthy people with undamaged joints, but when it comes to severe damage, painful inflammation can follow (my joint damage began with a torn meniscus when I shattered my knee in a fall down a slippery staircase; I seemed to recover from that, but months later too much walking on hard pavement led to chronic pain, which led to seek medical attention; this led to a disastrous meniscus removal surgery that predictably led to terrible arthritis, as it always does — it wasn’t known back then; the meniscus, a crucial part of the knee’s shock-absorption system, was regarded as a useless clump of tissue, its function unknown, so surgeons removed it with the same gusto with which they used to remove tonsils).

Even too much walking — and “too much” isn’t much by normal standards — can lead to awful pain. Walking, especially on hard surface, is an impact exercise — not as much as running, but it’s still impact. So much for the countless websites saying that nothing is so good for arthritis as walking! Worse, even “gentle stretching” is not at all gentle — there comes a point when any active movement comes with a risk of pain. 

 
When I reflect on my worst episodes of pain over the years, I now see it with terrible clarity: with only one exception they were all caused by overexercise. 


It took me decades to realize this. Talk about being a slow learner!

Riding a bike, trying to learn to play tennis, a tai-chi class, a yoga class, climbing a steep hill, trying to keep up with the rest of the group — one by one I discovered what I mustn’t do. I discovered it the hard way. As if my awful migraines weren’t enough . . . “Perhaps my life really is a punishment for something terrible I must have done in my past life,” I even said to a couple of friends in what now seems a fit of metaphysical insanity.

Just as “health food” can be the very worst diet for a particular individual, so “healthy exercise” can be the worst thing for a person with an injured joint.

SIXTY STEPS ONCE EVERY HOUR

I don’t know enough to comment on a special “light workout” recommended by a certain book — say 60 steps once every hour, and “micro-pushes” against a ball — but even with such minimal workout I would suggest proceeding with caution. If the pain worsens — and it may take a few days to find out — immediately stop any new activity. Stretching, trying to strengthen the quads — all risky, all may aggravate the pain. And pain indicates inflammation and more cartilage loss.

At the same time, we know that lack of motion is bad. I’d like to learn more about the passive continuous motion machines that are being introduced not only after knee replacement, but also after cartilage injuries — I think the light is beginning to dawn that cartilage CAN regenerate — but it takes time and just the right treatment. Passive motion machines show great promise.

The usual treatment is anti-inflammatories. The downside — and this will sound like a cruel joke — is an actual acceleration of joint damage. That, and if the doses are high and taken for a long time, a high risk of kidney failure, and an increased risk of a heart attack and stroke.

But a short-term use of high-dose NSAIDs can make a terrific difference — for a while.

The new anti-inflammatory developments, Patrick said, include microdoses of NSAIDs and the injections of ibuprofen directly into the knee.

THE BANDAGE MIRACLE

I told him I just discovered something much simpler: using a compression bandage to wrap the knee. I couldn’t tolerate various knee sleeves and braces, especially the dreadfully expensive ones meant to compensate for the misalignment of the knee that develops after meniscus removal. But the day before my X-ray appointment I happened to see an elderly man at the library, adjusting the elastic bandage around his knee. At home one of the first posts I read was by a man who’d just happened to discover that wrapping his knee brought great relief from pain. In a dresser drawer I found a generous length of elastic bandage — left from my first stem-cell treatment, I later realized.

On went the bandage, snap went the velcro-like sticky part that holds it, and — instant relief.

I’d been relying on a pain lotion, and having to constantly reapply it was quite disruptive. The bandage turned out to be more effective.

*

During the consultation, the word “surgery” was never mentioned.

Oh, by the way, my left shoulder also got some stem cells for bursitis and a possible tear. X-rays showed no bursitis or any other abnormality. I expected as much: the recovery was very quick.

*


Actually, there IS a bit of functional improvement, but definitely not the sort that many people desire: being able to take do pretty much everything, even certain sports (running is out, but one retired nurse claims that after her knee replacement and lots of physical therapy she can sit in the lotus position). For me, more improvement is likely to come later, but given the absence of meniscus, I'll never have full function. Thanks to the original ignoramus surgeons who crippled not just me, but many UCLA football players, the trauma of the surgery was too much to contemplate, and the possible complications later. Infections and resulting amputations are rare, but they do happen to some patients.

For someone still at a relatively early to intermediate stage of joint deterioration, I wouldn't hesitate to recommend stem cells. At my advanced stage, it’s up to the person, after considering the many factors involved: age, state of fitness, presence or absence of cardiovascular disease, ability to tolerate some disability, and so on.

The interesting thing is that I was told back in 1992 that I needed TKR or I wouldn't be able to walk within two years. The orthopedic surgeon showed me the devastating X-rays. But because I remembered the horror of my first knee surgery, when my meniscus was removed, I decided to research alternative treatments. Glucosamine was just gaining publicity then, but the recommended doses were pathetically insufficient. Being a wild spirit, I started taking more and more of it; two years passed, then three, then four etc and I was walking better than before. And not just walking — I was hiking in the mountains. Thank goodness I didn't have knee replacement back in the early nineties, when the artificial joints were of terrible quality and the procedure extremely invasive. Now I still haven't 100% ruled out TKR, but I'm very happy that a biological solution exists.

**

The big promise of stem cells: heart disease, brain diseases, and autoimmune diseases. The future is not yet, but you can read online about the first steps being made.

One last detail: as I was driving home, my brain started playing Rachmaninoff’s Second Piano Concerto, the grand, sweeping orchestral part. It was involuntary — it’s just the sort of thing my brain does now and then instead of commenting in words. I love it when it happens.

*

You can read about my stem cell treatment (the step-by-step procedure) here:

http://oriana-poetry.blogspot.com/2015/12/soviet-nostalgia-stem-cells-growing-up.html


OUR INNER MONOLOGUE (Julian Jaynes: When the gods stopped speaking — how his theory fares today)

“In the beginning of the book, Jaynes asks, “This consciousness that is myself of selves, that is everything, and yet nothing at all—what is it? And where did it come from? And why?” Jaynes answers by unfurling a version of history in which humans were not fully conscious until about 3,000 years ago, instead relying on a two-part, or bicameral, mind, with one half speaking to the other in the voice of the gods with guidance whenever a difficult situation presented itself. The bicameral mind eventually collapsed as human societies became more complex, and our forebears awoke with modern self-awareness, complete with an internal narrative, which Jaynes believes has its roots in language.

The kind of search that Jaynes was on—a quest to describe and account for an inner voice, an inner world we seem to inhabit—continues to resonate. The study of consciousness is on the rise in neuroscience labs around the world, but the science isn’t yet close to capturing subjective experience. That’s something Jaynes did beautifully, opening a door on what it feels like to be alive, and be aware of it.

He writes that the characters in The Iliad do not look inward, and they take no independent initiative. They only do what is suggested by the gods. When something needs to happen, a god appears and speaks. Without these voices, the heroes would stand frozen on the beaches of Troy, like puppets.

The combination of instinct and voices—that is, the bicameral mind—would have allowed humans to manage for quite some time, as long as their societies were rigidly hierarchical, Jaynes writes. But about 3,000 years ago, stress from overpopulation, natural disasters, and wars overwhelmed the voices’ rather limited capabilities. At that point, in the breakdown of the bicameral mind, bits and pieces of the conscious mind would have come to awareness, as the voices mostly died away. That led to a more flexible, though more existentially daunting, way of coping with the decisions of everyday life—one better suited to the chaos that ensued when the gods went silent. By The Odyssey, the characters are capable of something like interior thought, he says. The modern mind, with its internal narrative and longing for direction from a higher power, appears.

He cites a carving of an Assyrian king kneeling before a god’s empty throne, circa 1230 B.C. Frequent, successive migrations around the same time in what is now Greece, he takes to be a tumult caused by the breakdown. And Jaynes reflects on how this transition might be reverberating today. “We, at the end of the second millennium A.D., are still in a sense deep in this transition to a new mentality. And all about us lie the remnants of our recent bicameral past,” he writes, in awe of the reach of this idea, and seized with the pathos of the situation. “Our kings, presidents, judges, and officers begin their tenures with oaths to the now-silent deities, taken upon the writings of those who have last heard them.”

It’s easy to find cracks in the logic: Just for starters, there are moments in The Iliad when the characters introspect, though Jaynes decides they are later additions or mistranslations. But those cracks don’t necessarily diminish the book’s power. Particularly, [Dennett] thinks Jaynes’ insistence on a difference between what goes on in the minds of animals and the minds of humans, and the idea that the difference has its origins in language, is deeply compelling.

“There is such a difference between the consciousness of a chimpanzee and human consciousness that it requires a special explanation, an explanation that heavily invokes the human distinction of natural language,” though that’s far from all of it, he notes. “It’s an eccentric position,” he admits wryly. “I have not managed to sway the mainstream over to this.”

It’s a credit to Jaynes’ wild ideas that, every now and then, they are mentioned by neuroscientists who study consciousness. In his 2010 book, Self Comes to Mind, Antonio Damasio, a professor of neuroscience, and the director of the Brain and Creativity Institute at the University of Southern California, sympathizes with Jaynes’ idea that something happened in the human mind in the relatively recent past. “As knowledge accumulated about humans and about the universe, continued reflection could well have altered the structure of the autobiographical self and led to a closer stitching together of relatively disparate aspects of mind processing; coordination of brain activity, driven first by value and then by reason, was working to our advantage,” he writes.

In 2009 [Kujisten] highlighted brain-imaging studies suggesting that auditory hallucinations begin with activity in the right side of the brain, followed by activation on the left, which sounds similar to Jaynes’ mechanism for the bicameral mind. He hopes that as time goes on, people will revisit some of Jaynes’ ideas in light of new science.

Ultimately, the broader questions that Jaynes’ book raised are the same ones that continue to vex neuroscientists and lay people. When and why did we start having this internal narrative? How much of our day-to-day experience occurs unconsciously? What is the line between a conscious and unconscious process? These questions are still open. Perhaps Jaynes’ strange hypotheses will never play a role in answering them. But many people—readers, scientists, and philosophers alike—are grateful he tried.

http://nautil.us/issue/24/error/consciousness-began-when-the-gods-stopped-speaking
 

Oriana: 

I continue to be fascinated by the “inner chatter” as well as by the question that first arose with religion lessons: why did multiple people hear god speak in the biblical times, but now there is only silence? Jaynes may be onto something — and he came up with his “bicameral” theory well before neuroimaging and all we’ve learned about hallucinations and dreams.

Emile Cioran said, “The Greeks awakened to philosophy the moment their gods were no longer adequate; ideas begin where Olympus leaves off. To think is to stop venerating.” But when and why did the gods become inadequate? Here Jaynes's theory may be relevant.

To me the lingering death of religion, along with the shift toward naturalistic explanations and humanism as opposed to the worship of an imaginary Superman in the Sky, is the greatest story and adventure of humanity, the most dramatic development in the cultural evolution. First, granted, came the development of language, and nothing compares to the quantum leap of that -- but that was so long ago . . .  The development of modern consciousness is relatively recent.

“Cultural” doesn’t mean that the biology of brain function is not involved. On the contrary, we know that speech rewires the brain. Perceiving our inner voice as simply our thoughts rather than a god speaking to us (also, think of a schizophrenic hearing voices) — what a breakthrough! Amazingly, there are those — certain Jungians in particular — who prefer the archaic — or maybe “schizophrenic” is a more accurate term — idea that our thoughts are being broadcast to us from the astral realm, and the brain is
only a radio.

Detail of "Dormition of the Virgin" at Cloisters, New York. Photo: Leonard Kress
  
DEPRESSION AND THE SADOMASOCHISTIC PLAY OF SUB-PERSONALITIES (THE ANTI-SELF VS THE WISE ONE)

“The real cause of depression, and all the rest of psychiatric symptoms, follows from the way one’s unique consciousness is formed in the brain all through development from embryonic life to age twenty. Our developmental experience is mapped in the limbic system and the cortex as incredibly complex circuits of neuronal maps that reflect the impacts of love, respect, deprivation, and abuse as digested by one’s unique temperament. These brain maps generate human consciousness — which is organized in as a drama in the theater of the brain with a cast of personas, feeling relationships between them, scenarios, plots, set designs and landscapes. The internal play is the consummate creation of the human genome. Once established, beginning at age three, the representational play operates via top down cortical processing, and is the invisible prism through which we live our lives.

Serotonin and the other neurotransmitters operate in the synapses of our limbic cortical brain maps connecting the trillions of neurons that create the mappings that form our plays. Serotonin has no life of its own. It is merely a brain mechanism that serves the neuronal organization of consciousness, the play itself. The way the limbic-cortical brain maps our experience reflects the actuality of our experience. IF OUR CHARACTER PLAY IS TOO DAMAGED BY DEPRIVATION AND ABUSE, IT GENERATES AN INVISIBLE SADOMASOCHISTIC PLAY THAT IS FILLED WITH ATTACK AND HUMILIATION, endless war. Consequently the activated internal play is one of continuous internal fighting between personas. As such it feeds on the serotonin supply on an ongoing basis. It is inevitable that the supply will be overtaxed. This is not the result of a serotonin problem. It is built in from a damaged characterological play. It is not a question of ‘if’, but only ‘when’ serotonin will be overused and depression will appear.”

https://www.psychologytoday.com/blog/the-theater-the-brain/201405/no-it-s-not-the-neurotransmitters

Oriana:

I feel uneasy about anyone who states, “The REAL cause of X is Y.” But the two paragraphs I chose do speak to my own experience. I used to have an acute sense of an “anti-self” within. The anti-self wanted me to fail just to prove that I was a total failure — especially in poetry (alas, I did pay for becoming a poet with a life-long sense of failure) and in love, but also in professional life — in everything that mattered.

I realize that the play of sub-personalities a simplified conceptualization — but the anti-self that felt real and kept producing images of my dead body or my body falling in an act of suicide. So “depression as an internal theater” makes some sense to me, as does the sadomasochistic tone of it, “attack and humiliation.”

As a result of insight, or perception shift, my own “anti-self” appears to have vanished, while the “wise one” keeps pointing out that having “failed” (in quotation marks because success is a matter of definition) at this or that doesn’t mean that I don’t have a lot to give. “You can fail at everything you do, and still be a success as a person.” But it was interesting to read this, and to find myself nodding my head.

The rest of the article is controversial, I know. I step away from the drug issue. I simply don’t know. “I did it my way.”

But I have also experienced a mood elevating effect from anti-inflammatories, so the issue of inflammation and depression is also of great interest to me. And the fact that anti-inflammatories produce mood elevation shows that physiological factors do have an impact and must be taken into consideration — without denying that the sadomasochistic automatic thinking is critical in deepening and perpetuating depression.


RANDOMNESS VERSUS “SIGNS FROM GODS”
 
With billions of people intersecting, interacting globally, continuously, we can figure that billion of things bounce off each other globally, continuously. Good things, bad things, unpredictable things, kind and terrible things, strange and unexplainable things.

 But the Law of Large Numbers degrades to nonsense when we try connecting dots for personal meaning and comfort. We want the dots to spell a message that we, among all people on earth, are special to God. That's when Jesus shows up on toast. That's when a hurricane kills dozens in New Orleans — because — as Pat Robertson put it — God hates gays.

People connect dots for personal reasons. A family survives a tornado in Kansas that blows their neighbors off the map, and they declare on national TV that God answered their prayers. Too bad about the neighbors.

Since time began we have been story telling, pattern seeking, meaning making animals. But religious dots, like history, are connected only by survivors.

Chris Kammal, a Florida paramedic:

 “I work as a medic fireman. I see death and mayhem routinely. I have run thousands of calls over the last 23 years and so many of them were people in extreme crisis, or already dead when we got there.

I was asked the other day if I've ever seen miracles or things that can't be explained. My response was that I see many things beyond explanation but never anything miraculous. The miraculous implies that forces outside this world intervened, defying the basic principles of nature. But theres no evidence that the laws of physics have ever been suspended to save someone who was standing in the way.

A tree falling on a child doesn't reverse course because a mom cries out to God. There's no evidence that the universe has ever been manipulated by outside forces — in spite of ancient mythology or miraculous bible stories.

I've seen cars recognizable only by their tires, yet everyone came out alive with light injuries. I've seen cars with only moderate damage, with everyone dead.

I've seen a college freshman waiting at school for his mom to pick him up, but before she got there another driver had a heart attack, jumped the curb and killed the kid. Random is the rule.

The universe does not care who you are, where you come from, how religious you are, or how much money you have. We are all potential victims. Of course, good information and alert thinking help avoid problems before they happen. But you can't always avoid a drunk barreling the wrong way in your lane, or a tsunami that washes 430,000 innocent people to their deaths.

When you are at wits end, or your life is on the line, or you're down in the foxhole of war, you might or might not pray. It helps people transcend the circumstances they're trapped in. When people pray, it can ease the stress in their minds. They need hope, even if it's fantastical. Personally, I think it's no different than doing a line of coke, or smoking a joint.

It's my belief that if more of the world embraced the truth of randomness, we would spend less time being afraid of imaginary, omnipotent gods in the sky, and spend more time helping our fellow human beings.” 

https://www.facebook.com/einsteinsgod/posts/1140743799280488?



Oriana:

“A tree falling on a child doesn’t reverse course because a mom cries out to God” — I don’t think there is a single adult in the world who believes otherwise. Gravity rules, not god — the laws of nature in general. The best discussion of this that I’ve come across happens to be a chapter on “Signs and Wonders” in Jesse Bering’s “The Belief Instinct” — one of the most important books I’ve ever read. Its intellectual clarity is breath-taking. 


We are wired for seeking patterns. For instance, we tend to see faces even in inanimate objects: trees, cars. Our pattern-seeking cognitive bias, the practically irresistible tendency to connect the dots, can easily lead us astray. 

Speaking of cognitive errors, my father often pointed out that the basic error was the assumption of divine omnipotence. Once we remove that, more interesting concepts of something like a deity can be developed (though I'm not sure we need to cling to “deity”; when Rabbi Kushner speaks about the “power of ideals” it’s a lot more clear to keep calling it the “power of ideals” rather than hanging the god baggage on it). The healing power of empathy and of our own unconscious are of special interest to me, how certain neural pathways are capable of taking over and the brain can rewire itself in an instant. 

Paul Klee, The Forgetful Angel, 1939

ending on beauty:

 
The truth is, we are nearer to heaven
Each time we lie down.
Take a look at the cat
Rolled over with its feet in the air.

~ Charles Simic, from “Midsummer”


 

Saturday, June 4, 2016

WHY WE MARRY THE WRONG PERSON; #1 PREDICTOR OF DIVORCE; WOMEN AS PRIZE; JESUS: JEHOVA'S WITNESS? ONE MUSHROOM A DAY

 Melk Abbey, Austria

IN THE MIRROR

In the mirror in a cheap motel
I saw us while we made love.
I peeked shyly, afraid we’d look
ugly, two heaving

animals. Instead I saw
our bodies glow
in the room’s drowsy dusk,
our skin’s faint light

lacing our outline,
our curves and hollows
fused against the hum
of the freeway that divided our lives.

And I used to think
that sex had to do
with the feeling of power.
In the mirror I saw

two creatures clinging
to each other,
pale and mothlike,
printed on the dark.

~ Oriana © 2016
 
WHY WE WILL MARRY THE WRONG PERSON ~ ALAIN DE BOTTON

~  It’s one of the things we are most afraid might happen to us. We go to great lengths to avoid it. And yet we do it all the same: We marry the wrong person.

Partly, it’s because we have a bewildering array of problems that emerge when we try to get close to others. We seem normal only to those who don’t know us very well. In a wiser, more self-aware society than our own, a standard question on any early dinner date would be: “And how are you crazy?”

Perhaps we have a latent tendency to get furious when someone disagrees with us or can relax only when we are working; perhaps we’re tricky about intimacy after sex or clam up in response to humiliation. Nobody’s perfect. The problem is that before marriage, we rarely delve into our complexities. Whenever casual relationships threaten to reveal our flaws, we blame our partners and call it a day. As for our friends, they don’t care enough to do the hard work of enlightening us. One of the privileges of being on our own is therefore the sincere impression that we are really quite easy to live with.

Our partners are no more self-aware. Naturally, we make a stab at trying to understand them. We visit their families. We look at their photos, we meet their college friends. All this contributes to a sense that we’ve done our homework. We haven’t. Marriage ends up as a hopeful, generous, infinitely kind gamble taken by two people who don’t know yet who they are or who the other might be, binding themselves to a future they cannot conceive of and have carefully avoided investigating.

For most of recorded history, people married for logical sorts of reasons: because her parcel of land adjoined yours, his family had a flourishing business, her father was the magistrate in town, there was a castle to keep up, or both sets of parents subscribed to the same interpretation of a holy text. And from such reasonable marriages, there flowed loneliness, infidelity, abuse, hardness of heart and screams heard through the nursery doors. The marriage of reason was not, in hindsight, reasonable at all; it was often expedient, narrow-minded, snobbish and exploitative. That is why what has replaced it — the marriage of feeling — has largely been spared the need to account for itself.

What matters in the marriage of feeling is that two people are drawn to each other by an overwhelming instinct and know in their hearts that it is right. Indeed, the more imprudent a marriage appears (perhaps it’s been only six months since they met; one of them has no job or both are barely out of their teens), the safer it can feel. Recklessness is taken as a counterweight to all the errors of reason, that catalyst of misery, that accountant’s demand. The prestige of instinct is the traumatized reaction against too many centuries of unreasonable reason.

But though we believe ourselves to be seeking happiness in marriage, it isn’t that simple. What we really seek is familiarity — which may well complicate any plans we might have had for happiness. We are looking to recreate, within our adult relationships, the feelings we knew so well in childhood. How logical, then, that we should as grown-ups find ourselves rejecting certain candidates for marriage not because they are wrong but because they are too right — too balanced, mature, understanding and reliable — given that in our hearts, such rightness feels foreign. We marry the wrong people because we don’t associate being loved with feeling happy.

We make mistakes, too, because we are so lonely. No one can be in an optimal frame of mind to choose a partner when remaining single feels unbearable. We have to be wholly at peace with the prospect of many years of solitude in order to be appropriately picky; otherwise, we risk loving no longer being single rather more than we love the partner who spared us that fate.

Finally, we marry to make a nice feeling permanent. We imagine that marriage will help us to bottle the joy we felt when the thought of proposing first came to us: Perhaps we were in Venice, on the lagoon, in a motorboat, with the evening sun throwing glitter across the sea, chatting about aspects of our souls no one ever seemed to have grasped before, with the prospect of dinner in a risotto place a little later. We married to make such sensations permanent but failed to see that there was no solid connection between these feelings and the institution of marriage.

Indeed, marriage tends decisively to move us onto another, very different and more administrative plane, which perhaps unfolds in a suburban house, with a long commute and maddening children who kill the passion from which they emerged. The only ingredient in common is the partner. And that might have been the wrong ingredient to bottle.


 
The good news is that it doesn’t matter if we find we have married the wrong person.
 
We mustn’t abandon him or her, only the founding Romantic idea upon which the Western understanding of marriage has been based the last 250 years: that a perfect being exists who can meet all our needs and satisfy our every yearning.

We need to swap the Romantic view for a tragic (and at points comedic) awareness that every human will frustrate, anger, annoy, madden and disappoint us — and we will (without any malice) do the same to them. There can be no end to our sense of emptiness and incompleteness. But none of this is unusual or grounds for divorce. Choosing whom to commit ourselves to is merely a case of identifying which particular variety of suffering we would most like to sacrifice ourselves for.

This philosophy of pessimism offers a solution to a lot of distress and agitation around marriage. It might sound odd, but pessimism relieves the excessive imaginative pressure that our romantic culture places upon marriage. The failure of one particular partner to save us from our grief and melancholy is not an argument against that person and no sign that a union deserves to fail or be upgraded.

The person who is best suited to us is not the person who shares our every taste (he or she doesn’t exist), but the person who can negotiate differences in taste intelligently — the person who is good at disagreement. Rather than some notional idea of perfect complementarity, it is the capacity to tolerate differences with generosity that is the true marker of the “not overly wrong” person. Compatibility is an achievement of love; it must not be its precondition.

Romanticism has been unhelpful to us; it is a harsh philosophy. It has made a lot of what we go through in marriage seem exceptional and appalling. We end up lonely and convinced that our union, with its imperfections, is not “normal.” We should learn to accommodate ourselves to “wrongness,” striving always to adopt a more forgiving, humorous and kindly perspective on its multiple examples in ourselves and in our partners.

http://www.nytimes.com/2016/05/29/opinion/sunday/why-you-will-marry-the-wrong-person.html?action=click&contentCollection=Opinion&module=Trending&version=Full&region=Marginalia&pgtype=article


Oriana:

My life’s wisdom on relationships: it’s not about finding someone who is a perfect fit, but about becoming a loving person. However, I'd like to add that there are people with whom I feel comfortable being myself —  with whom I seem to become "more myself" — and those with whom a mask is needed, a narrowing of personality rather than an enlargement. You have to censor yourself, pretend, put on a false persona — as with an elderly relative whom you don’t want to offend. The fit is good enough when I feel I can be completely myself.

But I think the wisest part of this particular article is the statement that it's very important what the person is like when there is disagreement. In other words, the key is “conflict management.”

This reminds me of Marcus Aurelius: “Today you will meet someone who will annoy you.” But de Botton goes beyond that, to remind us that we too will annoy someone. This is the start of wisdom and humility: to know that it’s not just our partner who will inevitably keep frustrating and annoying us — we will annoy our partner in equal measure, without meaning to.

It takes a while to learn how not to escalate those unavoidable petty annoyances, to swallow one’s ego and not have to be right every time, to forswear the urge to punish and engage in blame games, nourishing resentments for years. It’s not always mutual. Sometimes the husband (it’s generally the husband, especially the “dominator” type) is convinced that he has a happy marriage, only to have a rude awakening when the wife suddenly asks for a divorce.

I wonder, though, if we had known at the start about all the disappointments and annoyances ahead if we’d ever have the motivation to get married — and if we didn't, we'd miss the joy of that magical first year, which for many people is the best of life.

Someone estimated that only 17% of marriages are happy. It’s not unusual for married people to admit they would not marry the same partner if they could to to it over again (and for women to admit they would not have had kids). Conflict resolution, sure, but also some people are mismatched to start with, and after that first year when a lot of sex covers up a lot of incompatibility, the differences only grow and grow. In any restaurant one can see couples so bitter and depressed it's scary. 


But past a certain age there is fear and inertia. Perhaps "speed dating for seniors" would be a solution — for both of them, while still married on paper. It would let them dip their toes in what it might be like to interact with someone else (getting rid of some delusions, too — there is no ideal “soul mate”, no “twin flame,” no savior — to change partners is to change problems).

(But am I sure? Perhaps it’s an inevitable part of life to keep on dreaming about “The One.” As long as it’s not obsessive, a fantasy relationship with a perfectly fulfilling partner is just part of our secret life, a bit of private imaginary joy. “But my fantasy relationships have always been satisfying,” a friend once said with perfect seriousness, after reflecting on the fact that she’d never known lasting love.)

And maybe Margaret Mead was right about the need for three different marriages as we go through the stages of life.

But speaking of stages of life, another phenomenon — and de Botton obviously doesn't have the space to write about everything — is that every few years it's a different marriage. Even a mismatched marriage can grow better and more cooperative as the partners’ good traits also deepen.

Maybe it all starts with accepting our own flaws. If we learn not to be harsh on ourselves, to understand that no one (not even ourselves!) is perfect, we probably won't be harsh on the partner. I also think that we don't have the right to punish the partner. Nothing so sad as couples locked in warfare -- already past eighty, but still trying to "win" that unwinnable war.

And yet I want to return to my idea that marriage is a pact of non-abandonment. “For richer, for poorer, in sickness and in health” — nothing surpasses these vows. This is the foundation of the dignity of marriage: we don’t abandon the person because of a diagnosis of cancer, for all the suffering it’s going to cause us too. We hold hands when one of us is devastated by having just lost his or her job. The thought that someone will be there for you in the hour of your greatest need is the sacred flame at the very heart of marriage. 


Duchamps: The Bride, 1912


CONTEMPT #1 FACTOR PREDICTING DIVORCE

“John Gottman began gathering his most critical findings in 1986, when he set up “The Love Lab” with his colleague Robert Levenson at the University of Washington. Gottman and Levenson brought newlyweds into the lab and watched them interact with each other. With a team of researchers, they hooked the couples up to electrodes and asked the couples to speak about their relationship, like how they met, a major conflict they were facing together, and a positive memory they had. As they spoke, the electrodes measured the subjects' blood flow, heart rates, and how much they sweat they produced. Then the researchers sent the couples home and followed up with them six years later to see if they were still together.

From the data they gathered, Gottman separated the couples into two major groups: the masters and the disasters. The masters were still happily together after six years. The disasters had either broken up or were chronically unhappy in their marriages. When the researchers analyzed the data they gathered on the couples, they saw clear differences between the masters and disasters. The disasters looked calm during the interviews, but their physiology, measured by the electrodes, told a different story. Their heart rates were quick, their sweat glands were active, and their blood flow was fast. Following thousands of couples longitudinally, Gottman found that the more physiologically active the couples were in the lab, the quicker their relationships deteriorated over time.

The disasters showed all the signs of arousal—of being in fight-or-flight mode—in their relationships. Having a conversation sitting next to their spouse was, to their bodies, like facing off with a saber-toothed tiger. Even when they were talking about pleasant or mundane facets of their relationships, they were prepared to attack and be attacked. This sent their heart rates soaring and made them more aggressive toward each other. For example, each member of a couple could be talking about how their days had gone, and a highly aroused husband might say to his wife, “Why don’t you start talking about your day. It won’t take you very long.”

The masters, by contrast, showed low physiological arousal. They felt calm and connected together, which translated into warm and affectionate behavior, even when they fought. It’s not that the masters had, by default, a better physiological make-up than the disasters; it’s that masters had created a climate of trust and intimacy that made both of them more emotionally and thus physically comfortable.

Gottman wanted to know more about how the masters created that culture of love and intimacy, and how the disasters squashed it. In a follow-up study in 1990, he designed a lab on the University of Washington campus to look like a beautiful bed and breakfast retreat. He invited 130 newlywed couples to spend the day at this retreat and watched them as they did what couples normally do on vacation: cook, clean, listen to music, eat, chat, and hang out. And Gottman made a critical discovery in this study—one that gets at the heart of why some relationships thrive while others languish.

Throughout the day, partners would make requests for connection, what Gottman calls “bids.” For example, say that the husband is a bird enthusiast and notices a goldfinch fly across the yard. He might say to his wife, “Look at that beautiful bird outside!” He’s not just commenting on the bird here: he’s requesting a response from his wife—a sign of interest or support—hoping they’ll connect, however momentarily, over the bird.

People who turned toward their partners in the study responded by engaging the bidder, showing interest and support in the bid. Those who didn’t—those who turned away—would not respond or respond minimally and continue doing whatever they were doing, like watching TV or reading the paper. Sometimes they would respond with overt hostility, saying something like, “Stop interrupting me, I’m reading.”

These bidding interactions had profound effects on marital well-being. Couples who had divorced after a six-year follow up had “turn-toward bids” 33 percent of the time. Only three in ten of their bids for emotional connection were met with intimacy. The couples who were still together after six years had “turn-toward bids” 87 percent of the time. Nine times out of ten, they were meeting their partner’s emotional needs.

Contempt, they have found, is the number one factor that tears couples apart. People who are focused on criticizing their partners miss a whopping 50 percent of positive things their partners are doing and they see negativity when it’s not there. People who give their partner the cold shoulder—deliberately ignoring the partner or responding minimally—damage the relationship by making their partner feel worthless and invisible, as if they’re not there, not valued. And people who treat their partners with contempt and criticize them not only kill the love in the relationship, but they also kill their partner's ability to fight off viruses and cancers. Being mean is the death knell of relationships.

Kindness, on the other hand, glues couples together. Research independent from theirs has shown that kindness (along with emotional stability) is the most important predictor of satisfaction and stability in a marriage. Kindness makes each partner feel cared for, understood, and validated—feel loved. “My bounty is as boundless as the sea,” says Shakespeare’s Juliet. “My love as deep; the more I give to thee, / The more I have, for both are infinite.” That’s how kindness works too: there’s a great deal of evidence showing the more someone receives or witnesses kindness, the more they will be kind themselves, which leads to upward spirals of love and generosity in a relationship.

For the hundreds of thousands of couples getting married this month—and for the millions of couples currently together, married or not—the lesson from the research is clear: If you want to have a stable, healthy relationship, exercise kindness early and often.

We’ve all heard that partners should be there for each other when the going gets rough. But research shows that being there for each other when things go right is actually more important for relationship quality. How someone responds to a partner’s good news can have dramatic consequences for the relationship.

In one study from 2006, psychological researcher Shelly Gable and her colleagues brought young adult couples into the lab to discuss recent positive events from their lives. They psychologists wanted to know how partners would respond to each other’s good news. They found that, in general, couples responded to each other’s good news in four different ways that they called: passive destructive, active destructive, passive constructive, and active constructive.

Active constructive responding is critical for healthy relationships. In the 2006 study, Gable and her colleagues followed up with the couples two months later to see if they were still together. The psychologists found that the only difference between the couples who were together and those who broke up was active constructive responding. Those who showed genuine interest in their partner’s joys were more likely to be together. In an earlier study, Gable found that active constructive responding was also associated with higher relationship quality and more intimacy between partners.

There are many reasons why relationships fail, but if you look at what drives the deterioration of many relationships, it’s often a breakdown of kindness. As the normal stresses of a life together pile up—with children, career, friend, in-laws, and other distractions crowding out the time for romance and intimacy—couples may put less effort into their relationship and let the petty grievances they hold against one another tear them apart. In most marriages, levels of satisfaction drop dramatically within the first few years together. But among couples who not only endure, but live happily together for years and years, the spirit of kindness and generosity guides them forward.

http://www.theatlantic.com/health/archive/2014/06/happily-ever-after/372573/?utm_source=atlfb

Oriana:

I especially like the part about being there for the person not only in hardship, but also for sharing joy. This is not obvious, but it makes sense.

And the part about making a “bid” for connection, even if it’s just “Look at that pretty bird that just landed near the patio!” Perhaps we’d prefer to be alone with our thoughts and not be interrupted, but we do need to respond. That’s also part of the marriage (or relationship) contract: when a partner speaks, we respond.

The vast majority of marriages are unhappy. We should accept this as a fact and seriously try to figure out the causes. Is it something about our culture, or is marriage simply a disastrous institution bound to make most couples unhappy? 

Jan Bogaerts, Still Life with Cherries, 1937
 
*

WE MOSTLY HATE THE “NEAREST AND DEAREST”
 
“We hate those we love — or loved. People expressed the most ill will toward those they are closest to on a daily basis — acquaintances, friends, family, exes. Even within the family, the “nearest and dearest” arouse the most hatred — fathers especially, followed by mothers, in-laws, sibs. Curiously, very few hate their own significant others — just 1 in 100 — but far more hate a friend’s boyfriend or girlfriend.

Ex-husbands are among the most hated, especially for women between 28 and 32 years of age. Also in that elite company are co-workers. Far more women than men name “a friend” as the most hated person in their lives.

The good news is that hatred is uncommon. Over a lifetime, people say they hate about five people on average. Men are more likely to report feeling hate as they get older, peaking in the late 30s and then declining until the late 50s. But most of us don’t experience hate on a regular basis. Indeed, most people say they never feel hatred at all.”

http://www.huffingtonpost.com/wray-herbert/the-anatomy-of-everyday-h_b_5380440.html

 
Oriana:

Nothing surprising about parents and siblings being the most common hate objects. However, I am surprised that hating the boss isn't mentioned. As for hating one's lover, that too happens -- mostly in the mixed form of love-hate. The abused partner is bound to feel at least some hate -- or anyone who feels trapped in a bad marriage or relationship.

Most people may SAY they never feel hatred, but I doubt if they are telling the truth. I was definitely an intense hater when I was young. Being in a subordinate position was practically enough to provoke hatred of the dominator. Oddly enough, I didn’t actually hate my bosses — I only despised their incompetence. I knew I could do a better job, and I resented the high salaries they were getting for doing so little and so badly.

I'm relatively free of hatred now. That’s probably due mainly to having autonomy at last. I have momentary flare-ups of hatred after being mistreated, or remembering how someone ripped me off — but after a minute, the emotion is gone. Maturity taught me to shrug off such things, and put my energy into appreciating the good things while they last. Life feels less intense now, but the rarity of hatred is a positive development. Now and then I enjoyed the huge energy of hatred, a wave of it rolling through my body, but mostly it felt bad. I’ll take tranquillity any time.


 

*

WOMEN AS A PRIZE VERSUS WOMEN AS PEOPLE
 
Arthur Chu had an interesting article in the Daily Beast: “Your Princess Is in Another Castle: Misogyny, Entitlement, and Nerds.” He states that men are still growing up with the idea that the right woman = princess = a prize they win or “earn.” In Chu’s words: “The overall problem is one of a culture where instead of seeing women as, you know, people, protagonists of their own stories just like we are of ours, men are taught that women are things to “earn,” to “win.” That if we try hard enough and persist long enough, we’ll get the girl in the end. Like life is a video game and women, like money and status, are just part of the reward we get for doing well.

So what happens to nerdy guys who keep finding out that the princess they were promised is always in another castle? When they “do everything right,” they get good grades, they get a decent job, and that wife they were promised in the package deal doesn’t arrive?”

Chu replies that the frustrated man is then likely to become misogynous. But I could hardly go on reading about misogyny, struck as I was by the idea of woman as a reward. And I remembered a young man I met in college. He said he had a girlfriend. Then he added with great bitterness: “But she has a big chin.”

When you look at fairy tales, you see the disastrous situation at the extreme: the princess, sometimes in a beggarly disguise (hence the “Cinderella complex”), or imprisoned in a tower, simply awaits the prince, who has to prove his valor in order to win her like a prize in a contest. Or, to dignify this with the vocabulary of Joseph Campbell’s concept of the hero’s journey, the hero has to overcome many obstacles to be rewarded with love. He must show himself worthy of that prize by killing dragons.

Campbell admitted that his lectures on The Hero’s Journey often ended with women in the audience asking, “But what about the woman’s journey?” This puzzled Campbell. His standard reply was that a woman doesn’t go on the hero’s journey. She waits for the hero to win her through heroic deeds. She is his prize. Campbell noticed that women were very disappointed to hear this. He’d say, “Don’t you understand that you are this GREAT THING, the hero’s REWARD?” This was news to the women; no, they didn’t understand.

I think I understand why they didn’t understand. The culture does not value the feminine. There is the expression “a real man”; a young man needs to prove that he is a “real man.” I heard that expression countless times; I can’t think of a single time when anyone praised a woman for being a “real woman.” We wouldn’t know what it means. It’s not a cultural ideal.

On the other hand, there is the expression “a trophy wife.” It’s not unusual for a rich older man to marry a good-looking young woman, and show her off much as he’d show off a sports car. The trophy wife can indeed be seen as a prize that he’d won. But that situation is hardly the cultural ideal. The woman tends to be stereotyped and despised as a gold digger, and the man is likely to be seen as an old fool who was taken in by her. The very word “trophy” does not carry the same positive connotations as “reward.”

To be a prize or trophy, a woman has to be young and beautiful. The most common compliment that woman past thirty begin to receive is being told that they look younger than their age. This storm of compliments becomes a hurricane after a woman turns forty. Turning forty becomes a crisis because that’s the expiration date for a woman as prize. In the past it used to be sooner. A woman who didn’t manage to get married by 25 was regarded as an old maid. So there has been progress, but the basic assumption persists.

I suspect that the idea that a man has to “earn” a beautiful woman was also stronger in the past. It was one way to manipulate men into enlisting in the military. “Only the brave deserve the fair” — I am thrilled to come across this saying only in old novels and historical movies.

*

Getting back to Campbell’s surprise that women didn’t understand they were “this great thing, the hero’s reward” — what surprises me that a man as brilliant as Campbell didn’t see that a lot of women struggled with low self-esteem. I had a low self-esteem, so of course I never thought of myself as a prize to be won. Reading the article reminded me about a relationship I had at 26. That young man, who definitely saw himself as a winner, did seem to perceive me as a prize that he was bound to win if only he persisted, got top grades, a good job etc (in fact that was my one and only chance to marry a college professor, ostensibly my dream — but I’d have to leave California for Columbus, Ohio. Was there any man for whom I’d leave California? Nah. That was pretty eye-opening).

Yet for a while he seemed absolutely sure I’d marry him. I interpreted the situation more or less correctly: his self-image as a winner made him over-confident. But I didn’t quite have the right label for myself in that relationship. Now I have that label: I was to be one of his big prizes, one of the rewards for his hard academic work and being such a good boy.

The child he’d have with me would also be a prize: no doubt bright, the kind of child that it would be joy to take to the museum of natural history, say, and share in the wonder of seeing a dinosaur skeleton for the first time. In retrospect I feel it was a compliment that he wanted to marry me and be the mother of his child; a compliment, yes, but an uncomfortable one, from someone who also said, “I'm not as conservative as you think; I’d allow my wife to work.” (I don’t mean to stereotype him: later he understood and respected the fact that I wanted not just “a room of my own,” but a whole “life of my own.” By that time he seemed to understand that was the way of educated women; they were complex people with their own lives. “No man can own you,” he said.)

Unfortunately or perhaps fortunately, I just wasn’t attracted to him, and for me it was a minor relationship, though one that taught me an important lesson about being loved when you yourself are not in love — how very uncomfortable that is. I decided that between loving without return versus being loved without being able to love back, I’ll take the unrequited love any time. Ah, life’s ironies. This complete mix-up of mythology versus reality. For his younger self, the unnecessary suffering because of the false premise that anyone is a prize to be won or lost, rather than a struggling and imperfect human being with a journey of their own.

I have a feeling that a lot of women had rather low self-esteem back when the main job open to women was being a secretary (speaking of which, the young man who tried to “win” me, a Ph.D., ended up with a church secretary). The overwhelming majority of women didn’t see themselves as a prize. They saw themselves as inferior and inadequate. They would never be beautiful enough, poised enough. They’d never be Miss America.

I wonder if we’d have been happier if we thought of ourselves as being “that great thing,” a prize to be won by a hero. Since attraction is mostly non-rational and it’s the unconscious that decides, it’s not about “winning” anything, and definitely not about “deserving." You either are attracted or you aren’t, for reasons that are mysterious, since they reside in the unconscious. Attraction happens or it doesn’t happen. We invoke “chemistry,” and that may yet turn out to be correct. Some speculate that chemical signals are being sent and received, something like scent, but we can’t consciously detect them.

Or maybe the answer lies in our early years. The relationship with our parents is certainly important, but I doubt that it determines everything. Rather, perhaps X reminds you of your favorite cousin, while Y reminds you of someone you absolutely detested in the fourth grade. But Y finds you terrifically attractive, again for reasons that have little to do with who you really are. Meanwhile you pine for the aloof X. It’s a mess that can make you wonder if someone up there has a perverse sense of humor.

That mess is completely irrelevant to my present life, and I feel very lucky that it’s irrelevant. Once creative work became the center of my life, a lot of nonsense dropped away. The work became the real reward, and I myself — my unique experiences that could be turned into poetry — became my own prize.

For some women I know, this happened when they became mothers: they became empowered, adult. Giving birth gave them a sense of accomplishment; furthermore, to the child they were the boss, so they gained confidence from that too. For other women, as for me, it was discovering their vocation or some activity or cause where they could excel and be of service. Sure, at some level the yearning for the ideal mate still lingers, but that fantasy (the Prince is not so much a prize as a Savior, perhaps a heavier trip) is no longer central.

If we must think in terms of a reward, then the life we have right now is the reward — even though there will be times when it will feel more like punishment. Only maturity knows that love is not so much a feeling as a behavior: we show in a myriad ways how much we value both those who are special to us and people in general. We learn the pleasure that comes from giving and kindness.

And just as important, I think, we learn about the importance of luck, the power of circumstances. We don’t see ourselves and others so much in terms of winning and losing, as if hard work would inevitably make us “winners.” It’s not about winning or losing. The “winners” may end up paying a big price and the “losers” may actually turn out to be quite happy. And being loved — being loved by the right person, that is, someone we genuinely like and feel comfortable with —  is a lot more important than having a girlfriend or spouse with a perfect chin.

Life is a lot more complicated than any fairy tales — both good and bad, all intertwined and constantly changing. The marriage as it is now is not as it was in the beginning, nor will it be the same five years from now. It could be said that every five-six years we are in a different marriage. The spouse changes, we change, the marriage changes. It’s fascinating to stay with the same person long enough to see this unfolding. It’s marvelous to work through the stage of power struggle and get to the stage of cooperation — now there is a real reward!

But mainly, rather than try to control life and win prizes, we learn to make the best of it. Pardon the cliché, but it’s true: maturity arrives when we learn to count our blessings. They are surprisingly numerous.

Image: Botticelli, Dante meets Beatrice, 1450. Dante’s princess married someone else, a rich banker. The only other thing we know is that she died at twenty-four.

Tangentially, Dante’s Beatrice comes across as a surprisingly unpleasant person: a scold, a dominatrix. Why would anyone imagine his princess as a shrew? I suppose that some people’s love fantasies may have a masochistic element. The church taught self-loathing: you are a sinner; of course that pure saint, Beatrice, would despise you. She deigns to act as a savior, but has to administer a severe tongue-lashing when the two first meet. Just . . . has to. Besides, being shamed and humiliated is good for you. Still, one wonders about Dante’s mother and/or wife (his actual wife, Gemma Donati, with whom he had five children — not his imaginary beloved).


*

WAS JESUS A JEHOVA'S WITNESS?
 

Jesus was definitely not a Catholic. In fact, the historical Jesus was probably closer to a Jehova’s Witness.

From an Amazon review of Bart Ehrman’s “Did Jesus Exist?: The Historical Argument for Jesus of Nazareth”:

Erhman believes, quite convincingly, that Jesus was an apocalyptic prophet in the Jewish tradition of his time. Jesus believed and preached that God would soon intervene and destroy the forces of evil, bring in his good kingdom on earth, and install him on his throne. There is just one problem. Jesus was wrong. In fact, he was mistaken about a lot things. People don't want to hear that, Ehrman points out.

The difficulty, Ehrman believes, is that this historical Jesus is obviously far too historical for modern tastes. Ehrman is right. Out of the context of his time, the overriding message of Jesus is preposterous, leaving anyone grasping for a meaningful faith nowhere to go, no inspiring message to believe in. Jesus the wisdom sage or Jesus the social revolutionary, for example, might offer solace, guidance, and hope but a Jesus predicting the end times leaves us only a corpse.

*

This is precisely the problem: the “historical Jesus is obviously far too historical for modern tastes.” In fact, a question arises as to the sanity of Jesus: today he’d be discussed as a case of paranoid schizophrenia with the “Messiah complex,” fairly common in schizophrenic delusions.

So no, the historical Jesus is not the chocolate Jesus who’d fraternize with kindly atheists. And here we are already visualizing a surprised atheist entering heaven together with her atheist dog.”

A part of this blog that may be of special interest is the rise of feminism coinciding with the supportive “god within” — as opposed to the external “god of punishment” (GOP).

http://oriana-poetry.blogspot.com/2013/05/chocolate-jesus.html


 
By the way, Jehova’s Witnesses don’t believe that we “go” anywhere after death. Rather, they accept apocalyptic Judaism’s idea that death is a dreamless sleep from which a small minority (presumably only JWs) will be resurrected when the Kingdom arrives. Ancient Israelis did not believe in a disembodied soul. Life began with the first breath, and ended with the last breath — but Yahweh could breathe the breath of life into a dead body.

*

One mushroom a day keeps cancer away: ERGOTHIONEINE
 
Regular mushroom consumption (approximately 1 button mushroom per day) has been associated with a 64% decrease in the risk of breast cancer (this common mushroom variety is best used cooked, rather than raw, because it contains the toxin, agaritine, which is reduced with cooking).

The best news about mushrooms is a powerful micronutrient called ergothioneine. Ergothioneine is an antioxidant and anti-inflammatory which mushrooms have in very high concentrations. Cooking actually releases this powerful nutrient from the mushroom cells. Mushrooms also have high levels of polyphenols that give them a higher antioxidant level than green pepper and zucchini.



*

ending on beauty




Charles:

I love the opening poem. The Picasso painting is perfect.

Favorite sentence in the marriage article: "The person who is best suited to us is not the person who shares our every taste (he or she doesn’t exist), but the person who can negotiate differences in taste intelligently — the person who is good at disagreement.”

You described this in two words, “conflict management.”

The vast majority of marriages are miserable but is it better than single and miserable? The grass is always greener on the other side.

It is also interesting to note that the vast majority of murders are the ones that are “nearest and dearest.”

A man can perceive a woman as a real human being AND a prize.


Oriana:

I have to agree: a caring, supportive partner is in fact a a big prize to have won in life. For the sake of the argument, in my essay I used the definition of “prize” that’s closer to “trophy” for display and ego-stroking purposes. Hence the example of the young man complaining that his girlfriend has a big chin. With maturity we usually know better what to value in a partner, and how lucky we are to have a relationship with more benefits than drawbacks.

An excellent point about victims of murders being usually among the “nearest and dearest.” Well, plenty of opportunity for anger, which can turn to homicidal rage.

It was different being single and young and poverty-stricken — then I could really see the benefits of marriage and the startling freedom it provided: I could work or not work, I could take  classes or just go to various lectures and workshops in town, etc. At the same time, I absolutely loved living alone; I could never erase the knowledge that I deeply preferred solitude. If I happened to have enough income to live on without toiling full-time, there is no way I would have remarried — a sickening confession, but the vast majority of women perfectly understand the economic factor in marriage — though a husband can also provide other benefits, like being a handyman (a good handyman is harder to find than a good lover — how is that for a shocking statement!).

And I know some married writers, both men and women, whose now-retired spouses want to travel all the time, and require “companionship.” The writers “grin and bear it,” knowing there is a price for everything. And the travel-loving spouse may still provide various benefits the writers enjoy. As long as there is no secret hatred . . .  (now there’s a touchy topic: secret, decades-long hatred in marriage).

But then writers and others who are “married to their work” are not typical of people in general. One needs to try out marriage to know if one likes not just living together, but living together in a more committed fashion (a marriage is a legal and economic contract) — and of course it’s never perfect. And thank goodness we have divorce. And pets. Surveys have found that many people prefer their pets to their spouses. This surprises no one.

Still, it’s striking that nowadays, probably thanks largely to Social Security and reverse mortgage (i.e. sufficient income), widows do not generally remarry, even when they do have the opportunity and aren’t yet too advanced in age. Even more striking is the very existence of the expression “the merry widow.” Funny that we never hear about the “merry widower” but we do hear of men who die within a short time after the wife’s death. It’s also telling that there is no “marriage advantage” for women in terms of health and life expectancy.

Perhaps we should face the fact that it’s not about marriage per se. Our great desire is to be loved, and to be secure in being loved — not to have to fear abandonment.