Saturday, June 18, 2016

NIETZSCHE WRONG! SUFFERING WEAKENS; ROZEWICZ: AGAINST ICARUS; JESUS DIDN’T BELIEVE IN SOULS GOING TO HEAVEN; "SHELL SHOCK" IS PHYSICAL BRAIN INJURY

At one time I don’t know when
at one time I thought I had the right the duty
to shout at the plowman
look look listen you blockhead
Icarus is falling
Icarus the son of vision is drowning
leave your plow
leave your field
open your eyes
there Icarus
is drowning
or that shepherd
with his back turned to the drama
of wings sun flight
of fall

I said oh blind ones

But now I don’t know just when now
I know the plowman should till the field
the shepherd watch over his flock
the venture of Icarus is not their venture
this is how it must end
And there is nothing
disturbing
about a beautiful ship sailing on
to its port of call

~ Tadeusz Różewicz, from “A Didactic Tale”

Jaroslaw Anders:

~ In “A Didactic Tale,” Rozewicz meditates on the painting “The Fall of Icarus,” traditionally attributed to Brueghel. In the painting, which has inspired several poems, most notably Auden’s “Musée des Beaux Arts,” Icarus’s fatal fall is virtually unnoticed by the central figure of the painting, the plowman preoccupied with his mundane task. The earthbound gravity wins over the reckless upward striving of the human spirit. But unlike Auden’s stoical, detached observation (“everything turns away/Quite leisurely from the disaster”), Rozewicz unambiguously takes the side of the plowman.

At one time, he says, he might have thought he had the right to shout “look look listen you blockhead/Icarus is falling/Icarus the son of vision is drowning.” But now he understands that “the plowman ought to till the land/the shepherd watch over his flock/the venture of Icarus is not their venture.” It is the unremarkable folk, the common people, who keep the world from self-destructing. Like Brueghel’s plowman, they move around with their eyes fixed on the ground, preoccupied with the necessary tasks of caring and feeding. ~ “Against Color,” The New Republic, November 8, 2011

Oriana:

I think already Auden is more on the side of the plowman. Auden saw the historical failure of lofty ideologies and great visions and ambitions. The plowman, the shepherd, the sailors -- these are the people who are of use to others, and ultimately advance progress, in small but essential ways. Already the myth itself was meant as a cautionary tale. Our admiration was supposed to be for sensible, moderate Daedalus, the brilliant engineer, not for the foolish youngster with his hubris.

*

Cherub and Lupe Velez, a Mexican and American movie star. Her suicide in 1944 is described in a book "From Bananas to Buttocks."

 
NIETZSCHE WAS WRONG: WHAT DOESN’T KILL ME MAKES ME WEAKER


We now know, from dearly bought experience, much more about post-traumatic stress experience than we used to. Apparently, one of the symptoms by which it is made known is that a tough veteran will say, seeking to make light of his experience, that “what didn’t kill me made me stronger.” This is one of the manifestations that “denial” takes. ~ Christopher Hitchens, “Mortality,” 2012

Oriana: “About suffering they were never wrong, the Old Masters.” So begins Auden’s great poem based on Brueghel’s painting, The Landscape with the Fall of Icarus. The Old Masters knew that suffering goes on in untidy corners, is ignored and later soon forgotten. Nietzsche, on the other hand, was wrong about suffering. “That which does not kill us makes us stronger” is perhaps his most famous aphorism. This notion lives on — which is ironic, Nietzsche’s life having been rather short and miserable and all the worse for his suffering (the newest thinking is that it was brain cancer) — making it pretty obvious that the great philosopher was in denial. 

Worse, the saying continues to resonate — and not just with Catholics, lapsed or otherwise, who were brainwashed to believed that “suffering is good for you.” We all got brainwashed.

Noam Shpancer: “One reason is that suffering, as Freud famously recognized, is an inevitable part of life. Thus we have developed many ways to try to ease it--one of which is bestowing upon it transformative powers (another is by believing in an afterlife, of which Freud disapproved; still another is cocaine, of which he was, for a time, a fan).

Another reason is that American culture, born of trauma and imbued with a hopeful can-do ethos, wants to believe this idea, finding it self-affirming. Once we have acquired a certain belief we tend to see, remember, and report mostly instances and events that support it. This is called confirmation bias.

Yet another reason we think trauma may be transformative is that we see variants of this process around us. Bacteria that are not killed entirely by an antibiotic will mutate and become resistant to it. People who go through the hardship of training tend to improve their performance. But human beings are not bacteria, and good training is not a traumatic event.

Now it is true that, in an evolutionary sense, those who survive a calamity are by definition the fittest. But it is not the calamity that made them so. For our minds, however, the leap is short between seeing the strong emerge from a calamity and concluding that they are strong because of the calamity.
Years ago, during my mandatory army service in Israel, I took part in anti-terrorist training that involved working with the K9 unit. I asked the unit commander where he found those vicious attack dogs of his. Most people, he said, believe that wild street dogs make the best anti-terrorist dogs, having survived the, well, dog-eat-dog world of the mean streets. But the truth is just the opposite. Street dogs are useless for this--or any other--work because they are unpredictable and not trainable. Dogs that have been well cared for, loved, and protected all their lives--those are the best anti-terrorist dog candidates.

And this is true for humans as well. Mayhem and chaos don't toughen you up, and they don't prepare you well to deal with the terror of this world. Tender love and care toughen you up, because they nurture and strengthen your capacity to learn and adapt, including learning how to fight, and adapting to later hardship.


Nietzschean — and country song — wisdom notwithstanding, we are not stronger in the broken places. What doesn't kill us in fact makes us weaker. Developmental research has shown convincingly that traumatized children are more, not less, likely to be traumatized again. Kids who grow up in a tough neighborhood become weaker, not stronger. They are more, not less likely to struggle in the world.”

https://www.psychologytoday.com/blog/insight-therapy/201008/what-doesnt-kill-you-makes-you-weaker



Oriana:

It’s being loved that makes us stronger. Connect, connect, connect! When hardship strikes, we need empathy. And that’s just “hardship.” In case of real trauma, we need a great deal of empathy and other help too, or we may never recover.

Chuck Lorre: "That which does not kill me makes me bitter.” I used to be a classic example of this. In the end I came to my senses, not wanting to waste any more life on bitterness. But some people stay bitter to the end, in depression in their seventies and eighties, getting more and more bitter. So sad.

People resist the idea that Nietzsche was wrong because they want to justify suffering as something that "toughens us up." Unfortunately this serves as an excuse for all kinds of cruelty. It’s interesting that Nietzsche saw religions as being at bottom systems of cruelty, yet didn’t see that the “suffering is good for you” argument is also in service to cruelty.

Some people even imagine that the more diseases we survive, the better for the body — that's part of the argument against vaccines. No, diseases harm us. Some of the damage is irreversible.

True — there are examples of people who in response to tragedy were transformed into heroes and/or activists. But we are eager to forget about those countless forgotten victims whose lives have been destroyed by suffering. We don’t want to contemplate what drives people to suicide. Yet it bears repeating: suffering does not make us stronger. Being loved is 

what makes us stronger.
 





Charles:

Many people believe that suffering makes us stronger because it makes a more interesting story — the overcoming of suffering.

Oriana:

I’m constantly aware of how much more I could do without my handicap — I’d be a more interesting person, actually, having had more experiences — by now having traveled to Italy, Ireland, and Lithuania, and having met who knows how many interesting people who’d enrich my psyche.

And instead of reading about inflammation and trying to find out which remedies are the least harmful, I could be reading wonderful essays and poems.

So much energy goes into dealing with the suffering that could go into better venues. Aren’t children better off when not crippled by polio? Should we bring it back because it “builds character”?

But it’s perfectly human not to want to acknowledge how destructive suffering is, and to go into denial about that obvious aspect. We latch on to the stories of exceptional individuals. Even they weren’t exactly made stronger by suffering — their life expectancy, for instance, got permanently shortened, and that “worn out” feeling and various other “souvenirs” of the illness or another catastrophe may never go away — but they found a creative adaptation to adversity.

Yes, that’s a more interesting story than the story of being slowly destroyed by suffering, which is what life is, but in different ways, at different pace and different degrees of intensity — and the degree and speed of destruction matter a lot. It’s marvelous to contemplate the stories of survival and some kind of accomplishment against all odds. The once-per-history Stephen Hawking. But when I think of people I’ve known who did succumb to cancer after long months of terrible suffering — perhaps they died with stoic heroism, without complaining, but no, they did not gain anything from the cancer that they would not gladly give back for even one more month of normal life. Perhaps even just one more week. Who knows, maybe even one more day.

The price of suffering is well-known to those who suffer, but they know their stories, unless crowned by a miraculous recovery, are not welcome. It’s similar to the stories of immigrants — unless they conform to what people want to hear, those stories are not welcome, and the enormity of suffering involved in being an immigrant remains unknown.

Let’s face it, the stories of suffering as suffering are totally unwelcome. You’re supposed to be destroyed quietly, almost secretly. Your obituary may mention how brave you were and could even manage to crack a joke. It omits the potential contributions you might have otherwise made during that “long battle with X.” 






HOW SUFFERING MAKES US WEAKER (a prelude)

I’ll devote more space to this in an upcoming blog — one of the posts will be devoted to the life-long damage resulting from early trauma (including being bullied at school — it’s by no means trivial).

For now, let me briefly contemplate what happens when someone has a serious but non-fatal stroke. So, something bad happened that didn’t kill — but did it make the person stronger?

Consider also the resources spent on rehabilitation — and we are talking the best case here, where such resources are available. Perhaps in addition to the experts there is also a devoted spouse or parent available to continue the intensive training, trying to make the person relearn speech and basic skills and some knowledge about the world. Imagine several years of this intensive effort and expenditure — all of it just to make up for the loss, not to advance to higher ground than before. And no “resistance to a future stroke” is being built.




You may object that stroke is too extreme an example. Let’s take stammering, then. The King’s Speech was an excellent movie that showed how the future King George VI struggled to overcome his stammer. We are shown the childhood abuse that led to his “losing his voice.” And we are shown the heroic persistence of his speech therapist and his pupil, crowned with success — of sorts.

The king manages — but barely only manages. He does not become an inspiring speaker that perhaps he would have become had the suffering not taken place, the stammer did not develop, and the time spent trying to overcome the handicap would have been freed for developing public speaking (or another skill) to the level of excellence.

Did suffering make King George stronger? While the overcoming of his handicap is certainly in the “inspiring” category, my final verdict, when I ponder the suffering, is “What a waste.” Unfortunately, most suffering is that. Chronic stress doesn’t make us more resilient. On the contrary, even minor stress can be very damaging if it comes on top of chronic stress.

A stray thought: our denial about the ravages and sheer waste of suffering may in part be due to the example of athletic training. But that’s not suffering in the real sense — and besides, the philosophy of “no pain, no gain” is now being seriously questioned. No, we don’t want too much inflammation, and we most definitely don’t want muscle or tendon damage!

*

In an ideal world, we wouldn’t be perceived as soldiers. We would be singers, dancers, lovers; travelers and explorers; loved children and loving parents. It’s not good to be walking with a pebble in your shoe and that constant irritation — even if it’s just a small pebble! — eclipsing the more worthwhile aspects of life. Do not go into denial and praise the pebble. If it’s possible to remove the pebble, by all means remove it.

Cardinalis cardinalis in flight
 

“SHELL SHOCK” (BLAST INJURY) IS PHYSICAL BRAIN DAMAGE

“In early 2012, a neuropathologist named Daniel Perl was examining a slide of human brain tissue when he saw something odd and unfamiliar in the wormlike squiggles and folds. It looked like brown dust; a distinctive pattern of tiny scars. Perl was intrigued. At 69, he had examined 20,000 brains over a four-decade career, focusing mostly on Alzheimer’s and other degenerative disorders. He had peered through his microscope at countless malformed proteins and twisted axons. He knew as much about the biology of brain disease as just about anyone on earth. But he had never seen anything like this.

The brain under Perl’s microscope belonged to an American soldier who had been five feet away when a suicide bomber detonated his belt of explosives in 2009. The soldier survived the blast, thanks to his body armor, but died two years later of an apparent drug overdose after suffering symptoms that have become the hallmark of the recent wars in Iraq and Afghanistan: memory loss, cognitive problems, inability to sleep and profound, often suicidal depression. Nearly 350,000 service members have been given a diagnosis of traumatic brain injury over the past 15 years, many of them from blast exposure. The real number is likely to be much higher, because so many who have enlisted are too proud to report a wound that remains invisible.

Perl and his lab colleagues recognized that the injury that they were looking at was nothing like concussion. The hallmark of C.T.E. is an abnormal protein called tau, which builds up, usually over years, throughout the cerebral cortex but especially in the temporal lobes, visible across the stained tissue like brown mold. What they found in these traumatic-brain-injury cases was totally different: a dustlike scarring, often at the border between gray matter (where synapses reside) and the white matter that interconnects it. Over the following months, Perl and his team examined several more brains of service members who died well after their blast exposure, including a highly decorated Special Operations Forces soldier who committed suicide. All of them had the same pattern of scarring in the same places, which appeared to correspond to the brain’s centers for sleep, cognition and other classic brain-injury trouble spots.

Then came an even more surprising discovery. They examined the brains of two veterans who died just days after their blast exposure and found embryonic versions of the same injury, in the same areas, and the development of the injuries seemed to match the time elapsed since the blast event. Perl and his team then compared the damaged brains with those of people who suffered ordinary concussions and others who had drug addictions (which can also cause visible brain changes) and a final group with no injuries at all. No one in these post-mortem control groups had the brown-dust pattern.

Perl’s findings, published in the scientific journal The Lancet Neurology, may represent the key to a medical mystery first glimpsed a century ago in the trenches of World War I. It was first known as shell shock, then combat fatigue and finally PTSD, and in each case, it was almost universally understood as a psychic rather than a physical affliction. Only in the past decade or so did an elite group of neurologists, physicists and senior officers begin pushing back at a military leadership that had long told recruits with these wounds to “deal with it,” fed them pills and sent them back into battle.

Trinitrotoluene, or TNT, was first used in artillery shells by the German Army in 1902. Soon after the First World War started in 1914, a rain of these devices was falling on the hapless men on each side of the front. It was a level of violence and horror far beyond the cavalry charges of earlier wars. Very quickly, soldiers began emerging with bizarre symptoms; they shuddered and gibbered or became unable to speak at all. Many observers were struck by the apparent capacity of these blasts to kill and maim without leaving any visible trace. The British journalist Ellis Ashmead-Bartlett famously described the sight of seven Turks at Gallipoli in 1915, sitting together with their rifles across their knees: “One man has his arm across the neck of his friend and a smile on his face as if they had been cracking a joke when death overwhelmed them. All now have the appearance of being merely asleep; for of the several I can only see one who shows any outward injury.”

One British doctor, Frederick Mott, believed the shock was caused by a physical wound and proposed dissecting the brains of men who suffered from it. He even had some prescient hunches about the mechanism of blast’s effects: the compression wave, the concussion and the toxic gases. In a paper published in The Lancet in February 1916, he posited a “physical or chemical change and a break in the links of the chain of neurons which subserve a particular function.” Mott might not have seen anything abnormal in the soldiers’ brains, even if he had examined them under a microscope; neuropathology was still in its infancy. But his prophetic intuitions made him something of a hero to Perl.

Mott’s views were soon eclipsed by those of other doctors who saw shell shock more as a matter of emotional trauma. This was partly a function of the intellectual climate; Freud and other early psychologists had recently begun sketching provocative new ideas about how the mind responds to stress. Soldiers suffering from shell shock were often described as possessing “a neuropathic tendency or inheritance” or even a lack of manly vigor and patriotic spirit. Many shell-shock victims were derided as shirkers; some were even sentenced to death by firing squad after fleeing the field in a state of mental confusion.

This consensus held sway for decades, even as the terminology shifted, settling in 1980 on “post-traumatic stress disorder,” a coinage tailored to the unique social and emotional strain of returning veterans of the war in Vietnam. No one doubted that blasts had powerful and mysterious effects on the body, and starting in 1951, the U.S. government established the Blast Overpressure Program to observe the effects of large explosions, including atomic bombs, on living tissue. One of my uncles recalls standing in the Nevada desert as an Army private in 1955, taking photographs of a nuclear blast amid a weird landscape of test objects: cars, houses and mannequins in Chinese and Soviet military uniforms. At the time, scientists believed blasts would mainly affect air pockets in the body like the lungs, the digestive system and the ears. Few asked what it would mean for the body’s most complex and vulnerable organ.

Daniel Perl is continuing to examine the brains of blast-injured soldiers. After five years of working with the military, he feels sure, he told me, that many blast injuries have not been identified. “We could be talking many thousands,” he said. “And what scares me is that what we’re seeing now might just be the first round. If they survive the initial injuries, many of them may develop C.T.E. years or decades later.”

Perl takes some solace from the past. He has read a great deal about the men who suffered from shell shock during World War I and the doctors who struggled to treat them. He mentioned a monument in central England called “Shot at Dawn,” dedicated to British and Commonwealth soldiers who were executed by a firing squad after being convicted of cowardice or desertion. It is a stone figure of a blindfolded man in a military storm coat, his hands bound behind him. At his back is a field of thin stakes, each of them bearing a name, rank, age and date of execution. Some of these men, Perl believes, probably had traumatic brain injuries from blasts and should not have been held responsible for their actions. He has begun looking into the possibility of obtaining brain samples of shellshocked soldiers from that war. He hopes to examine them under the microscope, and perhaps, a century later, grant them and their descendants the diagnoses they deserve.”

http://www.nytimes.com/2016/06/12/magazine/what-if-ptsd-is-more-physical-than-psychological.html?action=click&pgtype=Homepage&region=CColumn&module=MostEmailed&version=Full&src=me&WT.nav=MostEmailed&_r=0


 
INTROVERSION OR INTELLECT? (a more subtle understanding of introversion)


“What many people ascribe to introversion really belongs in the intellect/imagination domain. Intellect/imagination represents a drive for cognitive engagement of inner mental experience, and encompasses a wide range of related (but partially separate) traits, including intellectual engagement, intellectual curiosity, intellectual depth, ingenuity, reflection, introspection, imagination, emotional richness, artistic engagement, and aesthetic interests.

Traits such as sensitivity and social anxiety are also not part of Big Five introversion-extraversion domain. To be sure, many people may think of themselves as introverted because they are highly sensitive. But research shows that sensory processing sensitivity is independent of introversion. The various manifestations of being a highly sensitive person — inhibition of behavior, sensitivity to environmental stimuli, depth of information processing, and physiological reactivity — are linked to neuroticism and intellect/imagination, not introversion.

Finally, there's a common misconception that all introverts enjoy solitary activities. However, that isn't a defining feature of introverts. Responses such as "Enjoy spending time by myself" and "Live in a world of my own" involve an equal blend of introversion and intellect/imagination. Contrary to popular conceptualizations of introversion, preferring to be alone is not the main indicator of introversion.

The desire for positive social attention seems to be a particularly strong indicator of extraversion [4]. For example, Jacob Hirsh and colleagues found that taking into account the rest of the Big Five personality traits (agreeableness, neuroticism, conscientiousness, and intellect/imagination), the following 10 behaviors were most uniquely predictive of extraversion (from a list of 400 activities):

1. Told a dirty joke.

2. Planned a party.

3. Entertained six or more people.

4. Told a joke.

5. Volunteered for a club or organization.

6. Tried to get a tan.

7. Attended a city council meeting.

8. Colored my hair.

9. Went to a night club.

10. Drank in a bar.

Why might the drive for social attention be so strongly linked to extraversion? One possibility is that many human rewards are social in nature. Our complex social lives are probably the dominant force in human evolution, driving the evolution of intelligence, creativity, language, and even consciousness. The human reward system, therefore, most likely evolved to be particularly responsive to social rewards.

There are costs to extraverted behavior, however. This includes time and energy that could be invested in other activities, such as accomplishing a goal (conscientiousness) or engaging with ideas and imagination (intellect/imagination). There is also the risk that inappropriate attention-seeking behavior can fall flat, leading to reduced attention-holding power. Finally, high levels of exploration of the environment can expose extraverted individuals to increased physical risks. For instance, extraverts are more likely to be hospitalized due to accident or illness, and are more likely to become involved in criminal or antisocial behaviors and get arrested.

It's important to distinguish, however, between the most prominent behavioral manifestation of extraversion (desire for social attention) and the core underlying mechanism of extraversion (reward sensitivity). Even though reward sensitivity need not be limited exclusively to social situations, high reward sensitivity likely motivates extraverts to seek out potentially rewarding positive social interactions, and fuels them to display behaviors that will increase social attention (e.g., friendliness, smiling, high energy, loudness, exhibitionism, positive emotions).

From a biological perspective, reward sensitivity is likely governed by dopamine. While dopamine is involved in a variety of cognitive and motivational processes, the unifying function of dopamine is exploration. According to Colin DeYoung, "the release of dopamine, anywhere in the dopamingergic system, increases motivation to explore and facilitates cognitive and behavioral processes useful in exploration."

A lot of introverts notice that they often need to be alone to recharge their batteries after vigorous social interactions, whereas extraverts appear to gain energy from social interactions. This can be explained by dopamine's function in energizing potentially rewarding social interactions, as well as its role in overcoming the cost of effort. For introverts, such interactions are more effortful and tiring due to their less active reward system.”

http://blogs.scientificamerican.com/beautiful-minds/will-the-real-introverts-please-stand-up/



Oriana:

Funny that a big indicator of extraversion is telling dirty jokes.

 I thought wanting to spend time alone so I can process experience was the very definition of introversion, but it does make more sense to speak of being high on the intellect/imagination dimension. By the way, this dimension is traditionally designated as “openness to experience” — which doesn’t seem to be an accurate equivalent, though openness to INNER experience would be part of “intellect/imagination.”

I think I have an openness to ideas. Experiences — I need to think about the possible cost, including unpleasant memories and impact on health.

I can even imagine myself becoming a lot more sociable — if I lived around interesting, educated people, for instance. The whole dimension of introversion is not terribly clear. So much depends on the context. When I visited Boston and met a lot of educated people I became so sociable I could hardly shut up.

So perhaps it's more about the quality of people introverts meet. With the right people, I am sociable; with those who are into small talk or women who talk exclusively about their children, I find excuses to leave.

I'd still need lots of solitude in order to process the social interactions. A little goes a long way because I need to relive anything significant and think about it — to let my mind loose on it. If I don't have enough time to process, then life seems to just flee like sand through the fingers and becomes pretty meaningless. But the processing of experience is part of the intellect/imagination dimension rather than introversion per se.

I still think there is something to the augmenter/reducer dimension introduced in the sixties by the psychologist Aneseth Petrie. Reducers have weaker cortical arousal and need strong stimulation (e.g. noisy music); augmenters tend to magnify stimulation, so they prefer the quiet and the subtle. Reducers, who tend to be chronically understimulated, are more likely to be smokers and rely on coffee and other stimulants to raise their arousal level. They are easily bored. Augmenters seek out silence or soothing stimulation — doesn’t that sound like the classic description of an introvert?


JESUS NEVER SAID SOULS GO TO HEAVEN OR HELL AFTER WE DIE; HOW CHRISTIANS SWITCHED TO PLATO
“Neither Jesus nor any writer of the bible says anything about the soul going anywhere when they describe death. Nor do they identify the soul with the mind, or with the whole human being, as Christians began doing when in the fourth century. Jesus certainly taught that there will be life after death — the resurrection — but he didn’t teach that there will be life right after death, which is what most Christians now believe.

Jesus talked about souls, but he didn’t think of them in the way that most Christians do. Like the other first-century Jews, he thought of a person’s soul as what made him or her be alive [“the animating principle” — oddly enough, that’s how the Catholic Encyclopedia defines the soul]. Common words for soul were nefesh and ruach in Hebrew, and spiré and pneuma in Greek. Like the words for soul in most languages these came from words for breath, wind, and what’s moving. The reason words meaning air that’s moving were used for what makes us alive is probably that people noticed that as long as we are alive, we breathe, when when we “breathe our last breath,” we die. So they thought that our breath is what keeps us alive. Nothing was thought of as immortal here, as the soul is immortal in Greek dualism. The soul was understood to be mortal just like the rest of the person, and at death, both were destroyed.

If Jesus thought of the soul as what makes a person alive, and not as the person’s nonphysical, immortal mind, where did the now popular idea that the soul is the nonphysical, immortal mind come from? That idea came not from the bible but from Greek philosophy. Greek-influenced Christians tended to be dualists, thinking of each person as two things: a mortal body being controlled by an immortal soul.

The most influential of those dualists was Augustine, who defined a human being as “a rational soul using a mortal and earthly body.” That definition would have puzzled Jesus because he thought of a human being as one thing — a living body, not two things — a soul, plus a body that it “uses.”

In switching to Platonic ideas about death liberating the immortal soul, Christian thinkers quietly put aside Jesus’ ideas, which he shared with the writers of the bible, that death destroys us. What Jesus added was that the destruction of death is not permanent because at the end of the world God will intervene in the natural order and resurrect everyone [in flesh], judge them, and reward or punish them.

In Jesus’ day, this idea of the resurrection was less than two centuries old and was not accepted by all Jews. The Sadducees rejected it because it was not well-grounded in the scriptures. If you read through the whole Old Testament — over one thousand pages — God says nothing at all about anyone living after they die. And just before he drives Adam and Eve out of the garden, he scolds Adam, saying, “You are dust, and to dust you shall return.”

There are just two sketchy prophetic passages in the OT that suggest a future resurrection, and it is not a resurrection of the human race. These passages were written at a time when Jews were being persecuted, and in both of them only Jews — maybe only some Jews — will be resurrected.

Any Jew who believed in the resurrection of the dead at the time of Jesus, then, had very little to base it on. Jesus is vague about what it will involve, except to suggest that everyone, not just some Jews, will be resurrected, and there will judgment after resurrection, followed by happiness for the good people and suffering for the bad. But whatever he said about the resurrection of the dead, it is clear that he did not say that people’s souls go to heaven or hell when they die.” ~ John Morreall, “Questions for Christians,” 2014. (John Morreall is professor and chair of the Department of Religious Studies at the College of William and Mary.)

(by that logic there are no human souls in heaven or hell right now, as the author explains in the chapter that follows; heaven or hell were to follow the bodily resurrection of the whole 

person)

Master of the Parrot, Madonna and Child. Note that Baby J looks like Dennis the Menace.
 

Oriana:

I am amazed how much stuff I was indoctrinated with is wrong from the point of view of the first-century Jewish beliefs. Such fantastic fabrications! The Jewish beliefs were fabrications as well, but I was taught heresies of those original fabrications.

And surely at least SOME clergy knew there was no scriptural support for the idea that there are any human souls in heaven or hell? That nobody is in a "better place" until the resurrection? (never mind that the resurrection seems awfully delayed).

I was so heavily indoctrinated — or call it having had an excellent memory even as a child — that later I discovered that absurdities still cling to my psyche, e.g. the soul being somehow separate from the body and going somewhere after the body dies. I'd never say that I believe that, but some of this nonsense still clings like a stubborn weed and has to be uprooted from the psyche. So it helps that even Jesus — if historical Jesus ever existed — did not believe in a soul separate from the body.

But what helps most is simply the view of “soul” as consciousness, an emergent phenomenon that stems from brain function. Once brain function ceases, consciousness ceases the way flame is gone when the fuel is exhausted. Consciousness doesn’t “go anywhere.” It ceases. 



Monet: Antibes seen from Salis Gardens

 
ending on beauty:

 
And when the Heavens — disband —
And Deity conclude —
Then — look for me. Be sure you say —
Least Figure — on the Road —

~ Dickinson, 401

This being the Southwest, "no figure on the road" is the typical experience. After the biblical  “end of the world” (eagerly looked forward to by many), for a while at least, things here would look pretty much the same . . . or in Nebraska, say.

I’m glad for the companionship of clouds.

By the way, “heavens disband” because “the kingdom of heaven” did not mean a place in the clouds. It meant the future paradise here on earth. “Thy kingdom come” — not that we “go to heaven” — the Jews at the time of Jesus had no concept of “going to heaven” — but that heaven comes to earth.

 





No comments:

Post a Comment