Saturday, February 24, 2018


Natural Bridge Cave, Calaveras County, California


Beyond seven rivers,
beyond seven seas,
there was a princess who lived
on a glass mountain.

She had mirrors for companions.
She had see-through dreams.
The knights’ horses’ hooves
kept slipping.

But a hero, the youngest, the fool,
in return for a crumb to a crone
at the crossroads, with the usual one tooth,
won a horse with diamond hooves.


Every morning I slid
my hands into solitude
as into a basin of water,
and waited for the knight.

Only diamond like a dancing star
could carve a stairway
up the mirrored slope.
Ice-blue windows blew open,

crystal doors slammed shut.
An owl flew to the top:
“If not you, then who?
Who else will be a fool?”


The mountain is slippery and steep.
Its sheen half-blinds me with reflections,
a glassy, insomniac light.
I carve one step, then another.

It may take me a lifetime, I know.
But the princess in the tower of cloud
can hear — could always hear —
starry beat of diamond hooves.

~ Oriana

I offer this poem in puzzlement and frustration. Can a poet love own poem that fails with the English-language audience? And one that no longer reflects the speaker’s current concerns or beliefs?

Completely unknown here, the Glass Mountain is a popular fairy-tale in Poland. The version I know does not feature a tree with golden apples or the guardian eagle. Rather, the princess’s jealous father has her imprisoned in a castle on top of a steep, slippery glass mountain to keep her safe from men. But they keep trying, only to slip down the glass slope. Finally a good guy,  the youngest son, an underdog who shows his kindness by sharing his meager food with a crone, is rewarded with a steed that has diamond hooves — and those hooves work!

The phrase “the glass mountain” has entered the language to mean an extreme challenge — perhaps an impossible one. “Trying to make Marek stop drinking is my glass mountain,” a woman may say while her listeners sigh heavy sighs (I wonder if “Co-Dependent No More” has been translated into Polish).

Whoever is trying to climb a glass mountain is usually mistaken both about the Princess (the prize), but especially about having the horse with the diamond hooves (so to speak). An aspiring artist may think that talent is his diamond hooves in his climb toward recognition, while in fact it would take even more talent and strangeness (or, let’s face it, pathology) — plus connections (location! location!), and enormous energy, willingness to make huge sacrifices, and superhuman hard work.

In the broadest sense, life itself is a glass mountain we keep trying to scale, in a somewhat Sisyphean mode. But let’s say we do make it to the top. We can linger  there only so long before it’s time to descend — and then we’re told we’re “over the hill”! (A saving feature: those later, “over-the- hill” years tend to be happier than youth.)

But the poem is strangely dear to me — perhaps because it reflects my starry-eyed younger self — both as the imprisoned princess and as the knight who starts coming to her rescue (note my brave attempt at a “liberated” solution — I try to be my own Prince Charming!). In the broader sense perhaps that younger, trusting self never dies — we continue to count on luck, or on what believers would call “grace.” It’s the “optimism bias.”

There are times when it’s best to cut one’s losses and stop trying to climb the glass mountain (or, for a country, to stop fighting an unwinnable war). This is difficult because we are apparently hard-wired for optimism — no matter how poor the odds.

(A shameless digression: A personal side issue for me: at this time I don’t have any Polish speaking friends, so I literally have no one who grew up familiar with the fairy tale and  thus understands the meaning of “glass mountain.” If I called a problem “my glass mountain,” it would elicit blank stares. If I called trying to eliminate the Electoral College or repeal the Second Amendment a glass mountain, I would again encounter sheer incomprehension. This is the unavoidable loneliness of someone who comes from another  culture. There are worse fates.)


Your “glass mountain” tale seemed very familiar to me. I specifically remember the princess on the mountain story as being in one of my grade school readers. Though I remember the mountain as made of ice, not glass — maybe one of those “revisions” memory is prone to. But of course since the story is an obscure one in this culture, references to it would not be understood in the way you knew.

Yes, the kind of isolation you speak of, no one there to speak with you in your beloved native tongue, is a very lonely thing. After all, I am sure those first words we learn are for each of us the “real” words, connecting us to the “real” world, and not this strange translation.

You also say, more than once this week, "There are worse fates." And there are indeed — with this phrase I think we at once comfort and scold ourselves, voicing the pain and mother's admonition to "count your blessings" in four brief words.


Very keen observation on the first words being the “real words.” One reason is the emotional conditioning we acquire. But that’s also the liberating aspect of writing and speaking in a non-native language — it’s abstract chunks of letters/sounds, so you can use profanities and/or discuss matters like sex without feeling anything (or not much). That’s why one of my poems has the title “I Can Be a Poet Only in English.” Likewise, it’s not that offensive if someone uses vulgar language in your presence — again, because these are not “real words.”

~ “We like to think of ourselves as rational creatures. We watch our backs, weigh the odds, pack an umbrella. But both neuroscience and social science suggest that we are more optimistic than realistic. On average, we expect things to turn out better than they wind up being. People hugely underestimate their chances of getting divorced, losing their job or being diagnosed with cancer; expect their children to be extraordinarily gifted; envision themselves achieving more than their peers; and overestimate their likely life span (sometimes by 20 years or more).

The belief that the future will be much better than the past and present is known as the optimism bias. It abides in every race, region and socioeconomic bracket. Schoolchildren playing when-I-grow-up are rampant optimists, but so are grownups: a 2005 study found that adults over 60 are just as likely to see the glass half full as young adults.

Collectively we can grow pessimistic. But private optimism, about our personal future, remains incredibly resilient. A survey conducted in 2007 found that while 70% thought families in general were less successful than in their parents' day, 76% of respondents were optimistic about the future of their own family.

Even if that better future is often an illusion, optimism has clear benefits in the present. Hope keeps our minds at ease, lowers stress and improves physical health. In fact, a growing body of scientific evidence points to the conclusion that optimism may be hardwired by evolution into the human brain.

Hardwired for Hope?

I would have liked to tell you that my work on optimism grew out of a keen interest in the positive side of human nature. The reality is that I stumbled onto the brain's innate optimism by accident. After living through Sept. 11, 2001, in New York City, I had set out to investigate people's memories of the terrorist attacks. I was intrigued by the fact that people felt their memories were as accurate as a videotape, while often they were filled with errors. A survey conducted around the country showed that 11 months after the attacks, individuals' recollections of their experience that day were consistent with their initial accounts (given in September 2011) only 63% of the time. They were also poor at remembering details of the event, such as the names of the airline carriers. Where did these mistakes in memory come from?

Scientists who study memory proposed an intriguing answer: memories are susceptible to inaccuracies partly because the neural system responsible for remembering episodes from our past might not have evolved for memory alone. Rather, the core function of the memory system could in fact be to imagine the future — to enable us to prepare for what has yet to come. The system is not designed to perfectly replay past events, the researchers claimed. It is designed to flexibly construct future scenarios in our minds. As a result, memory also ends up being a reconstructive process, and occasionally, details are deleted and others inserted.

To test this, I decided to record the brain activity of volunteers while they imagined future events — not events on the scale of 9/11, but events in their everyday lives — and compare those results with the pattern I observed when the same individuals recalled past events. But something unexpected occurred. Once people started imagining the future, even the most banal life events seemed to take a dramatic turn for the better.  

Mundane scenes brightened with upbeat details as if polished by a Hollywood script doctor. You might think that imagining a future haircut would be pretty dull. Not at all. Here is what one of my participants pictured: "I was getting my hair cut to donate to Locks of Love [a charity that fashions wigs for young cancer patients]. It had taken me years to grow it out, and my friends were all there to help celebrate. We went to my favorite hair place in Brooklyn and then went to lunch at our favorite restaurant."

I asked another participant to imagine a plane ride. "I imagined the takeoff — my favorite! — and then the eight-hour-long nap in between and then finally landing in Krakow and clapping for the pilot for providing the safe voyage," she responded. No tarmac delays, no screaming babies. The world, only a year or two into the future, was a wonderful place to live in.

The Human Time Machine

To think positively about our prospects, we must first be able to imagine ourselves in the future. Optimism starts with what may be the most extraordinary of human talents: mental time travel, the ability to move back and forth through time and space in one's mind. Although most of us take this ability for granted, our capacity to envision a different time and place is in fact critical to our survival.

It is easy to see why cognitive time travel was naturally selected for over the course of evolution. It allows us to plan ahead, to save food and resources for times of scarcity and to endure hard work in anticipation of a future reward. It also lets us forecast how our current behavior may influence future generations. If we were not able to picture the world in a hundred years or more, would we be concerned with global warming? Would we attempt to live healthily? Would we have children?

While mental time travel has clear survival advantages, conscious foresight came to humans at an enormous price — the understanding that somewhere in the future, death awaits. Ajit Varki, a biologist at the University of California, San Diego, argues that the awareness of mortality on its own would have led evolution to a dead end. The despair would have interfered with our daily function, bringing the activities needed for survival to a stop. The only way conscious mental time travel could have arisen over the course of evolution is if it emerged together with irrational optimism. Knowledge of death had to emerge side by side with the persistent ability to picture a bright future.

 Using a functional magnetic resonance imaging (fMRI) scanner, we recorded brain activity in volunteers as they imagined specific events that might occur to them in the future. Some of the events that I asked them to imagine were desirable (a great date or winning a large sum of money), and some were undesirable (losing a wallet, ending a romantic relationship). The volunteers reported that their images of sought-after events were richer and more vivid than those of unwanted events.

This matched the enhanced activity we observed in two critical regions of the brain: the amygdala, a small structure deep in the brain that is central to the processing of emotion, and the rostral anterior cingulate cortex (rACC), an area of the frontal cortex that modulates emotion and motivation. The rACC acts like a traffic conductor, enhancing the flow of positive emotions and associations. The more optimistic a person was, the higher the activity in these regions was while imagining positive future events (relative to negative ones) and the stronger the connectivity between the two structures.

The findings were particularly fascinating because these precise regions — the amygdala and the rACC — show abnormal activity in depressed individuals. While healthy people expect the future to be slightly better than it ends up being, people with severe depression tend to be pessimistically biased: they expect things to be worse than they end up being. People with mild depression are relatively accurate when predicting future events. They see the world as it is. In other words, in the absence of a neural mechanism that generates unrealistic optimism, it is possible all humans would be mildly depressed.

Even when the incidents that befall us are the type of horrific events we never expected to encounter, we automatically seek evidence confirming that our misfortune is a blessing in disguise. No, we did not anticipate losing our job, being ill or getting a divorce, but when these incidents occur, we search for the upside. These experiences mature us, we think. They may lead to more fulfilling jobs and stable relationships in the future. Interpreting a misfortune in this way allows us to conclude that our sunny expectations were correct after all — things did work out for the best.

It seems that our brain possesses the philosopher's stone that enables us to turn lead into gold and helps us bounce back to normal levels of well-being. It is wired to place high value on the events we encounter and put faith in its own decisions. This is true not only when forced to choose between two adverse options (such as selecting between two courses of medical treatment) but also when we are selecting between desirable alternatives. 

Imagine you need to pick between two equally attractive job offers. Making a decision may be a tiring, difficult ordeal, but once you make up your mind, something miraculous happens. Suddenly — if you are like most people — you view the chosen offer as better than you did before and conclude that the other option was not that great after all. According to social psychologist Leon Festinger, we re-evaluate the options postchoice to reduce the tension that arises from making a difficult decision between equally desirable options.

In a brain-imaging study I conducted with Ray Dolan and Benedetto De Martino in 2009, we asked subjects to imagine going on vacation to 80 different destinations and rate how happy they thought they would be in each place. We then asked them to select one destination from two choices that they had rated exactly the same. Would you choose Paris over Brazil? Finally, we asked them to imagine and rate all the destinations again. Seconds after picking between two destinations, people rated their selected destination higher than before and rated the discarded choice lower than before.

In our experiment, after a decision was made between two destinations, the caudate nucleus rapidly updated its signal. Before choosing, it might signal "thinking of something great" while imagining both Greece and Thailand. But after choosing Greece, it now broadcast "thinking of something remarkable!" for Greece and merely "thinking of something good" for Thailand.

The Puzzle of Optimism

While the past few years have seen important advances in the neuroscience of optimism, one enduring puzzle remained. How is it that people maintain this rosy bias even when information challenging our upbeat forecasts is so readily available? Only recently have we been able to decipher this mystery, by scanning the brains of people as they process both positive and negative information about the future. The findings are striking: when people learn, their neurons faithfully encode desirable information that can enhance optimism but fail at incorporating unexpectedly undesirable information. When we hear a success story like Mark Zuckerberg's, our brains take note of the possibility that we too may become immensely rich one day. But hearing that the odds of divorce are almost 1 in 2 tends not to make us think that our own marriages may be destined to fail.

Why would our brains be wired in this way? It is tempting to speculate that optimism was selected by evolution precisely because, on balance, positive expectations enhance the odds of survival. Research findings that optimists live longer and are healthier, plus the fact that most humans display optimistic biases — and emerging data that optimism is linked to specific genes — all strongly support this hypothesis. Yet optimism is also irrational and can lead to unwanted outcomes. The question then is, How can we remain hopeful — benefiting from the fruits of optimism — while at the same time guarding ourselves from its pitfalls?

I believe knowledge is key. We are not born with an innate understanding of our biases. The brain's illusions have to be identified by careful scientific observation and controlled experiments and then communicated to the rest of us. Once we are made aware of our optimistic illusions, we can act to protect ourselves. The good news is that awareness rarely shatters the illusion. The glass remains half full. It is possible, then, to strike a balance, to believe we will stay healthy, but get medical insurance anyway; to be certain the sun will shine, but grab an umbrella on our way out — just in case.” ~,8599,2074067-5,00.html


While unable to fight off our own optimism bias — and perhaps we shouldn’t, given the health benefits — we may find the optimism of others excessive, childish, downright crazy, and just deeply annoying in its cheerful, chirping idiocy. Well, that’s life. Eighty percent of the population think they are above average.

To me the annoying person is not the typical optimist, or the besotted lottery player. It’s the cheerleader who assures you “You can do it!” even though you past a certain age you do know your abilities and your limitations, and no, you are not going to become an astrophysicist or a piano virtuoso or a business tycoon no matter how hard you try. Some situations really are glass mountains, and the princess in the castle on the top is not even your type.

But on the whole we can’t help being optimistic. There are worse fates.


The article on our ability to imagine the future, and the evidence that optimism may be a bias hard wired in our brains is very interesting. Extending the argument a bit, perhaps the failure of that function, the inability to imagine anything but darkness ahead, culminating in the absolute darkness of death, may result in not mild, but the most severe depressions, the kind that may end in suicide. Without hope, there will be no survival, no push toward any future at all.


The optimism bias is so widespread that I wonder why it ever fails. But fail it does — reality can be just too compelling, and the higher the hope, the greater the despair in case of a crash — and inability to spin that crash (although normally we are fantastic spin doctors).

I reluctantly admit that TS Eliot was right when he said that we can bear only so much reality. Well, tough — so we can’t afford to be completely realistic and have to either kid ourselves or keep ourselves distracted (my main solution — but am I kidding myself without knowing it? I try not to form any expectations — but how many hidden, optimistic expectations is my life actually based on?) 

glass Roman hydria, 4th century c.e.


Yet overdoing positive fantasies can backfire:


Yes, you read this correctly: they recovered MORE SLOWLY. This is almost as startling as the results of the famous 2006 STEP prayer study, in which those patients who knew they were being prayed for experienced significantly more post-surgery complications. But perhaps we shouldn’t be surprised. We have several studies by now that show that cultivating fantasies of already having accomplished the goal tends to lessen your efforts to achieve the goal. I think it’s related to the phenomenon well-known to writers: “talking out the book” may prevent them from ever writing the book. The drive to write is gone.

What also caught my attention is one of the readers’ comments:

“Positive thinking demands ego, demands your mental hard drive. Not that you shouldn't think — please, reality test things. But letting go of thinking and willing and pushing can have something to do with activating intuitive thinking, which can be much more powerful and useful.”

Here is an excerpt from the New Yorker article:

~ “Since publishing “The Secret,” in 2006, the Australian author Rhonda Byrne has been writing self-help manifestos based on the idea that people who think positive thoughts are rewarded with happiness, wealth, influence, wisdom, and success.

There’s no denying that many people have found comfort in Byrne’s ideas. Like religion, they offer an appealing, non-technical solution to life’s biggest problems while demanding nothing more of their adherents than faith. (Indeed, “The Secret” features verses from Matthew and Mark, promoting the idea that people receive in life what they seek in prayer.) But while many people give religion a pass because it claims to focus on questions that can’t be answered with science, the same is not true of success. Though Byrne presents her ideas without evidence, we can measure their worth with data.

According to a great deal of research, positive fantasies may lessen your chances of succeeding. In one experiment, the social psychologists Gabriele Oettingen and Doris Mayer asked eighty-three German students to rate the extent to which they “experienced positive thoughts, images, or fantasies on the subject of transition into work life, graduating from university, looking for and finding a job.” Two years later, they approached the same students and asked about their post-college job experiences. Those who harbored positive fantasies put in fewer job applications, received fewer job offers, and ultimately earned lower salaries. The same was true in other contexts, too. Students who fantasized were less likely to ask their romantic crushes on a date and more likely to struggle academically. Hip-surgery patients also recovered more slowly when they dwelled on positive fantasies of walking without pain.

Heather Barry Kappes, a management professor at the London School of Economics, has published similar research with Oettingen. I asked Kappes why fantasies hamper progress, and she told me that they dull the will to succeed: “Imagining a positive outcome conveys the sense that you’re approaching your goals, which takes the edge off the need to achieve.” Oettingen and Kappes asked two groups of undergraduates to imagine the coming week. One group fantasized that the week would go as well as possible, whereas the other group conjured a more neutral version of the week. One week later, when the students returned to the lab, the positive fantasizers felt that they had accomplished less over the previous week.

I asked Oettingen whether positive fantasies might sometimes be useful. She suggested that they might, if a person considered the specific steps that he would take to overcome the barriers to success. Kappes said that fantasies might be useful when you’re unable to satisfy a need—when you’re famished and hours from eating, for example—because they temporarily blunt the pang. There’s nothing wrong with getting lost in fantasy, as long as you aren’t ultimately hoping to indulge in the real thing.” ~

An "unclean spirit" or a medieval Yoda with a punk haircut?


There is also a connection with false memory — the brain starts believing you've already done something, so no more effort is needed. The drive to do it dissipates.
I know from my own experience how easy it's to create a false memory, to fool the brain into the sense of “mission accomplished!” Hence also the danger of praying instead of doing.

I love the statement that “you succeed in your head, but not in the world.” That's substituting positive thinking (“an oxymoron,” as someone observed) for action, and some people do just that. They imagine that saying “I'm rich” for 15 minutes a day will make them rich! That's the harm of it — when people substitute fantasy and/or affirmations for action, e.g. substitute the fantasy of walking for the hard work of physical therapy.

I've learned to “think at the keyboard” because if I think it out in my head with perfect clarity, there's a chance I won't type it out. But salvation lies in the fact that if I do sit down at the keyboard, something else emerges, something not originally thought up.

“Just do it!” is perhaps the most useful mantra there is — especially for women, who are more prone to hesitating and overthinking. “Life rewards action,” is another very useful saying.

Still, there can be a lot of pleasure in a fantasy when the real thing is impossible but the grief over that impossibility is no longer acute. A friend of mine knew at one point she would never again experience a real romance — so she indulged in fantasies as a deliberate substitute. It’s possible that it spared her bitterness. She remained sweet-tempered, and, to everyone’s astonishment, in the last months of her life, in spite of her age, obesity, and disability, she did end up with a companion! Another instance of “you never know” — life has infinite surprises.

Likewise, going over some bad scenarios may be a very positive action — it shows us that those might not be actual huge disasters — somehow or other, we would find ways to cope. And what really happens is generally never quite what we expect — so that simple faith: somehow we’ll cope — is a lot more useful than fantasizing about wonderful outcomes.

The comment about intuitive thinking as opposed to “positive thinking” is also worth noting. 


That push [to do things], that drive, is what enables accomplishment. The constant drivel about "being positive" and rejecting "negativity" is not only annoying, it reminds me of the whole scenario we have created where, with our children, "everyone's a winner, " everyone gets a trophy, no matter what they have or have not accomplished. The same with the extreme value put on "self esteem," which we are instructed is of essential value, must always be maintained and supported. But what qualities, virtues, attributes, actions, or productions  exist as the basis for all this glorious self-esteem? It seems to me often there is nothing there at all, and the prevailing culture doesn't even require it.

If you are a winner and a champion simply for existing, why try? You already have it, deserve it all, so there is no need to strive, to learn, to work, to do any of the hard things that lead to true accomplishment.

I have seen this with some post surgical patients, who do not work at their rehabilitation, perhaps because they imagined the surgery as some sort of miracle, that would produce the desired results without any effort of their own, other than their dream of magically induced recovery.

This is very like those believers in prayer and positive thinking who think such things are sure fire pathways to the future outcomes they desire. Have faith, pray, trust you will get what you want — or even something better — because god loves you and you deserve it.


One of the culprits here is the advertising industry. “You deserve the best,” has become a major slogan, used to push anything from toothpaste and soda pop to luxury cars. It’s certainly fine to buy the best if you can afford it, but “deserve” doesn’t enter into this.

But greatest damage has come with the New Age delusion of believing that it’s enough to “put it to the universe” that you desire something and the universe will magically comply with that wish — or else “something even better” is coming to you. Your secret super-power is simply repeating affirmations!

Alas, the way the brain works, affirmations may create an unconscious false memory of “mission accomplished,” and a consequent lessening or even absence of the drive to take a concrete action. That’s one of the first lessons writers learn: keep your mouth shut about a current project, or you’ll “talk it out” and end up with zero drive to do the writing. 

* *

On the frozen Neva, St. Petersburg, ca. 1910

"Trump? Putin? Internet? Soviet Union? Lenin, Stalin, Hitler? FOX News? Lady Gaga?

Life is life. Death is death. Without the former, there can be no latter, and vice versa. Or maybe not. It’s a brutally cold day today, but what would you expect. Winter is winter. Russia is Russia. Life goes on. We’ve all been dead for a long time now. The ice on the Neva is strong." ~ M. Iossel


“Actions are held to be good or bad, not on their own merits, but according to who does them. There is almost no kind of outrage — torture, imprisonment without trial, assassination, the bombing of civilians — which does not change its moral color when it is committed by ‘our’ side. The nationalist not only does not disapprove of atrocities committed by his own side, he has a remarkable capacity for not even hearing about them.” ~ George Orwell

This is so applicable to the current puzzlement over how Trump can get away with any outrage.

~ that's because to his base he's “One of Us.” That's all that matters — you root for your team.

Here it would be expected to have an image of one kind of atrocity or another . . . but let us instead detox by looking at a beautiful animal. 

~ “I was looking at a CT scan of one of the victims of the shooting at Marjory Stoneman Douglas High School, who had been brought to the trauma center during my call shift. The organ looked like an overripe melon smashed by a sledgehammer, with extensive bleeding. How could a gunshot wound have caused this much damage?

The reaction in the emergency room was the same. One of the trauma surgeons opened a young victim in the operating room, and found only shreds of the organ that had been hit by a bullet from an AR-15, a semi-automatic rifle which delivers a devastatingly lethal, high-velocity bullet to the victim. There was nothing left to repair, and utterly, devastatingly, nothing that could be done to fix the problem. The injury was fatal.

A year ago, when a gunman opened fire at the Fort Lauderdale airport with a 9mm semiautomatic handgun, hitting 11 people in 90 seconds, I was also on call. It was not until I had diagnosed the third of the six victims who were transported to the trauma center that I realized something out-of-the-ordinary must have happened. The gunshot wounds were the same low velocity handgun injuries as those I diagnose every day; only their rapid succession set them apart. And all six of the victims who arrived at the hospital that day survived.

Routine handgun injuries leave entry and exit wounds and linear tracks through the victim's body that are roughly the size of the bullet. If the bullet does not directly hit something crucial like the heart or the aorta, and they do not bleed to death before being transported to our care at a trauma center, chances are, we can save the victim. The bullets fired by an AR-15 are different; they travel at higher velocity and are far more lethal. The damage they cause is a function of the energy they impart as they pass through the body. A typical AR-15 bullet leaves the barrel traveling almost three times faster than, and imparting more than three times the energy of, a typical 9mm bullet from a handgun. An AR-15 rifle outfitted with a magazine cartridge with 50 rounds allows many more lethal bullets to be delivered quickly without reloading.

With an AR-15, the shooter does not have to be particularly accurate. The victim does not have to be unlucky. If a victim takes a direct hit to the liver from an AR-15, the damage is far graver than that of a simple handgun shot injury. Handgun injuries to the liver are generally survivable unless the bullet hits the main blood supply to the liver. An AR-15 bullet wound to the middle of the liver would cause so much bleeding that the patient would likely never make it to a trauma center to receive our care.

One of my ER colleagues was waiting nervously for his own children outside the school. While the shooting was still in progress, the first responders were gathering up victims whenever they could and carrying them outside the building. Even as a physician trained in trauma situations, though, there was nothing he could do at the scene to help to save the victims who had been shot with an AR-15. Most of them died on the spot, with no fighting chance at life.

A medical professor taught me about the dangers of drawing incorrect conclusions from data with the example of gum chewing, smokers, and lung cancer. He said smokers may be more likely to chew gum to cover bad breath, but that one cannot look at the data and decide that gum chewing causes lung cancer. It is the same type of erroneous logic that focuses on mental health after mass shootings, when banning the sale of semi-automatic rifles would be a far more effective means of preventing them.” ~


Desiderius Erasmus of Rotterdam, the leading figure of the Northern Renaissance, is widely considered the greatest of early humanists. Five hundred years ago, he faced a populist uprising led by a powerful provocateur, Martin Luther, that resulted in divisions no less explosive than those we see in America and Europe today.

Between 1500 and 1515, Erasmus produced a small library of tracts, textbooks, essays, and dialogues that together offered a blueprint for a new Europe. The old Europe had been dominated by the Roman Church. It emphasized hierarchy, authority, tradition, and the performance of rituals like confession and taking communion. But a new order was emerging, marked by spreading literacy, expanding trade, growing cities, the birth of printing, and the rise of a new middle class intent on becoming not only prosperous but learned, too.

Erasmus became the most articulate spokesman for this class. Moving from city to city in search of good libraries, fine wine, sparkling conversation, and skilled printers, he produced a new “design for living” based on the principles of tolerance, pluralism, concord, and virtuous conduct. In his 1515 essay Dulce bellum inexpertis (“War is sweet only to those who have not experienced it”), he denounced the ceaseless wars waged by rash princes. In The Education of a Christian Prince (1516), he offered a guide to good governance, urging sovereigns to pursue not their own interests but those of the people. In The Praise of Folly (1511), he mocked the pretensions and delusions of kings and courtiers, popes and theologians—part of his campaign to discredit the ruling class and open the way for renewal.

At the heart of Erasmus’s program was his revision of the New Testament. To reform Christendom, he felt, the text on which it was based had to be purified. This was the Vulgate, the Latin translation of the Bible. For a thousand years, this document had served as the scriptural foundation of the Roman Church. Many of its doctrines and institutions were based on specific words and phrases in the Vulgate. Yet a close inspection of the text raised many questions about its sacred status. It was marred by spelling mistakes, grammatical errors, clumsy constructions, and scribal blunders.

In 1500, Erasmus set out to learn Greek so that he could read the Gospels and Epistles in the language in which they had originally been written. (After the fall of Rome, knowledge of Greek had more or less disappeared from the Latin West.) He also began hunting down old manuscripts of the Greek New Testament; by comparing and collating them, he hoped to conjecture what their authors truly meant.

In early 1516, after months of exhausting writing, editing, and proofreading in the print shop of Johann Froben in Basel, Switzerland, the work was done. In addition to providing a revised Latin translation of the New Testament and a parallel Greek text (the first ever printed), Erasmus offered hundreds of annotations explaining the changes he had made. In them, he argued for a new way of reading the Bible—not as a collection of miracles, prophecies, and supernatural acts, but as the story of a transcendent being whose simplicity, humility, and compassion could encourage readers to change their ways and follow a more pious path.  

Erasmus of Rotterdam, an engraving by Albrecht Dürer, 1526
The publication of Erasmus’s revised New Testament was a milestone in biblical studies. It gave scholars the tools to read the Bible as a document that, while divinely inspired, was a human product that could be deconstructed and edited in the same manner as a text by Livy or Seneca. As copies began circulating, the magnitude of Erasmus’s achievement was immediately recognized. Not since Cicero had an intellectual figure so dominated Western discourse as Erasmus did in that enchanted spring of 1516. “Everywhere in all Christendom your fame is spreading,” wrote John Watson, a rector in England with whom he was friendly. “By the unanimous verdict of all scholars, you are voted the best scholar of them all, and the most learned in both Greek and Latin.”

The term “Erasmian” came into use to describe those who shared his vision. But those Erasmians represented only a small sliver of society. Erasmus wrote exclusively in Latin, for the highly educated, Latin-speaking elite. Dazzled by his readings in ancient Greek, Erasmus began promoting knowledge of that language as no less essential than Latin. “Almost everything worth learning is set forth in these two languages,” he wrote in one of his many educational texts. In these, Erasmus proposed a new curriculum for Europe, with instruction in Latin and Greek at its core.

Around the same time that the Erasmians were celebrating the dawn of a new enlightened era, a very different movement was gathering in support of Martin Luther. An Augustinian friar then in his early thirties, Luther had developed his own, unique gospel, founded on the principle of faith. Man, he thought, can win divine grace not through doing good works, as the Latin Church taught, but through belief in Christ. No matter how sincerely one confessed, no matter how many alms one gave, without faith in the Savior, he reasoned, no one can be saved. When Luther made this “discovery,” [sola fide
"faith alone"] in around 1515, he felt that he had become “altogether born again and had entered paradise itself through open gates.”

In his famous 1520 tract To the Christian Nobility of the German Nation, Luther (writing in the vernacular) offered his own reform program. Along with a piercing attack on Rome’s oppressive practices, he proposed twenty-seven measures to protect both the souls and pocketbooks of the German people. He also rejected the idea that the clergy make up a separate spiritual class superior to the laity. All Christians, he declared, are priests of equal standing, free to read and interpret the Bible for themselves. Such attacks on privileged elites endeared him to Herr Omnes, “Mr. Everyman.”

Because of such defiance, Luther was ordered to appear before Charles V, the Holy Roman Emperor, at the Diet of Worms in April 1521. Refusing to recant his writings, Luther made his famous stand on behalf of his conscience as a Christian. For that, he could have been seized on the spot and burned as a heretic, but with the German people mobilizing behind him, any effort to arrest him would have caused a riot. So Luther was able to leave Worms, resume his writing, and set in motion the Reformation.

Initially, Luther admired Erasmus and his efforts to reform the Church, but over time Luther’s inflammatory language and his stress on faith instead of good works led to a painful separation. The flashpoint was the debate over whether man has free will. In dueling tracts, Erasmus suggested that he does, while Luther vehemently objected; after that, the two men considered each other mortal enemies.

Beyond that immediate matter of dispute, however, their conflict represented the clash of two contrasting world views — those of the Renaissance and the Reformation. Erasmus was an internationalist who sought to establish a borderless Christian union; Luther was a nationalist who appealed to the patriotism of the German people. Where Erasmus wrote exclusively in Latin, Luther often used the vernacular, the better to reach the common man. Erasmus wanted to educate a learned caste; Luther, to evangelize the masses. For years, they waged a battle of ideas, with each seeking to win over Europe to his side, but Erasmus’s reformist and universalist creed could not match Luther’s more emotional and nationalistic one; even some of Erasmus’s closest disciples eventually defected to Luther’s camp. Erasmus became an increasingly marginal figure, scorned by both Catholics, for being too critical of the Church, and Lutherans, for being too timid. In a turbulent and polarized age, he was the archetypal reasonable liberal.

Even as his reputation faded, Erasmus worked to complete his blueprint for Europe. In The Complaint of Peace, he decried the nationalist enmities that were splitting the continent. “The English are hostile to the French, for no other reason than that they are French,” he wrote. “The Scots are disliked by the British, solely for being Scots. Germans don’t agree with French, Spaniards don’t agree with either. What perversity—for the mere name of a place to divide people when there is so much which could bring them together!”

Disturbed by the growing bitterness between Catholics and Protestants, Erasmus called on Christians to put aside their private hatreds and bitter quarrels, and instead nurture a spirit of accommodation so that peace could reign. On Mending the Peace of the Church, as he titled the tract, was a resonant appeal for religious tolerance—a formative document in the development of that tradition.

As his end approached, Erasmus sought to warn his fellow Christians of the catastrophe he saw looming — in vain. After his death, in 1536, Europe descended into a century of religious-fueled violence, culminating in the Thirty Years’ War (1618–1648) — the continent’s most destructive conflict before World War I. Erasmus’s ideas about tolerance, peace, and clemency were ruthlessly suppressed. Both Catholics and Protestants dismissed him as a weak, vacillating man who lacked ardor and conviction, and whose commitment to an irenic form of Christianity founded on the Gospels was as objectionable as it was obsolete.

Yet Erasmus’s vision of a united Europe in which people of differing beliefs share a common citizenship would live on, providing an intellectual haven amid the eruptions of nationalism, xenophobia, racism, and nihilistic violence that periodically ravaged the continent. Despite its snobbism and elitism, Erasmian humanism offered an alternative to the apocalypse.

Luther underwent his own reverses. When, in 1524–1525, the German peasants—inspired in part by his writings—rose up against their spiritual and secular overlords, Luther, fearing anarchy, denounced them as mad dogs who deserved to be stabbed, smitten, and slayed. With that, the common man turned irrevocably against Luther. A wrenching dispute over whether the body of Christ is present in the bread of communion led to an irreparable breach with the Swiss branch of the Reformation. And Luther’s uncompromising insistence on the rectitude of his own beliefs alienated many moderates, and not just Catholic ones.

By the time of his death, in 1546, Luther had become an isolated reactionary, his work eclipsed by a younger and more dynamic reformer, John Calvin. Even so, Luther would go down in history as the founder of Protestantism, the man who broke the spiritual stranglehold of the Roman Church. Luther’s brand of Bible-based ardor founded on pure faith would exercise a profound influence on Western culture, not least in America.

from Wiki:

~ “Free will does not exist”, according to Luther in his letter De Servo Arbitrio to Erasmus translated into German by Justus Jonas (1526) in that sin makes human beings completely incapable of bringing themselves to God. Noting Luther's criticism of the Catholic Church, Erasmus described him as "a mighty trumpet of gospel truth" while agreeing, "It is clear that many of the reforms for which Luther calls are urgently needed.” He had great respect for Luther, and Luther spoke with admiration of Erasmus's superior learning.

[This mutual admiration didn’t last.] In a letter to Nikolaus von Amsdorf, Luther objected to Erasmus’ Catechism and called Erasmus a "viper," "liar," and "the very mouth and organ of Satan.”

Erasmus was accused by the monks against the Reformation, that he had:

    “prepared the way and was responsible for Martin Luther. Erasmus, they said, had laid the egg, and Luther had hatched it. Erasmus wittily dismissed the charge, claiming that Luther had hatched a different bird entirely.” ~


Interesting to learn that the Thirty-Year-War was the worst until WW1 — and that toward the end of his life Luther had become an "isolated reactionary." Apparently he was also an ardent believer that the end of the world was “at hand.” (I still admire Luther’s courage, but he was a very flawed human being.)

 “Sin away,” he allegedly said, confident that faith alone (the correct version of the faith, that is) was sufficient for entry to paradise. I can see how this would relieve the stress of trying to “earn” heaven by being perfectly good. At the same time, sola fide can imply a dangerous extreme of taking no responsibility. 

If our deeds, good or bad, don’t count — on our own, we deserve only hellfire, and besides, predestination rules — then so much for the much quoted — and wise — verse, “Without works, faith is dead.” That also happens to be the wisdom of Judaism: the important thing is not our beliefs, but our actions. 

In the end, whether of not we believe in the perpetual virginity of Mary matters very little. And yet wars were fought over imaginary problems of this sort. Oh human folly indeed! Oh gentle Erasmus, too civilized and broad-minded for your times — and alas, even for our own.


~ “It was subtle of God to learn Greek when he wished to become an author, and not to learn it better.” ~ Friedrich Nietzsche on the New Testament

[The latest translator, David Bentley Hart] clearly agrees with Nietzsche on the quality of the book’s koine Greek. He finds the Gospel of Matthew “rarely better than ponderous,” that of Mark “awkwardly written throughout,” and that of John “syntactically almost childish,” while Paul’s letters are “maladroit, broken, or impenetrable” and Revelation is “almost unremittingly atrocious.” Sometimes he does convey the original’s sheer goofiness: “Fallen, fallen, Babylon the Great who has given all the gentiles to drink from the wine of the vehemence of her whoring” (Rev. 14:8). No wonder Hunter Thompson said he did not have to worry about running out of LSD in a hotel. He could trip on the Gideon Bible’s Revelation.” ~

But the latest translator runs into a serious problem. “Hart claims that he will not let his own theological views color his translation. But he clearly does not believe in hell — at least not in a permanent hell . . . Rather than rely on . . . common sense, he labors to out hell from the text of the Bible.”

And this leads to all kinds of awkwardness, including this:

~ “When Jesus at the Second Coming (Matt. 25:46) divides the damned from the saved, [Hart] says, “These will go to the chastening of that Age, but the just to the life of that Age” (for KJV “everlasting fire . . . life eternal”).

The devil, without a hell to tend, is demoted by Hart to “the Slanderer.” . . . Thus we get at Matthew 25:41: “Go from me, you execrable ones, into the fire of the Age prepared for the Slanderer and his angels” (for Tyndale and KJV, “Depart from me, ye cursed, into everlasting fire which is prepared for the devil and his angels”).” ~

The lead image for this article in the New York Book Review notorious painting of the Madonna spanking Baby J by Max Ernst:

Note the halo dropping to the floor.

Finally, I can’t resist sharing an example of Paul’s traffic-jam sentences as translated by Hart:

“But because of false prophets secretly brought in, who stole in so as to spy upon freedom, which we have in the Anointed One Jesus, so that they could enslave us — to whom we did not yield in subordination for even an hour, so that the truth of the good tidings might remain with you and from those who were esteemed as something — precisely what sort of something at that time does not matter to me (God does not take a man at his face) — for to me these estimable men had nothing to add; rather, to the contrary, seeing that I have been entrusted with the good tidings for those of the foreskin, just as Peter for a mission for those of the circumcision — for he who was operating in Peter for a mission to those of the circumcision was also operating in me for the gentiles — and, recognizing the grace given to me, James and Cephas and John — who appeared to be the pillars — gave their hands in fellowship to me and to bar-Nabas, that we should go to the gentiles land they to the circumcision, if only we should remember the poor — the very thing, indeed, that I was eager to do (Gal. 2:4-13)”

“If only we should remember the poor” — then indeed we can gloss over gentiles and genitals. 

What is more important, however, is the reviewer’s conclusion:

~ “Fresh translations of familiar texts are useful because they make us reexamine what we thought we knew. Hart has certainly made me think more deeply about the centrality of the world’s end to the entirety of the New Testament. (. . . ) Every aspect of the New Testament should be read in light of this “good news” that the world will shortly be wiped out.” ~

Domenichino: The Ecstasy of St Paul


Poor Paul! The reviewer ascribes his “traffic-jam sentences” to Paul’s being in a hurry — Paul really believed that the end of the world was imminent. I suspect that it’s rather that Paul’s growing blindness (related to his epilepsy) forced him to dictate, so we get a transcript of speech rather true writing.

This review is important because, among other things, it brings up the issue that’s typically overlooked when discussing translations: the translator’s own beliefs have a way of influencing the translation. Specifically, the reviewer shows that Hart’s non-belief in hell makes a ludicrous mess of his attempts to deal with passages that mention hell by twisting words so as to erase hell from the text.

Obviously a translator is wrong to twist a text so that it seems to support his beliefs — though I agree that as long as hell (and thus the need to be “saved” from it) is the foundation of Christianity, then Christianity is not a “religion of love.” Who knew that a translator's word choice could have such huge implications . . . Not that this latest translation will have any impact. For one thing, it's too late for true impact.

Another thing that occurred to me is that, if it’s the actual word of god (or even if it’s just “divinely inspired”), we’d expect the bible to be stylistically exquisite. And the King James Version has been praised as such. But as I think back to the words I so often heard in Polish, they struck me as awkward and old-fashioned rather than beautiful. My guess is that the KJV English translation is a lucky exception. Also, there exists an allegedly more accurate translation of the Hebrew Bible by Robert Alter — and the text is supposed to be more “forceful,” but also dry and largely devoid of beauty.

Since there can be no “objective” translation, devoid of the translator’s biases and theology, perhaps, at least when it comes to “holy” scriptures, we should stop trying?

One last thing: Bart Ehrman (and possibly Neil Carter too) said that it’s precisely reading the New Testament in the original Greek that leads some seminarians to leave the faith. The usual reason given for this is that these more critical seminarians see that a particular term could be translated in several different ways, so we can’t be sure what the text really means. What they were sure about suddenly becomes fuzzy at best. Likewise, the contradictions between the different gospels become more blatant. But I suspect that that without the seductive beauty of the KJV version, the text itself disappoints just too many times.

(Ehrman: “It often proves difficult enough to establish what the words of the NT mean; the fact that in some instances we don't know what the words actually were does more than a little to exacerbate the problem. I say that many interpreters would like to ignore this reality; but perhaps that isn't strong enough. In point of fact, many interpreters, possibly most, do ignore it, pretending that the textual basis of the Christian scriptures is secure, when unhappily, it is not.”)

(A shameless digression: I remember being shaken when I learned that Homer describes Penelope’s hand as “thick” — pachos. I recognized the root in pachyderm, and “thick” indeed fits better than “strong” [presumably because it’s muscular] or any other word. All of Homer’s translators are desperate to soften the original — not one dares say simply “thick.” No, translation is not neutral.)


“Things got so bad we couldn't lower our standards fast enough.” ~ Carrie Fisher, Wishful Drinking



~ “Unrefined extra virgin olive oil, a chief component of the Mediterranean diet, has been given significant credit for the diet’s health-promoting ability, especially with its rich polyphenol content.

Today, substantial new findings further validate extra virgin olive oil’s benefits for cardiovascular, bone, and brain health. Several of these studies were large-scale clinical trials on humans.

One study in particular caught mainstream media attention. This study, with nearly 19,000 participants, showed that those who consumed the highest quality foods, and who most closely adhered to a true Mediterranean diet, were the ones who were most likely to derive the benefits, including sharp reductions in coronary heart disease and stroke.
The high oleic acid (monounsaturated fat) content of olives was initially thought to be the main source of olive oil’s health benefits. Today, more researchers contend that the health benefits stem from olive oil’s high polyphenol content which includes: oleuropein, tyrosol, and hydroxytyrosol.

Increasing evidence suggests that the polyphenol hydroxytyrosol should be given the most credit as it makes up approximately 50% of extra virgin olive oil’s total polyphenol content.

In 2017, the American Journal of Clinical Nutrition published a study that evaluated the effects of hydroxytyrosol on a cohort of 1,851 men and women.

To be eligible for this study all participants had to be at high-risk for cardiovascular disease.

The participants averaged age 67 and had either type II diabetes or at least three or more major risk factors: smoking, hypertension, dyslipidemia, overweight/obesity, or a family history of premature cardiovascular disease.

The subjects were divided randomly into one of three different intervention groups:

    Group 1: Traditional Mediterranean diet supplemented with extra virgin olive oil.
    Group 2: Traditional Mediterranean diet supplemented with nuts.
    Group 3: Control, low-fat diet.

To measure hydroxytyrosol ingestion, urinary levels of its metabolite (homovanillyl alcohol) were measured.

Results showed that higher urinary levels of homovanillyl alcohol resulted in sharply lower risks of cardiovascular events and mortality.

Compared to those in the lowest quintile, individuals in the third or higher quintile levels of homovanillyl alcohol had at least 56% reduced risk of a cardiovascular event (heart attack, stroke, or death from cardiovascular cause). In addition, subjects in the highest quintile of homovanillyl alcohol had, on average, 9.5 years longer life after the age of 65.

The highest urinary levels of homovanillyl alcohol were obtained by the subjects whose intervention included a traditional Mediterranean diet with the addition of extra virgin olive oil.

Research on olive oil has been primarily focused on its cardiovascular support. However, a growing amount of research in the last decade has shown that it also reduces the risk of Alzheimer’s disease.

In a revealing animal study, researchers examined the effects of extra virgin olive oil on mice genetically prone to develop neurodegenerative changes typical of Alzheimer’s disease, such as a myloid plaque. These six-month-old mice were divided into two groups, one fed a standard diet, and the other group the same standard diet plus extra virgin olive oil. After six months, they found significant differences in their behavior and neuropathology. The researchers concluded that extra virgin olive oil exerted a beneficial effect on all major characteristics of Alzheimer’s disease, including behavior and neuropathology.

To test their neuropathology, all major biomarkers for Alzheimer’s disease (beta-amyloid, tau proteins, and synaptophysin) were recorded.

Beta-amyloid and tau are deleterious proteins that, through many mechanisms, cause cellular dysfunction and death. Synaptophysin is a protein marker of synaptic integrity.

The research found that in the mice fed extra virgin olive oil, there was a significant decrease in these deleterious proteins and an increase in the beneficial synaptophysin.

The researchers credited these beneficial effects to extra virgin olive oil’s ability to increase autophagy.

Autophagy is how cells rid themselves of debris that interferes with normal healthy cellular function.

ending on beauty:

My song is snow in March,
in May. My song is eighty degrees
the next day. My song is, I couldn’t decide

what to get you, so here, everything.

~ Chen Chen

No comments:

Post a Comment