Saturday, August 29, 2015



In high school I kept
a diary in English,
so if the teacher
caught me, and she did,
she wouldn’t understand. 
I had a small vocabulary
and even less to say.
“The weather is getting warm,”
I confessed in a foreign language.

My first class in Los Angeles,
on June evenings, 
in the palm-plumed dusk,
was a typing course. 
For rhythm, the instructor played
“The Yellow Rose of Texas”
above the cross-fire
of night students pounding
on the jamming keys.
i machined a sinister idiom:
Dear Sir: Due to circumstances
beyond our control —

College was a subordinate clause.
I bartered my youth
for footnotes to Plato.
I was a mouse in the auditorium,
scribbling neat, useless notes.
One time I graded three hundred
freshman papers on the death penalty.
I didn’t want to graduate.
Life was penalty enough.

To survive I had to learn
a third language,
an on-off code in the brain
it takes nightmares to crack:
words husked from the grain of things,
Adamic names that fit
animals like their own pelts —
fluent as flowers,
rare as rubies,
occult atoms in lattice of sleep.

To be silent and let it speak.

~ Oriana © 2015


Castle, lower Silesia



The similarity between the US and Russia

Oriana: Soon after I arrived, the US seemed to me like the propaganda image of the Soviet Union in fancy magazines like Soviet Life, meant for foreign readers — a paradise of prosperity and pre-hippie wholesomeness — except more Orwellian (advertising being more sophisticated and scientifically tested than political propaganda). 

The prosperity was a surface, I discovered almost immediately — though it was a large surface, endless square miles of suburbia — not a propaganda image, as with the Soviet glossie. What astonished me, though, was that Americans thought theirs was a classless society, while in every city and town the first thing I’d see was “good neighborhoods” and “bad neighborhoods”: the quiet tree-lined lanes and well-kept lawns, and  half a mile away, black-walled liquor stores and rusty old cars, stained old mattresses rotting away in the weedy yards. That was beyond any poverty I witnessed in Poland. This kind was sinister, malignant, without hollyhocks and sunflowers planted in front of the humble windows.

Populism was like “proletariat” to the square power. The cowboy instead of Europe's aristocratic ideal, and the "new Soviet man." The cult of the pioneer. Dostoyevski thought that Russia's destiny lay in Asia, the East/Siberia being like the American West. Lenin, Trotsky, Stalin — they all actually had a lot of admiration for the US as a role model. They wanted to be like the US, to match its development, its conquest of the vacant continent.


It’s odd that in spite of the official worship of Lenin -- at least given the abundance of portraits on the walls and carried in parades -- in school we were never told that he was warm and charming. And that's a repeated theme not only in Hammer's account: the warm smile, the jovial handshake, his ability to charm anyone, a peasant or an intellectual, have been noted by others as well, including H.G. Wells. 

Hammer: “Lenin rose from his desk and came to meet us at the door. He was smaller than I had expected — a stocky little man about five feet three, with a large, dome-shaped head and auburn beard, wearing a dark gray sack suit, white soft collar and black tie. His eyes twinkled with friendly warmth as he shook hands and led me to a leather-cushioned chair beside his big flat desk. We sat so close that our knees almost touched.

The room was very small and unpretentious, full of books, magazines and newspapers in half a dozen languages.  . . .    

During the hour or more our conversation lasted, I was completely absorbed by Lenin’s personality. His powers of concentration were enormous. When he talked to you, he made you feel you were the most important person in his life. He had a way of holding his face close to yours, his left eye squinting but his right eye transfixing you as if it were trying to pierce your innermost soul. By the time we were through, I felt embraced, enveloped, as if I could trust him completely.

. . . Our two countries, the United States and Russia, Lenin explained, were complementary. Russia was a backward land with enormous treasures in the form of undeveloped resources. The United States could find here raw materials and a market for machines, and later for manufactured goods. Above all, Russian needed American technology and methods, American machines, engineers and instructors. Lenin picked up a copy of Scientific American.

“Look here,” he said, running rapidly through the pages, “this is what your people have done. This is what progress means: buildings, inventions, machines, development of mechanical aids to human hands. Russia today is like your country was during the pioneer stage. We need the knowledge and spirit that has made America what she is today.”

[they proceed to discuss a trade exchange between Russia and the West]

“In looking back over the years at this memorable interview, I have tried my hardest to recollect the most striking feature of it all. I think it is this — that before entering Lenin’s room I had been so greatly impressed by the terrific veneration which he aroused among his followers that I somehow expected to meet a superman, a strange and terrible figure, aloof and distant from mankind.

Instead it was just the opposite. To talk with Lenin was like talking with a trusted friend, a friend who understood. His infectious smile and colloquial speech, his sincerity and natural ways, put me completely at ease.

Lenin had been called ruthless and fanatical, cruel and cold. I refuse to believe it. It was his intense human sympathy, his warm personal magnetism and utter lack of self-assertion or self-interest, that made him great and enabled him successfully to hold together and produce the best from the strong and conflicting wills of his associates.”


Hammer on FDR:

“Nobody could equal the speed of his mind, the warmth of his character and the charm of his personality. Add to this an unmatched capacity for decisive executive action. FDR was the consummate “can-do” president.

I am often asked to compare Lenin with FDR. They shared many qualities — not the least of which was that they were both approachable, unintimidating men who did not stand for a moment on the dignity of their office. In the presence of both men I felt the same captivating excitement and the same alertness. No half-baked idea, no vagueness could be risked with either man: they would spot in a flash and dismiss it.

Both had a strong sense of humor, though Lenin did not laugh as much as FDR, who loved nothing more than a joke and who was always on the lookout for the ridiculous and the absurd; but then Lenin did not have much to laugh about. The condition of the Soviet Union and the colossal strains of his work did not leave much room for humor.

Like FDR, Lenin was capable of dazzling intellectual flexibility. People I meet today, especially journalists who interview me, are astonished to hear that Lenin told me, in effect, that Communism was not working and that the Revolution needed American capital and technical aid.

The world has largely forgotten how deeply the Bolsheviks admired the industrial achievements of America and how common it was to hear them say that they wanted to make Russia a “socialist United States.” Lenin was greatly responsible for promoting this attitude, and it is not appreciated how pragmatic and realistic was the cast of his mind. Perhaps I can best convey this by saying that my own father — who in many ways typified the idealistic Communist sympathizer of those times — was far more romantic in his socialism than Lenin. It remains one of the ironies of my life that I found the father of world Communism to be less pure as a Communist and more pragmatic than my own father. (pp. 119-120, “Hammer”)


The murderer kills because he seeks “justice." ~ Steven Pinker (in a lecture on the culture of honor versus the culture of dignity)

Humans are moralizing animals, with a curious need to pass judgment on others and see that they get punished. Pinker studied the causes of violence, including the most common motive for homicide. According to police records, it’s not material gain; it’s “justice.” The killer is carrying out capital punishment; his victim deserves to die for this or that reason. Likewise, wars tend to be justified using the language of moral principles. Pinker suggests we need to think of morality less in terms of blame and punishment, but in terms of minimizing harm and maximizing flourishing.

It seems to me that when progressives speak of justice, it’s likely to mean human rights, equal opportunity, equal pay, etc. When conservatives speak of justice, they mean punishment, vengeance. It’s not an absolute difference, but a tendency.



“The story of Barabbas the criminal, whom Pilate offers to kill instead of Jesus, is predicated on the supposed Jewish custom of releasing a prisoner at Passover . . . BUT THERE WAS NO SUCH JEWISH CUSTOM. In fact, it flies in the face of deeply held Jewish (and, for that matter, Roman) beliefs about justice.” ~ Joel M. Hoffman, The Bible’s Cutting Room Floor, 2014


“Though we expect to find copying mistakes and other variations in our current versions of Josephus’s writings, we don’t in general suspect that the message of his texts was purposely altered in any significant way, except concerning the pivotal Jesus, James, and John the Baptist. And here nearly ever conceivable position has been proposed by scholars, including that Josephus didn’t mention any of these people, but later Christians added texts about them; that Josephus converted to Christianity, but his texts were changed to hide his belief in Jesus Christ; and that we have nearly perfect copies of what Josephus wrote.

What we can do is work from the preponderance of the scholarly evidence. Fortunately, that points us in a relatively clear direction. The passages about James and John are mostly authentic. The passage about Jesus is not. . . . No one seems to have been aware of this particular passage until the fourth century.” ~ Joel Hoffman, The Bible’s Cutting Room Floor

God is dead; but given the way of men, there may still be caves for thousands of years in which his shadow will be shown. — And we — we still have to vanquish his shadow, too. ~ Nietzsche, The Gay Science

Like billions of other poeple, I don't believe in the existence of Zeus. That's why I don't call myself an agnostic. If you can be sure that Zeus doesn't exist, you can be sure that the Christian god doesn't exist. Basically that was the moment of insight that made me an atheist at 14. That was it: I understood that Christianity was just another mythology.

But my atheism also a very deep intuitive response. The emotional certainty is really primary, since intellectually I could at least weakly defend agnosticism. The emotional certainty hit me first in connection with the absence of the afterlife, because who wouldn't want some kind of afterlife to exist? And I do understand the longing for a "real god." One who would be a companion, who'd understand and guide without judging, like an ideal friend. But there is not even a definition of "it" (I think the pronoun for a "real god" would have to be "it"), much less evidence.

In practical terms, atheism in the West comes down to not believing in the Christian god any more than in Zeus. Mythology is mythology -- a fascinating field of study, by the way. In my teens, when I developed an interest in classical mythology, I had no idea where it would lead . . .



About 50,000 years ago, homo sapiens developed the capacities for “innovation, planning depth, and abstract and symbolic thought,” as a study published in Current Anthropology puts it. Until recently, not much was known about why our species veered toward more sophisticated sensibilities.

A group of anthropologists and biologists at Duke University had a theory: It’s because our skulls changed shape. This would have led to, as their study argues, a “change in average human temperament toward a less aggressive, more socially tolerant individual.”

To test their hypothesis, the team measured more than 1,400 skulls—1,367 modern ones from 30 ethnicities; 41 from between 10,000 and 38,000 years ago; and 13 ancient ones from more than 80,000 years ago—paying special attention to the brow ridge, face shape, and endocranial volume. “The study was motivated,” the researchers say, “by us trying to find a biological explanation—with evidence—of what could explain the huge explosion of culture around 50,000 years ago.”

After taking stock of their painstaking measurements, the researchers were surprised by how well their data supported their hypothesis. They found that there had indeed been a structural change in the human cranium—specifically, our brow ridges shrunk and the upper parts of our faces got shorter. It happened in the late Pleistocene era, and the shift indicated a lowered level of testosterone acting on the skeleton. This “feminization” of our heads made us less violent and more genteel.

The researchers think that sexual selection could have been what feminized our skulls.


“A new MRI study and University College of London indicates that the secret to happiness is low expectations. Author and neuroscientist Robb Rutledge says, “Happiness depends not on how well things are going but whether things are going better or worse than expected.”

This rings very true in my experience. I once expected to make it big, and when I didn’t, I eventually got over that expectation, and have been much happier ever since. Every little success these days is a surprise and delight.

Like happiness, compassion is always in part a function of lowered expectations. We’re happier to accept other people’s difficult behaviors when we expect less from them.

It’s all about managing the “aspirational gap,” the gap between what is and what could be, what you have and what you expect. It’s all about expectation management.

As my sole comment on this, let me repeat the second paragraph:

“This rings very true in my experience. I once expected to make it big [Oriana: in my case, gain a national recognition as a poet], and when I didn’t, I eventually got over that expectation, and have been much happier ever since. Every little success these days is a surprise and delight.”

And I’d add: THINK SMALL. Do less, and do it slowly, easily. Take small steps. Lie down.



I’ve been using Xylitol for two years now. Aside from plain dextrose with its nice quick energy boost to the brain, Xylitol is my favorite sweetener. The only caveat: as with everything, MODERATION.

1. XYLITOL HAS ONLY A NEGLIGIBLE IMPACT ON BLOOD SUGAR AND INSULIN LEVELS. This means that unlike sugar, there are no highs and lows: no roller coaster for either your energy or your mood, and no subsequent cravings for more sweets and carbohydrates.  No adrenal fatigue, no weight gain, no increase in cortisol levels.  In fact, xylitol can help keep you hormonally balanced through its insulin stabilization factors.  And as I learned in my training with endocrinologist Dr. Diana Schwarzbein, healthy insulin response is essential to healthy aging and healthy hormones, as well as effecting cholesterol levels, incidence of Type II Diabetes, high blood pressure, and much more.

2. Tooth & Gum Health XYLITOL ALKALINIZES THE MOUTH. It not only reduces bacterial growth but actually inhibits and interferes with development of plaque, and bad bacterial strains such as strep.  The Journal of the American Dental Association said “Xylitol is an effective preventive agent against dental caries… Consumption of xylitol-containing chewing gum has been demonstrated to reduce caries in Finnish teenagers by 30-60%.  Studies conducted in Canada, Thailand, Polynesia and Belize have shown similar results…”  A study conducted at Harvard School of Dental Medicine concluded that “Xylitol can significantly decrease the incidence of dental caries.”  – which is why more and more dentists are recommending it, in toothpastes, gums and candies.  There is some indication that xylitol may work against biofilm, which would also be advantageous in the mouth. Sugar, of course, increases the acidity of the mouth and the body as a whole, as well as bacterial growth and the incidence of cavities.

3. Alkalinity XYLITOL IS ALKALINIZING, MAKING US LESS HOSPITABLE TO HARMFUL BACTERIA, VIRUSES, AND FUNGI.  Keeping the body alkaline makes it easier and more likely for you to stay healthy and balanced in every way.  Sugar, in contrast, creates an acidic environment, feeding destructive microbes and weakening the immune system.

4. Bone Health ANIMAL STUDIES SUGGEST THAT REGULAR CONSUMPTION OF XYLITOL CAN IMPROVE BONE STRENGTH DURING AGING, probably because of the increased consumption of calcium, as well as the alkalinizing effect.  The more acidic your system, the more the body will leech calcium from bones and teeth to re-balance itself.

5. Yeast/Candida Xylitol is the only sugar that does not feed yeast. In fact, it contributes to its destruction.  This means it is not only safe for those grappling with candida, it is actually beneficial.  This is not true of any of the other sugars or sugar alcohols, including sorbitol, mannitol, maltitol, erythritol, as well as fructose, honey, maple syrup, agave, malt, molasses, coconut sugar, etc.


Never again will I kneel in my small country, by a river,
So that what is stone in me could be dissolved,
So that nothing would remain but my tears, tears.

~ Milosz, “From the Rising of the Sun”

Photo: Edward Byrne

Sunday, August 23, 2015


Ludovico Mazzolino, God the Father


But the woman was growing restless.
You see, there was no
narrative. No verbs. Paradise

is all description. No subjunctive
sighs or regrets,
no frolicking future tense.

When time like a ruddy fruit
hung from the branches of the galaxy,
I told the woman the truth

with the two-way tongue of a snake.
All those fluent ribs,
opalescent scales — it was Me

undulating in the subtle serpent.
Oh let them be as gods!
I laughed for joy when I saw

the woman bite into the tart flesh.
The multitudes of Me whirled
a wild polka through the nebulas:

At last! At last!
I’ve managed to create
a being that could disobey Me.

Enough hosannas of flowers,
the beaky orange birds.
I did not curse my brave 

children, nor did I strew  
thistles before them in their path.
I blessed them. To the woman

I said, “You are the Tree of Life.”
To the man, “Love her —
she’ll be your strength.”

Yes I knew
suffering would happen.
Yes, because I love stories.

~ Oriana © 2015

I hope that this present a more pleasant — if still deeply problematic — Mr. Deity (in Polish children politely refer to god as “Mr. God”).

Problematic, but not out-and-out abusive. (By the way, that’s also a Jungian concept: god as trauma.) Not the god of "Acts of God" (actually a valid legal term that my insurance, for one, does use -- never mind that to them god is dead).

Note the crown. Because most “holy scriptures” go back to an era of absolute kings, it’s understandable that phrases such as “heavenly king” came into being. Early paintings often show god on the throne. Sometimes it’s a double throne, for the father and the son. And sometimes it’s a double throne for Jesus and Mary. But below is a more tradition presentation, simply god the father on his throne (Germany, late 1400s), with Jesus and Mary clearly subordinate.

(for me the take-away message is that children see god as Superman, but later tend to drop the physical attributes. But I suspect that what people say and what they really believe may be different things — it’s hard to relate to a disembodied non-person.)

How people think about their god is actually of intense interest to psychologists. In particular, the human tendency to think about gods in anthropomorphic terms (think of all those pictures of God as a bearded white guy in the sky), in contrast with the ‘theologically correct’ Christian view of god as a disembodied force with few human characteristics.

For example, perhaps people think about god as being a kind of disembodied human mind. Or perhaps they find that difficult. Perhaps if put under time pressure they instinctively think of god as being more like a regular person. Or maybe they think of god as being something like a superhero – a regular person but with a few extraordinary powers (this is the idea that memorable supernatural beings are ‘minimally counter-intuitive’).

Andrew Shtulman (Occidental College, Los Angeles) and Marjaana Lindeman (University of Helsinki) They ran three studies with a similar set-up. Basically, they asked a bunch of people whether god has a variety of attributes – things like whether god can know things, make plans, be happy, or see things – as well as body-related things like breathe, exert force, has a brain etc.

What they found was that people tended to attribute psychological properties to god over physical properties. If you come from a Christian background, you might find that unsurprising.

What was surprising, however, is that they conducted one of the studies with Hindus in India. Hinduism embraces anthropomorphism much more than the Judaeo-Christian religions, but they found that Hindus were only slightly more likely to attribute physical properties to god – and, like the Finns and American in their study, they were much more likely to give their god psychological properties.

Strangely enough, they found that religious people were more likely than the non-religious to attribute both physical and psychological properties to god. It seems that being religious is not a guarantee of being more theologically correct!

They found similar results after putting people under time pressure (asking their subjects to give an answer as quickly as possible). The results clearly showed that people in both the USA and India find it easier to attribute psychological properties to god than physical ones. They quickly rejected most physical properties and, on the occasions when they accepted them, it took longer to make that decision.

Put together, these results show that people don’t tend think of gods in a ‘minimally counter-intuitive’ way. That’s interesting because it’s been shown that children do think about gods in this way. It seems to be something we are educated out of as we grow up.

 But the results also show that people don’t have different instinctive and deliberative views of god. It’s not that they instinctively think in anthropomorphic terms, only to reject is on sober reflection.

And that’s odd, because people definitely do think about gods in anthropomorphic ways that are fundamentally incompatible with theologically correct ideas. For example, people frequently talk about god as if they have limited knowledge, or can’t be everywhere at once.

Shtulmanand and Lindeman explain it like this:

The logical inconsistency between (a) claiming that God is omniscient and (b) imposing limitations on God’s knowledge in a story-recall task may not be obvious to most people even at an explicit level. Barrett and Keil assumed that people could only make such a mistake if they held representationally distinct God concepts activated in cognitively distinct tasks, but our data suggest that many people are psychologically content to attribute logically incompatible properties.”


Human, all too human . . .  I should have suspected that the Hindu don't really see Ganesha as having the head of
an elephant

“My idea of deity is a great, luminous, oblong blur” ~ an American evangelist in the nineteen thirties, quoted by Robert Hughes in “The Shock of the New”

“Oblong” still points to its origin in the human body, but already shows a tendency to see god in more abstract and cosmic terms.

The familiar image in Raphael’s Transfiguration (c. 1520), where Jesus is presumably already in his radiant form, given that Elijah and Moses can be seen, seems awkward and childish (and strangely heavy-thighed). An oblong blur would be more dignified, and not so out of keeping with the modern world.

Imagine such a figure appearing at Home Depot. He would not be seen as a home owner, and thus would be ignored by the staff.

Raphael, Transfiguration, c. 1520


Then there is Abrahamic religions’ invariable assumption that humans are evil by nature. After a lecture by a “progressive” rabbi I saw that this applied to to “progressive” Judaism as well — why else would humans need a god? To be the “eye in the sky” and keep them behaving, since humans are evil. This condemnation of human nature as innately depraved is most extreme in Christianity. Humans are so evil that it requires the blood of Jesus to make them sufficiently “pure” to pass through the Pearly Gates.
 True, there is the motto: “Hate the sin but love the sinner.” I’ve been combing my mind for memories of anyone who hated the sin but loved the sinner, and coming up blank. If we hate unreliability, for instance, we’ll at least dislike those who can’t be relied on to be there when they  promised to show up, who don’t call back, who don’t do what they said they’d do. I don’t go out of my way to show my displeasure, but the possibility of a true closeness doesn’t exist. Life has taught me that unreliable people are chaos-makers and time-wasters.

Can psychology replace religion? I think the so-called “positive psychology” has a lot of potential that way. It tries to provide an affirmative, viable philosophy of life. It focuses on a person’s strengths, not flaws. It studies happiness, not pathology; optimal function, not dysfunction. Its founding father was Abraham Maslow (and, going farther back, I think Alfred Adler, with his emphasis on the pursuit of “mastery” rather than sex). It’s odd that positive psychology is still relatively obscure. My guess is that people conflate it positive thinking and self-help books, and thus it’s not taken seriously.

The usual markers of happiness are colloquially known as the “Big Seven”: wealth (especially compared to those around you), family relationships, career, friends, health, freedom, and personal values, as outlined by London School of Economics professor Richard Layard in Happiness: Lessons from a New Science. According to the Goldberg study, however, what makes people happiest isn’t even in the Big Seven. Instead, happiness is most easily attained by living in an aesthetically beautiful city. The things people were constantly surrounded by—lovely architecture, history, green spaces, cobblestone streets—had the greatest effect on their happiness. The cumulative positive effects of daily beauty worked subtly but strongly.

In a paper titled, “Untangling What Makes Cities Livable: Happiness in Five Cities,” Abraham Goldberg, a professor at University of South Carolina Upstate, and his team conducted a statistical analysis of happiness in New York City, London, Paris, Toronto, and Berlin. They analyzed earlier Gallup happiness surveys and collected their own data, and found that people’s happiness was coming from an unexpected place.

The times that people recorded the highest levels of happiness and life satisfaction were during sexually intimate moments (on a date, kissing, or having sex) and during exercise (when endorphins are being released).

But the next three types of moments where people recorded the highest levels of happiness were all related to beauty: when at the theater, ballet, or a concert; at a museum or an art exhibit; and while doing an artistic activity (e.g. painting, fiction writing, sewing).

In The Architecture of Happiness, Alain de Botton weighs the feeling of walking into an “ugly” McDonalds in the Westminster area of London compared to the feeling of entering the “beautiful” Westminster Cathedral across the street. He says that because of the harsh lighting, the plastic furniture, and the cacophonous color scheme (all those bright yellows and reds), one tends to feel immediately “anxious” in the McDonalds.

What one feels in the Westminster Cathedral, however, is a calmness brought on by a series of architectural and artistic decisions: the muted colors (grays and bleak reds), the romantic yellow lighting that bursts out onto Victoria Street, the intricate mosaics, and the vaulted ceilings. Although the Westminster Cathedral has the same principle elements of architecture as the McDonald’s—windows, doors, floors, ceilings, and seats—the cathedral helps people to relax and reflect, where the fast food restaurant causes one to feel stressed and hurried.

It seems part of humans’ appreciation of beauty is because it is able to conjure the feelings we tend to associate with happiness: calmness, a connection to history or the divine, wealth, time for reflection and appreciation, and, perhaps surprisingly, hope.

People's physical beauty can help with dating and often it spells a path to economic success. But the beauty around us—the sky-high nave of the Westminister Cathedral, the ability to appreciate a simple lunch—offers hope that life can inch closer to perfection.

“So long as we find anything beautiful, we feel that we have not yet exhausted what [life] has to offer,” writes Nehamas. “That forward-looking element is … inseparable from the judgment of beauty.”

Beauty often starts with something small. For the participants in the Goldberg study, it is about the appearance of a city; in the Monet painting, it is the appreciation of eating in the countryside; for Plato and many other philosophers, beauty is about achieving knowledge. But just because beauty can begin with the appreciation of colors, cuisine, and colonnades does not make it a superficial pursuit. As the 18th-century French writer Stendhal wrote, “Beauty is the promise of happiness.”

For me beauty if far more than the “promise” of happiness; it’s happiness itself. I am made happy by beauty, and unhappy by ugliness.

Actually, the study confirmed that for most people, Freud was right: happiness is sex. But after that (and, surprisingly, after exercise) comes living in a beautiful place and enjoying beauty in various forms. That’s why, all along the California coast, crowds gather near the beach to watch the sunset. Sometimes I think if I were paralyzed and unable to write, life would still be worth living — if I could still watch another Pacific sunset.

It matters!

“Imagine for a minute that you were at a coffee shop and were offered the option of being served coffee in either a lovely porcelain cup or in a not-so-lovely plastic cup. Which cup should you pick?

According to Aradhna, when we drink coffee—or for that matter, when we eat or drink anything else—we taste it not just with our taste-buds, but also with our other senses. The sense of smell, as most people know, is inextricably intertwined with the sense of taste. (Without being able to smell, for example, some people claim that we cannot distinguish between potato and apple. I’ve never checked this myself, but after seeing this video—particularly the latter half—, I have a good mind to try it out on my kids!) Indeed, Aradhna argues, it is not just the sense of smell that is intertwined with taste, but virtually all the other senses, including touch, sound, and sight are too, which is why the texture of chips (soft vs. crisp), the sound that it makes when we bite it (its “crunchiness”), as well as its color (golden yellow vs. white or brown), can all significantly affect our enjoyment of it.

According to Aradhna, the reason all of our senses matter is because all sensory inputs are ultimately combined into one overall evaluation in the part of our brain called the orbitofrontal cortex. In other words, we literally cannot distinguish the extent to which different sensory inputs contributed to our overall enjoyment of food. This may be one reason why people’s brains light up more—meaning there is evidence at the neurological level that people derive greater enjoyment—when they taste the same wine from a bottle that they think is more (vs. less) expensive.

A question that follows from the perspective of someone who wishes to maximize their pleasure from drinking a cup of coffee, then, is: should one attach importance to the cup? Or, put in more general terms, does “packaging” matter? Does the cover of a book matter for enjoying its content? Does a person’s physical attractiveness matter for enjoying their company?

The answer, according to Aradhna, would be resounding “Yes!” While she points to one reason why we enjoy something more when it is presented in a more pleasing manner—namely, that our brain combines all sensory inputs into one overall evaluation—findings from yet another stream of research, on “halo effects,” reveal another reason why superficialities matter. Halo effect findings reveal that, when something is more pleasing to our senses, we impute a whole bunch of other positive qualities to it. Thus, for example, a good-looking person is thought to be more intelligent, competent, and warm, which is why attractive people earn more money than their less-attractive counterparts. Halo effects seem to apply, within some limits, to inanimate stimuli as well, which is why we enjoy a shopping environment more when it looks and smells good.

In sum, presentation matters. The cup from which we drink matters, perhaps not as much as the coffee itself, but it can certainly add significantly to, or detract significantly from, our enjoyment of the coffee. Likewise, it stands to reason that we enjoy a book more when its cover is better-designed and a hotel room more when it is more put-together, etc.”

Sensory pleasure really IS important for our health and well-being, both emotional and physical (there is really no separating those two). Beauty of our surroundings -- and presumably of the objects we use -- is in fact important for happiness. When it comes to matters like housing, it can be critically important. The so-called superficialities are not in fact entirely superficial: we do judge a book by its cover, among other things. A beautiful cover will attract readers. Oscar Wilde even said that beauty is more profound than truth.

“The notion that happiness is actually attainable belongs to the second half of the 18th century, as Freud pointed out. Previously there had been a general consensus that no one can be called happy until he carries his happiness down to the grave in peace. Paradiso was strictly for the pages of Dante. In Greenland, for example, the Greenlanders bought into Christianity on account of its persuasive description of pain and suffering. The vale of tears was real. And then Captain James Cook, and his French counterpart, Louis-Antoine de Bougainville, embarked upon their great voyages. Bougainville’s Voyage autour du monde (1771) seems to suggest that this journey had less to do with discovery or French imperialism, than the pursuit of happiness. What’s more, Bougainville suggests that happiness was actually found—in Tahiti.

Bougainville stresses two things. First, that the Tahitians live a life of wellbeing, and don’t have to work too hard either. Second, that the women—and to some extent the men too—throw themselves willingly at French sailors, which adds significantly to the happiness of French sailors. There are of course darker strands to the narrative—Bougainville mentions at least one murder, and hints that in fact sexual bliss may actually have been obtained in exchange for a few nails or other useful items. But nevertheless, I think we can say that Bougainville was concerned less with the pursuit of happiness itself, than with the fact it had finally been located and lived out in the southern hemisphere. It was just a question of transporting the south back into the north (as Margaret Mead would ultimately argue, in her 1928 bestseller Coming of Age in Samoa). Captain Cook got there a little bit late in the day but it was his crew who were the first Europeans to witness surfing. Thus “the most supreme pleasure” (as surfing was described by Willliam Anderson, Cook’s surgeon on the Resolution) was just the ticket to “allay all perturbation of mind.”

These travelers' tales of transcendence had a powerful impact on subsequent thinkers. Freud, for one. His theory of the id and the ego transposes the 18th century map of the world, specifically the north/south divide, onto the map of the human psyche. The “southern” id was having all the fun—the pleasure principle—while the more northerly ego was reining in the hedonistic savage self with a good dose of the “reality principle.”

This article argues that equating happiness with hedonism is wrong. Personally, I need meaningful work and beauty. Freud said “love and work” — for me love and beauty and work are all fused. If I have beauty and the freedom to do the work I love, then every cell in my body gets the message I need to keep on living: “You are loved.”


This is a society where it’s very easy to see oneself as a “loser.” I have suffered from self-blame immensely. Eventually I managed to see the broader context and redefine “success” -- but only after years and years of needless suffering, including suicidal depression.

“Ron Paul epitomized the spirit of blame in 2011 when he passionately argued in a televised debate that the decision to forego health insurance was a fundamental right of Americans. When the moderator asked him if this would mean that someone without health insurance who was critically injured should die rather than receive government help, audience members could be heard shouting, “Yeah!” Take a risk and succeed, and you are a hero. Take a risk and fail, and you are to blame—even if it costs you your life. Risk and blame are the hallmarks of worthy personhood in contemporary American society.

But the puzzling question is why people who do not benefit from a system of blame—that is, most Americans—cling so fiercely to its creed. Seeking an answer, I spent several years researching the American working class, the very people whose homes are underwater and whose college debt goes unpaid. I witnessed how blame was deployed in everyday life to solve problems—to anchor the self, judge worthiness, grant dignity, and make sense of failures. In short, I learned that blame is a strategy to make certain what is uncertain.

Self-blame is shored up by a multi-million dollar self-help industry. But its true power lies in its promise that we can will ourselves to happy and successful lives, in its ability to make a virtue out of failure, insecurity, and uncertainty. As Kelly, a line cook who has lived on and off in her car, explained, “Life doesn’t owe me any favors. I can have a sense of my own specialness and individuality, but that doesn’t mean that anybody else has to recognize that or help me accomplish my goals.” Those who embrace blame tend to have little empathy for those who cannot pull themselves up by their bootstraps. If I have to go it alone, the logic goes, then everyone else should, too.

As Fried argues, blame is costly, both socially and politically. Blame divides potential communities of solidarity into winners and losers. Even more worrisome, the quest for personal responsibility and the eagerness to blame oneself for failure obscures the larger forces that have weakened our social safety net, our communities, and our families. Doing away with gratuitous blame—directed at others and at ourselves—requires building institutions that restore, carefully and thoughtfully, our collective supply of meaning, trust, and dignity.

A reader’s comment:

Corporations should be required to have human resources that offer effective and LEGITIMATE support to employees, and are not just the corporations pathetic hiring/firing/chastising back-stabbing arm.  As employees feel stronger and are actually trusted by corporations, they will take more initiative and pride in their work.... you know, like how it used to be in America in the 1950s!  The worst thing to happen to the USA is libertarians and people who hate other people. 

Another reader:

About welfare leading to unsustainable taxation:  My view is that we could afford to pay out a good deal more welfare without so much as noticing the cost.  The problem is that the 0.1% and multinational corporations are paying negligible taxes, and receiving enormous amount of welfare.  The government -- until recently the Pentagon, now more NIH, because the cutting edge of the economy has shifted from technology to biology -- takes on enormous investment to produce innovations like the computer, and the Internet, and then gives those over to private interests to exploit.  This is the socialization of risk and privatization of profit.



I can't help it, I'm still thinking of Lenin's strange death — most likely poisoned by Stalin, first using a small dose, which didn't do the job but caused an odd, medically unexplained pain in the eyes, and then the massive dose which caused convulsions (stroke doesn't cause convulsions, but many poisons do). There were orders not to do tissue toxicology. Lenin was planning to remove Stalin from power. He also had a plan to democratize (at least to a degree) the Central Committee. Imagine an alternate history . . .

This idea is haunting me again, along with the thought that throughout history, many rulers have been assassinated.


ZUCCHINI, YELLOW CROOKED-NECK SQUASH, AND THE MIRACLE OF PECTIN: healing inflammatory intestinal disorders, protection against diabetes

We tend to think about squashes, both summer and winter, as starchy vegetables. This thinking is correct, since about 85-90% of the total calories in squashes (as a group) come from carbohydrate, and about half of this carbohydrate is starch-like in composition and composed of polysaccharides. But we also tend to think about polysaccharides as stagnant storage forms for starch that cannot do much for us in terms of unique health benefits. Here our thinking is way off target!

Recent research has shown that the polysaccharides in summer squash include an unusual amount of pectin—a specially structured polysaccharide that often include special chains of D-galacturonic acid called homogalacturonan. It's this unique polysaccharide composition in summer squash that is being linked in repeated animal studies to protection against diabetes and better regulation of insulin. We expect to see future studies on humans confirming these same types of benefits from consumption of summer squash.



~ lowers cholesterol

~ lowers high triglycerides

~ helps prevent colon cancer and prostate cancer

~ promotes stable blood glucose and lowers the risk of diabetes

~ helps relieve gastroesophageal reflux (“heartburn”)

~ alleviates both diarrhea and constipation

~ may help arthritic joints by stimulating the production of synovial fluid

~ may help prevent the formation of gall stones

~ may help lower blood pressure


~ and a nod toward my “years of perdition”


“So did your Monday start with a bang?”
asks Tess, the phlebotomist.
I answer with an underwater stare
since I’m about to faint, having sat

for an hour on an empty stomach
in the waiting room, a parade
of humanity in wheelchairs
and their jumping-jack caretakers

surging to the little window, then back
against the wall. Tess inserts
another vial into the butterfly
needle. My veins are baby-fine

but my blood is dark red.
This worries me: my blood so dark
with the years, but silent about
the shipwreck of my life.

The phlebotomist ponders
my fish stare, pulls out the
needle with a kindly smile:
“I see. So this has been the bang.”

~ Oriana © 2015

Saturday, August 15, 2015



I want to know
the first word,
older than fire
and more necessary —

Was it a cry of warning?
Or a child’s wail for
touch, mother syllable
of familiar heat?

A woman’s god-creating
attempt to name a lover,
a man’s god-shattering
attempt to name himself?

Or was the first word
god, manifest
music of thought,
emptied of frightened flesh?

Was it a yes, a no,
a yes, but – ?
And I want to know
the first lie.


Perhaps there were many
first words, first secrets,
first denials —
gossiping over a carcass

of a woolly mammoth,
complaining (the foremost
marker of culture 

is complaining).


Now we have too many words.
Words instead of children 

sit in our laps. That’s why we 
talk so much about silence.


How moist the newborn
first word must have been,
like the scent
of the earth after rain.

Someone’s grinding acorns,
baking the first bread.
Someone pierces bone,
making the first flute.

water starts flowing
into water,
the grain of stone
closes over stone.

~ Oriana © 2015



“Emile Durkheim is the philosopher who can best help us to understand why Capitalism makes us richer and yet frequently more miserable; even –- far too often –- suicidal.

Durkheim lived through the immense, rapid transformation of France from a largely traditional agricultural society to an urban, industrial economy. He could see that his country was getting richer, that Capitalism was extraordinarily productive and, in certain ways, liberating. But what particularly struck him, and became the focus of his entire career, were the psychological costs of Capitalism. The economic system might have created an entire new middle class, but it was doing something very peculiar to people’s minds. It was — quite literally — driving them to suicide in ever increasing numbers.

Edouard Manet, The Suicide, 1881

This was the immense insight unveiled in Durkheim’s most important work, Suicide, published in 1897. The book chronicled a remarkable and tragic discovery: that suicide rates seem to shoot up once a nation becomes industrialized and Consumer Capitalism takes hold. Durkheim observed that the suicide rate in the Britain of his day was double that of Italy; but in even richer and more advanced Denmark, it was four times higher than in the UK. Furthermore, suicide rates were much higher amongst the educated than the uneducated; much higher in Protestant than in Catholic countries; and much higher among the middle classes than among the poor.

Durkheim’s focus on suicide was intended to shed light on a more general level of unhappiness and despair at large in society. Suicide was the horrific tip of the iceberg of mental distress created by Capitalism.

1. Individualism

Under Capitalism, it is the individual (rather than the clan, or ‘society’ or the nation) that now chooses everything: what job to take, what religion to follow, who to marry… This ‘individualism’ forces us to be the authors of our own destinies. How our lives pan out becomes a reflection of our unique merits, skills and persistence.

If things go well, we can take all the credit. But if things go badly, it is crueller than ever before, for it means there is no one else to blame. We have to shoulder the full responsibility. We aren’t just unlucky any more, we have chosen and have messed up. Individualism ushers in a disinclination to admit to any sort of role for luck or chance in life. Failure becomes a terrible judgement upon oneself. This is the particular burden of life in modern Capitalism.

2. Excessive expectations

Capitalism raises our hopes. Everyone – with enough effort – can become the boss. Everyone should think big. You are not trapped by the past – Capitalism says – you are free to remake your life. The opportunities grow enormous…as do the possibilities for disappointment.

The cheery, boosterish side of Capitalism attracted Durkheim’s particular ire. In his view, modern societies struggle to admit that life is often quite simply painful and sad. Our tendencies to grief and sorrow are made to look like signs of failure rather than, as should be the case, a fair response to the arduous facts of the human condition.

3. Too much freedom

Capitalism relentlessly undermined social norms. States became more complex, more anonymous and more diverse. People didn’t have so much in common with each other any more.

What kind of career should you have? Where should you live? What kind of holiday should you go on? What is a marriage supposed to be like? How should you bring up children? Under Capitalism, the collective answers get weaker, less specific. There’s a lot of reliance on the phrase: ‘whatever works for you.’ Which sounds friendly but also means that society doesn’t much care what you do and doesn’t feel confident it has good answers to the big questions of your life.

In very confident moments we like to think of ourselves as fully up to the task of reinventing life, or working everything out for ourselves. But, in reality, as Durkheim knew, we are often simply too tired, too busy, too uncertain – and there is nowhere to turn.

4. Atheism

Durkheim was himself an atheist, but he worried that religion had become implausible just as its communal side would have been most necessary to repair the fraying social fabric. Despite its factual errors, Durkheim appreciated the sense of community that religion offered: “Religion gave men a perception of a world beyond this earth where everything would be rectified; this prospect made inequalities less noticeable, it stopped men from feeling aggrieved.”

Durkheim took the dark view that inequality would be very hard to eradicate (perhaps impossible), so we would have to learn, somehow, to live with it. This led him to a warmer appreciation of any ideas that could soften the psychological blows of reality.

Durkheim also saw that religion created deep bonds between people. The king and the peasant worshipped the same God, they prayed in the same building using the same words. They were offered precisely the same sacraments. Riches, status and power were of no direct spiritual value.

Capitalism had nothing to replace this with. Science certainly did not offer the same opportunities for powerful shared experiences. The Periodic Table might well possess transcendent beauty and be a marvel of intellectual elegance – but it couldn’t draw a society together around it.

Durkheim was especially taken with elaborate religious rituals that demand participation and create a strong sense of belonging. A tribe might worship its totem, men might undergo a complex process of initiation. The tragedy – in Durkheim’s eyes – was that we had done away with religion at precisely the time when we most needed its collective consoling dimensions and had nothing much to put in its place.

Monet: The fair at the Church of Saint Jaques Dieppes, 1901

5. Weakening of the nation and of the family

In the 19th century, it had looked, at certain moments, as if the idea of the nation might grow so powerful and intense that it could take up the sense of belonging and shared devotion that once had been supplied by religion. But the excitement of a nation at war had, Durkheim saw, failed to translate into anything very impressive in peacetime.

Family might similarly seem to offer the experience of belonging that we needed. But Durkheim was unconvinced. We do indeed invest hugely in our families, but they are not as stable as we might hope. And they do not provide access to a wider community.

 John Singer Sargent: The Daughters of Edward Darley Boit, 1882

Increasingly, the ‘family’ in the traditional expansive sense has ceased to exist. It boils down to the couple agreeing to live in the same house and look after one or two children for a while. But in adulthood these children do not expect to work alongside their parents; they don’t expect their social circle to overlap with their parents very much and don’t feel that their parents’ honor is in their hands.

Our looser, more individual sense of family isn’t necessarily a bad thing. It just means that it’s not well placed to take up the task of giving us a larger sense of belonging – of giving us the feeling that we are part of something more valuable than ourselves.

Durkheim is a master diagnostician of our ills. He shows us that modern economies put tremendous pressures on individuals, but leave us dangerously bereft of authoritative guidance and communal solace.

He didn’t feel capable of finding answers to the problems he identified but he knew that Capitalism would have to uncover them, or collapse. We are Durkheim’s heirs – and still have ahead of us the task he accorded us: to create new ways of belonging, to take some of the pressure off the individual, to find a correct balance between freedom and solidarity and to generate ideologies that allow us not to take our own failures so personally and sometimes so tragically.”

( This is a longish article, but here is a video that summarizes it in only seven minutes:

Degas, Absinthe, 1876


“. . . to take some of the pressure off the individual, to generate ideologies that allow us not to take our own failures so personally and sometimes so tragically.” I’d use the word “life philosophy.” And to develop a life philosophy one has to have both sufficient intelligence and sufficient experience. I finally understood that the “excuse of youth” doesn’t expire at the age of thirty, say. Or forty, or even beyond. Wisdom comes when it comes (IF it comes).

And it helps tremendously to meet others who freely admit, “I didn’t realize what was important until I turned 58” — or 61, 69, 75 — put in any figure here. There is no shame in taking a long time to understand what’s really precious and important. And it takes life experience. If you are the kind of woman (this seems to apply to women in particular) who requires living by herself before she ceases to be mostly a caretaker and a “service person,” and can at last “find herself,” then the experience of divorce or widowhood may be necessary.

I agree that the danger of depression and suicide goes up tremendously with the individualism fostered by capitalism. But I’ll take it any time over the collective pressures of earlier times. Women were virtually slaves, and men paid a price as well, their talents stifled as they had to go into the family business.

Hopefully we’re past excessive individualism in the sense of blaming the individual for every “failure” — no longer seen as misfortune due to circumstances. With a trend away from free will, we are beginning to see the enormous role of circumstances. This is only the beginning of that deeper understanding that began with the growth of psychology and neuroscience. A more psychological perspective should lead us away from blaming, contempt, and hatred, and toward more compassion.

There is of course still a segment of society that explains poverty as sin and misfortune as god’s punishment. It’s a noisy view — This woman got raped because she wore sexy clothes. These school children got shot because we don’t have school prayer — but that’s increasingly the voice of the lunatic fringe.

As for the family, I’ve seen a strengthening rather than a weakening. True, a minority women choose to use a sperm bank and become single mothers. Given the risk of waiting too long for Mr. Right and missing one’s chance to have a biological child, that decision is not exactly outrageous. A more interesting trend is that of educated people marrying later in life and having lasting marriages, with the father involved in child rearing. Those families can be extremely close, and yes, the retired parents will move so as to be near their children and grandchildren.

Even without as much closeness as that, what I see around me is people raising children with a lot more love than was typical of Durkheim’s day, when typical child rearing practices would strike us as abusive. Family love has become enormously important, and is perhaps the most successful replacement for religion.

Finally, when we look at suicide rate in various countries, we see that the most developed capitalist countries are not in the lead (except for Japan, with its shame culture). Lithuania and Russia are far ahead of France, Germany, and Switzerland — probably because of the combined ravages of alcoholism and economic hardship.

Thus, Durkheim’s analysis is only partly correct. But his theorizing remains important because it points to the importance of social connectedness and close personal ties. We are beginning to speak of the “connectome” — the way we’ve popularized “genome” and “biome.” A human being is not an isolated individual. People’s lives are meaningful within their social group. The worship of individualism seems to have reached its peak and crested. Now it’s connection, connection, connection.

And yes, the task is ahead of us. The future may bring us more eco-farms and artisanal communities where people can cultivate the satisfying sense of connection that marks the best of pre-capitalism. 

Bueghel the Elder, The Kermess, 1570-1580



Recently I’ve had a disquieting experience of being compared to a mass shooter or a wife beater (sic) for saying something that revealed my atheism. My comments were compared to “shooting at random.” I mentally reeled in astonishment. Nothing I’ve ever said or done in my life merited being compared with the actions of a mass shooter, much less some forgettable Facebook comment. This holds even if we apply the unbelievably high standard set by Jesus in the Sermon on the Mount: even if you didn’t commit murder, if you are angry with your brother or sister in your heart, you will be just as “subject to judgment.” “And anyone who says, ‘You fool!’ will be in danger of the fire of hell.”

But I didn’t call anyone a fool. My tone was moderate, polite.

Then I recalled what I’ve noticed before: some religious people feel threatened by the very existence of atheists. And I found an article that “explained it all” by the excellent “Godless in Dixie” blogger, Neil Carter: “Why Even Nice Atheists Are Offensive to the Faithful.”

“Greta Christina pretty much nailed it when she said:

‘Religion relies on social consent to perpetuate itself. But the simple act of coming out as an atheist denies it this consent. Even if atheists never debate believers or try to persuade them out of their beliefs; even if all we ever do is say out loud, “Actually, I’m an atheist,” we’re still denying our consent. And that throws a monkey wrench into religion’s engine.’

In other words, atheists offend simply by existing. Just as the Emperor’s new wardrobe choice could only be successful if everyone agreed to not speak ill of it, so there’s an unspoken rule that says the worst thing an apostate can do is admit out loud that she has left the faith. Nothing more need be said in order to offend. She offends now by existing, and she offends by openly admitting who and what she is.

Which is why more of us need to do this. Not much more is really required in order to make a difference. The mere act of “coming out” as an atheist denies Fundamentalism the consent it requires in order to remain a coercive force over the lives of people. Do all expressions of the Christian faith demand such obeisance? No, definitely not. But the ones that do won’t go away just because we ignore them. And frankly, I don’t think it makes much difference to the more liberal strains of religion what the rest of us believe, so long as we agree to try to leave the world a better place than how we found it. Those are my feelings as well.

I would temper this encouragement to come out with a warning that for some it might not be such a good idea. Some just aren’t in a position to do it at all. Some will need to wait until they are in a stronger place themselves so that they can endure the onslaught of negativity their apostasy will engender. But for those who can, it helps the rest of us each time yet another person steps forward and says, however they say it, “Actually, I’m an atheist.”

I posted this on Facebook and received this comment: "I can understand perfectly. I myself consider being religious as a character flaw and i just can't get over it. But we all have our flaws. I never considered "a man of God" to be complimentary. I don't even think the Clergy really believe it. It's a living."

I replied:

From the start I had a heavy suspicion that at least some priests didn't believe the stuff. Some must have felt doomed to hell for unbelief and hating the god talk — their faces looked absolutely tragic. Same with nuns. Living a lie, wasted lives, having denied themselves human love . . . I was a sensitive child, and could tell if someone was unhappy, especially extremely unhappy. And that disturbed me: seeing those pale tragic faces above the black robes.

There are plenty of stories of non-believing clergy if you search online. There are whole books, memoirs. And yes, it is a living. The Clergy Project is an organization that helps agnostic and atheist priests and ministers leave the church and find a secular job.


Considering that atheists used to get burned at the stake if anyone found out, it’s interesting that we have a wealth of historical material documenting the existence of atheists going back as far as the ancient Hindu culture. There is reason to think that doubters (to use a milder term) existed as long as religion existed, even in cultures where doubt was severely penalized. The reason for the death penalty for atheism, I suspect, has been the shaky nature of religious faith. Smart people could figure out that the official “knowledge” (in primeval times, there was no separate word for religion) didn’t add up, and that animal sacrifice and other rituals did no good. Possibly a lot of people suspected as much, but tried to stifle doubt within themselves, finding elaborate excuses for god’s silence and absence.

And this goes on even in modern times: religious people take offense that seems completely out of proportion. Say that prayer doesn’t work, and you are going to be compared to a mass shooter. I would have never believed it — and then it happened to me, and due to a milder statement . . .



“The myth might have arisen from the Nobel Prize-winning research of Roger Sperry, which was done in the 1960s. Sperry studied patients with epilepsy, who were treated with a surgical procedure that cut the brain along a structure called the corpus callosum. Because the corpus callosum connects the two hemispheres of the brain, the left and right sides of these patients' brains could no longer communicate.

Sperry and other researchers, through a series of clever studies, determined which parts, or sides, of the brain were involved in language, math, drawing and other functions in these patients. But then popular-level psychology enthusiasts ran with this idea, creating the notion that personalities and other human attributes are determined by having one side of the brain dominate the other. Popular culture would have you believe that logical, methodical and analytical people are left-brain dominant, while the creative and artistic types are right-brain dominant.

The neuroscience community never bought into this notion, lead author Jeff Anderson said, and now we have evidence from more than 1,000 brain scans showing absolutely no signs of left or right dominance.

They found no evidence that people preferentially use their left or right brain. All of the study participants were using their entire brain equally throughout the course of the experiment.

The preference to use one brain region more than others for certain functions, which scientists call lateralization, is indeed real, Anderson said. For example, speech emanates from the left side of the brain for most right-handed people. This does not imply, though, that great writers or speakers use their left side of the brain more than the right, or that one side is richer in neurons.

There is a misconception that everything to do with being analytical is confined to one side of the brain, and everything to do with being creative is confined to the opposite side, Anderson said. In fact, IT IS THE CONNECTIONS AMONG ALL BRAIN REGIONS THAT ENABLE HUMANS TO ENGAGE IN BOTH CREATIVITY AND ANALYTICAL THINKING.

"It is not the case that the left hemisphere is associated with logic or reasoning more than the right," Anderson told LiveScience. "Also, creativity is no more processed in the right hemisphere than the left."

Anderson's team examined brain scans of participants ages 7 to 29 while they were resting. They looked at activity in 7,000 brain regions, and examined neural connections within and between these regions. Although they saw pockets of heavy neural traffic in certain key regions, on average, both sides of the brain were essentially equal in their neural networks and connectivity.

"We just don't see patterns where the whole left-brain network is more connected, or the whole right-brain network is more connected in some people," said Jared Nielsen, a graduate student and first author on the new study.

At the same time, the usage has become ingrained:

“The left-brain right-brain myth will probably never die because it has become a powerful metaphor for different ways of thinking – logical, focused and analytic versus broad-minded and creative. Take the example of Britain’s Chief Rabbi Jonathan Sacks talking on BBC Radio 4 earlier this year. “What made Europe happen and made it so creative,” he explained, “is that Christianity was a right-brain religion … translated into a left-brain language [Greek]. So for many centuries you had this view that science and religion are essentially part of the same thing.”

There is more than a grain of truth to the left-brain right-brain myth. While they look alike, the two hemispheres of the brain do function differently. For example, it’s become almost common knowledge that in most people the left brain is dominant for language. The right hemisphere, on the other hand, is implicated more strongly in emotional processing and representing the mental states of others. However, the distinctions aren't as clear cut as the myth makes out - for instance, the right hemisphere is involved in processing some aspects of language, such as intonation and emphasis [Oriana: and figurative language, i.e. metaphor and irony].

But it’s important to remember that in healthy people the two brain hemispheres are well-connected. In most of what we do, the hemispheres have evolved to operate together, sharing information across the neural bridge of the corpus callosum.

It’s tricky to combat that belief system [in being right-brained or left-brained] by saying the truth is really more complicated. But it’s worth trying, because it would be a shame if the simplistic myth drowned out the more fascinating story of how our brains really work.

We have tons of books that talk about simplifying your life in terms of getting rid of excess stuff (clothes, books, furniture etc) — but not that many people advocate focusing on just one thing, or maybe two — and getting rid of the endless trivial tasks that consume our time (and time is life, the most important wealth). I’ve just listened to an interview with Greg McKeown on NPR. His message is that we mustn’t spread ourselves thin, trying to do it all. We should do far less, sticking to the essential. We should be very selective: “The main thing is to keep the main thing the main thing.”

“What we need to do is decide that we are going to become an essentialist — that we are not going to get caught up in that furor of the frenzied, frenetic nonsense — and instead pursue those things that really matter most to us.”

Ah, but the uncertainty about what to choose to do! This is where the OR statement becomes crucial. For instance: If my goal is to be a good writer who gives something of true value to my readers, do I surf the pictures of baby animals on Facebook, or do I read a challenging book?

It’s also been called the red light/green light principle: Will doing X get me closer to my goal? The yes answer is a green light; no is a red light.


There is no guarantee that, three days into the book, I may decide the challenging book has been a waste of time after all. What writer hasn’t been haunted by the thought of having wasted years on the wrong project — perhaps his whole life? Here is something wonderful on this subject:

W. G. Sebald writes about a particular brand of melancholy that attends scholars and writers and weavers, a kind of melancholy born of concentrating for long periods of time on intricate patterns. They worry, he writes, about having pulled too long at the wrong thread. Sebald himself writes about a day he gets so lost in footnotes, escaping the factual by virtue of stranger and stranger details buried in the marginalia. At one point, Sebald looks up and realizes that his elderly neighbor, who has been engaged on a lifelong process of reading an encyclopedia has only reached the letter K and, now, it is clear, he will never finish what he started. Sebald starts to see the library as an immense creature that feeds on words and gives birth to words. ~ Janice Greenwood

I think we will never know if we’d wasted much of our life pulling at the wrong thread. We must risk making a wrong choice. But I love the man who got only to the letter K. May he live until P!


The main thing is not to take too many projects at once. “We manage best when we manage small,” the poet Linda Gregg reminds us. It’s better to do one thing extraordinarily well than a dozen things badly.

Another point, somewhat tangential. People object to the idea of doing less by saying, “But I have so much to do! If I don’t try to to do it all, I’ll die before it ever get done!”

No, it will never get done. In modern life, the stream of activities doesn’t end just because you need to take the time to die. As writers, we are often advised to start “in medias res” — in the middle of the narrative, without introduction and preliminary details. Perhaps the same applies to endings. It’s better to end one’s life in medias res, I think, to know we’ll never get to the end of that mess, never know the moral of the story, the last line, than to try to catch up on everything and never do anything — no matter how small — at the level of excellence.