Saturday, August 15, 2015



I want to know
the first word,
older than fire
and more necessary —

Was it a cry of warning?
Or a child’s wail for
touch, mother syllable
of familiar heat?

A woman’s god-creating
attempt to name a lover,
a man’s god-shattering
attempt to name himself?

Or was the first word
god, manifest
music of thought,
emptied of frightened flesh?

Was it a yes, a no,
a yes, but – ?
And I want to know
the first lie.


Perhaps there were many
first words, first secrets,
first denials —
gossiping over a carcass

of a woolly mammoth,
complaining (the foremost
marker of culture 

is complaining).


Now we have too many words.
Words instead of children 

sit in our laps. That’s why we 
talk so much about silence.


How moist the newborn
first word must have been,
like the scent
of the earth after rain.

Someone’s grinding acorns,
baking the first bread.
Someone pierces bone,
making the first flute.

water starts flowing
into water,
the grain of stone
closes over stone.

~ Oriana © 2015



“Emile Durkheim is the philosopher who can best help us to understand why Capitalism makes us richer and yet frequently more miserable; even –- far too often –- suicidal.

Durkheim lived through the immense, rapid transformation of France from a largely traditional agricultural society to an urban, industrial economy. He could see that his country was getting richer, that Capitalism was extraordinarily productive and, in certain ways, liberating. But what particularly struck him, and became the focus of his entire career, were the psychological costs of Capitalism. The economic system might have created an entire new middle class, but it was doing something very peculiar to people’s minds. It was — quite literally — driving them to suicide in ever increasing numbers.

Edouard Manet, The Suicide, 1881

This was the immense insight unveiled in Durkheim’s most important work, Suicide, published in 1897. The book chronicled a remarkable and tragic discovery: that suicide rates seem to shoot up once a nation becomes industrialized and Consumer Capitalism takes hold. Durkheim observed that the suicide rate in the Britain of his day was double that of Italy; but in even richer and more advanced Denmark, it was four times higher than in the UK. Furthermore, suicide rates were much higher amongst the educated than the uneducated; much higher in Protestant than in Catholic countries; and much higher among the middle classes than among the poor.

Durkheim’s focus on suicide was intended to shed light on a more general level of unhappiness and despair at large in society. Suicide was the horrific tip of the iceberg of mental distress created by Capitalism.

1. Individualism

Under Capitalism, it is the individual (rather than the clan, or ‘society’ or the nation) that now chooses everything: what job to take, what religion to follow, who to marry… This ‘individualism’ forces us to be the authors of our own destinies. How our lives pan out becomes a reflection of our unique merits, skills and persistence.

If things go well, we can take all the credit. But if things go badly, it is crueller than ever before, for it means there is no one else to blame. We have to shoulder the full responsibility. We aren’t just unlucky any more, we have chosen and have messed up. Individualism ushers in a disinclination to admit to any sort of role for luck or chance in life. Failure becomes a terrible judgement upon oneself. This is the particular burden of life in modern Capitalism.

2. Excessive expectations

Capitalism raises our hopes. Everyone – with enough effort – can become the boss. Everyone should think big. You are not trapped by the past – Capitalism says – you are free to remake your life. The opportunities grow enormous…as do the possibilities for disappointment.

The cheery, boosterish side of Capitalism attracted Durkheim’s particular ire. In his view, modern societies struggle to admit that life is often quite simply painful and sad. Our tendencies to grief and sorrow are made to look like signs of failure rather than, as should be the case, a fair response to the arduous facts of the human condition.

3. Too much freedom

Capitalism relentlessly undermined social norms. States became more complex, more anonymous and more diverse. People didn’t have so much in common with each other any more.

What kind of career should you have? Where should you live? What kind of holiday should you go on? What is a marriage supposed to be like? How should you bring up children? Under Capitalism, the collective answers get weaker, less specific. There’s a lot of reliance on the phrase: ‘whatever works for you.’ Which sounds friendly but also means that society doesn’t much care what you do and doesn’t feel confident it has good answers to the big questions of your life.

In very confident moments we like to think of ourselves as fully up to the task of reinventing life, or working everything out for ourselves. But, in reality, as Durkheim knew, we are often simply too tired, too busy, too uncertain – and there is nowhere to turn.

4. Atheism

Durkheim was himself an atheist, but he worried that religion had become implausible just as its communal side would have been most necessary to repair the fraying social fabric. Despite its factual errors, Durkheim appreciated the sense of community that religion offered: “Religion gave men a perception of a world beyond this earth where everything would be rectified; this prospect made inequalities less noticeable, it stopped men from feeling aggrieved.”

Durkheim took the dark view that inequality would be very hard to eradicate (perhaps impossible), so we would have to learn, somehow, to live with it. This led him to a warmer appreciation of any ideas that could soften the psychological blows of reality.

Durkheim also saw that religion created deep bonds between people. The king and the peasant worshipped the same God, they prayed in the same building using the same words. They were offered precisely the same sacraments. Riches, status and power were of no direct spiritual value.

Capitalism had nothing to replace this with. Science certainly did not offer the same opportunities for powerful shared experiences. The Periodic Table might well possess transcendent beauty and be a marvel of intellectual elegance – but it couldn’t draw a society together around it.

Durkheim was especially taken with elaborate religious rituals that demand participation and create a strong sense of belonging. A tribe might worship its totem, men might undergo a complex process of initiation. The tragedy – in Durkheim’s eyes – was that we had done away with religion at precisely the time when we most needed its collective consoling dimensions and had nothing much to put in its place.

Monet: The fair at the Church of Saint Jaques Dieppes, 1901

5. Weakening of the nation and of the family

In the 19th century, it had looked, at certain moments, as if the idea of the nation might grow so powerful and intense that it could take up the sense of belonging and shared devotion that once had been supplied by religion. But the excitement of a nation at war had, Durkheim saw, failed to translate into anything very impressive in peacetime.

Family might similarly seem to offer the experience of belonging that we needed. But Durkheim was unconvinced. We do indeed invest hugely in our families, but they are not as stable as we might hope. And they do not provide access to a wider community.

 John Singer Sargent: The Daughters of Edward Darley Boit, 1882

Increasingly, the ‘family’ in the traditional expansive sense has ceased to exist. It boils down to the couple agreeing to live in the same house and look after one or two children for a while. But in adulthood these children do not expect to work alongside their parents; they don’t expect their social circle to overlap with their parents very much and don’t feel that their parents’ honor is in their hands.

Our looser, more individual sense of family isn’t necessarily a bad thing. It just means that it’s not well placed to take up the task of giving us a larger sense of belonging – of giving us the feeling that we are part of something more valuable than ourselves.

Durkheim is a master diagnostician of our ills. He shows us that modern economies put tremendous pressures on individuals, but leave us dangerously bereft of authoritative guidance and communal solace.

He didn’t feel capable of finding answers to the problems he identified but he knew that Capitalism would have to uncover them, or collapse. We are Durkheim’s heirs – and still have ahead of us the task he accorded us: to create new ways of belonging, to take some of the pressure off the individual, to find a correct balance between freedom and solidarity and to generate ideologies that allow us not to take our own failures so personally and sometimes so tragically.”

( This is a longish article, but here is a video that summarizes it in only seven minutes:

Degas, Absinthe, 1876


“. . . to take some of the pressure off the individual, to generate ideologies that allow us not to take our own failures so personally and sometimes so tragically.” I’d use the word “life philosophy.” And to develop a life philosophy one has to have both sufficient intelligence and sufficient experience. I finally understood that the “excuse of youth” doesn’t expire at the age of thirty, say. Or forty, or even beyond. Wisdom comes when it comes (IF it comes).

And it helps tremendously to meet others who freely admit, “I didn’t realize what was important until I turned 58” — or 61, 69, 75 — put in any figure here. There is no shame in taking a long time to understand what’s really precious and important. And it takes life experience. If you are the kind of woman (this seems to apply to women in particular) who requires living by herself before she ceases to be mostly a caretaker and a “service person,” and can at last “find herself,” then the experience of divorce or widowhood may be necessary.

I agree that the danger of depression and suicide goes up tremendously with the individualism fostered by capitalism. But I’ll take it any time over the collective pressures of earlier times. Women were virtually slaves, and men paid a price as well, their talents stifled as they had to go into the family business.

Hopefully we’re past excessive individualism in the sense of blaming the individual for every “failure” — no longer seen as misfortune due to circumstances. With a trend away from free will, we are beginning to see the enormous role of circumstances. This is only the beginning of that deeper understanding that began with the growth of psychology and neuroscience. A more psychological perspective should lead us away from blaming, contempt, and hatred, and toward more compassion.

There is of course still a segment of society that explains poverty as sin and misfortune as god’s punishment. It’s a noisy view — This woman got raped because she wore sexy clothes. These school children got shot because we don’t have school prayer — but that’s increasingly the voice of the lunatic fringe.

As for the family, I’ve seen a strengthening rather than a weakening. True, a minority women choose to use a sperm bank and become single mothers. Given the risk of waiting too long for Mr. Right and missing one’s chance to have a biological child, that decision is not exactly outrageous. A more interesting trend is that of educated people marrying later in life and having lasting marriages, with the father involved in child rearing. Those families can be extremely close, and yes, the retired parents will move so as to be near their children and grandchildren.

Even without as much closeness as that, what I see around me is people raising children with a lot more love than was typical of Durkheim’s day, when typical child rearing practices would strike us as abusive. Family love has become enormously important, and is perhaps the most successful replacement for religion.

Finally, when we look at suicide rate in various countries, we see that the most developed capitalist countries are not in the lead (except for Japan, with its shame culture). Lithuania and Russia are far ahead of France, Germany, and Switzerland — probably because of the combined ravages of alcoholism and economic hardship.

Thus, Durkheim’s analysis is only partly correct. But his theorizing remains important because it points to the importance of social connectedness and close personal ties. We are beginning to speak of the “connectome” — the way we’ve popularized “genome” and “biome.” A human being is not an isolated individual. People’s lives are meaningful within their social group. The worship of individualism seems to have reached its peak and crested. Now it’s connection, connection, connection.

And yes, the task is ahead of us. The future may bring us more eco-farms and artisanal communities where people can cultivate the satisfying sense of connection that marks the best of pre-capitalism. 

Bueghel the Elder, The Kermess, 1570-1580



Recently I’ve had a disquieting experience of being compared to a mass shooter or a wife beater (sic) for saying something that revealed my atheism. My comments were compared to “shooting at random.” I mentally reeled in astonishment. Nothing I’ve ever said or done in my life merited being compared with the actions of a mass shooter, much less some forgettable Facebook comment. This holds even if we apply the unbelievably high standard set by Jesus in the Sermon on the Mount: even if you didn’t commit murder, if you are angry with your brother or sister in your heart, you will be just as “subject to judgment.” “And anyone who says, ‘You fool!’ will be in danger of the fire of hell.”

But I didn’t call anyone a fool. My tone was moderate, polite.

Then I recalled what I’ve noticed before: some religious people feel threatened by the very existence of atheists. And I found an article that “explained it all” by the excellent “Godless in Dixie” blogger, Neil Carter: “Why Even Nice Atheists Are Offensive to the Faithful.”

“Greta Christina pretty much nailed it when she said:

‘Religion relies on social consent to perpetuate itself. But the simple act of coming out as an atheist denies it this consent. Even if atheists never debate believers or try to persuade them out of their beliefs; even if all we ever do is say out loud, “Actually, I’m an atheist,” we’re still denying our consent. And that throws a monkey wrench into religion’s engine.’

In other words, atheists offend simply by existing. Just as the Emperor’s new wardrobe choice could only be successful if everyone agreed to not speak ill of it, so there’s an unspoken rule that says the worst thing an apostate can do is admit out loud that she has left the faith. Nothing more need be said in order to offend. She offends now by existing, and she offends by openly admitting who and what she is.

Which is why more of us need to do this. Not much more is really required in order to make a difference. The mere act of “coming out” as an atheist denies Fundamentalism the consent it requires in order to remain a coercive force over the lives of people. Do all expressions of the Christian faith demand such obeisance? No, definitely not. But the ones that do won’t go away just because we ignore them. And frankly, I don’t think it makes much difference to the more liberal strains of religion what the rest of us believe, so long as we agree to try to leave the world a better place than how we found it. Those are my feelings as well.

I would temper this encouragement to come out with a warning that for some it might not be such a good idea. Some just aren’t in a position to do it at all. Some will need to wait until they are in a stronger place themselves so that they can endure the onslaught of negativity their apostasy will engender. But for those who can, it helps the rest of us each time yet another person steps forward and says, however they say it, “Actually, I’m an atheist.”

I posted this on Facebook and received this comment: "I can understand perfectly. I myself consider being religious as a character flaw and i just can't get over it. But we all have our flaws. I never considered "a man of God" to be complimentary. I don't even think the Clergy really believe it. It's a living."

I replied:

From the start I had a heavy suspicion that at least some priests didn't believe the stuff. Some must have felt doomed to hell for unbelief and hating the god talk — their faces looked absolutely tragic. Same with nuns. Living a lie, wasted lives, having denied themselves human love . . . I was a sensitive child, and could tell if someone was unhappy, especially extremely unhappy. And that disturbed me: seeing those pale tragic faces above the black robes.

There are plenty of stories of non-believing clergy if you search online. There are whole books, memoirs. And yes, it is a living. The Clergy Project is an organization that helps agnostic and atheist priests and ministers leave the church and find a secular job.


Considering that atheists used to get burned at the stake if anyone found out, it’s interesting that we have a wealth of historical material documenting the existence of atheists going back as far as the ancient Hindu culture. There is reason to think that doubters (to use a milder term) existed as long as religion existed, even in cultures where doubt was severely penalized. The reason for the death penalty for atheism, I suspect, has been the shaky nature of religious faith. Smart people could figure out that the official “knowledge” (in primeval times, there was no separate word for religion) didn’t add up, and that animal sacrifice and other rituals did no good. Possibly a lot of people suspected as much, but tried to stifle doubt within themselves, finding elaborate excuses for god’s silence and absence.

And this goes on even in modern times: religious people take offense that seems completely out of proportion. Say that prayer doesn’t work, and you are going to be compared to a mass shooter. I would have never believed it — and then it happened to me, and due to a milder statement . . .



“The myth might have arisen from the Nobel Prize-winning research of Roger Sperry, which was done in the 1960s. Sperry studied patients with epilepsy, who were treated with a surgical procedure that cut the brain along a structure called the corpus callosum. Because the corpus callosum connects the two hemispheres of the brain, the left and right sides of these patients' brains could no longer communicate.

Sperry and other researchers, through a series of clever studies, determined which parts, or sides, of the brain were involved in language, math, drawing and other functions in these patients. But then popular-level psychology enthusiasts ran with this idea, creating the notion that personalities and other human attributes are determined by having one side of the brain dominate the other. Popular culture would have you believe that logical, methodical and analytical people are left-brain dominant, while the creative and artistic types are right-brain dominant.

The neuroscience community never bought into this notion, lead author Jeff Anderson said, and now we have evidence from more than 1,000 brain scans showing absolutely no signs of left or right dominance.

They found no evidence that people preferentially use their left or right brain. All of the study participants were using their entire brain equally throughout the course of the experiment.

The preference to use one brain region more than others for certain functions, which scientists call lateralization, is indeed real, Anderson said. For example, speech emanates from the left side of the brain for most right-handed people. This does not imply, though, that great writers or speakers use their left side of the brain more than the right, or that one side is richer in neurons.

There is a misconception that everything to do with being analytical is confined to one side of the brain, and everything to do with being creative is confined to the opposite side, Anderson said. In fact, IT IS THE CONNECTIONS AMONG ALL BRAIN REGIONS THAT ENABLE HUMANS TO ENGAGE IN BOTH CREATIVITY AND ANALYTICAL THINKING.

"It is not the case that the left hemisphere is associated with logic or reasoning more than the right," Anderson told LiveScience. "Also, creativity is no more processed in the right hemisphere than the left."

Anderson's team examined brain scans of participants ages 7 to 29 while they were resting. They looked at activity in 7,000 brain regions, and examined neural connections within and between these regions. Although they saw pockets of heavy neural traffic in certain key regions, on average, both sides of the brain were essentially equal in their neural networks and connectivity.

"We just don't see patterns where the whole left-brain network is more connected, or the whole right-brain network is more connected in some people," said Jared Nielsen, a graduate student and first author on the new study.

At the same time, the usage has become ingrained:

“The left-brain right-brain myth will probably never die because it has become a powerful metaphor for different ways of thinking – logical, focused and analytic versus broad-minded and creative. Take the example of Britain’s Chief Rabbi Jonathan Sacks talking on BBC Radio 4 earlier this year. “What made Europe happen and made it so creative,” he explained, “is that Christianity was a right-brain religion … translated into a left-brain language [Greek]. So for many centuries you had this view that science and religion are essentially part of the same thing.”

There is more than a grain of truth to the left-brain right-brain myth. While they look alike, the two hemispheres of the brain do function differently. For example, it’s become almost common knowledge that in most people the left brain is dominant for language. The right hemisphere, on the other hand, is implicated more strongly in emotional processing and representing the mental states of others. However, the distinctions aren't as clear cut as the myth makes out - for instance, the right hemisphere is involved in processing some aspects of language, such as intonation and emphasis [Oriana: and figurative language, i.e. metaphor and irony].

But it’s important to remember that in healthy people the two brain hemispheres are well-connected. In most of what we do, the hemispheres have evolved to operate together, sharing information across the neural bridge of the corpus callosum.

It’s tricky to combat that belief system [in being right-brained or left-brained] by saying the truth is really more complicated. But it’s worth trying, because it would be a shame if the simplistic myth drowned out the more fascinating story of how our brains really work.

We have tons of books that talk about simplifying your life in terms of getting rid of excess stuff (clothes, books, furniture etc) — but not that many people advocate focusing on just one thing, or maybe two — and getting rid of the endless trivial tasks that consume our time (and time is life, the most important wealth). I’ve just listened to an interview with Greg McKeown on NPR. His message is that we mustn’t spread ourselves thin, trying to do it all. We should do far less, sticking to the essential. We should be very selective: “The main thing is to keep the main thing the main thing.”

“What we need to do is decide that we are going to become an essentialist — that we are not going to get caught up in that furor of the frenzied, frenetic nonsense — and instead pursue those things that really matter most to us.”

Ah, but the uncertainty about what to choose to do! This is where the OR statement becomes crucial. For instance: If my goal is to be a good writer who gives something of true value to my readers, do I surf the pictures of baby animals on Facebook, or do I read a challenging book?

It’s also been called the red light/green light principle: Will doing X get me closer to my goal? The yes answer is a green light; no is a red light.


There is no guarantee that, three days into the book, I may decide the challenging book has been a waste of time after all. What writer hasn’t been haunted by the thought of having wasted years on the wrong project — perhaps his whole life? Here is something wonderful on this subject:

W. G. Sebald writes about a particular brand of melancholy that attends scholars and writers and weavers, a kind of melancholy born of concentrating for long periods of time on intricate patterns. They worry, he writes, about having pulled too long at the wrong thread. Sebald himself writes about a day he gets so lost in footnotes, escaping the factual by virtue of stranger and stranger details buried in the marginalia. At one point, Sebald looks up and realizes that his elderly neighbor, who has been engaged on a lifelong process of reading an encyclopedia has only reached the letter K and, now, it is clear, he will never finish what he started. Sebald starts to see the library as an immense creature that feeds on words and gives birth to words. ~ Janice Greenwood

I think we will never know if we’d wasted much of our life pulling at the wrong thread. We must risk making a wrong choice. But I love the man who got only to the letter K. May he live until P!


The main thing is not to take too many projects at once. “We manage best when we manage small,” the poet Linda Gregg reminds us. It’s better to do one thing extraordinarily well than a dozen things badly.

Another point, somewhat tangential. People object to the idea of doing less by saying, “But I have so much to do! If I don’t try to to do it all, I’ll die before it ever get done!”

No, it will never get done. In modern life, the stream of activities doesn’t end just because you need to take the time to die. As writers, we are often advised to start “in medias res” — in the middle of the narrative, without introduction and preliminary details. Perhaps the same applies to endings. It’s better to end one’s life in medias res, I think, to know we’ll never get to the end of that mess, never know the moral of the story, the last line, than to try to catch up on everything and never do anything — no matter how small — at the level of excellence.

No comments:

Post a Comment