Saturday, June 9, 2018


Pacific Beach, California; Nathan Rupert



Sylvia, I saw him,
your man in black —
but not with a Mein Kampf look.
More the look of an old dog
that had long ceased to care.

He was seventy perhaps.
Sunset was cooling its late sheen
as he waded into the waves,
dressed in a formal black suit,
as if for a wedding.

But when the water reached
almost up to his mouth,
he slowly turned back. 
The sun had already set.
Dripping his dark, he labored out.

He shuffled on with a blank stare
between the pier and the electric plant.
In wet, heavy clothes, he slogged
toward the lights of the street
across the darkening sand.

Sylvia, I thought, where
are you now, do you exist
except in your poems?
And where would you be
if you had labored to live?

Sheer curiosity has kept me alive,
wanting to know how far
I could go, how far the world
would go. Yet one Pacific sunset,
its torches casting a fiery glow,

dressed not in black but white,
hair streaming loose, a veil,
I too could step into that pearl
wall of surf, up to my mouth,
and in that wordless love

not know whether I too
would turn around.

~ Oriana


“Now more than ever seems it rich to die,” Keats writes in Ode to the Nightingale — a line that instantly engraved itself on my psyche. Even on first exposure, before I actually witnessed how people die, I realized that it’s pure fantasy. But it was a fantasy I could identify with: to die as if into beauty, to enter beauty.

This theme is also important in another poem of mine, “Lorelei” — whom I also imagine as committing suicide by drowning by walking into a river during a glorious sunset:

Now this embrace of liquid light,
as though a mirror dissolved, and she

were entering herself,
slowly closing around the edges.

I knew a young man whose suicide fantasy was to shoot himself while listening to Beethoven’s Ode to Joy. I suspect that when he did finally do it, he wasn’t in the state of mind in which he could remember it.

To live for beauty and to die into beauty — I have lived long enough to realize that reality does not work that way. Reality tends to be a great ironist. It mocks our loveliest dreams. But that doesn’t stop some of us from dreaming, dissolving into that liquid light.


Friar Lawrence:
These violent delights have violent ends
And in their triumph die, like fire and powder,
Which, as they kiss, consume. The sweetest honey
Is loathsome in his own deliciousness
And in the taste confounds the appetite.
Therefore love moderately. Long love doth so.
Too swift arrives as tardy as too slow.

~ Romeo and Juliet, Act 2, Scene 6

But what are words of wisdom next to the proverbial raging hormones . . .

Still, it’s unusual in literature, and especially poetry, which loves passion, to have the advice to love moderately.

The last line reminds me of the Roman proverb, Festina lente — make haste slowly. And of course there’s “more haste, less speed” and “haste makes waste.”

But back to moderate love. Excess is generally a bad idea; it predict a crash, Icarus’s fall into the sea. In “Twelfth Night,” Orsino seems to understand it:

If music be the food of love, play on,
Give me excess of it, that, surfeiting,
The appetite may sicken, and so die.


Of course the Bard was also a master of comedy, and some of it has become even funnier with time:


~ “2. Never go to bed angry.

This advice pushes couples to solve their problems right away. Yet everyone has their own methods of dealing with disagreements, and research indicates that about two-thirds of recurring issues in marriage are never resolved because of personality differences — you’re unlikely to work out that fight about the dishes no matter how late you stay up.

In our “Love Lab,” where we studied physiological reactions of couples during arguments (including coding of facial muscles related to specific emotions), we found that when couples fight, they are so physiologically stressed — increased heart rate, cortisol in the bloodstream, perspiring, etc. — that it is impossible for them to have a rational discussion. With one couple, we intentionally stopped their argument about a recurring issue by saying we needed to adjust some of our equipment. We asked them to read magazines for 30 minutes before resuming the conversation. When they did so, their bodies had physiologically calmed down, which allowed them to communicate rationally and respectfully. We now teach that method to couples — if you feel yourself getting overwhelmed during a fight, take a break and come back to it later, even if that means sleeping on it.” ~


I loved the story about the experimenters asking the arguing couple to read a magazine for half an hour because they (the researchers) “needed to adjust some of our equipment.” Once the stress hormones are down, rationality and respect have a chance to prevail.

We know a lot about stress and the harmful behavior that can result. So much depends on our physical state — so much. More knowledge of the effects of stress and “bad mood” could work wonders for how people relate — not just for marriages or romantic relationships.

Alas, many people believe that anger needs to be expressed — that this is good (“cathartic”) both for the angry person and for the relationship, even though experience shows how destructive anger can be, and how expressing it often only escalates the rage. Buddha was wiser than that: he advised watching (“witnessing”) the anger and letting it pass. But in this experiment simply reading a magazine was enough to lower stress and restore calm, rationality, and respect. This is great news: you don’t have to be an accomplished meditator to defuse anger. You can lower the stress in a variety of ways, and then calmly discuss the issues (if they need discussing, that is; sometimes it’s best to let things be.)

“I’ll tickle your catastrophe,” to quote the Bard again. No, don’t touch — at least not in anger. Let time and distraction work their magic.

Man climbing a frozen waterfall —“because it’s there,” I suppose. My philosophy: you have to let a man do “guy things” just as you have to let a cat be a cat and a dog a dog. 


The idea that anger must be expressed, like so many culturally popular ideas, is both wrongheaded and toxic. Expressing it does not seem to resolve anger, but to amplify it. With abusive toxic masculinity anger  is like a sacrament, an exaltation of power. "Anger management" is exactly what they don't want . . . they savor the intoxication of rage and its violent expression. For these folks there can be no "take a minute to cool off" — they want the heat.


Yes to everything. Studies have confirmed that expressing anger only amplifies it. And yes, there is that exaltation of power: an excellent insight. The man may claim to be a victim, but anger is generally a display of power and dominance. In domestic violence, there is also the abuser’s “See what you made me do” — blaming the actual victim. 

Stradano: Inferno, Canto 8: Anger


“I’m stuck in a vast old Victorian hotel with endless rooms and hallways trying to check out, but I can’t,” he said. “I spend a lot of time in hotels, but this one is menacing because I just can’t leave it. And then there’s another part to this dream, always, where I’m trying to go home but I can’t quite remember where that is.” ~ Anthony Bourdain, his recurrent dream


In my case, it was an endless house in which I’d find myself through some absurd circumstances,  and then couldn’t find an exit. One possible interpretation is getting so caught up in people pleasing that you start telling the person what you think they want to hear, and keep on wading deeper and deeper into falsehood. 


Both Bourdain's nightmare and your own are metaphors — for entrapment and frustration and the inability to find a solution.  Maybe, even, for despair. I have had an opposite dream several times, of a house familiar to me where I suddenly discover new rooms, new floors, new gardens — all beautiful and strange. There was even one dream where a small garden, once entered, had endless space inside . . . so, an infinity inside a small defined compass. That may be my favorite dream ever.


I had a recurrent dream of discovering new rooms as well. Eventually I’d find the absolutely perfect room, quiet and lovely, where I wanted to set up my work space — sometimes it had a bed in it, too, also welcome. My own special space where no one would disturb me. But my joy in the dream never lasted. As soon as I settled in, a crowd of people would pour in. I’d wake up in a rage.

Envy you your ‘infinite garden” dream. That sounds delightful.



    ~ “I thought I was doing the right thing, I was obeying orders, and now, of course, I see that it was unnecessary and wrong. But I don’t know what you mean by being upset….I didn’t personally murder anybody. I was just the director of the extermination program at Auschwitz.”
    ~ Rudolf Hoess, April 11, 1946 at Nuremberg.

Several years ago I began researching Nazi mens rea, the legal term for a criminal defendant’s mental state at the time a crime is committed, in order to explore what it means to obey unethical orders. How do evil people convince others to do their dirty work? What effect do hateful ideologies and propaganda have on individual agency? Can complicity in crimes against humanity be explained by obedience to hierarchies or coercion?

I discovered that any attempt to pin down the origin and nature of atrocities foundered when shifting from systemic failures onto issues of individual moral culpability.

The Nuremberg Trials disturbed observers not simply with revelations of mass atrocities, but also by the Nazis’ seeming normalcy and lack of remorse. Dr. Leon Goldensohn spent seven months studying the mental health of the Nuremberg defendants on assignment from the U.S. Army. Goldensohn regularly interviewed both defendants and witnesses, 33 in total. His notes were published in The Nuremberg Interviews: An American Psychiatrist’s Conversations with the Defendants and Witnesses.

Goldensohn, himself a Jew, treated the defendants as subjects in a study, hoping for signs of a distinctive Nazi pathology. He didn’t find one. There were common patterns of behavior and repetitive answers, but from Goldensohn’s notes it’s clear that each Nazi made their own impression on the doctor. Defendants for the most part used their time with him to rehearse their testimony.

The court drama played out over the original charge of conspiracy. Some scholars argue that the prosecution’s wish to prove that the defendants had all collaborated together in an organized conspiracy towards the Final Solution led them to exaggerate the intentionality and coherence of Nazi planning and policy. At trial, defense counsels were quick to point out the enormous confusion of authority in the Third Reich. In the hopes of having the conspiracy charges dismissed, Nazi defendants pled ignorance of the atrocities, blaming the compartmentalized system of Nazi administration.

There is evidence from Goldensohn’s interviews that even during the trial, Hermann Goering, the highest ranking Nazi defendant, was maintaining party discipline in prison. Goering’s plan for the defense was copied by the majority of defendants and involved ignoring the atrocities, or in the alternative, blaming Goebbels and Himmler, both conveniently dead. He disparaged lower-level officials’ claims when they contradicted his own, and proudly took responsibility for all but the extermination camps. No one living would account for those.

Goering was smooth:

    We Germans consider an oath of fealty more important than anything….Mind you, I said almost anything. I don’t consider the extermination of women and children as proper even if an oath were taken. I myself can hardly believe that women and children were exterminated. It must have been that criminal Goebbels, or Himmler, who influenced Hitler to do such a dastardly thing.

The disconnect between language and reality was astounding. By Nazi reasoning, Goering’s stolen art was a major disgrace, whereas killing Jews was merely distasteful. In an interview with Otto Ohlendorf, commander of Einsatzgruppe D, one of the paramilitary death squads, Goldensohn couldn’t hide his disgust.

    LG: Did your wife know of this business of the Einzatsgruppe?

    OO: No.

    LG: Have you seen her since 1941–42?

    OO: I saw her, but never talked to her about those things. I didn’t think it was good conversation for a woman.

    LG: But it’s all right to shoot women, not all right to talk to them about shootings?

    OO: In the first place, I didn’t shoot women. I merely supervised.

Hans Fritzsche, one of the few defendants to be released, was the head of the Radio Division in Goebbel’s Ministry of Propaganda from 1942 onwards. His defense:

    Pure idealism on my part. I can defend everything point by point. But I won’t try to do that, because everything I did, I did before the world public. On the other side of the picture is the fact that on the basis of my work, 5 million people were murdered and untold atrocities took place. It is purely a question of judgment as to whether a connection can be established clearly between these two things.

Fritzsche felt no personal responsibility for his actions spreading anti-Semitic propaganda. His idealism, aka his Hitler-worship, was to blame. It’s not that they hated Jews, you see. They were simply devoted to the Führer. The Führer made it legal to kill Jews; if it’s legal, it’s not murder.

Fifteen years later, Eichmann still blamed idealism for everything.

The psychiatrists who examined Eichmann pronounced him normal, or as one psychiatrist said, “more normal, at any rate, than I am after having examined him.” Still, the disconnect between systemic crimes and personal culpability remained. By Eichmann’s reasoning, his fixation on the Jewish question was the result of idealism. He was quoted once saying “Had I been born Jewish, I’d have been the most fervent Zionist!” As Arendt explained: an idealist was not merely a man who believed in an idea.

    An idealist lived for his idea…and was prepared to sacrifice for his idea everything and, especially, everybody.

Eichmann might have personal feelings on a subject but he would never permit them to interfere with his actions if they came into conflict with his idea. This blind fanaticism allowed for some form of conscience so long as it did not obstruct the Nazi in the execution of his duties. For Eichmann that meant planning the deportation of Europe’s Jewish population.

Hannah Arendt was furious that Eichmann, like the Nuremberg defendants before him, had distanced himself from his crimes through mental gymnastics. She cut to the chase, arguing that he was guilty of crimes against humanity because the subjective element, his mens rea, was objective by virtue of complete obedience to the Führerprinzip. Eichmann not only “obeyed orders, he obeyed the law.”

She advocated rethinking criminal intent altogether in cases of crimes against humanity. Arendt seized upon Eichmann’s distortion of Kant’s categorical imperative:

“Act in such a way that the Führer, if he knew your action, would approve it.”

Eichmann had abdicated his ability to think for himself, she said. In relinquishing himself to Hitler, Eichmann became strictly liable for the crimes he committed on Hitler’s behalf.

Strict liability in such cases resolved the issue of criminal intent and made it difficult for those who benefited from the regime to then disavow it later, as “so-called inner emigrants.”

~ [Inner emigrants] were people who frequently had held positions, even high ones, in the Third Reich and who, after the end of the war, told themselves and the world at large that they had always been “inwardly opposed” to the regime. The question here is not whether or not they are telling the truth; the point is, rather, that no secret in the secret-ridden atmosphere of the Hitler regime was better kept than such “inward opposition.” As a rather well-known “inner emigrant,” who certainly believed in his own sincerity, once told me, they had to appear “outwardly” even more like Nazis than ordinary Nazis did, in order to keep their secret. ~

One such emigrant was Oswald Pohl, head of the SS Economic and Administrative Main Office, and witness at Nuremberg. In interviews, Goldensohn pushed back on Pohl’s answers:

    Had he ever objected to the whole business?

    OP: No. Nobody asked for my opinion. It would have done no good to protest anyway….I did not participate in the murder of the Jews.

    I remarked that nevertheless, he did run all the concentration camps.

    Yes, but the camps had nothing to do with it…. Some of my present wife’s best friends were Jewish. That is proof enough of how I feel.

South African psychologist Pumla Gobodo-Madikizela conducted prison interviews with the infamous Apartheid death squad leader Eugene de Kock, serving consecutive life sentences and barred from amnesty under the Truth and Reconciliation Commission. de Kock was said to have repented and showed remorse, but in her book, A Human Being Died That Night, Gobodo-Madikizela writes that he exhibited outright similarities to the Nazis, particularly in his views on racism. Just like Eichmann and Streicher’s claims to Zionism, de Kock insisted that his zealously nationalist father equated Afrikaner nationalism with the ANC’s struggle for freedom, that his father could not possibly have been a racist because he spoke multiple African languages, and “had he been black, he would have joined the ANC.”

When we consider history, we see that such mental gymnastics are not coincidental. If they were unique to Nazis, the Klan would not be marching and lynching postcards would not exist. When Trump said he could shoot someone on 5th Avenue and his followers would support him, he was right. To paraphrase Nixon, “It’s not murder when the President does it.” Destroying society’s moral compass promotes the politics of hate from a practical perspective.

In light of this, we must continue to study the nature of genocide and mass atrocities, not in an attempt to find definitive answers, but rather to illuminate the boundaries of what’s knowable. Expanding our collective imagination of what’s humanly possible is crucial if we’re ever going to stop embracing old horrors with new technologies.” ~

Having fun at Auschwitz


The part about idealism was especially interesting to me, statements like “If I were Jewish, I'd be a fervent Zionist.” This is probably true, and shows a self-awareness we didn’t suspect was there. The content of fanaticism is almost accidental; fanaticism itself, whether religious or nationalist or anything else, has certain universal features. It’s the fervency that counts, the total dedication, the devotion to “duty” that obliterates other considerations.

When that kind of dedication happens to be to the cause of developing, say, a polio vaccine, we certainly don’t object. Then our attitude is admiration. Then we speak highly of idealism and service to humanity.

“Act in such a way that the Führer, if he knew your action, would approve it.” Doesn’t that sound  like something preached to us in childhood, but with another figure in place the Führer?

Idealism in service to humanity is wonderful when the results are indeed positive. But when the outcome is the Inquisition or labor camps, it’s precisely the “idealism” defense that is of great interest. Religious and political idealism pushed to an extreme has consistently resulted in disaster. The point is that we need to understand how a genocidal ideology can take hold of people’s minds. Is it enough to propagandize heavily enough, presenting a certain group of people as vermin that needs to be exterminated? Or is it enough simply to speak of a threat to the fatherland or the religious community?

Many scholars have decided that we will never really understand the “enigma” of large-scale atrocities. I don’t think it helps to speaks of “pure evil” that allegedly lurks within each of us like Original Sin — nor are we really ahead (well, somewhat) thanks to the concept of the “banality of evil.” We need to take a close, hard look at idealism.

The Nazi flag in front of the Cologne Cathedral, 1937


The most important line of the article on idealism/fanaticism seems to me  "Expanding our collective imagination of what's humanly possible is crucial" to avoiding future atrocities such as genocide, to avoid recurrent 'Holocausts." Ascribing these events to instances of evil, to the action of "monsters," gets us nowhere. The people who carried out these huge crimes against humanity refuse to see their actions as criminal or morally reprehensible . . . not in the phrase "I was only carrying out orders" but in that what they did was lawful and sanctioned by their leader, however distasteful these tasks may have been. This kind of devotion and "idealism" is actually fanaticism, the commitment to absolutely subsume one's will to the leader, of a religion, a state, a party, or a cult. A leader like Hitler, Manson, or Trump — whose followers will allow him anything, and admit no limits (moral or otherwise) to what they will do for him.

The lesson in all these instances is that there are no limits on humanly possible evil. No internal moral calculator that preserves judgement on the essential 'good or 'evil of actions. The idealized, fanatically followed leader's wishes  determine the moral status of the fanatic's act, the leader's orders must be carried out. And I believe these orders, no matter how horrific, how cruel, how evil and unjust, can be fervently, even gladly carried out by his followers, who no longer recognize the face of evil, only that of their leader, and his reflection in themselves.


That was one of Hannah Arendt’s arguments: that people like Eichmann refused to think for themselves. The abdication of moral judgment stemmed from the refusal to think.

Thinking increases loneliness — Arendt saw that too. People tend to do whatever is easiest and least unpleasant. Following the crowd, adoring the leader — there is instant gratification in that, as opposed to dealing with difficult complexities and taking the risks that come with non-conformity. Arendt got vilified for refusing to simplify, for daring to think with her kind of depth and originality.


~ “James Baldwin submitted an essay, “Freaks and the American Ideal of Manhood,” to Walter Lowe Jr., the first African American editor of Playboy magazine. Its radical thesis—that misguided notions of masculinity were at the root of America’s moral quandary—was new for Baldwin (at least in emphasis) and a direct challenge to the magazine’s primary demographic.

Founded in 1953 by Hugh Hefner, Playboy originally targeted and appealed to white, heterosexual, middle- and upper-class male consumers, depicting a life of glamor, status, sophistication, and sexual freedom. As Elizabeth Fraterrigo notes in Playboy and the Making of the Good Life in Modern America, in the postwar era the magazine was considered the “premier arbiter of American beauty” and possessed “tremendous cultural power.”

For its critics, however, the magazine’s elevation of men as swinging bachelors and women as objects of lust made it an egregious example of sexism in the media. Gloria Steinem, feminist activist and founding editor of Ms. magazine, responded to Hugh Hefner’s claim that Playboy celebrated the beauty of the female body by countering: “There are times when a woman reading Playboy feels a little like a Jew reading a Nazi manual.” Its popularity, however, was in many ways connected to its reliance on sexual conventions and “girl next door” fantasies as well as its subversion of traditional racial, sexual, and gender constrictions.

. . . Playboy offered an odd juxtaposition of titillation, fantasy, serious journalism, and cultural commentary, illustrating many of the paradoxical possibilities and seductive illusions inherent in American popular culture. Baldwin’s “Freaks and the American Ideal of Manhood” appeared in the January 1985 issue, which featured actress Goldie Hawn on the cover. His subject matter was aimed squarely at the magazine’s readership. What did it mean to be a man? How was masculinity represented in films, television, ads, and celebrity culture? How did it converge with race and sexuality? And what were its implications for the nation as a whole?

While not immediately recognized as such, “Freaks” has become one of his most widely regarded and cited essays. Baldwin’s interest in the subversive possibilities of androgyny aligned in many ways with the rising black feminist movement and anticipated subsequent developments in queer theory and cultural studies. Surveying the landscape of the Reagan era, he recognizes the tensions between the era’s more traditional representations of masculinity (symbolized by President Reagan and many Hollywood blockbuster movies) and its queer alternatives (represented, among other ways, in the emerging New Pop Cinema). In place of America’s longstanding myths about what a man should be, he calls for a new vision of identity, not constructed by fear of the Other or violent hierarchies, but by reciprocity, complexity, border crossing, and becoming.

“Masculinity,” writes Abigail Solomon-Godeau, “however defined, is, like capitalism, always in crisis. And the real question is how both manage to restructure, refurbish, and resurrect themselves for the next historical turn.” It was certainly a relevant question at the dawn of the 1980s. By the end of the Carter presidency, the American ideal of manhood was perceived to be in trouble. Men had gone soft, the narrative went, and the nation, as a result, was weaker, more vulnerable, and uncertain. In the 1970s, explained Robert Bly, poet and leader of the mythopoetic men’s movement, we “began to see all over the country a phenomenon that we might call the ‘soft male.’ . . . [T]hey’re not interested in harming the earth or starting wars. There’s a gentle attitude toward life in their whole being and style of living. But many of these men are not happy.”

This unhappiness, Bly elaborated, had to do with no longer having role models, in the home or in popular culture, of strong, authentic masculinity. Instead, argued Bly, we saw everywhere domesticated, emasculated men. White men in particular felt anxious about their new roles in the wake of inroads by minorities, feminists, and gays. Far from being “Masters of the Universe,” a term popularized in the 1980s to describe the hyper-masculine hero (He-Man) of a children’s cartoon series, as well as the Gordon Gekko-like characters in Tom Wolfe’s novel The Bonfire of the Vanities (1987), many white men, in reality, no longer felt in control of the small orbit that was their lives. In 1979, film icon and conservative activist John Wayne died, symbolically representing the passing of a more traditional, triumphant vision of white masculinity. What America lost and desperately needed again, Bly and others declared, were real men—men who reclaimed a “deep masculinity,” a warrior mentality that had gone missing in post–civil rights culture.

For James Baldwin, seductive as this worldview might be, it was a fantasy — a fantasy America had been telling itself for decades while evading its more complex realities. “Reagan is a symptom of the American panic just as Maggie Thatcher is a symptom of the British panic,” he wrote. “They want to thrust themselves, you and me, back into the past.” Reagan’s “Morning in America” was nowhere close to the world Baldwin grew up in, nor was it the reality Baldwin witnessed in the 1980s. “There is an unadmitted icy panic coiled beneath the scaffolding of these present days,” he wrote in his 1984 preface to Notes of a Native Son. The country, Baldwin recognized, had changed — just not in the ways most Americans assumed. For all of the country’s institutional, sociopolitical, and technological advances, Baldwin contended that America’s dominant narratives remained much the same.

At the root of America’s failure to mature as a country, Baldwin argues in “Freaks,” are the mostly unacknowledged ways in which racial anxieties overlap with issues of gender, sexuality, desire, and power. “There seems to be a vast amount of confusion in the Western world concerning these matters,” he writes. Part of this confusion had to do with the tendency to reduce all concepts to simplistic either/or categories. For Baldwin, these binaries pervaded the American psyche and its resulting myths, narratives, and representations: there were “cowboys and Indians, good guys and bad guys, punks and studs, tough guys and softies, butch and faggot, black and white.” Such a rigidly bifurcated view of identity, Baldwin argues, is so “paralytically infantile that it is virtually forbidden—as an unpatriotic act—that the American boy evolve into the complexity of manhood.” How, he wondered, was it possible for a black man—indeed, any individual—to escape, resist, or reimagine these limiting types?

“Freaks” offers by far the most personal and developed analysis of sexuality and masculinity. It also offers his most compelling thesis: that in spite of received dualistic expectations about what it means to be a boy or girl, man or woman, we are all in fact both. This notion of “androgyny,” as he terms it, does not obviously mean that everyone is biologically both male and female, but that the “hermaphrodite reveals in intimidating exaggeration, the truth concerning every human being—which is why the hermaphrodite is called a freak.” The androgyne, similarly, evokes both fascination and fear in American culture—fascination because she/he seems exotic and different, and fear because he/she feels uncomfortably familiar. In embodying a liminal space “in the middle,” in ambiguity, the androgyne becomes problematic for those invested in protecting established borders of identity.

Baldwin’s essay, however, is not just about the androgynes we think we see. It doesn’t require that a man wear eyeliner or a woman have short hair. Regardless of one’s physical appearance or perceived characteristics, he argues, “there is a man in every woman and a woman in every man. . . . The last time you had a drink, whether you were alone or with another, you were having a drink with an androgynous human being.” That is, even the most masculine figures, whether or not they reveal or understand it, contain the “spiritual resources” of both genders. “I know,” says Baldwin, “that the macho men—truck drivers, cops, football players—these people are far more complex than they want to realize.” ~


What really stood out for me here is perhaps idiosyncratic: I loathed Reagan profoundly, wondering sometimes if there was something wrong with me since he was so popular with millions of Americans, while I hated him with all my being. This article made something very obvious finally obvious to me: Reagan was a second-rate John Wayne. He represented the cowboy ideal. Europe has (or, arguably, used to have) the aristocratic ideal; America had the cowboy ideal. A great deal followed — and still follows — from that.

John Guzlowski:

I think you're right. And I think gun culture comes from the John Wayne myth too. A man with a gun can solve any problem.


And one of the crazy aspects here is that Reagan was an actor, someone who heavily relied on make-up, for instance — on appearance, on make-believe male toughness and cowboy-like folksiness. Yet acting is not really a “masculine” profession. A great actor (not that Reagan was that) is certainly an artist, and an artist is almost the opposite of a cowboy (and much more successful with women; a cowboy is in love with his horse).

But we mustn't think that toxic masculinity is an exclusively American problem. That is the temptation, I know: the cowboy is so extremely American. But let's not forget that the term "machismo" is not Anglo . . .


“Now I’ve taken a closer look at my desk and realized that nothing good can be produced on it.” This text breaks off,  followed by a note: “Wretched, wretched, and yet well intended. It’s midnight . . . The burning light bulb, the quiet apartment, the darkness outside, the last waking moments entitle me to write, even if it’s the most wretched stuff. And I hastily make use of this right. This is just who I am.” ~ Franz Kafka (quoted by Reiner Stach)


For me reading Kafka's letters was exhilarating. Finally here was someone more neurotic than I was! You can hardly imagine how good it made me feel, bursting in happy laughter — especially since I was surrounded by all the “think positive” garbage.

At the same time, a writer who feels that his or her work is worthless is not necessarily neurotic. It can in fact be simply honest: only a fraction of anyone’s work is of some lasting value. But to get to that fraction, we have to keep producing, and we can do that only by giving ourselves the permission to produce “even if it’s the most wretched stuff.” Note that Kafka ends up giving himself that permission.

~ “Most commencement addresses focus on highlights. They consider the biggest, most impressive events in a successful person's life. But such a curated selection isn't representative of what that life actually looked like. Even for the most auspicious life, the highlights—winning the championship or the big prize—comprise a small percentage of the total. Most of life is about trudging through minor and inconsequential work.
I began with the story of philosopher and mathematician Bertrand Russell. In May of 1910, Russell published a work called Principia Mathematica. He and his intellectual partner, Alfred North Whitehead, had worked on it for ten years. They constructed Principia with the goal of providing what they called "a logical foundational for mathematics." Essentially, they were not satisfactorily convinced that 1+1=2 and thought someone ought to do some digging to see if the math, so to speak, really adds up. It’s not like this was a side project, either. For three of those ten years, Russell and Whitehead worked eight to ten hours per day, eight months of the year. And for their efforts they received, upon publication of their book, a resounding negative fifty pounds. It cost them money to publish it.

The upshot is that a guy named Kurt Gödel came along and mathematically proved that not only was all of Principia Mathematica totally wrong, but any attempt to create a logical foundations for mathematics was doomed to fail on principle. This is Gödel's famous Incompleteness Theorem, and it nullified a decade worth of Russell’s work. 

Of course, that isn’t how the story ends for Russell. He went on to become one of the most famous philosophers of the 20th century, even winning the 1950 Nobel Prize in Literature, "in recognition of his varied and significant writings in which he champions humanitarian ideals and freedom of thought.” If you were going to summarize Russell's life, it'd be tempting to just talk about the highlights. But, when considered as a whole, most of it would look more like trudging through his work on Principia rather than winning the Nobel. Even the most exciting life is still one that is mostly boring.

You will find broadly the same commitment to triviality in any successful person. Take Van Gogh, for instance. There was a period early in his career and lasting a few years in which he refused to paint. He only composed sketches with pen and pencil. He felt he had mastered the basics before moving on to the good stuff. In order to become one of the world’s greatest painters, you have to do a lot of things that are not painting.

The reason I felt that this was an important message for my colleagues and me to consider was that we were about to embark on the not-painting phase of our painting careers. I thought it’d be worthwhile to survey the landscape in front of us. Our first reaction, when faced with the prospect of investing a lot of time into menial tasks, is to assume that we're falling short of working toward our larger goals. But that's not necessarily true. Those periods of low-level execution are, in fact, a crucial step in the process. Or, as I put it in my draft, “Attaining greatness, in any field, amounts to, above all else, minutia, tedium, and monotony.” I thought someone had to say it.” ~


I didn't know this about Betrand Russell's Principia. It seems that all the little bios mention it as an achievement. Wow, how ironic. But it can certainly happen . . . Think of endless scientists devoting endless hours to what ultimately proved to be wrong-headed projects. But: "they also serve." Mistakes can also be building blocks. In any case, they are inevitable.


~ “In a new study, published in Cerebral Cortex on May 29, neuroscientists explain how they generated “personally relevant” spiritual experiences in a diverse group of subjects and scanned their brains while these experiences were happening. The results indicate that there is a “neurobiological home” for spirituality. When we feel a sense of connection with something greater than the self—whether transcendence involves communion with God, nature, or humanity—a certain part of the brain appears to activate.

The study suggests that there is universal, cognitive basis for spirituality, as opposed to a cultural grounding for such states. This new discovery, researchers say, could help improve mental health treatment down the line.

Previous studies have examined the brain activity of Buddhist monks or Catholic nuns, say—people who are already spiritually inclined and familiar with the practice of cultivating transcendent states. But this research analyzed subjects from different backgrounds with varying degrees of religiosity, and totally different individual notions of what constitutes a spiritual experience.

“Although studies have linked specific brain measures to aspects of spirituality, none have sought to directly examine spiritual experiences, particularly when using a broader, modern definition of spirituality that may be independent of religiousness,” the study explains. Because there are many types of transcendent moments with varying degrees of meaning to different people, it’s been difficult to test the general effects of spirituality, as opposed to religiosity. So for this study, the researchers generated individual scripts that put each subject in their own relevant transcendent state.

With each of the 27 subjects—all healthy young adults—the researchers created a personal script based on each person’s self-reported previous spiritual experiences. The scientists then scanned brain activity when generating such a state in the subjects.

During their varied transcendent states, all subjects showed similar activity patterns in the parietal cortex, which processes sensation, spatial orientation, and language, and is thought to influence attention, among other functions. In other words, whether the thing that makes a person feel connected to something greater involves church, trees, or a stadium full of sports fans, it appears to have the same effect on the brain. 

The effect on the brain is distinct from the effect of other forms of relaxation, according to researchers. “We observed in the spiritual condition, as compared with the neutral-relaxing condition, reduced activity in the left inferior parietal lobule (IPL), a result that suggests the IPL may contribute importantly to perceptual processing and self-other representations during spiritual experiences,” the study explains.

These changes in the brain may help explain why, during spiritual experiences, the barrier between the self and others can be reduced or even eliminated altogether. Although we need some separation between ourselves and everyone else for protection and to manage reality, removing the barrier every so often is also valuable.

“Spiritual experiences are robust states that may have profound impacts on people’s lives,” explains Yale psychiatry and neuroscience professor Marc Potenza, in a statement about the work. “Understanding the neural bases of spiritual experiences may help us better understand their roles in resilience and recovery from mental health and addictive disorders.”

Spiritual experiences involve “pronounced shifts in perception [that] buffer the effects of stress,” the study says. The findings suggest that those experiences can be accessed by everyone, and that transcendence isn’t dependent upon religiosity. That makes studying spiritual experiences and figuring out how to use such states for improved mental health easier for scientists. Next, the researchers hope to test a bigger group of subjects of all ages.

Beyond mental health, scientists study spirituality because the human quest for meaning is timeless and universal. By cultivating spiritual experiences in addition to strengthening our intellectual abilities, people can lead emotionally richer lives and develop more open minds, scientists say.

As Tony Jack, director of the Brain, Mind and Consciousness lab at Case Western Reserve University—who was not involved in this study—explains to WKSU, analytical thinking and spiritual, empathic thinking rely on different neural pathways and processes. They don’t happen simultaneously in the brain, but both modes are necessary, like breathing in and breathing out. “You can’t do both at the same time, but you need both to stay healthy and well,” he says. ~


I find the word “spiritual” to be so vague as to be almost useless. “Experience of awe,” on the other hand, means a lot to me. For me personally, awe is almost always about beauty. Without beauty, I would not find life worth living.

But I can also understand (I think I can) those who go into rapture during sports events. Singers in a choir — goes without saying. Creative activity at its most fulfilling — apparently I'm not the only one who can deliriously repeat my own best lines to myself! (Bob Hass was the first one to admit to that; I was tremendously relieved.)

Reading great poetry can also be a “spiritual” experience in the sense discussed here. One interesting aspect of it is that the content of the poetry can be very dark. Consider this description of hell from Milton’s Paradise Lost:

No light; but rather darkness visible
Served only to discover sights of woe,
Regions of sorrow, doleful shades, where peace
And rest can never dwell, hope never comes
That comes to all, but torture without end
Still urges, and a fiery deluge, fed
With ever-burning sulphur unconsumed.

The pleasure in the vivid language and unusual phrases (“darkness visible” is especially memorable) fills me with joy that seems paradoxical, given the subject. 

Likewise, it’s one thing to contemplate a subject such as the fall of Icarus as a moral lesson, and another to “sink into it” (yes, a groaner, I know) in art. 


~ “In 2012 Marcus Claesson and Ian Jeffery from the University College Cork in Ireland and their colleagues, reported this gut flora changes among some older adults and they correlated changes in the type of bacteria with frailty and mortality. They found that institutionalized older adults have a different gut flora then community older adults and younger people. And they related this diminished flora—caused by a restricted diet—to diminished physical capacity.

But it was only in December 2014 that Martin Blaser from New York University and Glenn Webb from Vanderbilt University, Nashville, Tennessee, tried to explain how bacteria are designed to kill older adults. They argue that modern medical problems, such as inflammation-induced early cancer, resistance to infectious diseases and degenerative diseases are all in response to bacterial change as we get older and this has an evolutionary cause.

Using mathematical models the authors show how bacteria evolved because they contributed to the stability of early human populations: enhancing the survivability of younger adults while increasing vulnerability of older adults. Such an evolutionary process has advantages, but in the modern world, bacteria's legacy is now a burden on human longevity. Although this mathematical model has many flaws—primarily the identification of specific bacteria responsible for specific diseases—it allows gerontologists to see aging as a balance, not an all or nothing event.

Bacteria are necessary in balancing the cellular activities in our human body. In one example scientists are using bacteria that cause botulism to eradicate tumors. Linlin Guo and her colleagues from the Buck Institute for Research on Aging in California increased lifespan in flies by altering bacteria in their intestine. It seems that bacteria are an important system in the body which might have outcomes on longevity. Our body is a universe of organic activity.

from another source:

~ “The connection between microbes and lifespan dates back to Elie Metchnikoff—an eccentric Russian Nobel laureate who the microbiologist Paul de Kruif once described as a “hysterical character out of one of Dostoevsky’s novels.” He believed that intestinal microbes produced toxins that caused illness, senility, and aging, and were “the principal cause of the short duration of human life”. (His claim, though baseless, apparently started a fashion for colostomy in the early 20th century.) On the other hand, he also thought that some microbes could prolong life by producing lactic acid, which killed their harmful cousins. That was why, Metchnikoff believed, Bulgarian peasants who regularly drank sour milk would often become centenarians.

In 1908, Metchnikoff wrote about his ideas in a book called The Prolongation of Life: Optimistic Studies—an ironic title given that the man was a profound pessimist who had twice tried to kill himself. Still, he also quite literally put his money where his mouth was by regularly drinking sour milk, and created a fad that would culminate in the modern probiotics industry. Metchnikoff died at the age of 71, and his claims haven’t quite stood the test of time. But more recently, several groups of scientists have shown that animal microbiomes can indeed influence the lifespans of their hosts.

More recently, Dario Valenzano showed that the killifish—an extremely short-lived fish that’s being increasingly used in studies of aging—lives longer if old individuals consume the poop of younger ones, suggesting either that old microbiomes quicken the deaths of these fish, or that young microbiomes can prolong their lives.

For Wang, the ultimate goal is to develop genetically engineered strains of bacteria that can improve human health—a souped-up, life-extending probiotic for modern-day Metchnikoffs to quaff. But that won’t be easy. Despite a lot of research and development, existing probiotics are largely underwhelming, because it is very hard to get these bacteria to stably colonize the gut. “That’s a challenge for the entire field, and we’re collaborating with others to find different ways around it,” says Wang.

A different option would be to find microbe-made chemicals like colanic acid that could have anti-aging effects on their own. “Making people live longer and healthier is very different from treating diseases,” explains Wang. “If I talk to a patient and say I have a magic drug that can cure their disease but has side effects, I think they’d take it. But if you tell a healthy person that you have a compound that would extend their life by five years, but has side effects we don’t know about… I would be hesitant. That’s why I’m looking to the microbiome. Maybe we can find natural compounds that come from the microbes that we can use to boost our health. They’d be safe because they’re already there.”

Indeed, the team has already shown that colanic acid can also extend the life of fruit flies, and can affect the mitochondria of mammalian cells in the same way that it did those of the worms. (Colanic acid stimulates these tiny power plants to split apart, making extra copies of themselves. It also switches on a group of genes that help mitochondria deal with stressful conditions, and that have been previously linked to longer life in worms.) “I don’t want to speculate too much, but that makes us positive,” Wang says. “We’re now starting experiments with mice.” ~

ending on beauty:

I tell the forest
a story about my mother
and the way she cooked
rice and chestnuts

The forest listens
and grows hungry

~ John Guzlowski

Only in poetry you can get away with talking to the forest and making it hungry.

No comments:

Post a Comment