skip to main |
skip to sidebar
TRISTAN
as if they knew. . . that the mask is real,
that everything is real, that the dream awaits.
Everything dreams, Robert wrote,
watching birds turn the color of twilight
as he sat in his rickety armchair
at his only window. The heater
leaked gas; to stay alive he kept
the window open like a quarter moon
to the sea, the dark and the rain.
The empty light bulb socket
sprouted gray braids of dust.
The aroma of souls –- lavender
and sweat — do not want to die yet,
Robert wrote, and Robert is dead.
I echo beneath the voices,
innocent as the future,
Robert wrote, becoming the past,
growing as archaic as the grass.
Dew thinks only of diamonds,
Robert wrote, becoming spray.
The cliffs drip wet sea moss.
A seagull like a white cross
pauses in mid-air
over a wounded man
in an oarless boat, sending up
a melody from his harp —
making music, like all of us,
to keep from being afraid.
~ Oriana © 2016
**
LOST IN REVISION
This is the polished final version. The poem has found its form and its music. It is “well-crafted.” But I mourn for the richness when I look at what I removed. I mourn all that is “lost in revision.” It’s one way that art is gained at the expense of life.
This used to be the first stanza:
The long journeys of your small hands,
Robert wrote. We were not lovers
but one New Year’s Eve
we lay side by side on his bed and held hands;
he a hopeless alcoholic, I in love
without hope with someone else.
*
And this was near the end:
I think of Tristan,
the incredible trust he had
to lie down in an oarless boat,
waves lapping against the slender wood,
the boat rocking, a perilous cradle —
the trust almost of an unborn child
that the journey will take you
where you need to go.
I see now that the lines I left out are pretty much all the explication the poem needs.
By the way, I took his hand after we read one of Neruda’s Twenty Love poems together. I didn’t mean that gesture as more than companionship, and he respected that. I forget how long we lay that way, in complete silence, in the city night that was a kind of perpetual half-dusk. Was there some dim lamp? If so, he must have turned it off to save on the electric bill. This was the opposite of the “American dream” — but we had poetry, we had Neruda.
But since I’ve gotten into the confessional mode with that omitted first stanza about that unforgettable New Year’s Eve, I might as well reveal how Robert died. He did go on to have a turbulent relationship with a poet who now and then found herself on the mental ward (she was diagnosed as a paranoid schizophrenic) and in half-way houses for the mentally ill. In any case, the two had a quarrel, broke up, and Robert went to a park near the ocean cliffs, sat under his favorite tree — the largest tree in the park — and began drinking. He favored vodka, the most concentrated form of alcohol. He drank until he passed out.
As I imagine this scene under the sprawling tree, I see a tiny volume of poems in his hand: Neruda’s “Twenty Love Poems.” He practically knew them by heart.
His body was found the following day. The autopsy found lethal blood alcohol level. But at first the police couldn’t identify the body because someone stole Robert’s wallet. It’s not known if this happened when Robert was still alive but unconscious, or after he was already dead.
Eventually he was identified — I am not sure about the details. He lived alone so I don’t think anyone reported a missing person. Perhaps the police did find the wallet, minus the money.
The news reached me and affected me deeply, so I wrote the poem, quoting lines from Robert’s own poems. He had a gift for the lyrical line, though it was hard for him to sustain any coherent content. He used to show me new poems, then ask, “But what does it mean?”
I used to give many poetry readings back then, so I decided to read “Tristan” at my next one — held in a bookstore in the town where Robert used to live. Someone who’d known both me and Robert invited his mother.
It was still relatively soon after his death, so I expected the mother to be devastated. I worried somewhat that perhaps she’d start sobbing. She was perfectly relaxed and even cheerful. “I knew he wouldn’t live long,” she said later. “Robert was not of this world. You know, ‘my kingdom is not of this world’ . . . But at least he died peacefully, under his favorite tree.” She smiled and fanned herself with the folded pages of my poem, a signed copy of which I’d handed to her as a kind of pious memorial offering.
So there it was: a proverbial romantic poet who literally drank himself to death, and his mother — not “grieving”, not “inconsolable” — rather, seemingly carefree and content, even happy, clinging to these two magical phrases: “not of this world” and “he died peacefully under his favorite tree.”
It took several years and a pretty dramatic life experience of my own before I understood the mother’s apparent contentment. When someone has a lot of pathology, his death can come as a relief. I missed Robert’s beautiful lines, but they meant nothing to his mother. She had other children. I suspect that to her he was dead long before he was actually dead.
She spoke of wanted to publish Robert’s poem. She wanted the pages to be special paper, light blue with white clouds. “He was not of this world” — she was perfectly satisfied with this formula. The publishing project, clouds or no clouds, came to nothing.
Roots and Wings — this anthology was to Robert what Rilke’s New Poems were to me — the “founding text” that showed us what poetry was and served as a poetic awakening.
*
And the girlfriend? After she received the news of Robert's death, she swallowed a quantity of pills — she always had a supply of several powerful tranquilizers. But as soon as she did that, she called emergency, was taken to a hospital and had her stomach pumped out. The last I heard about her, she’d joined a bible society and was walking door to door as a missionary. She stopped writing “because everything important was already written in the bible.”
I don’t know if his mother is still alive. If she is, I suspect that when the topic comes up (though perhaps it never does, unless a stranger, unaware, points to a figure in an old family photo), she still talks about how Robert died “peacefully, under his favorite tree.” No one mentions that he literally drank himself to death. Instead, he’s been turned into a legend of a good death, a beautiful death. And perhaps that’s better. Perhaps, for a mother, that’s the best way.
“Tales of ordinary madness” is how Charles Bukowski would likely dismiss this story. Ordinary alcoholism, ordinary mental illness. But I'm glad I had the ability to transmute it all into some lines of my own, however romanticized. Beauty is its own excuse.
And I'm wondering about Robert’s words I quote in the epigraph:
as if they knew . . . that the dream awaits.
What awaited was his death at thirty six. Yet while Robert wanted to be out of emotional pain, and turned to alcohol for that, he was never suicidal. It’s not that he “wanted to live” — that’s not a meaningful phrase in this case. He wanted to write, to string together beautiful images. The dream was the continued outpouring of lyrical poems. And that dream didn’t “await.” He was living it.
The aroma of souls — lavender
and sweat — do not want to die yet —
He also wanted to be loved, but he was too damaged for that. I'm tempted to say that he died for lack of love — poetry couldn’t save him. I used to keep some loose pages of his handwritten poems — the lines I quote in “Tristan” come from those pages. Then I moved a few times, and the pages got lost. For me, what remains is mainly that poem of mine and the dusky memories it brings, as if the ocean breeze touched a wind harp. Not much, but not nothing.
*
My first draft of the poem includes this last stanza that I eventually omitted:
He drank himself to death;
he was no Tristan,
says the sober corrector.
He lived in a drunken yet lyrical
squalor, pleads a more merciful muse.
He was hopeless yet he sang
his brief song — like all of us,
to keep from being afraid.
WHY TRUMP MIGHT WIN
“A recent article in Politico points to two traits that best predict whether you are a Trump supporter: Authoritarianism, and fear of terrorism. Matthew MacWilliams found that Trump’s bump in the polls is connected to support from “Americans with authoritarian inclinations.”
Authoritarian personalities OBEY, follow strong leaders, and tend to respond very negatively, and aggressively, to outsiders, like immigrants, Muslims, and visible minorities. When they feel threatened, persons inclined to authoritarianism support any policy that they think will help keep them “safe”. You know, build a wall, ban Muslims, establish a database to track Muslim American citizens.
This gets into other correlates of authoritarianism, such as militarism (being “hawkish”), and nationalism (“my country is the best in the history of the world”. Again, if you support a strong military, and believe your country is the best ever, you are more likely to justify use of that military might against persons or countries whose policies or actions work against the national interests of your country. “Carpet bombing” (killing civilians in the effort to kill one’s enemies) and torture become legitimate options to those who score high on authoritarianism and zeal, and score low on critical thinking
The trend over the last decade and a half has been authoritarians moving in droves from the Democratic to the Republican parties. As Democrats continue to support the rights of various groups (e.g., civil rights, gay rights, immigrant and refugee rights, equal pay for equal work, etc.), there is less emphasis on one group being naturally stronger, better, or more deserving than other groups; hence the shift of authoritarians further to the right of the political spectrum.
Here’s the danger: A lot more persons from the political middle can join the authoritarian column and end up supporting the Donald. How? As folks on the hard right continue to fear monger, and fan the flames of prejudice and suspicion, more Americans are expressing fears of imminent terrorist attacks (i.e., they feel threatened). For instance, MacWilliams reported that 52% of voters who expressed the greatest fear that another terrorist attack will happen in the US in the next year were authoritarians. That is, they are susceptible to Trump’s campaign themes and messaging.
Scary? Yes. Inevitable? No. But we must stop thinking of Trump supporters as a “small, pitiful band” of older, uneducated white malcontents, and acknowledge that Trump is riding the crest of a burgeoning wave of authoritarianism.
https://www.psychologytoday.com/blog/intersections/201601/trump-presidency-can-happen-and-here-s-why
Oriana:
This article goes back to January of 2016 — it seems back then it was still possible for some to think of Trump’s supporters as a “small” band, rather than a widespread movement. How naive that seems now that Trump has become not just the “frontrunner,” but the Republican nominee. Now almost everyone will admit that it would take just one terrorist attack similar to the shootings in San Bernardino to elect Trump. In fact such an attack wouldn’t even have to be on US soil. The secret of Trump's appeal? It's as basic as the fear of death.
Trump, the star of his own reality TV show, becoming president? Let’s remember that some dismissed the possibility of Ronald Reagan, an actor, and a mediocre actor at that, becoming president.
And it’s not just about macho authoritarianism and racism. Trump does happen to have a powerful message that appeals to workers who feel insecure about their jobs. Supporters of free trade may disagree with that message, but there it is, with Trump as the unlikely champion of the American working class. And he constantly presents himself as someone who can’t be bought, independent of the power of lobbyists.
Look at how many times we've said it before: Trump is a joke, he can't last, he's just too revolting. He was supposed to drop out any time. Instead he became the nominee. So it was all wishful thinking after all — we refused to believe that he could say anything at all, e.g. that he could go out into the street and shoot someone, and still not lose any votes -- that he could say anything, do anything, and get away with it. And time after time, he did get away with it, winning state after state. True, in the last few months, when people will be presented with the stark choice, maybe the psychological climate will be different, and maybe even those with the lowest IQ and least education can see it's all talk and ego and bs, and Trump can't really bring back the lost jobs or defeat ISIS or anything — but all bets are off.
The main bet here would be on the appeal to reason versus the appeal to emotions. Appeals to the jury are emotional, the stock market is “emotional,” and people vote mainly on the basis of whether or not they like the candidate — or so the research people have told us. People seem to decide early on if they like a candidate, and stick to it, defending their preference against rational arguments.
But then I also cling to this — though it may be true only for a small minority — religion is supposed to be 100% emotional, yet there are many examples, myself included, where the longing for the afterlife was as strong as anyone's, but after much tormented thinking in one moment [sic: it was literally a moment] the rational mind took over and said, look, this is all mythology, this is all ancient fairy tales, all made-up, every single god, every single religion.
And the emotional longing — along with very real fear of hell — was completely impotent against this insight. So reason CAN prevail. Can it prevail also in this election? Can a candidate with the least emotional appeal win because her opponent — for all the wild enthusiasm he arouses in his fans — is just too evil (racisim) and incompetent?
Re: the Nazi similarities — there is one saving factor, and that is, the US is not really in economic collapse the way Germany was in the Thirties. There is no out-of-control inflation etc. There is a lot of rage — but it simply may not be enough as long as there isn't the economic collapse — or, as some argue, another serious terrorist attack to fan the fear.
*
As an example of stunning courage in opposing authoritarianism, this
https://www.youtube.com/watch?v=4y5a7VJhrZA
This Egyptian human rights activist knows that his life is in danger; all the “apostates” like him, Salmon Rushdie, Ayaan Hirsi Ali and Wafa Sultan — and a surprising number of others (surprising only because of the danger) — are what the Soviet dissidents used to be. They are the real heroes of our times.
Ayaan Hirsi Ali
YOUNG VOTERS, IDEALISM, AND THE OUTSIDER POSITION
Voters between the ages of 18 and 29 are in the process of becoming adults, but don’t yet associate themselves with established centers of power. As such, they look at life through a different set of lenses than their older counterparts who have already formed roles in society—from powerful to powerless.
1. Challengers of the Status Quo
Young adults are part of the in-between generation. Unlike their older adult counterparts who have accepted aspects of democracy they may not like or agree with, younger adults have a psychological need to challenge the status quo. In fact, it’s an important role youth play in all democratic societies. Leaders who earn the youth vote are successful at communicating passionate messages of hope and change. Those leaders connect emotionally with youth in ways that help these younger voters feel seen, heard, and understood.
2. Believers in the Common Good
Voters between 18 and 29 are evolving from the self-focus of their teen years to a felt sense of the common good. They have reached a stage of development where they believe laws should be changed to meet the needs of the greatest numbers of people. While their voting decisions are often issue-related, those decisions are not based on the more narcissistic reasons. Young voters care about student debt and single-payer health care because those issues connect emotionally with a young adult’s sense of ethics and fairness for all. Candidates who earn the youth vote in today’s democracies connect with young people on issues related to social justice and equality.
3. Speakers of Truth to Power
Cognition and emotional judgment change as young people move through stages of moral development. Young voters are generally more idealistic than older voters and have strong moral-ethical convictions tied to their civic identities. At this stage of life, they feel empowered to speak out on issues they believe in, even when those in power hold different beliefs. Political candidates who are not members of the powerful elite often connect with youth in ways that no other candidates can, solely because of their perceived outsider positions.
4. Followers of Role Models
Young adults are influenced by role models, and the qualities they associate with them are often reflected in the candidates for whom young people cast their votes. In the research study I conducted for my book, Tomorrows Change Makers: Reclaiming the Power of Citizenship for a New Generation, five qualities of role models stood out. Candidates are viewed as role models by young people when they possess 1) a capacity to infect youth with their passion; 2) a clear set of values and an ability to live those values in the world; 3) a focus on others rather than themselves; 4) an acceptance of people different from themselves; and 5) an ability to overcome obstacles in their lives. Leaders who earn the youth vote show many or all of these characteristics that are valued by young adults.
5. Advocates for Ethics
Probably at no other time in their adult development, twenty-something voters exhibit a high awareness of ethics and ethical behavior. Ethical problems related to political candidates have been shown by research to be catalysts for decision-making by young adults. Leaders who are able to insulate themselves from perceived ethical failures will appeal most to youth.
https://www.psychologytoday.com/blog/the-moment-youth/201605/the-psychology-behind-how-young-people-vote
And you don’t have to be young to feel like an outsider, and not identify with the power elites. One of the most interesting experiences I've had was teaching creative writing in prisons. "Poets always identify with the prisoners," a supervisor told me. "It's the outsider status.”
Aside from that, there is youth’s hunger for idealism. It’s a rare politician who feeds that hunger. Sometimes it seems that JFK was the last president who knew how to inspire the young.
WHILE RECITING THE APOSTLES’ CREED (favorite moments of truth series)
“I'd been on patrol, and I went to church that evening. It was an Anglican church, quite high church (I always liked the ceremony) and I was standing up, reciting the Apostles' Creed (which to this day I could recite word for word) and suddenly I realized I didn't believe a word of it, and probably never had. And I never went back to church after that, except for the occasional funeral.”
~ Arthur Hailey, in Walden Book Report, July 1998
Oriana:
Hailey reports this moment in truth happened to him in Cyprus in 1944. He was a RAF pilot. Even this proximity of death did not disable his cognitive function: when the “I don’t believe a word of it” insight was ready, there was no stopping it.
It’s different for everyone, but going over a familiar text and suddenly seeing it in a completely different light is one way. I didn’t get to see a single page of the bible until after I’d already had my epiphany: “it’s just another mythology.” I was certainly familiar with the Genesis story of creation, but seeing the actual text, phrase by archaic phrase, creation in six days, the solid firmament like a tin roof with the “waters above the firmament,” Eve from the rib, etc, made me think, again but with deeper conviction, “This really IS mythology.” And not subtle literary mythology, but big time archaic mythology, hopelessly tribal. Instantly I knew it could not be saved by a metaphoric reading. It could only be seen as yet another creation myth, not even as entertaining as some of them are.
One of my two favorite stories of suddenly seeing the light. It happens in so many different ways. "Apostasy is autobiography", I am tempted to say, but that's my love of alliteration speaking. Yet I am truly fascinated. Sometimes there is one distinct moment of realization, and sometimes a very gradual process of shedding the beliefs and gaining more and more clarity. And the journey isn't over after that. More reasons, more answers come like waves lapping toward the shore.
My #1 favorite is the priest who was re-reading the proofs of the existence of god before mass, just to pass the time — something he’s done many times before — but this time, suddenly, he saw that every single proof was invalid.
(Apostasy! How I love that word — that lightning flash of reason, and the courage that has to follow.)
In general, I am fascinated those moments when the voice of reason suddenly prevails — not just the moment when one sees that religion is man-made, but, for instance, the moment when I saw that it was too late in life for depression. It’s a common belief that reason is basically impotent against emotions — that you can count on reason prevailing all of a sudden. And yet now and then that’s precisely what happens: reason prevails, and it can indeed be “all of a sudden.”
BETHLEHEM IN JUDEA DID NOT EXIST IN JESUS’ LIFETIME
Archaeological excavations have shown that Bethlehem in Judaea likely did not exist as a functioning town between 7 and 4 B.C., when Jesus is believed to have been born. Studies of the town have turned up a great deal of Iron Age material from 1200 to 550 B.C. as well as material from the sixth century A.D., but nothing from the first century B.C. or the first century A.D. Aviram Oshri, a senior archaeologist with the Israeli Antiquities Authority, says, “There is surprisingly no archaeological evidence that ties Bethlehem in Judaea to the period in which Jesus would have been born.
Many archaeologists and theological scholars believe Jesus was actually born in either Nazareth or Bethlehem of Galilee, a town just outside Nazareth, citing biblical references and archaeological evidence to support their conclusion. Throughout the Bible, Jesus is referred to as “Jesus of Nazareth,” not “Jesus of Bethlehem.” In fact, in John (7:41- 43) there is a passage questioning Jesus’ legitimacy because he’s from Galilee and not Judaea, as the Hebrew Scriptures say the Messiah must be.
“If the historical Jesus were truly born in Bethlehem,” Oshri adds, “it was most likely the Bethlehem of Galilee, not that in Judaea. The archaeological evidence certainly seems to favor the former, a busy center [of Jewish life] a few miles from the home of Joseph and Mary, as opposed to an unpopulated spot almost a hundred miles from home.” In this Bethlehem, Oshri and his team have uncovered the remains of a later monastery and the largest Byzantine church in Israel, which raises the question of why such a huge house of Christian worship was built in the heart of a Jewish area. The Israeli archaeologist believes that it’s because early Christians revered Bethlehem of Galilee as the birthplace of Jesus.
“There is no doubt in my mind that these are impressive and important evidence of a strong Christian community established in Bethlehem of Galilee a short time after Jesus’ death,” he says. Oshri, however, doubts that Bethlehem of Galilee will be recognized as the birthplace of Jesus any time soon. “Business interests are too important,” he says. “After all this time, the churches do not have a strong interest in changing the Nativity story.”
http://ngm.nationalgeographic.com/geopedia/Bethlehem
Oriana:
Much had to be fabricated in order to make the historical Jesus (if he did exist, he was likely one of the many itinerant apocalyptic preachers, possibly a schizophrenic cult leader) conform with the prophecies about the Messiah. Even with all these contortions (like the invention of the census that allegedly required everyone to return to the town of their birth — no census works that way), it was not a good fit.
Slaughter of the innocents, flight into Egypt, reading in a (non-existent) synagogue in Nazareth, people there wanting to throw him off a (non-existent) mountain — I needed to have those things debunked detail by detail before words like FICTION and MYTHOLOGY could have a full effect.
If I had known even a fraction of what I know now — e.g. that Mark, Matthew, Luke, and John were not the real names of the evangelists, and Mary and Joseph were not the real names of the parents of Jesus — or that Bethlehem in Judea wasn't a town in the times of Jesus — I would have liberated myself much sooner. I wouldn’t have had those moments of terror — what if I'm wrong and thus destined for the jaws of hell? But that information wasn’t really accessible when my doubt became serious.
A silver star marks the spot where Jesus was allegedly born
DOES THE BRAIN ACTUALLY STORE AND PROCESS INFORMATION?
“No matter how hard they try, brain scientists and cognitive psychologists will never find a copy of Beethoven’s 5th Symphony in the brain — or copies of words, pictures, grammatical rules or any other kinds of environmental stimuli. The human brain isn’t really empty, of course. But it does not contain most of the things people think it does — not even simple things such as ‘memories’.
We don’t store words or the rules that tell us how to manipulate them. We don’t create representations of visual stimuli, store them in a short-term memory buffer, and then transfer the representation into a long-term memory device. We don’t retrieve information or images or words from memory registers. Computers do all of these things, but organisms do not.
Computers, quite literally, process information – numbers, letters, words, formulas, images. The information first has to be encoded into a format computers can use, which means patterns of ones and zeroes (‘bits’) organized into small chunks (‘bytes’). On my computer, each byte contains 8 bits, and a certain pattern of those bits stands for the letter d, another for the letter o, and another for the letter g. Side by side, those three bytes form the word dog. One single image – say, the photograph of my cat Henry on my desktop – is represented by a very specific pattern of a million of these bytes (‘one megabyte’), surrounded by some special characters that tell the computer to expect an image, not a word.
Forgive me for this introduction to computing, but I need to be clear: computers really do operate on symbolic representations of the world. They really store and retrieve. They really process. They really have physical memories. They really are guided in everything they do, without exception, by algorithms.
Humans, on the other hand, do not – never did, never will. Given this reality, why do so many scientists talk about our mental life as if we were computers?
In his book In Our Own Image (2015), the artificial intelligence expert George Zarkadakis describes six different metaphors people have employed over the past 2,000 years to try to explain human intelligence.
In the earliest one, eventually preserved in the Bible, humans were formed from clay or dirt, which an intelligent god then infused with its spirit. That spirit ‘explained’ our intelligence – grammatically, at least.
The invention of hydraulic engineering in the 3rd century BCE led to the popularity of a hydraulic model of human intelligence, the idea that the flow of different fluids in the body – the ‘humours’ – accounted for both our physical and mental functioning. The hydraulic metaphor persisted for more than 1,600 years, handicapping medical practice all the while.
By the 1500s, automata powered by springs and gears had been devised, eventually inspiring leading thinkers such as RenĂ© Descartes to assert that humans are complex machines. In the 1600s, the British philosopher Thomas Hobbes suggested that thinking arose from small mechanical motions in the brain. By the 1700s, discoveries about electricity and chemistry led to new theories of human intelligence – again, largely metaphorical in nature. In the mid-1800s, inspired by recent advances in communications, the German physicist Hermann von Helmholtz compared the brain to a telegraph.
Each metaphor reflected the most advanced thinking of the era that spawned it. Predictably, just a few years after the dawn of computer technology in the 1940s, the brain was said to operate like a computer, with the role of physical hardware played by the brain itself and our thoughts serving as software. The landmark event that launched what is now broadly called ‘cognitive science’ was the publication of Language and Communication (1951) by the psychologist George Miller. Miller proposed that the mental world could be studied rigorously using concepts from information theory, computation and linguistics.
The information processing (IP) metaphor of human intelligence now dominates human thinking, both on the street and in the sciences. There is virtually no form of discourse about intelligent human behavior that proceeds without employing this metaphor, just as no form of discourse about intelligent human behavior could proceed in certain eras and cultures without reference to a spirit or deity. The validity of the IP metaphor in today’s world is generally assumed without question.
No one really has the slightest idea how the brain changes after we have learned to sing a song or recite a poem. But neither the song nor the poem has been ‘stored’ in it. The brain has simply changed in an orderly way that now allows us to sing the song or recite the poem under certain conditions. When called on to perform, neither the song nor the poem is in any sense ‘retrieved’ from anywhere in the brain, any more than my finger movements are ‘retrieved’ when I tap my finger on my desk. We simply sing or recite — no retrieval necessary.
Because neither ‘memory banks’ nor ‘representations’ of stimuli exist in the brain, and because all that is required for us to function in the world is for the brain to change in an orderly way as a result of our experiences, there is no reason to believe that any two of us are changed the same way by the same experience. If you and I attend the same concert, the changes that occur in my brain when I listen to Beethoven’s 5th will almost certainly be completely different from the changes that occur in your brain. Those changes, whatever they are, are built on the unique neural structure that already exists, each structure having developed over a lifetime of unique experiences.
This is why, as Sir Frederic Bartlett demonstrated in his book Remembering (1932), no two people will repeat a story they have heard the same way and why, over time, their recitations of the story will diverge more and more. No ‘copy’ of the story is ever made; rather, each individual, upon hearing the story, changes to some extent — enough so that when asked about the story later (in some cases, days, months or even years after Bartlett first read them the story) – they can re-experience hearing the story to some extent, although not very well.
even if we had the ability to take a snapshot of all of the brain’s 86 billion neurons and then to simulate the state of those neurons in a computer, that vast pattern would mean nothing outside the body of the brain that produced it. This is perhaps the most egregious way in which the IP metaphor has distorted our thinking about human functioning. Whereas computers do store exact copies of data – copies that can persist unchanged for long periods of time, even if the power has been turned off – the brain maintains our intellect only as long as it remains alive. There is no on-off switch. Either the brain keeps functioning, or we disappear.
We are organisms, not computers. Get over it. Let’s get on with the business of trying to understand ourselves, but without being encumbered by unnecessary intellectual baggage. The IP metaphor has had a half-century run, producing few, if any, insights along the way. The time has come to hit the DELETE key."
https://aeon.co/essays/your-brain-does-not-process-information-and-it-is-not-a-computer
Oriana:
I agree that the computer analogy has become cumbersome — especially the idea of an exact digital transcript. But I like the idea of "emergent phenomena" — or perhaps “emergent patterns” is a better term. Perhaps the firing of neurons is more like the flight of migrating birds -- an emergent pattern. Once something starts it, it just unfolds. I like what Courtney Hilton wrote in her comment: "Yes it is obvious that Beethoven's Fifth isn't stored in the brain in any objective sense. But indeed there are emergent patterns in the brain, which gives us the capacity, for example, to play it. The important bit is that these patterns have no meaning outside of the brain, i.e. one couldn't meaningfully 'decode' these patterns into the sheet music for Beethoven's Fifth.”
Perhaps what is stored is an “activator.” I'm not prepared to speculate about just how that would translate: certain proteins, or the ways they are folded? So many fascinating questions.
ending on beauty
A little farther
we will see the almond trees blossoming
the marble gleaming in the sun
the sea breaking into waves
a little farther,
let us rise a little higher
~ Giorgos Seferis
the “palm at the end of the mind”?
THE LUCKIEST THING
Explain that you live between two great darks, the first
With an ending, the second without one, that the luckiest
Thing is having been born, that you live in a blur
Of hours and days, months and years, and believe
It has meaning, despite the occasional fear
You are slipping away with nothing completed, nothing
To prove you existed.
~ Mark Strand, from "The Continuous Life”
How badly we need to undo the saying that the best thing is to have never been born — the so-called “wisdom of Silenus” (a minor rustic god of wine). And here at last is the antidote to the original anti-life venom: the contrary statement we need to hear again and again in order to fully love life: “the luckiest thing is having been born.” Yes. Given the chances against it, it’s a miracle to be celebrated every day.
“The luckiest thing is having been born” — it took over 20 centuries for a poet to finally admit this. True, we also have Rilke’s “Just to be here is magnificent.” And in Mario Puzo’s novel The Godfather, the dying Vito Corleone’s last words are: “Life is so beautiful.” (Oddly enough, that’s not in the movie — was the statement regarded as too controversial?)
In a similar vein:
“The day before singer Lou Reed died, according to his wife composer/performance-artist Laurie Anderson, he was floating in his swimming pool at home, and said, “You know, I am just so susceptible to beauty.”
Anderson said of it: “I think of that every day. How to open yourself to the world. And really appreciate it. Because boy, this is it. This is all we have. Right here. So you better pay attention. After Lou died, I was not expecting that at all—this feeling of being dazzled by life.” ~ Gregg Levoy
But we’ve been waiting for someone to simply to contradict the original statement, so famous and so poisonous.
The great task of poetry is to praise, to undo the great load of the hatred of this world, this life, the body, being human (especially being a woman), the stages of life, each one an adventure . . .
Silenus, marble sarcophagus, 2nd century c.e.
COPERNICUS, THE FOREFATHER OF THE “DEATH OF GOD” (the end of the era of the “small universe”)
From the moment when Copernicus announced that the earth, which had been the footstool of God, was but a minimal fraction of the universe, the old tribal deity began to die. ~ Will Durant, “Fallen Leaves”
Durant shows a keen insight here. We can indeed trace the beginning of the “death of god” back to Copernicus. The old gods go back to the “small universe” era. They lived on mountain tops, or in castles in the clouds. Even the earth was imagined as much smaller than today we know it to be. The stars were just little lights high up, but not too high, lights that went on so it wouldn’t be too dark when the sun went somewhere for the night.
It was also not yet quite as absurd to imagine that all that existed was merely a stage for the drama of the human life.
Durant also comments, ahead of his time, on how religion first begins to die “in the educated classes,” while thriving in poverty and hardship:
“Historically ‘underprivileged’ nations and classes have sought consolation in supernatural beliefs, dignifying themselves by association with mystic powers, and tempering the sting of poverty with hopes of a better fortune in another world. Chronic illness, deformity, or grief may serve like poverty to generate such creeds.”
At the same time Durant, who didn’t believe in a personal god, greatly admired the ethics of Christ.
“[I hope] that the love which radiated from Christ will overcome the fearful intolerance of empowered creeds.” ~ “Fallen Leaves”
Though my atheism is rooted in literature (via mythology) more so than science, my love of paleontology and cosmology also provoked disquieting thoughts. The universe seemed to be much too large and too ancient to be only about humans. Also, it didn't seem to need god -- for instance, it could be posited that the universe always existed, going through cycles, energy into matter, etc.
For me the notion that religion — any religion — was man-made mythology was sufficient to invalidate it as any kind of absolute truth. But I realize that for the average educated person the scientific worldview is a more powerful argument. One of the purposes of religion was to explain the universe. Without the development of science, atheism would be rare.
And to think that the gigantic shift in human thinking began with the idea that the earth moved around the sun, and not the other way.
Copernicus House, Toruń (I was born in a charming ancient river-port town near Toruń)
THE QUESTION THAT SETTLES IT: IS THE WORLD GETTING BETTER OR WORSE?
Imagine that you are about to be born, but you do not know what gender, ethnicity, or socio-economic status you will have when born. Nor do you know whether you will have a physical or mental disability. With that in mind, would you rather be born in North America in 1800, 1900, or 2000?
My students invariably get the point. They’d rather be born in our current day. And the reason is clear: western society on the whole is a far kinder place–and as Rawls would say, a more just place–today than it was 100 or 200 years ago. To put it another way, attitudes which were commonly held on gender, ethnicity, socio-economic status, and disability a century ago would shock most people today.
Consider one example. I have a friend who has severe scarring on his face from burns he suffered in a fire when he was a child. Up until the 1970s several American cities had so-called “ugly laws” that would give business establishments the right to withhold service from “such people” under the noxious assumption that these folk might make other patrons uncomfortable.
Think about that. Forty years ago a waitress at Denny’s could refuse to serve a customer with scarring simply because the waitress found the customer too ugly.
The overall trajectory is clear: the world is becoming a more humane, civil place. And for that we should be thankful.
http://www.patheos.com/blogs/unfundamentalistchristians/2016/05/dear-christian-the-world-isnt-getting-worse-in-fact-its-probably-getting-better/
Oriana:
That fundamentalist Christians are always assuming the world is getting worse is not surprising because they rejoice in it as a sign of the Last Days. They can barely wait for the Apocalypse. But that many liberals are stuck in the idea that everything is getting worse and worse -- that's just sheer ignorance. Not that we don't have huge problems — certainly. But just imagine being born in 1800, or even 1900, esp as a woman in a poor family — the horror, the horror.
Lake Powell, photo: Preston Roulette
THE CHARISMATIC INTROVERT
It can hardly be overemphasized that what constitutes the essence of the charismatic speaker or leader isn’t really their outgoingness at all. It’s their passion, their conviction, their sincere commitment to a belief, cause, or concern. And these enticing qualities have almost nothing to do with how temperamentally introverted or extroverted they might be.
When such individuals address others, it’s the warmth and strength of their emotion—or the power of their eloquence—that inspires and motivates the audience (whether it be 1 or 1,000). Moreover, since they tend to delve deeply in to things, their studious intake of the subject they’re preparing to present may manifest vocally with quite as much intensity, even intimacy, as would be the case with an extrovert.
Although introverts are wired at birth to prefer solitude over socializing, and listening over talking, they’re not—by any reasonable definition—deficient in the art of self-expression or the power to influence others. For inherent in almost all of them is an “extroverted aptitude” altogether capable of charming and captivating those around them. As long as they have the knowledge, will, desire—and passion—to do so, introverts carry within them the same potential for charisma as do their more “out there” counterparts.
https://www.psychologytoday.com/blog/evolution-the-self/201405/the-charismatic-introvert?tr=HomeEssentialsTh
tourmaline crystal
IMAGINARY SAINTS: CATHERINE OF ALEXANDRIA
“Catherine of Alexandria was removed from the calendar of saints by the Catholic Church in 1969, along with 200 others, due to “insufficient evidence of historicity”— a phrase meaning “they were just pretend.” Catherine was at the bottom of the class, one of 46 saints on the list whose existence was even more strongly declared “seriously doubtful.” Others removed at the time were St. Christopher and St. Valentine.
Though the church itself had decided Catherine was almost certainly a myth, there was little felt need to make this known. From museums to Catholic schools and websites, her story is still dutifully recounted in elaborate historical detail.
Colleges and cathedrals named for known fictions continue to act as if nothing has changed. Millions still pray for the intercession of characters who the hierarchy knows are no more capable of hearing them than Daenerys Targaryen.
I asked several faculty colleagues [at St. Catherine College in Minnesota] if they knew about Catherine’s quiet demotion. None of the non-Catholics did, and all were properly floored by the news. Of the Catholic faculty I asked, some knew and some didn’t, but all of them shrugged. Not one had a problem with a known fiction being presented to the masses as true.
Here’s where it gets much worse.
Catherine’s story tells of a noble Christian woman steadfastly defending her beliefs, then being tortured and executed on a wheel by a pagan king for rejecting the pagan religion. But scholarship suggests that “Catherine’s” biography was most likely borrowed whole cloth from the actual philosopher, mathematician, and astronomer Hypatia of Alexandria, who, according to Socrates Scholasticus, “made such attainments in literature and science, as to far surpass all the philosophers of her own time.”
So why didn’t the early church just make Hypatia a saint? Because of a sore spot in her rĂ©sumĂ©: Hypatia was not a Christian.
So an imaginary double was created in her stead, christened Catherine, and martyred dramatically, if ironically, for the one attribute the real person did not possess: Christian faith.
The irony goes deeper still: Scholasticus and other credible contemporaries report that Hypatia was murdered by a group of Christian monks, an assassination later applauded in the Chronicle of John, Bishop of NikiĂ» for “destroy[ing] the last remains of idolatry in the city.”
This cements the demented irony of the eventual identity theft: a pagan woman murdered by Christians for her beliefs was transformed into a Christian woman murdered by pagans for her beliefs.
When the time came to teach the final section of my critical thinking course at the college, I included this question on the topic choices for group research: Did Saint Catherine of Alexandria exist?
The students were puzzled by the question. Not one had ever heard that their school might be named for a nonexistent person. But one group took the topic.
It took very little time for them to find the whole story, Hypatia and all. In the process, they learned something I hadn’t known—that “Saint Catherine” was returned to the church calendar in 2002, not as a result of new evidence, but in recognition of her “usefulness as a symbol,” an iconic figure to emulate and to admire.
In the Q&A after the presentation, the (mostly Catholic) students, to their credit, erupted in outrage.
If it were openly acknowledged that the college is named for a fictional character, one student said – if we were all gathered together behind the Wizard’s curtain – that would be different. Instead, they were asked to invoke her concretely, to literally plead for her guardianship of our college, her blessings on us all.
And what does it say about humanity, another asked, that we have to create fictional heroes? Is it even good to require perfection, virginity, and martyrdom before we can admire someone?
At the root of the discussion was a queasy feeling that either blithe incuriosity or willful patronizing was at work here, that the love of these stories had at some level mattered more than the truth. The truth certainly mattered to these students – not whether something was “culturally true,” or “that-which-is-true-but-never-happened,” or any of the other concepts that ought to go find themselves a word that isn’t already defining something else. These students wanted to know the truth, definition 1, about the name that would be on their diplomas.
Finally, someone asked: “And what about Hypatia?”
Yes. What about Hypatia? What does fiction do to the reality it supplants? What about this actual flesh-and-blood woman of actual accomplishments, cast aside in favor of a cardboard cutout? Isn’t there something especially vile about what the mythic Saint Catherine does to the real human person Hypatia?
http://www.patheos.com/blogs/secularspectrum/2016/03/when-saints-go-missing-do-they-make-a-sound/?utm_source=SilverpopMailing&utm_medium=email&utm_campaign=Atheist%2003%2023%2016%20(1)&utm_content=&spMailingID=50996586&spUserID=MTEwMzMwODA5NzI1S0&spJobID=883101486&spReportId=ODgzMTAxNDg2S0#sthash.mPRtUGtP.dpufThe crowning irony: the imaginary saint, St. Catherine of Alexandria, was regarded as the patron of philosophers
KETOGENIC DIET SHOWS PROMISE IN THE TREATMENT OF NEUROLOGICAL DISORDERS
In Alzheimer’s disease results from clinical studies have been inconclusive but promising. In one randomized double-blind study, Alzheimer’s patients on a ketogenic diet showed significant cognitive improvement compared to patients not following the diet. In cell cultures, ketone bodies have been shown to be effective against the toxic effects of beta-amyloid, a key pathological feature of the disease. The diet may also help reduce oxidative stress and enhance mitochondrial function.
Mitochondrial dysfunction is also thought to play a contributory role in Parkinson’s disease, with its characteristic movement and cognitive impairment. In one small clinical trial of five patients with Parkinson’s disease, patients on the diet reduced their scores on the Unified Parkinson’s Disease Rating Scale by 43.4%.
The diet may also prove helpful in the treatment of Amyotrophic Lateral Sclerosis, or ALS. Mitochondrial dysfunction is also likely to play role in this devastating disease of the motor neurons. Though human studies have not yet been performed, mouse models of the condition have yielded promising results. In these mouse models, animals given a ketogenic diet showed significant motor improvements compared to animals on a normal diet.
Researchers speculate that the diet may prove helpful in even more neurological conditions, such as recovery from stroke and brain injury. Though the diet is an accepted treatment for refractory epilepsy, in other neurological conditions more clinical trials are needed to see if the diet is truly efficacious. If borne out, the diet may open another therapeutic avenue for the treatment of these diseases.
http://brainblogger.com/2013/04/10/ketogenic-diet-for-epilepsy-and-other-neurological-disorders/
ending on beauty
In becoming a photographer I am only changing medium. The essential core of both verse and photography is poetry. ~ Minor Martin White (1908-1976)
CLOUDS
From the mountains thick-pelted
they rise; the prairies are marching,
the silver heads of bison bow.
The real heaven is passing by,
the wind on their backs,
along their flanks the sun’s burnish.
Myths surge, and animals yet uninvented,
darkening into existence —
Let me walk with them — may my eyes
be vast like that —
let me name them before the last sky
marbles in mid-flow, still far from the entrance.
~ Oriana © 2016
Minor Martin White: Barn and Clouds, 1955
Flamingo, photo by Paige Klee
I CAN BE A POET ONLY IN ENGLISH
Because the words might mean anything.
If you told me that table means
chair, I’d sink into a cushioned
table and lean back, on the deck
of The Titanic, which means
luxury before a fall —
Iceberg happens, but who could deny
that merde might mean
the highest grade of emerald?
In Polish “to cross yourself”
sounds almost like “to say goodbye.”
How could I write in a language
where you cross yourself before
you travel, step into water,
or commit suicide —
as if it’s not enough
to lose the future tense,
intended only for the young.
*
The Germans panicked when after the war
they got parcels from America marked
GIFT. In German, Gift means poison.
The Old Germanic root of English “gift”
is giftu, poison.
Did the frost-bound Anglo-Saxons
guess, like the marble Greeks
with their pharmakon,
that a little poison could be a cure?
— though Socrates may have gone
too far, toasting the gods with hemlock,
saying death is no misfortune.
*
When a friend tells me, I’ll drop
you off, it sounds — Splat! —
like a misfortune, but it is a gift. Even
when we pray, A gift is God in action,
we cannot know if it’s a gift or poison
until later, from the vanishing point.
As we step into the dangerous
waters of memory, having failed
to cross ourselves,
let’s remember the primordial
meaning of “gift” was bride-price.
That’s why we toast To Life,
that dazzling and expensive bride —
a cup of kindness or poison
to cure us of this constant vanishing.
~ Oriana © 2016
It’s the first part of the poem that explains why “I can be a poet only in English.” It’s about the gift of emotional distance conferred by a second language.
The first line used to be “because the words don’t mean anything.” A friend corrected me, pointing out that logically I can’t say that. What I meant was “the words don’t mean anything at the emotional level.” If the word for excrement changed its meaning to “great treasure,” I would have no trouble adapting. Only native speakers would.
But imagine a native speaker of English now live and function in French. She knows what the word “merde” means and avoid tossing it around so as not to appear vulgar. But if instead it meant “compassion,” she’d have no trouble saying, “Merde is the supreme human value.”
I’ve often wondered what kind of poet I’d have become if I’d stayed in Poland — assuming I’d have become a poet at all (probably so, unless I’d never managed to dump Catholicism; then I’d have been too repressed to write). If I stayed in Poland and wrote in Polish as a secular person, a reasonably free mind, then I suspect that yes, I would have become a poet, but an entirely different kind of poet.
I was always playing with the language — as a young teen I loved archaic Polish, for instance, and also loved inventing new words. I didn’t especially like traditional Polish poetry (we were not exposed to the great post-war minimalist poets), but somehow I managed to discover a language poet, Miron Bialoszewski. My guess is that I would have become a language poet, fascinated not by the meaning of words but by their sound and playful transmutations. I might also try to write “intellectual” poems, in imitation of Szymborska. Both language poetry and intellectual poetry rely on detachment, distance. My Polish poetry would probably be heavy on wit and irony.
But personal narratives? Would I be able to write personal narratives in Polish? I can't imagine it. Too intimate, in both positive and negative sense. In the negative sense, for me Polish was the language of politeness, and that involves considerable inhibition. Too much deference, too many taboos.
At the time I wrote “I Can Be a Poet Only in English,” I fully meant it: how could I ever write a love poem in Polish? Yes, it is for me the language of the heart — and I can’t imagine writing a love poem in Polish precisely because it’s the language of the heart. Poetry requires distance, so instead of being flooded with emotions, one can focus on how to arrange words and create art — which can then touch the reader’s heart. It’s a paradox. In poetry, emotion is best expressed indirectly, through images, symbols, metaphor, brief “narrow-slice” narratives. Only then is emotion mysterious and powerful.
Wordsworth said that poetry is emotion recollected in tranquillity. That tranquillity was attainable in English, because the words “didn’t mean anything.” They were arbitrary chunks of sound. In Polish the words were too conditioned to evoke emotions.
I have experimented with translating my poems into Polish, and in one case the result was much more powerful in Polish — it’s generally agreed that Polish is more emotionally expressive than English. But the poems were all born in English since more than emotional expressiveness I needed emotional distance to speak about highly charged matters. The poems had to move away not only from the church, but, even more so, from mommy and daddy — and the Mommy and Daddy of the first culture.
And I have moved away. For me, even after decades of living in it, English remains the language of distance — and that’s gloriously liberating.
ENGLISH AS THE LANGUAGE OF EQUALITY
I’ve been listening to Malcolm Gladwell’s “Outliers.” It's better than “Blink,” even if less relentlessly documented. But it was the chapter on aviation safety that really hit home. Gladwell discusses the usual factors involved in crashes, such as bad weather and engine failure, but then he focuses on the degree to which a culture is hierarchical and deferential — a pattern reflected in speech. With special attention to Korean Airlines’ Guam crash in 1997 and the tragic black-box recording, he documents how the crew, speaking in Korean, simply did not dare correct the sleepy, dysfunctional captain. It would have been impolite.
Today the same airline has won awards for its safety. The biggest change? An American advisor was called in. The crew was forced to speak English — after being given more English lessons. Only those who became fluent in “the language of aviation” remained employed (at least according to Gladwell).
As someone who grew up in a deferential culture (at least as a “girl from a good home”) and learned to speak in a polite manner and not question authority, I understand this. When I arrived in the US, I was shocked by the bad manners, the impolite speech. At the same time, my polite expressions (I was trying to translate from Polish, prefacing requests with “Would you be so kind as to . . .”) confused people and even freaked them out. They expected short commands, skipping even “please.” This created clarity: “Open here”; “Exit.” What worked best in a crowded elevator was shouting “Four!” (I discovered this after first trying, “Would you be so kind as to press number 4?”)
But deep inside, I also felt more and more liberated. It wasn’t just that my peers were always saying fuck this and fuck that, those little blasphemies I loved. It was the egalitarianism and assertiveness of the language. Obviously an American co-pilot wouldn’t merely politely hint to the captain that he needed to pull up fast to avoid crashing.
When forced to speak English, the Korean crew had no trouble communicating with clarity, no matter how “rude” it would have sounded in Korean. And they felt comfortable being assertive.
People who speak more than one language often say that they have a different personality depending on the language. I am softer and more polite in Polish, even if I try to avoid the formal address. A friend of mine says his mother sounded harsh and bossy in English; with her Polish friends, she was soft and charming. In her case, it’s possible to argue that she wasn’t sufficiently at home in English to be able to be charming in her English-language persona. But I am very much at home in English, and yet I too sense a difference.
**
How did English become the least aristocratic language, becoming indeed probably the closest we have to the language of equality? I suspect the answer lies mainly in the diverse origins of modern English. It took in so many other languages that it was forced to simplify forms of address, the singular “you” becoming standard usage for everyone. True, it may have originally been meant as the more formal plural you, like the formal French “vous,” but because in English there is no separate grammar to differentiate the singular from the plural (that grammar fell from use not long after the time of Shakespeare), the “you” became direct, singular, and egalitarian.
In German, the respectful form of address is “Sie” — “they” — with verbs that follow reflecting the plural. “Are they satisfied with my essay?” you might address your teacher. I know it sounds funny in English, and practically insane. In Polish, it would be, “Is Mr. Professor satisfied with my essay?” — or simply an equivalent of “gentleman” or “lady.” Up to a certain age — meaning throughout high school — the teacher would reply using the singular “you,” thus maintaining the hierarchy. On my return trips I found the formal address ludicrous, stilted, and simply unbearable.
Here is the view of a Hindi speaker that I found online:
“My native language Hindi and the language I am learning, Japanese, have different pronouns and titles for giving respect to elders, authority and so on.
I hate it with a passion and do away with them as much as possible.
This is a large reason as to why I like English more than my native language.”
Another person on the Internet also commented how English felt a lot more egalitarian to her, and she also felt that her English-speaking teachers treated her in a more egalitarian manner.
To people who speak exclusively English it may not seem like much that you address everyone as simply “you,” and the person — regardless of social status, age, gender, or race, replies to you in the same manner. A five-year old says “you” to the teacher and the teacher says “you” to the child. But in fact this is revolutionary.
Does an egalitarian language automatically create an egalitarian culture? The question makes us instantly aware that there are many factors that create various degrees of hierarchy in a society. So I am forced to give a qualified answer: an egalitarian language does not by itself create an egalitarian culture — but it helps create a more egalitarian culture.
I strongly suspect that a culture can be helped by its language(s), or it can be stunted by the wrong kind of language, e.g. if you have to worry about five degrees of deference, or if it’s not clear if the action has been performed or is only intended. Gladwell gives the example of how Chinese helps children master mathematical operations — the language is already a kind of mathematics, so less translation is needed.
ARE SOME LANGUAGES BETTER SUITED TO FRANKNESS, OTHERS TO LYING AND PRETENDING?
But in some other respects, it’s English that may be liberating to the Chinese:
“In another study, Chinese-English bilinguals were asked to describe themselves in each respective language and surprisingly their self-esteem and self-description differed depending on the language they were using. In English they reported higher self-esteem and described themselves in more individualistic terms, while in Chinese they perceived themselves mainly as members of groups they belonged to.”
Yet another study mentioned that Chinese women admitted to feeling pain in English, but not in Chinese. Thus English is more likely to be a language of frankness and truthfulness, of less social anxiety. And I can easily imagine an Asian woman saying in her polite native language that she has a happy marriage. But in English it’s likely to be another story.
Again, it’s hard to disentangle language from culture, but it seems to me that highly deferential languages are the languages of lying. These may be only people-pleasing white lies, but if you’re forced to say them every day, then your mind-set is that of lying rather than telling the truth. If the language tends to vagueness and suggestiveness rather than direct expressions, then it’s almost impossible to tell the truth.
Speaking a language that’s primarily a language of politeness rather than a language of equality is like having a faint smile always smeared on your face. Language too is a mask; the languages of politeness practically demand constant pretending. The subtle message is: “You are inferior. It’s forbidden to have thoughts of your own, to think your thoughts and feelings have value.” The constant collectivist message is: “Don’t be yourself.”
Social class and gender are of course heavily involved here. Yet pondering how to render “Religion is the greatest bullshit ever told” into Polish and coming up blank (except for much weaker “nonsense”), I realized that working-class Polish men would probably not have this problem. Yet in English no one, no matter how sheltered and refined, is unfamiliar with “impolite” terms. Profanity is used freely. There is an ugliness to it, yes — but profanity is an act of liberation, defying authority and breaking taboos.
Profanity announces that there are no sacred cows. Anything can be questioned — and that means we can grasp more of the complexity of life. For instance, we no longer have to pretend that motherhood is all sunshine and rainbows. It was in English that I first heard a mother admit that at least once a day she wishes she could flush her kids down the toilet. Now imagine that the word “toilet” is taboo and a woman is forced to use silly terms like “powder room.” Euphemism has its positive uses, but being confined to it is stultifying. Languages that “call a spade a spade” — languages with a wealth of curses rather than terms of politeness — make communication of ideas and feelings easier and more truthful.
http://blog.internations.org/2011/07/different-language-different-personality/
“I AM A DIFFERENT PERSON WHEN I SPEAK ENGLISH” — DIFFERENT LANGUAGE, DIFFERENT PERSONALITY?
~ “In English, my speech is very polite, with a relaxed tone, always saying "please" and "excuse me." When I speak Greek, I start talking more rapidly, with a tone of anxiety and in a kind of rude way . . .”
~ “I find when I'm speaking Russian I feel like a much more gentle, "softer" person. In English, I feel more “harsh,” “businesslike.”
Susan Ervin-Tripp conducted a study in which she asked Japanese-American women to complete sentences she gave them in both Japanese and English. She found that they proposed very different endings depending on the language used. Thus, for the sentence beginning, "When my wishes conflict with my family . . ." one participant's Japanese ending was, ". . . it is a time of great unhappiness," whereas the English ending was, ". . . I do what I want.”
More than forty years later, Baruch College Professor David Luna and his colleagues asked Hispanic American bilingual women students to interpret target advertisements picturing women, first in one language and, six months later, in the other. They found that in the Spanish sessions, the bilinguals perceived women in the ads as more self-sufficient as well as extrovert. In the English sessions, however, they expressed more traditional, other-dependent and family-oriented views of the women.”
However, the author of this article suggests that it’s not the language itself as the cultural context that brings out different personality traits:
“Although divided on the personality issue, most respondents agreed with the fact that different contexts, domains of life and interlocutors–which in turn induce different languages–trigger different impressions, attitudes and behaviors. Thus, as bicultural bilinguals we adapt to the situation or the person we are talking to, and change our language when we need to (see here), without actually changing our personality. One respondent put it very nicely: "... it is not a personality change but simply the expression of another part of our personality that is not shown as strongly in our other language(s)”.
Future research will hopefully use both explicit and implicit tests of attitudes and self-concept as suggested by yet another respondent. This is all the more important as it could be that not everyone is equally apt at judging that they "feel different" when they change language. In a recent study, researcher Katarzyna Ożańska-Ponikwia examined why some people report feeling different while others do not. She asked some 100 bilinguals made up of people who had grown up speaking two languages, immigrants who acquired their second language later on in life, as well as students who had stayed in a foreign country for an extended period of time, to give answers to two personality questionnaires and to give scale values to statements such as, "I feel I'm someone else while speaking English", or "Friends say that I'm a different person when I speak English".
What she found was that only people who are emotionally and socially skilled are able to notice feeling different. According to her, some people do not report changes in their behavior or in their perception or expression of emotions when changing language, not because they do not exist, but because they are unable to notice them. She speculates that it is people with above-average levels of social and emotional skills who can notice that they adapt aspects of their personality and behavior when using another language.”
https://www.psychologytoday.com/blog/life-bilingual/201111/change-language-change-personality
https://www.psychologytoday.com/blog/life-bilingual/201212/change-language-change-personality-part-ii
This "medieval Yoda" was found in a 14th illuminated manuscript known as Smithfield Decretals. It illustrates the story of Samson (don't ask). Oh lovable little demons! Perhaps the human imagination is not infinite after all, but keeps circling around certain patterns.
Oriana:
I don’t have a definite answer here, but I think this much can be safely said: there is a lot of learning and adaptation that goes along with a language. To me the speakers of English (whether British or American) were dramatically different, and that no doubt brought out different aspects of my personality.
Emotional distance was much easier to me in English — which may come as a surprise to my American friends, who see me as quite emotionally expressive. But words in English don’t provoke a strong emotional reaction in me (simple Pavlovian conditioning seems to be at work here) so I can say (or write ) virtually anything in English — including words whose equivalent would never pass my lips in Polish (don’t forget that I was a “girl from a good home”). This emotional detachment from the words in English gives me a feeling of greater freedom to say anything — but because I am also aware that there could be consequences, I can be more controlled and calculating, depending on the context.
Here are more responses from the Internet:
~ I find that when I'm talking in Arabic, even when I'm not in an Arabic country, I'm a lot shyer and a lot more reserved. I'm also more deferential.
On the other hand, in English I'm a lot more talkative and assertive.”
~ I am equally bilingual in English and Malay yet I have extreme difficulty translating between the two languages. English is a very direct language, while Malay is prone to poetry, analogy, indirect or double meanings.” (Oriana: A language so ideally suited to poetry is the opposite of what you want for aviation safety.)
~ I’ve been living in Japan for 8 years now, and I really noticed a big difference when some Japanese women very fluent in French, and who have lived in France for a while, speak in French or in Japanese.
It might look or sound like a difference of personality or behavior but I think it’s just a cultural adaptation.
When these 2 women (who don’t know each other) speak in French, they’re more prone to irony and speak more frankly; they give their opinion in a more direct way and get onto topics they wouldn’t get onto when speaking in Japanese.
I should also add that the tone of their voices are really different when speaking in French or speaking in Japanese (lower in French, high pitched in Japanese) which gives even more the impression they are 2 different persons depending on the language they speak.
It's less obvious with a young Japanese boy I know when is switching from Japanese to French (well, he uses much more slang and swearwords in French!), or with a really extroverted Japanese girl I knew when she was switching from Japanese to French. Maybe because the way they behave naturally is already closer from the way French behave.
~ When I speak Indonesian, I find myself acting much more humble and shy/embarrassed (malu), and much more conscious of other's relative status, since it's very important to use polite language forms depending on status. I find myself becoming much more indirect and nuanced in my expression, yet cheerful, nosy and sociable in the manner I experience while living in Indonesia (where are you going? what are you doing? etc.) I literally feel more emotional in Indonesian.
~ Going with my native Slavic languages, I'm the proud carrier of the tradition of dark humor (not too dark but just enough to scare people with ethnic jokes). I am relatively conservative in my thoughts and my language/phrases, polite and overly shy.
When it comes to my French, I turn into the light and airy being, full of light and love, and perceive the world to be the same. I bathe in the sounds of the language, the tonality and the melody of its prose.
My English makes me to be an outgoing and positive person, a rather entertaining conversationalist and almost every other phrase that comes out of my mouth is just too witty not to share.
~ When I speak English, I feel more open because the country of origin always reminds me of a more straight-forward approach. When I speak Chinese or Indonesian, I become more reserved because the culture requires me to read between lines in conversation.
Oriana: Some people say that even their voice changes depending on the language. Others, though, insist there is no change in anything.
It’s obvious that culture and language can’t really be considered separately. I feel, however, that certain effects of learning a language later in life are real. If the second language reflects a more egalitarian culture (e..g addressing everyone as “you” rather than using deferential forms and the special grammar, indirect grammar that does with them), the speaker is likely to feel more assertive and have a higher self-esteem when using the egalitarian language. They become less timid, less passive, more frank (e.g. Chinese women will admit to feeling more pain if answering in English than in Chinese). If the second language stresses clarity rather than being vaguely poetic, the speaker is likely feel more rational, down-to-earth and businesslike.
My favorite effect, though, is emotional detachment. When the second language is learned somewhat later in life (the critical factor is not growing up with it), the words don’t have conditioned emotional meanings. As I describe in my poem, curses might as well be endearments, so there is no big taboo against using or that word. Consequently the speaker is less inhibited and more relaxed — anything can be discussed (think of the Japanese women joking and discussing a wide range of subjects — in French).
Of course it takes a sufficient mastery of another language for these benefits. “With each new language, we gain a new soul,” my polyglot Aunt Henia told me. That’s because with a new language a new culture enters our psyche, enriching us. This may be particularly true of cultures that are in some ways more advanced than our culture of origin — for instance, being more egalitarian.
Always be sincere, even if you don't mean it. ~ Harry Truman
THERE IS NOT “TRUE SELF” ANYWAY (so go ahead, create a more vital, happy self)
“I am in the eighth grade. I just won a prize in my Sunday school class for memorizing the most Bible verses. I am a committed Christian. The next week, I read Camus’s The Stranger. There is no God. Later, I score a couple of touchdowns in the big game. I’m a serious jock, and don’t need to waste my time thinking about metaphysics.
But by the time I reached middle age, I still hadn’t discovered my unwavering “I.” Was I a phony? Spineless? Neurotic?
Problem was, the more I tried to uncover this deep self, the more frustrated I became. I could talk all day about my memories, fantasies, dreams, and I could reach some conclusions about what I thought my real identity was. But once I left the therapist's couch, I found that my insights didn't translate into clarity and ease. When I faced the difficult issues of my everyday life, I was just as bewildered and tormented as I had always been.
Reluctantly, I changed psychotherapists. I say reluctantly because I was very drawn to my first psychotherapist's ideas, grounded in the depth psychology of Sigmund Freud and Carl Jung. I had long studied and admired these thinkers, and was enamored of the idea that rigorous introspection could reveal true identity.
My new psychotherapist practiced cognitive behavioral therapy, roughly based on the idea that a self is a collection of the habits that we choose to express. Our harmful habits cause our suffering; to ease the pain, create new habits. Making these habits is similar to fashioning a new narrative for ourselves, and acting out that narrative.
A philosophical school behind this kind of psychotherapy is pragmatism, as developed by William James at the turn of the twentieth century. James believed that there are no stable truths, but that truths “happen” (as Robert D. Richardson puts it in his biography of James) to those ideas that help us negotiate our world effectively, elegantly, aesthetically.
James also maintained that the habits we form to express these “truths” are what constitute a self. A psychotherapeutic corollary to this theory is that we won’t get happier by navel-gazing but simply by deciding to behave as a happy person might. Smile more, to put it crudely, and you will feel better.
Recent neuroscience bears out this idea that the “self” is a fabricated narrative. Michael Gazzaniga has shown how the left brain transforms the raw data of the right into meaningful stories. Daniel Dennett has demonstrated that the brain possess no central cognitive unit but rather processes data in several regions. What gives our being a “center of gravity” is language, with which we construct a cogent “I” to which we attribute, as we would to a character in a novel, intention, agency, rationality.
The notion that our identities are novels in the making is exhilarating. It grants us freedom, especially if we are sad, to create a more vital self. And our fictions are in fact not relative. Some are “truer” than others, if by truer we mean those narratives that are most alive, that connect us to the wide world in ways that are surprising, diverse, complex, ironic.
Though the work [of creating lively habits] is arduous, often sorrowful and fraught with failure, it is the artist’s labor, ecstatic, the struggle to transform painful, chaotic experience into orders exuberant and astonishing.”
https://www.psychologytoday.com/blog/morbid-curiosities/201505/fake-your-way-happiness
Sphinx, Neo-Assyrian, 9th-8th century bc
Oriana:
The pragmatic philosophy of William James makes perfect sense. It was revolutionary for his time, and still is: Smile, and you will feel happy. To this I’d add that beta-blockers work better than any metaphysics: a lot of misery is due to the nasty effects of adrenaline.
As for forming new habits: Think Small. “Think Big” is the most disastrous slogan out there. It practically guarantees failure. The principles of Think Small: baby steps, mini-projects, and micro-ambitions. These reinforce focus on the work, not the outcome, and thus practically guarantee success — which builds on itself. “We manage best when we manage small.”
How to be happy? Smiling helps. Fake being happy? Why not — I'm not against it. But I say, do something small — even quite tiny — and do it very well. Do it with excellence, with artistry.
Also, “Remember you are loved” — not necessarily in the sense of romantic love. For me it's often enough to remember that no matter what happens, I have myself — not that I have to define that self. It’s enough that my “self” is happening, interacting with the world — and I'm never bored with that.
*
ON THE IMPOSSIBILITY OF KNOWING ONE “TRUE SELF” AND ONE’S “TRUE” LIFE STORY
One reason I've always hated to talk about my life has been the keen realization that everything I say is false — not a deliberate lie, but an unavoidable partial and false version. Of course that happened too in poems, especially childhood poems: a painful sensation of enormous but inescapable lying, only partly redeemed by artistic merit.
But I’ve grown easier on myself, knowing that absolute truth is neither knowable nor desirable, and art has to be selective and simplify. Rather than an accurate life story — aside from the important realization that I am not to blame for all the bad things that happened; circumstances played a huge part — it’s more important to have a life philosophy that serves the present, making it worth living. Besides, I can always treasure-hunt and polish the good things I produced in the past, those “inaccurate” poems and prose memories that I enjoy sharing with others.
ALFRED LOISY AND THE WRONG COMING: INSTEAD OF THE SECOND COMING, WE GOT THE CHURCH
“Jesus announced the coming of the Kingdom, but it was the Church that arrived.” ~ French theologian Alfred Loisy, 1902. He got excommunicated for this and other insights.
Jesus is never coming back. Never, never, never, never.
Not on the clouds of glory, nor even in a metaphorical way. Simply: never.
This doesn’t sadden me, but the thought that instead of the Kingdom we got the Church is depressing. It is said that Loisy made this remark with a note of regret. A false prophecy is commonplace; most predictions turn out to be wrong. But this was a colossal “wrong coming”. At least Loisy didn’t get burned at the stake, only because he was born late enough in history.
Another controversial position taken by Loisy was his distinction between the pre-Moses period dominated by the religion of El (or Elohim), and the post-Moses period when Yahweh gradually took over. El was the chief god of the Canaanite pantheon.
The best insights are often those that are put in the simplest words: “Jesus announced the coming of the Kingdom, but it was the Church that arrived.” Instead of the divine perfection, a flawed institution that almost instantly became oppressive and corrupt. “The Empire
never ended."
Furthermore, the entire message got completely derailed. The point was never dying and the afterlife in some disembodied state — that was never the Hebrew belief (the biblical belief was that life began with the first breath, and ended with the last breath — to exist, you had to be breathing — hence the clumsy notion of resurrection in the body).
No, the afterlife was never the point of the original Christian message. The point was supposed to be the future kingdom of god right here on earth. Deluded, yes, but at least we should take an honest look at what the message really was.
CAN LIPOSUCTION SLOW DOWN AGING?
Obesity, especially excessive belly fat, increases the risk of numerous diseases, including diabetes, atherosclerosis and cancer. In addition, the cytokines associated with the widespread inflammation related to excessive belly fat directly impair cognitive function. Worse, the effect of excessive belly fat ultimately increases the risk of developing Alzheimer’s disease.
What would happen if these harmful fats cells were simply removed? Exercise can shrink fat cells but only liposuction can remove them from the body. A group of scientists at the Medical College of Georgia investigated this novel question by conducting three very clever experiments on obese and normal weight mice (Journal of Neuroscience, 2014). First, a group of obese mice were forced to exercise on a treadmill. Unlike the millions of Americans who own treadmills, these mice had no choice but to run. As expected, the daily treadmill exercising reduced belly fat, reduced the level of inflammation in their body, and significantly restructured how their brains’ function at the cellular level leading to greatly improved memory.
In a parallel study, the scientists surgically removed fat pads from a similar group of obese mice, i.e. they underwent a standard liposuction procedure. *The results were identical to those produced by running on the treadmill: inflammation was reduced and the mice became significantly smarter.* These findings confirm many recent studies that have documented the ability of fat cells to impair brain function and accelerate aging.
Then the scientists did something truly astonishing; they transplanted fat pads into normal, healthy weight mice. The impact of the fat cells was immediately obvious: the mice showed increased signs of brain and body inflammation and they developed deleterious changes in brain structure and function that lead to reduced memory performance, i.e. the rats became stupid.
The evidence is now quite overwhelming between increasing levels of body fat and a decline in virtually every aspect of normal brain function. The scientific literature is vast, including over 10,000 published articles that document the precise mechanisms underlying this connection. The relationship has been documented in humans, monkeys, rats and every organism that has been studied.
https://www.psychologytoday.com/blog/your-brain-food/201403/liposuction-can-make-you-smarter
Oriana:
Of course the gold-standard study would be to randomly assign people to liposuction and control group, and track their aging. This simply can’t be done, so currently only the affluent undergo the removal of excess fat cell (by the way, surgery is not the only method; non-surgical techniques have recently been developed). And the affluent also live longer, but probably due to multiple factors, lower rates of obesity being only one of them.
ending on beauty:
What is of genuine importance is eternal vitality, not eternal life. ~ Nietzsche
This instantly reminded me of Blake’s “Energy is eternal delight.” And somehow that energy finds a venue for itself, the ideas and new areas of growth. It goes both ways: when a goal seizes the imagination, the energy will be found; and when energy is abundant, a goal will be found. Like a mountain river, the eternal vitality rushes on.
The everlasting universe of things
Flows through the mind, and rolls its rapid waves,
Now dark—now glittering—now reflecting gloom—
Now lending splendor . . .
Where waterfalls around it leap for ever,
Where woods and winds contend, and a vast river
Over its rocks ceaselessly bursts and raves.
~ Shelley, Mont Blanc