*
THE DAY I LEARNED MORTALITY
The surgeon said in a calm, controlled
voice, “You should be able to lead
a normal life — ” he paused —
“for the rest of your life.”
I walked out of the arctic hospital.
I kept walking to the parking lot.
It was the fracture of that pause:
the silence rolled, uncontrolled —
I drove on the streets, the freeway.
Sunlight in streaks and spills
played tag along the tattered
eucalyptus groves. Wildfires
of bougainvilleas flickered,
flirting with the wind.
A fluent paradise on fault lines.
A death sentence, but normal.
The palm-tree in front of my apartment
stood quiet, not clapping
its fronds, but waiting.
Not a twig fidgeted, not a cloud.
I kept walking. I kept climbing
the echoing stairs.
But everything around me
had stopped.
Everything was staring,
waiting,
my shadow splayed in two
against the stucco wall.
~ Oriana
Yes, that was the first encounter. But I was only 28. My parents were alive and shockingly healthy. Only one friend, slightly younger than I, was dead of colon cancer. It was yet far from a more complete understanding.
A shock, to be sure, but now it seems relatively minor. I wasn’t even thirty! Talk about the infancy of life.
Now it’s personal. The word “hospice,” for instance, is hair-raising. I realize that most people at that stage are “out of it” — but not always.
My recent medical apocalypse has been a much more devastating encounter than the one I describe in the poem. How terrible life is! The last act is sheer cruelty. Now I understand why so many have no interest in matters like evidence or the nature of reality, and embrace religious promises instead. Otherwise, we get this:
MEDICAL HISTORY
I’ve been pregnant. I’ve had sex with a man
who’s had sex with men. I can’t sleep.
My mother has, my mother’s mother had,
asthma. My father had a stroke.
My father’s mother has high blood pressure.
Both grandfathers died from diabetes.
I drink. I don’t smoke. Xanax for flying.
Propranolol for anxiety. My eyes are bad.
I’m spooked by wind. Cousin Lilly died
from an aneurysm. Aunt Hilda, a heart attack.
Uncle Ken, wise as he was, was hit
by a car as if to disprove whatever theory
toward which I write. And, I understand,
the stars in the sky are already dead.
~ Nicole Sealey
It’s unbearable.
The greatest consolation is the thought of being able to contribute even a tiny bit to the life of others — to share our psychic riches while we can.
And to say “I love you” often. The truth of emotions is complex, and irrelevant. People need to hear that they are loved, and we need to say it. It’s perhaps our foremost moral duty.
Zebras are like horses. I wonder if standing like this is horses’ equivalent of cuddling (I realize that it sounds like a strange word choice, but what else?)
*
HOW TO RECOVER FROM STRESS MORE QUICKLY
1. PREVENT RUMINATION
Replaying the memory of a stressful experience after it is over can activate similar pathways in the brain as the actual experience. This can keep the stress reaction “switched on” even if a stressor is no longer there and cause the experience to be perceived as more distressing than it actually was. Preventing people from ruminating lowers their blood pressure faster after acute stress. Chronic stress has been linked to hypertension and in a small, randomised trial, US researchers, including Lynn Clemow at Columbia University Medical Center, used stress management training (based on a cognitive-behavioral group workshop) to effectively lower systolic blood pressure in patients with hypertension. The decline in pressure correlated with a decline in depressive rumination.
2. MINDFULNESS MEDITATION MAY NOT BE FOR YOU, BUT GIVE YOGA A TRY
The perceptual element of stress may be the reason some mind-body interventions such as yoga, breathing techniques and focused-attention meditation can benefit stress management through effects on improving emotional regulation, reducing stress reactivity and speeding up recovery after stress. It may also explain why some techniques such as mindfulness meditation have shown mixed results in controlled studies. It is possible the technique of mindfulness meditation can invite rumination and repetitive negative thoughts in some individuals but not in others.
3. GET INTO NATURE
An apparently “soft” factor like exposure to nature can hasten recovery following stress and lower markers of stress.
If you walk outside in green spaces, or even look at pictures of nature scenes, you may be able to increase your resilience to stress. A recent study by Stanford researchers showed that walking in green campus parkland reduced anxiety and worry more than walking on a busy street and had cognitive benefits as well. In another study, students were stressed by having to take a math test and getting feedback (even if not accurate) that they were performing below average. After the stressor, researchers assigned participants to one of two groups that either saw pictures of empty pathways and trees or pictures of urban scenes with cars and people. Those who saw the pictures of trees had faster cardiovascular recovery from stress (e.g., heart rate slowed down faster).
4. AVOID BRIGHT OR BLUE LIGHT IN THE EVENING
Avoid bright light or blue night exposure late in the evening from the use of LED screens — bright or blue light at night can delay the release of melatonin, a hormone that has been shown to reduce anxiety.
5. ENGAGE IN LIGHT EXERCISE (e.g. walking)
Low-intensity exercise reduces circulating levels of cortisol.
6. SMILE
A recent study by Tara Kraft and Sarah Preston at the University of Kansas showed that smiling—even if they're fake smiles—can help your body resist stress. In this clever study, the researchers used chopsticks to arrange subjects’ mouths into either (fake) smiles or neutral expressions. Half the subjects in the smile group did not know they were smiling. The other half were told to smile and therefore had genuine smiles (which involve moving both eye muscles and mouth muscles). But both smiling groups had lower heart rate than the neutral group after performing a stressful task. The group with genuine smiles had the lowest heart rate overall; the fake smile group had less of a drop in positive mood during the stressor. The researchers suggest that moving your facial muscles sends a message to your brain that can influence your mood.
7. STAND UPRIGHT
It turns out that standing in an upright pose actually helps you perform better under stress, as compared to slouching. In another clever recent study, published in the journal Health Psychology, researchers assigned people to either stand upright or slouch. The researchers held the subjects in position with physiotherapy tape (after giving them a cover story). Both groups then had to do a stressful speech task. The upright group performed better and had less fear and more positive mood, compared to the slouchers. They were also less self-conscious. So the next time you’re under stress, remember to stand tall.
8. TRY TO SEE YOUR STRESS AS A CHALLENGE
A study by Harvard and Yale researchers shows that your attitude toward stress matters and that people can learn more positive attitudes. The researchers showed one of two brief video clips to managers at a large, multinational banking firm, then measured their mood and work performance in subsequent weeks. These managers had high-pressure jobs with quotas they had to meet. One group saw a clip showing the negative effects of stress while the other group saw a clip about seeing stress as a positive challenge. The group that saw the clip about the positive aspects of stress actually felt less stressed—they engaged more at work and were happier and healthier. They also reported a 23% decrease in stress-related physical symptoms (like backache) compared to the group whose members saw the negative video. So try to see your stressors as challenges that you can learn from (even if it’s just learning to tolerate stress).
Adapted from
http://www.bbc.com/future/story/20190813-burnout-anxiety-stress-proof-relief?utm_source=pocket-newtab and from https://www.psychologytoday.com/us/blog/the-mindful-self-express/201603/6-proven-ways-recover-stress
Statue of the Buddha, Toledo Museum of Art
*
“Digressions, incontestably, are the sunshine; —and they are the life, the soul of reading; — take them out of this book for instance, — you might as well take the book along with them.” ~ Laurence Sterne, The Life and Opinions of Tristram Shandy, Gentleman.
Both the author and his eponymous hero were dying of consumption. So long as he continued writing, he would go on living. ~ M. Iossel
John Guzlowski:
ENGLISH KEEPS GAINING DOMINANCE AS THE UNIVERSAL LANGUAGE
~ “De Swaan divides languages into four categories. Lowest on the pyramid are the “peripheral languages”, which make up 98% of all languages, but are spoken by less than 10% of mankind. These are largely oral, and rarely have any kind of official status. Next are the “central languages”, though a more apt term might be “national languages”. These are written, are taught in schools, and each has a territory to call its own: Lithuania for Lithuanian, North and South Korea for Korean, Paraguay for Guarani, and so on.
Following these are the 12 “supercentral languages”: Arabic, Chinese, English, French, German, Hindi, Japanese, Malay, Portuguese, Russian, Spanish and Swahili – each of which (except for Swahili) boast 100 million speakers or more. These are languages you can travel with. They connect people across nations. They are commonly spoken as second languages, often (but not exclusively) as a result of their parent nation’s colonial past.
Then, finally, we come to the top of the pyramid, to the languages that connect the supercentral ones. There is only one: English, which De Swaan calls “the hypercentral language that holds the entire world language system together”. The Japanese novelist Minae Mizumura similarly describes English as a “universal language” . For Mizumura, what makes it universal is not that it has many native speakers – Mandarin and Spanish have more – but that it is “used by the greatest number of non-native speakers in the world”. She compares it to a currency used by more and more people until its utility hits a critical mass and it becomes a world currency. The literary critic Jonathan Arac is even more blunt, noting, in a critique of what he calls “Anglo-Globalism”, that “English in culture, like the dollar in economics, serves as the medium through which knowledge may be translated from the local to the global.”
In the last few decades, as globalization has accelerated and the US has remained the world’s most powerful country, the advance of English has taken on a new momentum. In 2008, Rwanda switched its education system from French to English, having already made English an official language in 14 years earlier. Officially, this was part of the government’s effort to make Rwanda the tech hub of Africa. Unofficially, it’s widely believed to be an expression of disgust at France’s role in propping-up the pre-1994 Hutu-dominant government, as well as a reflection that the country’s ruling elite mostly speaks English, having grown up as exiles in anglophone east Africa. When South Sudan became independent in 2011, it made English its official language despite having very few resources or qualified personnel with which to teach it in schools. The Minister of higher education at the time justified the move as being aimed at making the country “different and modern”, while the news director of South Sudan Radio added that with English, South Sudan could “become one nation” and “communicate with the rest of the world” – understandable goals in a country home to more than 50 local languages.
The situation in east Asia is no less dramatic. China currently has more speakers of English as a second language than any other country. Some prominent English teachers have become celebrities, conducting mass lessons in stadiums seating thousands. In South Korea, meanwhile, according to the sociolinguist Joseph Sung-Yul Park, English is a “national religion”. Korean employers expect proficiency in English, even in positions where it offers no obvious advantage.
*
Aneta Pavlenko, an applied linguist at Temple University in Pennsylvania, who has spent her career studying the psychology of bilingual and multilingual speakers, has found that speakers of multiple languages frequently believe that each language conveys a “different self”. Languages, according to her respondents, come in a kaleidoscopic range of emotional tones. “I would inevitably talk to babies and animals in Welsh,” reports a Welsh-speaker. An informant from Finland counters: “Finnish emotions are rarely stated explicitly. Therefore it is easier to tell my children that I love them in English.” Several Japanese speakers say that it’s easier to express anger in English, especially by swearing.
Here is the memoirist Eva Hoffman on the experience of learning English in Vancouver while simultaneously feeling cut off from the Polish she had grown up speaking as a teenager in Kraków: “This radical disjointing between word and thing is a desiccating alchemy, draining the world not only of significance but of its colors, striations, nuances – its very existence. It is the loss of a living connection.” The Chinese writer Xiaolu Guo described something similar in her recent memoir, writing about how uncomfortable she felt, at first, with the way the English language encouraged speakers to use the first-person singular, rather than plural. “After all, how could someone who had grown up in a collective society get used to using the first-person singular all the time? … But here, in this foreign country, I had to build a world as a first-person singular – urgently.”
In the 1970s, Anna Wierzbicka, a linguist who found herself marooned in Australia after a long career in Polish academia, stood the Sapir-Whorf hypothesis on its head. Instead of trying to describe the worldviews of distant hunter-gatherers, she turned her sociolinguistic lens on the surrounding anglophones. For Wierzbicka, English shapes its speakers as powerfully as any other language. It’s just that in an anglophone world, that invisible baggage is harder to discern. In a series of books culminating in 2013’s evocatively named Imprisoned in English, she has attempted to analyse various assumptions – social, spatial, emotional and otherwise – latent in English spoken by the middle and upper classes in the US and UK.
Reading Wierzbicka’s work is like peeking through a magic mirror that inverts the old “how natives think” school of anthropology and turns it back on ourselves. Her English-speakers are a pragmatic people, cautious in their pronouncements and prone to downplaying their emotions. They endlessly qualify their remarks according to their stance towards what is being said. Hence their endless use of expressions such as “I think”, “I believe”, “I suppose”, “I understand”, “I suspect”. They prefer fact over theories, savor “control” and “space”, and cherish autonomy over intimacy. Their moral lives are governed by a tightly interwoven knot of culture-specific concepts called “right” and “wrong”, which they mysteriously believe to be universal.
Because English is increasingly the currency of the universal, it is difficult to express any opposition to its hegemony that doesn’t appear to be tainted by either nationalism or snobbery. When Minae Mizumura published the Fall of Language in the Age of English, in 2008, it was a surprise commercial success in Japan. But it provoked a storm of criticism, as Mizumura was accused of elitism, nationalism and being a “hopeless reactionary”. One representative online comment read: “Who does she think she is, a privileged bilingual preaching to the rest of us Japanese!” (Perhaps unsurprisingly, Mizumura’s broader argument, about the gradual erosion of Japanese literature – and especially, the legacy of the Japanese modernist novel – got lost in the scuffle.)
In California, where I live, most of the languages that were spoken before the arrival of Europeans are already extinct. On America’s eastern seaboard, thanks to long proximity to Anglo settlers, the situation is even worse. Most of what we know about many of these vanished languages comes in the form of brief word lists compiled by European settlers and traders before the 19th century. Stadaconan (or Laurentian) survives only from a glossary of 220 words jotted down by Jacques Cartier when he sailed up the St Lawrence River in Canada in 1535. Eastern Atakapa, from Louisiana’s Gulf Coast, is known from a list of only 287, gathered in 1802. The last fragments of Nansemond, once spoken in eastern Virginia, were collected from the last living speaker just before his death in 1902, by which time he could only recall six words: one, two, three, four, five and dog.
In this past century, the Earth has been steadily losing diversity at every level of biology and culture. Few deny this is a bad thing. Too often though, we forget that these crises of diversity depend, to a great extent, on our own decisions. Much of what has been done can also be undone, provided there is the will for it. Hebrew is the most famous case of a language brought back from the dead, but linguistic revitalization has been proven to be possible elsewhere as well. Czech became a viable national language thanks to the work of literary activists in the 19th century. On a much smaller scale, endangered languages such as Manx in the Isle of Man and Wampanoag in the US have been successfully pulled back from the brink.
Before the era of the nation-state, polyglot empires were the rule, rather than the exception. Polyglot individuals abounded, too. For most of history, people lived in small communities. But that did not mean that they were isolated from one another. Multilingualism must have been common. Today, we see traces of this polyglot past in linguistic hotspots such as the Mandara mountains of Cameroon, where children as young as 10 routinely juggle four or five languages in daily life, and learn several others in school.
A resident of another linguistic hotspot, the Sepik region of Papua New Guinea, once told Evans: “It wouldn’t be any good if we talked the same; we like to know where people come from.” It’s a vision of Babel in reverse. Instead of representing a fall from human perfection, as in the biblical story, having many languages is a gift. It’s something to remember before we let English swallow the globe.” ~
https://getpocket.com/explore/item/behemoth-bully-thief-how-the-english-language-is-taking-over-the-planet?utm_source=pocket-newtab
Oriana:
“Diversity” has multiple positive connotations, and Lorca in Spanish sounds like a gift beyond any translation.
At the same time, we have a pragmatic question: how is humanity to communicate? The scientific community needs a common language — we take that absolutely for granted. But what about the world of commerce? Of international aviation, and travel in general? And, as I’ve noted in another essay, the gift of English isn’t simply communication — English is a language of equality, with a gain in time both due to time not wasted on formal shades of reverence but also the gift of greater clarity. What may seem rude at first turns out to be, above all, useful.
By the way, there have always been international languages — until English took over, French was the language of the educated Europeans — and not exclusively Europeans either. Will English be replaced by yet another language? Or perhaps by a simplified version of English, without the current convolutions (note, for instance, that New-World Spanish has done away with the “th” sound ubiquitous in European Spanish).
All we can be sure of is that it won’t be Hopi or Nahuatl.
A view of Lisbon; C. Fishman
*
THE POWER OF MOVIES TO CHANGE HISTORY
~ “Research has shown that people learn very effectively from stories and narratives, engaging our brain in ways that are both pleasurable and incredibly complex, so movies (and not just documentary form) are often ways for people to learn about the past. Our imagination is ready for action, and movies can provide a tantalizing twist, often portraying World Wars, the Depression, slavery, the Holocaust, or space exploration. Actors can become incorporated into people’s imagery of the past, such as in Mel Gibson's The Passion of the Christ (2004) starring Jim Caviezel as Jesus, and Daniel Day-Lewis as Abraham Lincoln in Lincoln (2012).
Quentin Tarantino’s Once Upon a Time in Hollywood provides an engaging story and background for the 1969 events that led to the Sharon Tate Manson-clan murder spree. Living up to its story-book title ("Once Upon A Time...") (spoiler alert) the movie provides a much different ending, as Sharon Tate never meets her demise in this tale. Most people over the age of 60 know about the Manson-family murders and Sharon Tate.
The movie provides a much less horrific ending for Tate and an alternative tale—complete with Tarantino-style violence (it involves a flame thrower) and using fantasy instead of historical facts. The Manson clan has the tables turned on them. However, in what is a deviation from the truth of what happened 50 years ago, it becomes possible that people (especially young adults) will now know a different version of reality—and may not question the movie’s twist on truth, and end up believing some of the fictitious events in the movie.
Research has shown that presenting people with misinformation—some information or event that is inconsistent with the truth of what happened earlier but is highly believable, can lead to not only some initial confusion, but it can then alter memory. As a result of introducing misinformation in a psychology experiment on the exact topic, people will claim to have been lost in a mall as a child after being told this story had happened to them, or that as a child they met Bugs Bunny at Disneyland to refresh your memory, Bugs is a Warner Bros. character and thus couldn’t be seen at Disneyland)
People are prone to believe stories and what makes sense often without questioning the events that are being suggested. Movies might provide just the right amount of entertaining and (sometimes subtle) misinformation that can lead to memories and history being altered in the process. Presenting tales and alternative ending in the context of a real event can make people think what could have happened if only a few things were different—but these variations on the truth can also lead to some implanted memories for people who only have a vague understanding of the past.
Quentin Tarantino is not intentionally trying to dupe people into thinking things were different 50 years ago, instead he is allowing us to imagine how things could have been different if a few small or seemingly random events happened or different choices were made by certain characters. He took creative license to shed a brighter light (flame-thrower style) on a dark event. Movies can allow the mind to imagine, and it is then up to us to differentiate what we imagine with what actually happened in the past, but sleeper effects can make us reimagine the past in ways that can have profound effects on our later memory, which can be modified each time we visit events from the past.
Ideally, movies that provide variations of the past will make people research what actually happened, to have a more complete understanding of the events, but it can also lead to some subtle changes in history from the younger viewers’ point of view.” ~
https://www.psychologytoday.com/us/blog/metacognition-and-the-mind/201908/can-hollywood-alter-history-how-film-modifies-memory
And this: “A lot of people are going to focus on the end of “Once Upon a Time ... in Hollywood.” The minute that we see that the film has jumped forward to August of 1969 and that Sharon Tate is very pregnant, anyone with even a passing knowledge of history knows what’s coming. Or at least they think they do. The final few scenes will be among the most divisive of the year, and I’m still rolling around their effectiveness in my own critical brain. Without spoiling anything, I’m haunted by the final image, taken from high above its characters, almost as if Tarantino himself is the puppet master saying goodbye to his creations, all co-existing in a vision of blurred reality and fiction. However, the violence that precedes it threatens to pull the entire film apart (and will for some people). Although that may be the point—the destruction of the Tinseltown dream that casts this blend of fictional and real characters back into Hollywood lore.”
https://www.rogerebert.com/reviews/once-upon-a-time-in--hollywood-2019
I wonder what Hannah Arendt would say about movies that seriously distort history — in this case for the sake of entertainment rather than devious propaganda. But that fine line, especially when history is not pretty — if it were, it would probably never make it into the category of “history.”
All memory is pretty much false memory — but there is a question of degree. Movies are extremely "persuasive." You can get to hate or adore a certain group of people because of a movie. I noticed some of that after "Crazy Rich Asians" vs "The Farewell." Actually that's not the best contrast — it's not about the Chinese culture per se, though the theme of family is strong in both. Rather, "Crazy Rich" made me hate the rich a lot more effectively than all the anti-capitalist propaganda we were presented in school. Not the kind of hatred that would have real-world consequences, at least not in my case. But one can imagine different viewers . . .
BUT WHAT ABOUT BIOGRAPHICAL NOVELS?
~ “We live in an age when biographical novels have become hugely popular, some of them rising to a high level of artistry, as in Colm Toíbín’s The Master (Henry James), Michael Cunningham’s The Hours (Virginia Woolf), or Joyce Carol Oates’ Blonde (Marilyn Monroe). It’s not that fine biographical novels haven’t always been around (see Lotte in Weimar, Thomas Mann’s exhilarating 1939 novel about Goethe, or Marguerite Yourcenar’s Memoirs of Hadrian, a magisterial book published in 1951). But similar works—really good ones—have been coming at us thick and fast in the last few decades.
Traditional literary novels are in decline. The figures bear this out, as in the most recent NEA study of American reading habits. A student of mine recently said to me in frustration: “I just can’t get interested in ‘made-up’ lives.” And I must admit, my own tastes have shifted over the decades away from invented lives. I think I speak for many when I say that it’s biographical novels—which are centered on actual lives and circumstances—that have found a more secure place in my reading (and writing) life.
Philip Roth famously put forward in “Writing American Fiction” (1961) the notion that the clamorous world around us has overtaken fiction. He wondered how a novelist could compete, making a credible fictive reality in light of a world that repeatedly stupefies, sickens, and seems finally “an embarrassment to one’s own meager imagination.” With Trump in the White House, Roth’s commentary seems truer than ever: This tacky, bumptious, and thoroughly implausible creature would read as false in any novel. Nobody would believe it.
On the other hand, true stories hold our attention. Think how many films claim to be “based on actual events.” But “real” lives, so to speak, are difficult to access. I know, having written biographies of Steinbeck, Frost, Faulkner, Jesus and Gore Vidal as well as bio-fictional takes on Tolstoy, Walter Benjamin, Melville, and, most recently, Paul the Apostle in The Damascus Road. On reflection, I think I got far closer to the reality of the life at hand in the novels than in the biographies. The restrictions of straight biography frequently close out any effort to imagine the feelings of the figure at the center of the narrative. One has to rely on letters or journals or interviews for confirmation, and of course even those can be defective.
I would have to guess, for instance, how Steinbeck felt when his first wife cheated on him with a close friend or when Frost’s wife of many decades refused to allow him into the bedroom when she was dying. I wondered about Faulkner’s suicidal drinking habits but had only external evidence, as when Faulkner’s daughter told me her father would sometimes try to “rearrange her features when he was drunk,” as she put it. I was on safer ground with Gore Vidal because he was a close friend; but, even there, I had to limit myself to imagining his feelings if I was to avoid steering uncomfortably into fiction.
[on writing about St. Paul]: It’s for the novelist to imagine the contours of Paul’s inner world, to guess at his motives. I saw him as a repressed homosexual, a man of amazing visionary powers, a godly person who heard voices—including the voice of God. But no scholar writing about Paul would comfortably push into his sexual feelings, his neurotic self-doubts, his anxieties about his friends, his risky compulsion to move through the fraught and dangerous world of the Roman Empire in order to bring the Good News to the masses.
While writing this, I would often reread my favorite biographical novels for encouragement, and in recent years there have been so many to choose from: Hilary Mantel’s glittering trilogy about Thomas Cromwell, Paula McClain’s The Paris Wife, which centers movingly on Hemingway’s love affair with Hadley in Paris in the 1920s. I went back to The Secret Life of Emily Dickinson by Jerome Charyn and Ann Beattie’s implausible, arresting, and underrated Mrs. Nixon. I reread Tracy Chevalier’s Girl with Pearl Earring and Gore Vidal’s Lincoln: these have become permanent fixtures in the pantheon of bio-fiction.
Any imagined life is both less and more than real. It’s less real in the sense that it’s not possible to resurrect the actual person. Even then, can one really know another person? Fiction offers the one and only way we have to get into the head of somebody not ourselves. If this person is someone of interest for one reason or another, there is all the more reason to want to know them and their world more deeply.
And there is a truthfulness in fiction that is simply unavailable to the academic biographer.
When I was writing The Last Station, a novel about Tolstoy’s final year, for example, I knew from biographical sources that Sofya Tolstoy had thrown herself into the pond on their property one day in 1910, moments after she discovered that her husband had left her for good. What I could not do was know what she was thinking and feeling as she dropped through those sheets of black water. What was the quality of her despair? This is the kind of thing only a novelist can tell us, or try to tell us. And—in increasing numbers—they’re giving it a whirl, often succeeding in ways that are changing the face of modern fiction.” ~
https://lithub.com/reading-in-a-boom-time-of-biographical-fiction/?fbclid=IwAR2V12o-XnyXrC-fTlTvAzzIryKMsnCZGAe_02T3A-Fe2iBq4ns0lTUgX9o
GOSPEL ACCORDING TO CRUDE OIL
~ “Historian Darren Dochuk argues in his new book, Anointed with Oil: How Christianity and Crude Made Modern America, the search for fossil fuels has itself long been overlaid with Christian commitment. Oil executives themselves historically have been among the most active and enthusiastic promoters of apocalyptic Christianity in the United States, their zeal to drill representing their religious passion as well as their quest for self-enrichment. Over the course of U.S. history, Dochuk writes, oil companies “openly embraced the theological imperatives that informed their chief executives, aligned their boardrooms with biblical logics, and sacralized their operations as modes of witness and outreach.” Because of the heavy investment of the industry in religious faith, oil, for Dochuk, has become more than just a commodity or an energy source. Its “grip on the human condition” is “total”; it has become “an imprint on America’s soul.”
For the sociologist Max Weber, capitalism was defined by the distinctive way that it taught people to approach their work. As he wrote in The Protestant Ethic and the Spirit of Capitalism (1905), people had to be taught to treat their labor—whatever it might be—with the seriousness of purpose devoted to a calling: to come to their jobs day after day on time, to labor with dedication, and to postpone a life of pleasure. The Calvinist creed—according to which worldly riches were to be sought not for their own pleasures but as evidence of God’s grace, a bulwark against the loneliness and powerlessness of each individual before the divine—taught people how to act in a capitalist order.
Oil, Dochuk suggests, at once underwrote and was fueled by a different system of religious belief. Oil is an industry of speculation, of rocky land that hides wonders unseen, of alchemic transformation of the raw materials of the earth into the fuel of industrial society. As Dochuk shows, many of the men (and they were mostly men) who spent their lives drilling for oil also subscribed to belief systems revolving around the notion that the world contained spiritual paradoxes not comprehensible through science alone. Their desire to become phenomenally wealthy was often inextricable from their longing to carry out what they saw as God’s work on earth.
Dochuk opens his story with Patillo Higgins, one of the first entrepreneurs to strike oil in Texas. One Sunday afternoon in 1891, Higgins had escorted a Sunday-school class of eight-year-old girls up to the top of Spindletop Hill to show them springs of water bubbling forth from the rocky ground. While instructing them in this “everyday application of religion,” he had chanced to notice gaseous clouds rising up from the earth as well—a sign that oil might be hiding underneath. Returning to his small town of Beaumont, Texas, he bought the plot of land that held the springs with his church elder.
On New Year’s Day, 1901, a drill team finally struck oil. Three million barrels of oil poured forth in the thirty days that followed, and the population of Beaumont ballooned from 6,000 to 50,000. And a new pantheon of oil companies—Gulf, Texaco, and Sun Oil Company, smaller than Standard Oil but still substantial—rose out of the wells of Texas.
Much of Anointed with Oil is organized around the idea that the division in the oil industry between the independents and the majors (especially Standard Oil, which at its peak in the late nineteenth century controlled nearly 90 percent of all the oil refining in the country) was echoed in two competing versions of Christianity and capitalism: the “wildcat Christianity” of the independents and the “civil religion of crude” promoted by Rockefeller and his firm. On the one hand, Rockefeller was a severe and pious Baptist who believed in the responsibility of those with riches to improve the social order. “I believe the power to make money is a gift from God,” he argued. “Having been endowed with the gifts I possess, I believe it is my duty to make money and still more money, and to use the money I make for the good of my fellow man according to the dictates of my conscience.” Rockefeller taught Sunday school each week for sixty years, went to prayer before attending to crises at the oil fields, shuttered pubs and closed brothels in his oil towns. And he built the mammoth bureaucratic entity of Standard Oil, which he described as the “salvation of the oil business,” its executives “missionaries of light.” He was inspired by the conviction that his refining enterprise could provide “collective salvation” for the industry by introducing rationality where disastrous competition had prevailed. “The Standard was an angel of mercy,” he argued, “reaching down from the sky, and saying, ‘Get into the ark. Put in your old junk. We’ll take all the risks!’”
Opposing Rockefeller were the wildcatters, driven by their own mystical version of faith, one far more ragged and improvisational. They clung to “an absolute essence of pure capitalism” that safeguarded their ability to make money in whatever way they thought best and had nothing but contempt for Rockefeller’s efforts to rationalize the industry or to contain competition. Oil was a way for the past to speak to the present, a sign of God’s glory and of riches for the prophet who could see through the earth’s surface to glimpse another world beyond. Dochuk tells the story of one Montanan coal miner turned aspiring oilman whose correspondence with spiritualists and astrologists drove his quest for oil; they reassured him that by “taking minerals out of the earth” he was “allowing them to transmute into higher forms, synthesize with human need and desire, and serve as further reminders that the universe leaned toward unity.” Oil, the would-be driller came to believe, was the “Magic Wealth Producer!” (as one Texas town’s boosters advertised)—and finding it would allow him to contribute to the spiritualist cause.
Other oil independents, such as Lyman Stewart—one of the founders of Union Oil—subscribed to premillennialism, which held that end times were nigh and the arrival of Christ imminent. The world would soon descend into chaos, evil, and disorder, evidence of which could easily be found in the tumultuous society of the turn of the century. But all was well, for after a period of tribulations it would then be reborn. For Stewart, Dochuk suggests, the worldview of premillennialism rhymed with his life experiences seeking oil: the sense of powerlessness before supernatural, otherworldly forces, the pendulum of fortune swinging wildly to and fro. Stewart went on to imitate his arch-enemy, Rockefeller, in his philanthropic efforts—donating money to support religious education and to fund the publication of fundamentalist Christian texts.
Perhaps the greatest stronghold of “wildcat Christianity” was East Texas, where oil was discovered just as the Great Depression took hold. The oil boom that followed—the largest in American history—at once inspired and helped promote what Dochuk describes as “end-times urgency.” The denizens of East Texas believed they were blessed with oil, charged with using it to build God’s kingdom on earth, and pressed to do so quickly before the gifts that had been extended to them disappeared. Independent oil producers operated a majority of wells in the region throughout the 1930s. Church lots were littered with oil derricks as ambitious oilmen sought to drill wherever they could, while enterprising ministers dreamed of striking it rich; Dochuk describes a congregation gathering to pray over a new well. The “rush to obtain oil,” he writes, “always worked according to earth’s (and God’s) unknowable clock, with depletion (and Armageddon) an inevitability lingering on the horizon.” Their faith was undimmed even after the 1937 New London disaster, in which a gas explosion at a public school newly built for the children of oil workers killed about 300 students. As one religious leader put it in the aftermath, “These dear oil field people can set the world an example for consecration, and they will.” The intense melding of political and religious ideas with economic interest helped to make Texas one of the hotbeds of opposition to Roosevelt and to New Deal liberalism in the years that followed the second World War.
Still, by the early twenty-first century, the old certitudes were running out. Oil independents with their strong ties to evangelical Christianity believed that their fortunes were rising with Ronald Reagan’s election to the White House—but a glut in world oil markets that led to falling prices in the 1980s put many out of business, never to recover. Meanwhile, the oil giants no longer seemed able to promise stable, peaceful economic development to the rest of the world. The fragmentation of the Rockefeller dynasty was the most dramatic example. “Most of the fourth Rockefeller generation have spent long years with psychiatrists in their efforts to grapple with the money and the family, the taint and the promise,” pronounced one 1976 exposé. By the end of the twentieth century, Steven Rockefeller, a professor of religion at Middlebury College, had started to steer his family foundation toward positions that would have horrified his great-great-great grandfather—especially advocacy for environmental conservation.” ~
https://bostonreview.net/philosophy-religion/kim-phillips-fein-gospel-oil?utm_source=Boston+Review+Email+Subscribers&utm_campaign=7f13a8fc77-MC_Newsletter_9_4_19&utm_medium=email&utm_term=0_2cb428c5ad-7f13a8fc77-40729829&mc_cid=7f13a8fc77&mc_eid=97e2edfae1
Here are the 10 states with the highest percentage of millionaires
10. California: 6.61 percent. (Cost of living: 33 percent above national average.)
9. Delaware: 6.62 percent. (Cost of living: .6 percent below national average.)
8. Virginia: 6.98 percent. (Cost of living: 1.7 percent below national average.)
7. New Hampshire: 7.36 percent. (Cost of living: 14.7 percent above national average.)
6. Massachusetts: 7.41 percent. (Cost of living: 20.7 percent above national average.)
5. Alaska: 7.50 percent. (Cost of living: 18.5 percent above national average.)
4. Hawaii: 7.57 percent. (Cost of living: 26.9 percent above national average.)
3. Connecticut: 7.75 percent. (Cost of living: 18.5 percent above national average.)
2. New Jersey: 7.86 percent. (Cost of living: 13.4 percent above national average.)
1. Maryland: 7.87 percent. (Cost of living: 21.4 percent above national average.)
Oriana:
THE LIFELONG JOURNEY OF RECOVERY
“I think I was about 15 when I conceived of myself as an atheist, but I think it was only very recently that I can really tell that there's nobody there with a copybook making marks against your name.” ~ Sharon Olds
This confirms what I’ve been saying for a while now: it can take a long, long time — half a lifetime or longer — let's face it, a lifetime — to recover from the “god of punishment” and rejoice in the knowledge that there is no punishment just for being human.
Since the existence of the God of Punishment isn’t so obvious, it turns out that all kinds of people are perfectly willing to take on the function: “I’ll be the God of Punishment.” I realize that people who had punitive (a nicer word than “abusive,” isn't it?) parents, who knew they had not been wanted children, who for any reason felt that they were "bad" when growing up face the same challenge of a lifelong recovery.
So there are all kinds of petty gods of punishment on parade, and we can’t avoid dealing with at least some of them. The older we get, the more we tend to find such people pathetic.
And sometimes life isn't long enough for full healing. Life rushes on whether or not you've reached clarity about the past, and (typically) understood it was not your fault. Recently I came upon this passage in a poem of mine, about blood drawing:
My veins are baby-fine
but my blood is dark red.
This worries me: my blood
so dark with the years,
but silent about
the shipwrecks of my life.
But of course it's a blessing that the blood is silent. The last thing I want is being reminded of those shipwrecks — and lately they seem to have receded, without my really trying. Coming to terms with the past usually happens automatically — the brain is clever and it's the unconscious that mysteriously, effortlessly takes care of such tasks — when it is ready. Before such readiness, the best solution can be simply deliberately not to think about the shipwrecks of the past — aren’t the current ordeals ENOUGH?
Remember: rumination is a habit, a behavior, and a behavior can be changed. You are not helpless over it, but you do need a motivation (for me the idea that I didn’t want to waste what precious few good years remained was enough).
Introspection can be dangerous for the insufficiently healed. Crying fits and time wasted brooding can follow. That’s why during the immediate period of recovery from depression I used to have the no-think zone.” “How can I be useful today?” is a more urgent matter.
This is a Hellenistic bronze of a Boxer at Rest, c. 330 BCE. I especially admire the face. How wonderful that the Greeks did not have a prohibition on making "graven images" (btw, the Catholic church tossed/falsified that commandment, for which I am infinitely grateful).
Ending on beauty:
God and I were walking in the woods
just down the road this morning.
He was quiet for a long time,
and I finally asked him what he was thinking.
He didn’t hesitate at all this time.
He said, I was thinking about Eden
and how happy we all were there.
I wish I’d given that young couple another chance.
~ John Guzlowski
Adam and Eve by an Iraqi artist
No comments:
Post a Comment