Picasso: Weeping Woman. "I always saw Dora Maar as a weeping woman, and one day she became that." To me, it’s one of his greatest masterpieces.
*
WEEPING WOMAN
in La Jolla, on Fay Avenue,
a non-virgin on the verge,
fighting tears, struggling to sustain
a quivery lipgloss smile.
Smile though your heart is breaking —
the first song I heard in America.
I shocked people, not knowing
what it meant, the American dream.
Now the tearful, middle-aged, no longer
beautiful woman in La Jolla,
California’s richest town,
reminded me of those first weeks —
how shocking I was in general.
A professor explained:
being small, and from Eastern
Europe, I wasn’t supposed to quote
Nietzsche and sound intelligent:
“People are uncomfortable
because they can’t put you
in a convenient slot.”
Slot and slut — I was learning
to be careful about vowels,
not to mention the crisp terminal
consonants as in ship.
In Milwaukee I was always asked,
“Did you come by plane or by boat?”
Boat, I used to think, was too small
to cross the Atlantic, as in row, row,
row your boat. Useless, the English
they teach you in school. Still,
I felt a quickening when I saw
the sticker Rocinante on someone's
non-luxury car — Don Quixote's
immortal nag. And again
yesterday: after smiling at the tearful
woman in La Jolla, in the thick of
the American Dream, I drove behind
a car with NIHILIST on its license.
For five ecstatic minutes,
I blew telepathic kisses
to the driver, even though
I am against nihilism,
and besides, every woman
is a weeping woman.
~ Oriana
Amazing, these momentary encounters that stay in memory. But would it have stayed in my memory if I didn’t write this poem? No way of telling. But because the poem exist, I see that particular weeping woman now and then, and understand her pain (probably left by her spouse or partner — love causes so much grief when we are young).
*
“You made me confess the fears that I have. But I will tell you also what I do not fear. I do not fear to be alone or to be spurned for another or to leave whatever I have to leave. And I am not afraid to make a mistake, even a great mistake, a lifelong mistake and perhaps as long as eternity too.” ~ James Joyce, A Portrait of the Artist as a Young Man
Joyce at 20
HOW TO SIMPLIFY YOUR LIFE
~ “Your odds of success improve when you are forced to direct all of your energy and attention to fewer tasks.
If you want to master a skill—truly master it—you have to be selective with your time. You have to ruthlessly trim away good ideas to make room for great ones. You have to focus on a few essential tasks and ignore the distractions. You have to commit to working through 10 years of silence.
Simplify and Go All In
If you take a look around, you'll notice very few people actually go “all in” on a single skill or goal for an extended period of time.
Rather than researching carefully and pouring themselves into a goal for a year or two, most people “dip their toes in the water” and chase a new diet, a new college major, a new exercise routine, a new side business idea, or a new career path for a few weeks or months before jumping onto the next new thing.
In my experience, so few people display the persistence to practice one thing for an extended period of time that you can actually become very good in many areas—maybe even world-class—with just one year of focused work. If you view your life as a 20-slot punchcard and each slot is a period of focused work for a year or two, then you can see how you can enjoy significant returns on your invested time simply by going all in on a few things.
My point here is that everyone is holding a “life punchcard” and, if we are considering how many things we can master in a lifetime, there aren't many slots on that card. You only get so many punches during your time on this little planet. Unlike financial investments, your 20 “life slots” are going to get punched whether you like it or not. The time will pass either way.
Don't waste your next slot. Think carefully, make a decision, and go all in. Don't just kind of go for it. Go all in. Your final results are merely a reflection of your prior commitment.
Warren Buffet's house in Omaha, Nebraska; one of several houses he owns. At least he's interested in the common good, and wants to contribute to greater well-being of others.
A reader’s comment:
“When Warren lectures at business schools, he says, “I could improve your ultimate financial welfare by giving you a ticket with only 20 slots in it so that you had 20 punches—representing all the investments that you got to make in a lifetime. And once you’d punched through the card, you couldn’t make any more investments at all.
He says, “Under those rules, you’d really think carefully about what you did and you’d be forced to load up on what you’d really thought about. So you’d do so much better.”
https://getpocket.com/explore/item/warren-buffett-s-20-slot-rule-how-to-simplify-your-life-and-maximize-your-results?utm_source=pocket-newtab
Oriana:
I would add: Think carefully before you get into a complicated hobby. Turning your pool into a koi pond, for instance, means endless chores.
“First you get the koi, and then you get obsessed about them,” a koi enthusiast told me. You start going to koi conventions, pestering others with photos of your aloof, shimmering pets. Still, who are we to judge someone else’s obsessions? One of the things I learned thanks to undergoing my recent medical apocalypse is that you don’t judge — just gape with amazement at how people cope and go on, in spite of the various hardships that no one escapes, not even the rich (a spouse may turn out to be an alcoholic or a non-stop womanizer; a child may turn out to be a drug addict or be mentally ill).
I don’t think there is a contract with the universe that guarantees your life will be just peachy — never mind that you are a good person. Many bad things happen to good people. We preserve the inaccurate memories of beautiful moments and feel grateful to have had them — recently or in the remote past, it doesn’t matter. They seem like miracles.
*
*
ATLANTIS AND OTHER LOST CONTINENTS
~ “More than 2,000 years ago, Plato wrote about a land called Atlantis, where a mighty empire vanished beneath the waves after a series of “excessively violent earthquakes and floods.” His tale has inspired plenty of nonsense in the centuries since, but now it seems that Plato was on to something. New research shows that lost continents are a real thing, and they have had a big impact on human life — though not in the way Plato imagined.
Douwe van Hinsbergen, a geologist at Utrecht University in the Netherlands, has been exploring one of the most dramatic of these lost continents — known as Greater Adria. In a paper published in early September in the journal Gondwana Research, he and his colleagues studied rocks around and beneath the Mediterranean Sea to reveal the full extent of Greater Adria for the first time. “It’s enormous! About the size and rough shape as Greenland,” he says.
If you don’t recall seeing Greater Adria on a map, there’s a reason for that. It is completely buried — not under the ocean, but beneath southern Europe. About 140 million years ago, the two continents began to collide. Greater Adria got bulldozed and buried in the process and mostly sank beneath what is now Italy, Greece and the Baltics.
And Greater Adria is not unique. Emerging studies of Earth’s mantle show likely traces of past lost continents. Analysis of ancient rocks suggest that almost all of Earth’s earliest continents might have disappeared, taking with them much of the history of life on this planet. The evidence of how life first appeared may be lost somewhere down there in the depths.
But lost continents are not entirely lost. Like lost civilizations, they leave traces behind, if you know how to look for them. Van Hinsbergen notes that rocks from Greater Adria got scraped off and incorporated onto the Alps, while whole chunks got embedded in southern Italy and Croatia. Even the parts of Greater Adria that got shoved dozens of miles down into the mantle, the layer below the crust, continue to influence modern Europe.
Under tremendous heat and pressure and over tens of millions of years, limestone rocks from Greater Adria turned into marble. Friction between Greater Adria and Europe then pulled the sunken rocks back to the surface, where people found them and mined them. “That’s where the marble came from that the Romans and the Greeks used for their temples,” van Hinsbergen says.
Plato was literally standing on the remains of a real Atlantis. He just had no idea.” ~
https://www.nbcnews.com/mach/science/real-life-atlantis-lost-continent-found-under-europe-revealing-earth-ncna1055856?fbclid=IwAR3BLH4K4p6UdeDumMLSkHXqBaL3exhFK5ffqsvE04EuAe8uR-sACMMnyFU
*
THE GHOSTS BEHIND US
“Behind every man now alive stand thirty ghosts, for that is the number by which the dead outnumber the living,” Arthur C. Clarke in 2001: A Space Odyssey, 1968.
That number no longer is accurate: The Population Bureau estimates that around 107 billion people have lived on earth to date, and there are close to 7 billion human beings currently alive -- which means that behind each of us alive today stand fifteen ghosts.
A ghost is a soul (no? well, then, what is it?), and a soul weighs 21 grams, according to (eminently Google-able) Doctor Duncan "Om" McDougall. Thus, the total weight of all the ghosts standing behind our backs is 224.7 million kilos, which would seem like a fairly heave load, but when distributed equanimously between us, amounts to the mere 315 grams of ghost weight per single living person, or a little over 10 oz, or 63 nickels.
63 nickels: not a bad title for a short-story about something perhaps mundanely realistic, like divorce or competitive eating.
63 nickels weighs our individual allotment of ghost matter — the extra load we all carry through our lives.” ~ M. Iossel
Oriana:
It’s interesting to ponder the lives of our ancestors. All those rich experiences, the drama. Each person’s life could make a riveting novel or movie — even if, on the daily basis, it seems like a lot of mundane activities at best, and dubious kinds of employment, e.g. grading Freshman Comp essays. In spite of that, everyone has done something interesting, if only we have the energy to listen, to imagine.
The problem is excess of fascinating tales, not their scarcity. Each life is fascinating, if we see deeply enough.
*
GOVERNMENTS LED BY NARCISSISTS AND PSYCHOPATHS
~ “After spending his early life suffering under the Nazis and then Stalin, the Polish psychologist Andrew Lobaczewski devoted his career to studying the relationship between psychological disorders and politics. He wanted to understand why psychopaths and narcissists are so strongly attracted to power as well as the processes by which they take over governments and countries.
He eventually came up with the term “pathocracy” to describe governments made up of people with these disorders — and the concept is by no means confined to regimes of the past.
It’s not really surprising that people with personality disorders are drawn to political power — narcissists crave attention and affirmation, and feel that they are superior to others and have the right to dominate them. They also lack empathy, which means that they are able to ruthlessly exploit and abuse people for the sake of power. Psychopaths feel a similar sense of superiority and lack of empathy, but without the same impulse for attention and adoration.
But pathocracy isn’t just about individuals. As Lobaczewski pointed out, pathological leaders tend to attract other people with psychological disorders. At the same time, empathetic and fair-minded people gradually fall away. They are either ostracised or step aside voluntarily, appalled by the growing pathology around them.
As a result, over time pathocracies become more entrenched and extreme. You can see this process in the Nazi takeover of the German government in the 1930s, when Germany moved from democracy to pathocracy in less than two years.
Democracy is an essential way of protecting people from pathological politicians, with principles and institutions that limit their power (the Bill of Rights in the US, which guarantees certain rights to citizens is a good example).
This is why pathocrats hate democracy. Once they attain power they do their best to dismantle and discredit democratic institutions, including the freedom and legitimacy of the press. This is the first thing that Hitler did when he became German chancellor, and it is what autocrats such as Trump, Vladimir Putin and Hungarian prime minister Viktor Orbán have been attempting to do.
In the US, there has clearly been a movement towards pathocracy under Trump. As Lobaczewski’s theory predicts, the old guard of more moderate White House officials – the “adults in the room”– has fallen away. The president is now surrounded by individuals who share his authoritarian tendencies and lack of empathy and morality. Fortunately, to some extent, the democratic institutions of the US have managed to provide some push back.
Britain too has been fairly fortunate, compared to other countries. Certainly there have been some pathocratic tendencies in some of our recent prime ministers (and other prominent ministers), including a lack of empathy and a narcissistic sense of self importance. But the UK’s parliamentary and electoral systems – and perhaps a cultural disposition towards fairness and social responsibility – have protected the UK from some of the worst excesses of pathocracy.
Pathocratic politics today
This is why recent political events seem so alarming. It seems as if the UK is closer to pathocracy than ever before. The recent exodus of moderate Conservatives is characteristic of the purges which occur as a democracy transitions into pathocracy.
The distrust and disregard for democratic processes shown by the UK prime minister, Boris Johnson, and his ministers and advisers — the prorogation of parliament, the insinuation that they may not follow laws they disagree with — is also characteristic of pathocracy.
As a psychologist, I would certainly not attempt to assess Johnson, having never met him. But in my view he is certainly surrounding himself with the most ruthless and unprincipled – and so most pathocratic — elements of his party. The former prime minister David Cameron even referred to Johnson’s chief adviser Dominic Cummings as a “career psychopath”.
At the same time, it is important to point out that not everyone who becomes part of a pathocratic government has a psychological disorder. Some people may simply be callous and non-empathic without a fully fledged psychological disorder.
Others may simply possess the kind of narcissism (based on a sense of superiority and entitlement) which arises from a certain style of upbringing. Some politicians may simply follow the party line through loyalty or in the belief that they will be able to rein in the pathocratic impulses of the people around them.
So far, thanks to the actions of parliament and the bravery of a small number of principled Conservative MPs, the potential pathocracy of Johnson’s government has been kept at bay.
But the danger of democracy transitioning into pathocracy is always real. It is always closer to us than we think, and once it has a foothold, will crush every obstacle in its way.” ~
http://theconversation.com/pathological-power-the-danger-of-governments-led-by-narcissists-and-psychopaths-123118?utm_medium=Social&utm_source=Facebook&fbclid=IwAR0V5GkONlfhWMJjmZrT-e5YXi_yGt0MrsjOWQFKm-XqSMV2GWlk8HqU5j0#Echobox=1568897070
Oxford, Balliol College. Johnson "read the classics" there -- one wonders with what effect.
*
US BIRTHRATES LOWEST IN THIRTY YEARS
~ “Births in the US have dropped to their lowest rate in 30 years, marking a cultural shift as women delay motherhood, experts say.
The fertility rate has dropped to 1.76 births per woman - the lowest since 1978, the National Centre for Health Statistics said in a report.
Some 3.85 million babies were born in the US in 2017, the fewest since 1987, as births among women in their teens and 20s decreased.
Both the birth rate - the number of births per thousand — and fertility — a lifetime average forecast — fell.
Declining birth rates are common as countries become more developed.
The US fertility rate is lower than the UK's but the US still has a higher fertility rate than many other countries.
While births decreased among younger women in the US last year, it rose in women aged between 40 and 44.
Donna Strobino of Johns Hopkins University put the change down to women choosing to delay motherhood in favor of work.
She told AFP news agency: "Women are becoming more educated, they are in the workforce, they are pursuing their careers.
And in the absence of policies that really help women who are working to really take some time off post-partum you are probably going to see a continuation of this delay."
It does not mean the population will shrink. It may grow at a slower rate, but there will continue to be more and more people in the US.
William Frey of the Brookings Institute called for some perspective on the figures.
"The country isn't going to run out of people," he said.
https://www.bbc.com/news/world-us-canada-44151642
Oriana:
Again, no reason to panic. The American population is still going to grow, not shrink. So much for the dream of uncrowded freeways.
*
OUR SHRINKING SKULLS AND SLEEP DEPRIVATION
~ “Skeletal records show that for hundreds of thousands of years, people had beautiful skulls: straight teeth, wide jaws, forward faces, large airways. Robert Corruccini, an emeritus anthropology professor at Southern Illinois University, found perfectly straight teeth and wide jaws in children’s skulls from pre-Roman times among Etruscan remains in southern Italy.
Then, about 250 years ago, our faces began to change. Boyd argues that industrialization interrupted the ancestral patterns of weaning and feeding, with babies nursing on demand for years while also trying solid foods under adults’ watchful eyes. Boyd says that the widespread adoption of bottle feeding, pacifiers and soft processed food deprived toddlers of practice chewing and distorted the shapes of their mouths. (“In modern society you have Gerber’s baby food,” Corruccini told me. “Etruscan kids had to chew once they were getting off breast milk. Babies have remarkably powerful chewing capabilities.”) Just like diabetes and heart disease, malocclusion — the misalignment of jaws and teeth — followed industrialization around the globe. Meanwhile, people in societies that never industrialized enjoyed well-aligned teeth and jaws.
Other factors may have played a part too. Environmental pollutants and recirculated indoor air increased the strain on our bodies and worsened pregnant women’s health in regions that industrialized first. That can impact skull shape of babies in utero by affecting birth weight, jaw length, and size of sucking pads in the cheeks. Skeletal records of animals show similar differences in skull shape between animals raised in the wild and those raised in captivity — suggesting that humans’ modern diet and environment play an outsized role in our evolving faces.
One of the first to recognize the shrinking skull was Charles Darwin, who described “civilized” humans as having shorter jaws than the “savages” who lived in non-industrialized societies, in The Descent of Man. “This shortening may, I presume, be attributed to civilized men habitually feeding on soft, cooked food, and thus using their jaws less,” he wrote. “I am informzd by Mr. Brace (the U.S. philanthropist Charles Loring Brace) that it is becoming quite a common practice in the United States to remove some of the molar teeth of children, as the jaw does not grow large enough for the perfect development of the normal number.”
In the 1920s and ’30s, a dentist named Weston Price traveled around the world, taking photographs of the teeth of indigenous people in Africa, Europe, North and South America, and Australia. His photos confirmed Darwin’s suspicions — he documented well-aligned teeth, high palates and forward jaws of non-industrialized populations. Price and Darwin may have been motivated by abhorrent beliefs — eugenics-informed ideas about “civilized” (Caucasian) races being an improvement on “savage” races (African and indigenous peoples) — but their research can still prove useful to modern researchers tracing the evolution of human skeletons.
“I have a CBCT of a five-year-old caucasian who died 250 years ago, and a five year old who’s in my office referred from a pulmonary physician because the kid has sleep apnea. And I can show how that jaw compares to one who died 300 years ago, and how much smaller it is,” Boyd explained to me.
The implications of shrinking modern skulls are more than aesthetic. Our smaller faces do the most harm in one area crucial to physical and mental health: our ability to get a good night’s sleep.
In proper development, the tongue moves along the roof of the mouth to push nutrients toward the esophagus, gently expanding the palate and exercising the lower jaw, lengthening and widening it over time. When a child’s jaw is too short and palate too narrow, their tongue cannot rest against the roof of the mouth and instead rests against the lower teeth. This causes them to routinely breathe through the mouth, an unhealthy habit. Then, as they lie flat to sleep, the tongue may fall back to block the throat, causing apnea. This can worsen into a vicious cycle through overuse of bottles, pacifiers or sippy cups, misshaping the teeth and mouth. Malocclusion and its resultant sleep problems form part of the cluster known as diseases of civilization, including obesity, stress, and depression. These are all conditions largely caused by our modern lifestyle and environment.
Albert Einstein College of Medicine professor Karen Bonuck has documented the damage that sleep problems can cause without early intervention. Her 2012 paper for Pediatrics found that babies and toddlers who mouth-breathed, snored, and experienced apnea were more likely to demonstrate behavior problems at ages four and seven, including hyperactivity, poor conduct, peer problems, and emotional difficulties, based on analysis of more than 11,000 children in a longitudinal study in Avon, a U.K. county. Boyd and his colleagues seized on that paper, and continue to cite it as evidence of the urgent need to identify and solve breathing problems in young children.
Bonuck rattles off a series of research findings on the far-reaching impact of inadequate sleep, zipping from topic to topic like a pinball. This study found that restricting children’s sleep by 30 minutes a night for less than a week lowered their neuropsychological functioning by the equivalent of two years. One meta-analysis, summing up findings from 21 studies, discovered that young children with sleep disordered breathing on average earned grades 12% lower than their peers. Combined with a paper that found kids, on average, have been losing a minute of sleep per year for the last century, this doesn’t bode well for the human race.
“When children don’t sleep, they’re cranky, moody, their expressive language is all impaired. Not only their verbal word-learning. The communication skills are at risk,” Bonuck said. Roughly half of children two to five years old experience sleep problems, defined as difficulty falling or staying asleep; inadequate sleep duration; trouble breathing or interrupted sleep. Yet almost no early intervention screenings address sleep or nighttime breathing. To be sure, sleep comes under pressure from many other factors, including the growth of social media, smartphones, and intense academics earlier in childhood.
When parents bring their children for screening, a simple questionnaire can identify a risk factor for sleep problems, such as mouth-breathing. Solutions range from nasal steroids, speech or myofunctional therapy, and allergy treatment, all the way to adenotonsillectomy, and the early orthodontia that Boyd practices.
“They focus better, less hyper, fewer tantrums, get along with others… Kids do literally grow when they’re sleeping. They’ll be healthier. By the way, this ties into obesity,” Bonuck said, citing a 2011 Institute of Medicine report on preventing childhood obesity and her own 2015 paper that found children’s sleep-disordered breathing and behavioral sleep problems before age seven are both risk factors for obesity at 15.
When children mouth-breathe or snore, the air passing through their throat dries out tissues and raises the risk for infection and inflammation, which would further compress the airway. They miss the many benefits of nasal breathing and disuse causes the nasal airway to shrink, exacerbating the problem.
“They have a chronic sinus infection and congestion. They can’t smell,” explained Joy Moeller, a myofunctional therapist in Los Angeles. “They lose their appetite or become picky eaters, preferring pasta because it’s easier to chew. It may lead to obesity, sleep disorders, or crooked teeth.”
There is no easy way to turn back the evolution of our skulls. It’s unrealistic to advise parents to eschew processed food, breastfeed longer, move to open-air cabins in the country, or perhaps put children on the Paleo diet to prevent these changes taking hold in the skulls of the next generation. We are stuck with our smaller modern faces, but there are steps we can take to address the conditions that come with them.
Across the state from Hanes at Northwestern University’s medical school, sleep medicine doctor Stephen Sheldon explained new techniques — or recovered techniques — that encourage the jaw to grow wider and more forward in order to align the teeth and enlarge the airway, and often enlist myofunctional therapy to create healthy tongue and mouth habits. Traditionally, orthodontists are most concerned with straightening teeth, rather than moving the mandible forward as a primary goal.
“We really don’t know yet which is better and we really need to pursue that question and answer it in a scientific method,” Sheldon said. “We have lots of anecdotes, but depending upon anecdotes is not science.”
Some orthodontists wait for braces. To Boyd, any time you see crowding or potential problems on the horizon, it’s a signal to expand the mouth now. Why delay? The sooner you intervene, the sooner the airway expands and kids start to develop good habits for nasal breathing and tongue position. He’s worked with children as young as two years old, and special needs kids whom some dentists find difficult to treat. Some of his colleagues use removable devices similar to those described in Boyd’s stack of early 20th century research papers.
The problem is getting worse, not better. “More babies are born with anatomy that makes nursing and breastfeeding difficult, raising the risk of developing dysfunctional feeding habits,” said feeding specialist Bahr. “More time on their backs than their tummies, processed foods, bottle feeding, and pacifiers contribute to the misshapen jaw, impairing breathing and sleep. Once their sleep suffers, a range of other problems begin to develop.”
“Often parents don’t even realize their child is having a sleep problem, because they don’t think snoring is a big deal,” he said. “They don’t realize how significant snoring can be to affect the child’s life. They’re waking frequently at night, can’t fall asleep.”
At my own children’s dental office, I found a 20-page parent education pamphlet from the American Dental Association that stresses the importance of teeth cleaning, a healthy diet, regular checkups, preventing injuries, and limits on sucking. Only one page addresses bite problems or teeth alignment, saying that orthodontic treatment usually begins between eight and 14 years of age. There’s no mention of sleep problems that could connect to a mouth and jaw issue, such as snoring, mouth-breathing, restless sleep, frequent nighttime wakings, effortful breathing, or difficulty waking children in the morning.
Humans draw 28,000 breaths each day. We sleep for about one-third of our life. Changing our sleeping and breathing habits can transform our physical and mental health. It all begins in our jaw, mouth, and throat anatomy, which shape the path of each breath.
Jerry Rose, a dental anthropologist and professor emeritus at the University of Arkansas, warned that a whole generation could be impacted if we don’t change course. “In evolution, there are winners and losers,” Rose said. We have to adapt, and adapt quickly, to our changing physiology — or risk the consequences.
“There are groups of people who simply went the wrong way,” Rose said. “And then they’re gone.”
https://onezero.medium.com/our-skulls-are-out-evolving-us-and-that-could-mean-a-public-health-crisis-f950faed696d
Oriana:
This seems developmental — prolonged thumb-sucking, the lack of chewing — rather than a true evolution, which has to involve genes.
Still scary. It's not normal for children to snore or mouth-breathe. But so many ideas of "normal" are changing.
Also: what about those other articles that state babies are being born larger and larger? It's hard to sort it all out.
*
ETHICS AND MORALITY SHOULD SERVE HUMAN NEEDS
“Ethics and morality should serve human needs, and choices should be based upon consideration of the consequences of actions rather than pre-ordained rules or commandments.” ~ from the description of Humanistic Judaism (also called “god-optional Judaism”)
A friend applauded the second part of the sentence: “choices should be based upon consideration of the consequences of actions rather than pre-ordained rules or commandments.” Of course I agree with that. True caring means considering the consequences rather than applying rigid rules. But it’s the first part that just stunned me: ETHICS AND MORALITY SHOULD SERVE HUMAN NEEDS. That the church never told me that is not surprising, since in its view, everything should serve the needs of the church. The dictatorial government certainly didn’t tell me, since everything should serve the needs of the government. Regardless, the first rule of survival rule was to keep quiet, so -- no discussion. It was only later that I saw how certain rules had nothing to do with ethics, and everything to do with the power structure.
Actually a seed of human need-based ethics is in Mark 2:27: “The Sabbath was made for man, not man for the Sabbath.” I liked that saying very much, but didn’t dare generalize from it. Oh sure, it was OK to consider the needs of others as an opportunity for self-sacrifice. But if you had any personal needs, you were SELFISH and needed to go to confession.
It used to be much worse, I know — the “no divorce” rule, for instance. Or “spare the rod and spoil the child.” Or “donating” a child to a convent or monastery — why, that was seen as wonderful parental ethics. The bad old times — let’s face it, the past was mostly horrible, precisely because individual needs did not count.
Sometimes people’s need collide, and it gets complicated. But some sort of solution can be worked out so that neither party is miserable. I think we are moving away from “one size fits all” toward a more individualist view.
Since I’ve been asking myself what of Christianity is still of value to me, I’ve decided to add “the Sabbath was made for man, not man for the Sabbath” as a seed of individualist, need-based ethics. Previously, I saw just two things, with all their implications: the story of the woman taken in adultery and “the kingdom of heaven is within you.” I’m adding the Sabbath saying. I’m not concerned with historical authenticity: let scholars argue about that. I’m concerned with extracting practical wisdom.
I never denied that there is wisdom in the Gospels. One nugget that I already cited: the Sabbath was created for man, not man for the Sabbath. Rituals are important, but not if they become oppressive and worsen our lives, already difficult. Simplify, simplify!
*
Rubens: Presentation of portrait of Marie de Medicis to Henry IV, 1625. What interests me is Jupiter and Juno up in the clouds — note the wonderful peacock (and the rather inadequate eagle, Jupiter’s bird). In order to glorify and elevate the royal marriage, Rubens (who was said to be a devout Catholic, going to mass every day before beginning to paint voluptuous plus-size naked women) had to reach for pagan gods — both the a-sexual Yahweh and Jesus present an anti-marriage ideal, and Mary remains a perpetual virgin.
Finally, it's perhaps a little bizarre that the moment of presentation of the portrait of the future spouse was thought to be worth recording. But the world belonged to aristocracy, and they thought everything about them was extremely important.
Ending on beauty:
“A shower fell in the night and now dark clouds drift across the sky, occasionally sprinkling a fine film of rain. I stand under an apple tree in blossom and I breathe. Not only the apple tree but the grass round it glistens with moisture; words cannot describe the sweet fragrance that pervades the air. I inhale as deeply as I can, and the aroma invades my whole being; I breathe with my eyes open, I breathe with my eyes closed-I cannot say which gives me the greater pleasure.
This, I believe, is the single most precious freedom that prison takes away from us; the freedom to breathe freely as I now can. No food on earth, no wine, not even a woman's kiss is sweeter to me than this air steeped in the fragrance of flowers, of moisture and freshness.
No matter that this is only a tiny garden, hemmed in by five-story houses like cages in a zoo. I cease to hear the motorcycles backing, radios whining, the burble of loudspeakers. As long as there is fresh air to breathe under an apple tree after a shower, we may survive a little longer.” ~ Aleksandr Solzhenitsyn
Wheat by Thomas Hart Benton, 1967
*
THE DAY I LEARNED MORTALITY
The surgeon said in a calm, controlled
voice, “You should be able to lead
a normal life — ” he paused —
“for the rest of your life.”
I walked out of the arctic hospital.
I kept walking to the parking lot.
It was the fracture of that pause:
the silence rolled, uncontrolled —
I drove on the streets, the freeway.
Sunlight in streaks and spills
played tag along the tattered
eucalyptus groves. Wildfires
of bougainvilleas flickered,
flirting with the wind.
A fluent paradise on fault lines.
A death sentence, but normal.
The palm-tree in front of my apartment
stood quiet, not clapping
its fronds, but waiting.
Not a twig fidgeted, not a cloud.
I kept walking. I kept climbing
the echoing stairs.
But everything around me
had stopped.
Everything was staring,
waiting,
my shadow splayed in two
against the stucco wall.
~ Oriana
Yes, that was the first encounter. But I was only 28. My parents were alive and shockingly healthy. Only one friend, slightly younger than I, was dead of colon cancer. It was yet far from a more complete understanding.
A shock, to be sure, but now it seems relatively minor. I wasn’t even thirty! Talk about the infancy of life.
Now it’s personal. The word “hospice,” for instance, is hair-raising. I realize that most people at that stage are “out of it” — but not always.
My recent medical apocalypse has been a much more devastating encounter than the one I describe in the poem. How terrible life is! The last act is sheer cruelty. Now I understand why so many have no interest in matters like evidence or the nature of reality, and embrace religious promises instead. Otherwise, we get this:
MEDICAL HISTORY
I’ve been pregnant. I’ve had sex with a man
who’s had sex with men. I can’t sleep.
My mother has, my mother’s mother had,
asthma. My father had a stroke.
My father’s
mother has high blood pressure.
Both grandfathers died from diabetes.
I drink. I don’t smoke. Xanax for flying.
Propranolol for anxiety. My eyes are bad.
I’m spooked by wind. Cousin Lilly died
from an aneurysm. Aunt Hilda, a heart attack.
Uncle Ken, wise as he was, was hit
by a car as if to disprove whatever theory
toward which I write. And, I understand,
the stars in the sky are already dead.
~ Nicole Sealey
It’s unbearable.
The greatest consolation is the thought of being able to contribute even a tiny bit to the life of others — to share our psychic riches while we can.
And to say “I love you” often. The truth of emotions is complex, and irrelevant. People need to hear that they are loved, and we need to say it. It’s perhaps our foremost moral duty.
Zebras are like horses. I wonder if standing like this is horses’ equivalent of cuddling (I realize that it sounds like a strange word choice, but what else?)
*
HOW TO RECOVER FROM STRESS MORE QUICKLY
1. PREVENT RUMINATION
Replaying the memory of a stressful experience after it is over can activate similar pathways in the brain as the actual experience. This can keep the stress reaction “switched on” even if a stressor is no longer there and cause the experience to be perceived as more distressing than it actually was. Preventing people from ruminating lowers their blood pressure faster after acute stress. Chronic stress has been linked to hypertension and in a small, randomised trial, US researchers, including Lynn Clemow at Columbia University Medical Center, used stress management training (based on a cognitive-behavioral group workshop) to effectively lower systolic blood pressure in patients with hypertension. The decline in pressure correlated with a decline in depressive rumination.
2. MINDFULNESS MEDITATION MAY NOT BE FOR YOU, BUT GIVE YOGA A TRY
The perceptual element of stress may be the reason some mind-body interventions such as yoga, breathing techniques and focused-attention meditation can benefit stress management through effects on improving emotional regulation, reducing stress reactivity and speeding up recovery after stress. It may also explain why some techniques such as mindfulness meditation have shown mixed results in controlled studies. It is possible the technique of mindfulness meditation can invite rumination and repetitive negative thoughts in some individuals but not in others.
3. GET INTO NATURE
An apparently “soft” factor like exposure to nature can hasten recovery following stress and lower markers of stress.
If you walk outside in green spaces, or even look at pictures of nature scenes, you may be able to increase your resilience to stress. A recent study by Stanford researchers showed that walking in green campus parkland reduced anxiety and worry more than walking on a busy street and had cognitive benefits as well. In another study, students were stressed by having to take a math test and getting feedback (even if not accurate) that they were performing below average. After the stressor, researchers assigned participants to one of two groups that either saw pictures of empty pathways and trees or pictures of urban scenes with cars and people. Those who saw the pictures of trees had faster cardiovascular recovery from stress (e.g., heart rate slowed down faster).
4. AVOID BRIGHT OR BLUE LIGHT IN THE EVENING
Avoid bright light or blue night exposure late in the evening from the use of LED screens — bright or blue light at night can delay the release of melatonin, a hormone that has been shown to reduce anxiety.
5. ENGAGE IN LIGHT EXERCISE (e.g. walking)
Low-intensity exercise reduces circulating levels of cortisol.
6. SMILE
A recent study by Tara Kraft and Sarah Preston at the University of Kansas showed that smiling—even if they're fake smiles—can help your body resist stress. In this clever study, the researchers used chopsticks to arrange subjects’ mouths into either (fake) smiles or neutral expressions. Half the subjects in the smile group did not know they were smiling. The other half were told to smile and therefore had genuine smiles (which involve moving both eye muscles and mouth muscles). But both smiling groups had lower heart rate than the neutral group after performing a stressful task. The group with genuine smiles had the lowest heart rate overall; the fake smile group had less of a drop in positive mood during the stressor. The researchers suggest that moving your facial muscles sends a message to your brain that can influence your mood.
7. STAND UPRIGHT
It turns out that standing in an upright pose actually helps you perform better under stress, as compared to slouching. In another clever recent study, published in the journal Health Psychology, researchers assigned people to either stand upright or slouch. The researchers held the subjects in position with physiotherapy tape (after giving them a cover story). Both groups then had to do a stressful speech task. The upright group performed better and had less fear and more positive mood, compared to the slouchers. They were also less self-conscious. So the next time you’re under stress, remember to stand tall.
8. TRY TO SEE YOUR STRESS AS A CHALLENGE
A study by Harvard and Yale researchers shows that your attitude toward stress matters and that people can learn more positive attitudes. The researchers showed one of two brief video clips to managers at a large, multinational banking firm, then measured their mood and work performance in subsequent weeks. These managers had high-pressure jobs with quotas they had to meet. One group saw a clip showing the negative effects of stress while the other group saw a clip about seeing stress as a positive challenge. The group that saw the clip about the positive aspects of stress actually felt less stressed—they engaged more at work and were happier and healthier. They also reported a 23% decrease in stress-related physical symptoms (like backache) compared to the group whose members saw the negative video. So try to see your stressors as challenges that you can learn from (even if it’s just learning to tolerate stress).
Adapted from
http://www.bbc.com/future/story/20190813-burnout-anxiety-stress-proof-relief?utm_source=pocket-newtab and from https://www.psychologytoday.com/us/blog/the-mindful-self-express/201603/6-proven-ways-recover-stress
Statue of the Buddha, Toledo Museum of Art
*
“Digressions, incontestably, are the sunshine; —and they are the life, the soul of reading; — take them out of this book for instance, — you might as well take the book along with them.”
~ Laurence Sterne, The Life and Opinions of Tristram Shandy, Gentleman.
Both the author and his eponymous hero were dying of consumption. So long as he continued writing, he would go on living. ~ M. Iossel
John Guzlowski:
Digressions that seem like digressions but the reader reassembles them to see the heart of the writer in full. I had a great Shakespeare prof who believed every word in a Shakespeare play or poem contained the entire poem. I once wrote a paper for him on the importance of the word “‘tis” in the play Hamlet.
ENGLISH KEEPS GAINING DOMINANCE AS THE UNIVERSAL LANGUAGE
~ “De Swaan divides languages into four categories. Lowest on the pyramid are the “peripheral languages”, which make up 98% of all languages, but are spoken by less than 10% of mankind. These are largely oral, and rarely have any kind of official status. Next are the “central languages”, though a more apt term might be “national languages”. These are written, are taught in schools, and each has a territory to call its own: Lithuania for Lithuanian, North and South Korea for Korean, Paraguay for Guarani, and so on.
Following these are the 12 “supercentral languages”: Arabic, Chinese, English, French, German, Hindi, Japanese, Malay, Portuguese, Russian, Spanish and Swahili – each of which (except for Swahili) boast 100 million speakers or more. These are languages you can travel with. They connect people across nations. They are commonly spoken as second languages, often (but not exclusively) as a result of their parent nation’s colonial past.
Then, finally, we come to the top of the pyramid, to the languages that connect the supercentral ones. There is only one: English, which De Swaan calls “the hypercentral language that holds the entire world language system together”. The Japanese novelist Minae Mizumura similarly describes English as a “universal language” . For Mizumura, what makes it universal is not that it has many native speakers – Mandarin and Spanish have more – but that it is “used by the greatest number of non-native speakers in the world”. She compares it to a currency used by more and more people until its utility hits a critical mass and it becomes a world currency. The literary critic Jonathan Arac is even more blunt, noting, in a critique of what he calls “Anglo-Globalism”, that “English in culture, like the dollar in economics, serves as the medium through which knowledge may be translated from the local to the global.”
In the last few decades, as globalization has accelerated and the US has remained the world’s most powerful country, the advance of English has taken on a new momentum. In 2008, Rwanda switched its education system from French to English, having already made English an official language in 14 years earlier. Officially, this was part of the government’s effort to make Rwanda the tech hub of Africa. Unofficially, it’s widely believed to be an expression of disgust at France’s role in propping-up the pre-1994 Hutu-dominant government, as well as a reflection that the country’s ruling elite mostly speaks English, having grown up as exiles in anglophone east Africa. When South Sudan became independent in 2011, it made English its official language despite having very few resources or qualified personnel with which to teach it in schools. The Minister of higher education at the time justified the move as being aimed at making the country “different and modern”, while the news director of South Sudan Radio added that with English, South Sudan could “become one nation” and “communicate with the rest of the world” – understandable goals in a country home to more than 50 local languages.
The situation in east Asia is no less dramatic. China currently has more speakers of English as a second language than any other country. Some prominent English teachers have become celebrities, conducting mass lessons in stadiums seating thousands. In South Korea, meanwhile, according to the sociolinguist Joseph Sung-Yul Park, English is a “national religion”. Korean employers expect proficiency in English, even in positions where it offers no obvious advantage.
*
Aneta Pavlenko, an applied linguist at Temple University in Pennsylvania, who has spent her career studying the psychology of bilingual and multilingual speakers, has found that speakers of multiple languages frequently believe that each language conveys a “different self”. Languages, according to her respondents, come in a kaleidoscopic range of emotional tones. “I would inevitably talk to babies and animals in Welsh,” reports a Welsh-speaker. An informant from Finland counters: “Finnish emotions are rarely stated explicitly. Therefore it is easier to tell my children that I love them in English.” Several Japanese speakers say that it’s easier to express anger in English, especially by swearing.
Here is the memoirist Eva Hoffman on the experience of learning English in Vancouver while simultaneously feeling cut off from the Polish she had grown up speaking as a teenager in Kraków: “This radical disjointing between word and thing is a desiccating alchemy, draining the world not only of significance but of its colors, striations, nuances – its very existence. It is the loss of a living connection.” The Chinese writer Xiaolu Guo described something similar in her recent memoir, writing about how uncomfortable she felt, at first, with the way the English language encouraged speakers to use the first-person singular, rather than plural. “After all, how could someone who had grown up in a collective society get used to using the first-person singular all the time? … But here, in this foreign country, I had to build a world as a first-person singular – urgently.”
In the 1970s, Anna Wierzbicka, a linguist who found herself marooned in Australia after a long career in Polish academia, stood the Sapir-Whorf hypothesis on its head. Instead of trying to describe the worldviews of distant hunter-gatherers, she turned her sociolinguistic lens on the surrounding anglophones. For Wierzbicka, English shapes its speakers as powerfully as any other language. It’s just that in an anglophone world, that invisible baggage is harder to discern. In a series of books culminating in 2013’s evocatively named Imprisoned in English, she has attempted to analyse various assumptions – social, spatial, emotional and otherwise – latent in English spoken by the middle and upper classes in the US and UK.
Reading Wierzbicka’s work is like peeking through a magic mirror that inverts the old “how natives think” school of anthropology and turns it back on ourselves. Her English-speakers are a pragmatic people, cautious in their pronouncements and prone to downplaying their emotions. They endlessly qualify their remarks according to their stance towards what is being said. Hence their endless use of expressions such as “I think”, “I believe”, “I suppose”, “I understand”, “I suspect”. They prefer fact over theories, savor “control” and “space”, and cherish autonomy over intimacy. Their moral lives are governed by a tightly interwoven knot of culture-specific concepts called “right” and “wrong”, which they mysteriously believe to be universal.
Because English is increasingly the currency of the universal, it is difficult to express any opposition to its hegemony that doesn’t appear to be tainted by either nationalism or snobbery. When Minae Mizumura published the Fall of Language in the Age of English, in 2008, it was a surprise commercial success in Japan. But it provoked a storm of criticism, as Mizumura was accused of elitism, nationalism and being a “hopeless reactionary”. One representative online comment read: “Who does she think she is, a privileged bilingual preaching to the rest of us Japanese!” (Perhaps unsurprisingly, Mizumura’s broader argument, about the gradual erosion of Japanese literature – and especially, the legacy of the Japanese modernist novel – got lost in the scuffle.)
In California, where I live, most of the languages that were spoken before the arrival of Europeans are already extinct. On America’s eastern seaboard, thanks to long proximity to Anglo settlers, the situation is even worse. Most of what we know about many of these vanished languages comes in the form of brief word lists compiled by European settlers and traders before the 19th century. Stadaconan (or Laurentian) survives only from a glossary of 220 words jotted down by Jacques Cartier when he sailed up the St Lawrence River in Canada in 1535. Eastern Atakapa, from Louisiana’s Gulf Coast, is known from a list of only 287, gathered in 1802. The last fragments of Nansemond, once spoken in eastern Virginia, were collected from the last living speaker just before his death in 1902, by which time he could only recall six words: one, two, three, four, five and dog.
In this past century, the Earth has been steadily losing diversity at every level of biology and culture. Few deny this is a bad thing. Too often though, we forget that these crises of diversity depend, to a great extent, on our own decisions. Much of what has been done can also be undone, provided there is the will for it. Hebrew is the most famous case of a language brought back from the dead, but linguistic revitalization has been proven to be possible elsewhere as well. Czech became a viable national language thanks to the work of literary activists in the 19th century. On a much smaller scale, endangered languages such as Manx in the Isle of Man and Wampanoag in the US have been successfully pulled back from the brink.
Before the era of the nation-state, polyglot empires were the rule, rather than the exception. Polyglot individuals abounded, too. For most of history, people lived in small communities. But that did not mean that they were isolated from one another. Multilingualism must have been common. Today, we see traces of this polyglot past in linguistic hotspots such as the Mandara mountains of Cameroon, where children as young as 10 routinely juggle four or five languages in daily life, and learn several others in school.
A resident of another linguistic hotspot, the Sepik region of Papua New Guinea, once told Evans: “It wouldn’t be any good if we talked the same; we like to know where people come from.” It’s a vision of Babel in reverse. Instead of representing a fall from human perfection, as in the biblical story, having many languages is a gift. It’s something to remember before we let English swallow the globe.” ~
https://getpocket.com/explore/item/behemoth-bully-thief-how-the-english-language-is-taking-over-the-planet?utm_source=pocket-newtab
Oriana:
“Diversity” has multiple positive connotations, and Lorca in Spanish sounds like a gift beyond any translation.
At the same time, we have a pragmatic question: how is humanity to communicate? The scientific community needs a common language — we take that absolutely for granted. But what about the world of commerce? Of international aviation, and travel in general? And, as I’ve noted in another essay, the gift of English isn’t simply communication — English is a language of equality, with a gain in time both due to time not wasted on formal shades of reverence but also the gift of greater clarity. What may seem rude at first turns out to be, above all, useful.
By the way, there have always been international languages — until English took over, French was the language of the educated Europeans — and not exclusively Europeans either. Will English be replaced by yet another language? Or perhaps by a simplified version of English, without the current convolutions (note, for instance, that New-World Spanish has done away with the “th” sound ubiquitous in European Spanish).
All we can be sure of is that it won’t be Hopi or Nahuatl.
A view of Lisbon; C. Fishman
*
THE POWER OF MOVIES TO CHANGE HISTORY
~ “Research has shown that people learn very effectively from stories and narratives, engaging our brain in ways that are both pleasurable and incredibly complex, so movies (and not just documentary form) are often ways for people to learn about the past. Our imagination is ready for action, and movies can provide a tantalizing twist, often portraying World Wars, the Depression, slavery, the Holocaust, or space exploration. Actors can become incorporated into people’s imagery of the past, such as in Mel Gibson's The Passion of the Christ (2004) starring Jim Caviezel as Jesus, and Daniel Day-Lewis as Abraham Lincoln in Lincoln (2012).
Quentin Tarantino’s Once Upon a Time in Hollywood provides an engaging story and background for the 1969 events that led to the Sharon Tate Manson-clan murder spree. Living up to its story-book title ("Once Upon A Time...") (spoiler alert) the movie provides a much different ending, as Sharon Tate never meets her demise in this tale. Most people over the age of 60 know about the Manson-family murders and Sharon Tate.
The movie provides a much less horrific ending for Tate and an alternative tale—complete with Tarantino-style violence (it involves a flame thrower) and using fantasy instead of historical facts. The Manson clan has the tables turned on them. However, in what is a deviation from the truth of what happened 50 years ago, it becomes possible that people (especially young adults) will now know a different version of reality—and may not question the movie’s twist on truth, and end up believing some of the fictitious events in the movie.
Research has shown that presenting people with misinformation—some information or event that is inconsistent with the truth of what happened earlier but is highly believable, can lead to not only some initial confusion, but it can then alter memory. As a result of introducing misinformation in a psychology experiment on the exact topic, people will claim to have been lost in a mall as a child after being told this story had happened to them, or that as a child they met Bugs Bunny at Disneyland to refresh your memory, Bugs is a Warner Bros. character and thus couldn’t be seen at Disneyland)
People are prone to believe stories and what makes sense often without questioning the events that are being suggested. Movies might provide just the right amount of entertaining and (sometimes subtle) misinformation that can lead to memories and history being altered in the process. Presenting tales and alternative ending in the context of a real event can make people think what could have happened if only a few things were different—but these variations on the truth can also lead to some implanted memories for people who only have a vague understanding of the past.
Quentin Tarantino is not intentionally trying to dupe people into thinking things were different 50 years ago, instead he is allowing us to imagine how things could have been different if a few small or seemingly random events happened or different choices were made by certain characters. He took creative license to shed a brighter light (flame-thrower style) on a dark event. Movies can allow the mind to imagine, and it is then up to us to differentiate what we imagine with what actually happened in the past, but sleeper effects can make us reimagine the past in ways that can have profound effects on our later memory, which can be modified each time we visit events from the past.
Ideally, movies that provide variations of the past will make people research what actually happened, to have a more complete understanding of the events, but it can also lead to some subtle changes in history from the younger viewers’ point of view.” ~
https://www.psychologytoday.com/us/blog/metacognition-and-the-mind/201908/can-hollywood-alter-history-how-film-modifies-memory
And this: “A lot of people are going to focus on the end of “Once Upon a Time ... in Hollywood.” The minute that we see that the film has jumped forward to August of 1969 and that Sharon Tate is very pregnant, anyone with even a passing knowledge of history knows what’s coming. Or at least they think they do. The final few scenes will be among the most divisive of the year, and I’m still rolling around their effectiveness in my own critical brain. Without spoiling anything, I’m haunted by the final image, taken from high above its characters, almost as if Tarantino himself is the puppet master saying goodbye to his creations, all co-existing in a vision of blurred reality and fiction. However, the violence that precedes it threatens to pull the entire film apart (and will for some people). Although that may be the point—the destruction of the Tinseltown dream that casts this blend of fictional and real characters back into Hollywood lore.”
https://www.rogerebert.com/reviews/once-upon-a-time-in--hollywood-2019
Oriana:
I wonder what Hannah Arendt would say about movies that seriously distort history — in this case for the sake of entertainment rather than devious propaganda. But that fine line, especially when history is not pretty — if it were, it would probably never make it into the category of “history.”
All memory is pretty much false memory — but there is a question of degree. Movies are extremely "persuasive." You can get to hate or adore a certain group of people because of a movie. I noticed some of that after "Crazy Rich Asians" vs "The Farewell." Actually that's not the best contrast — it's not about the Chinese culture per se, though the theme of family is strong in both. Rather, "Crazy Rich" made me hate the rich a lot more effectively than all the anti-capitalist propaganda we were presented in school. Not the kind of hatred that would have real-world consequences, at least not in my case. But one can imagine different viewers . . .
BUT WHAT ABOUT BIOGRAPHICAL NOVELS?
~ “We live in an age when biographical novels have become hugely popular, some of them rising to a high level of artistry, as in Colm Toíbín’s The Master (Henry James), Michael Cunningham’s The Hours (Virginia Woolf), or Joyce Carol Oates’ Blonde (Marilyn Monroe). It’s not that fine biographical novels haven’t always been around (see Lotte in Weimar, Thomas Mann’s exhilarating 1939 novel about Goethe, or Marguerite Yourcenar’s Memoirs of Hadrian, a magisterial book published in 1951). But similar works—really good ones—have been coming at us thick and fast in the last few decades.
Traditional literary novels are in decline. The figures bear this out, as in the most recent NEA study of American reading habits. A student of mine recently said to me in frustration: “I just can’t get interested in ‘made-up’ lives.” And I must admit, my own tastes have shifted over the decades away from invented lives. I think I speak for many when I say that it’s biographical novels—which are centered on actual lives and circumstances—that have found a more secure place in my reading (and writing) life.
Philip Roth famously put forward in “Writing American Fiction” (1961) the notion that the clamorous world around us has overtaken fiction. He wondered how a novelist could compete, making a credible fictive reality in light of a world that repeatedly stupefies, sickens, and seems finally “an embarrassment to one’s own meager imagination.” With Trump in the White House, Roth’s commentary seems truer than ever: This tacky, bumptious, and thoroughly implausible creature would read as false in any novel. Nobody would believe it.
On the other hand, true stories hold our attention. Think how many films claim to be “based on actual events.” But “real” lives, so to speak, are difficult to access. I know, having written biographies of Steinbeck, Frost, Faulkner, Jesus and Gore Vidal as well as bio-fictional takes on Tolstoy, Walter Benjamin, Melville, and, most recently, Paul the Apostle in The Damascus Road. On reflection, I think I got far closer to the reality of the life at hand in the novels than in the biographies. The restrictions of straight biography frequently close out any effort to imagine the feelings of the figure at the center of the narrative. One has to rely on letters or journals or interviews for confirmation, and of course even those can be defective.
I would have to guess, for instance, how Steinbeck felt when his first wife cheated on him with a close friend or when Frost’s wife of many decades refused to allow him into the bedroom when she was dying. I wondered about Faulkner’s suicidal drinking habits but had only external evidence, as when Faulkner’s daughter told me her father would sometimes try to “rearrange her features when he was drunk,” as she put it. I was on safer ground with Gore Vidal because he was a close friend; but, even there, I had to limit myself to imagining his feelings if I was to avoid steering uncomfortably into fiction.
[on writing about St. Paul]: It’s for the novelist to imagine the contours of Paul’s inner world, to guess at his motives. I saw him as a repressed homosexual, a man of amazing visionary powers, a godly person who heard voices—including the voice of God. But no scholar writing about Paul would comfortably push into his sexual feelings, his neurotic self-doubts, his anxieties about his friends, his risky compulsion to move through the fraught and dangerous world of the Roman Empire in order to bring the Good News to the masses.
While writing this, I would often reread my favorite biographical novels for encouragement, and in recent years there have been so many to choose from: Hilary Mantel’s glittering trilogy about Thomas Cromwell, Paula McClain’s The Paris Wife, which centers movingly on Hemingway’s love affair with Hadley in Paris in the 1920s. I went back to The Secret Life of Emily Dickinson by Jerome Charyn and Ann Beattie’s implausible, arresting, and underrated Mrs. Nixon. I reread Tracy Chevalier’s Girl with Pearl Earring and Gore Vidal’s Lincoln: these have become permanent fixtures in the pantheon of bio-fiction.
Any imagined life is both less and more than real. It’s less real in the sense that it’s not possible to resurrect the actual person. Even then, can one really know another person? Fiction offers the one and only way we have to get into the head of somebody not ourselves. If this person is someone of interest for one reason or another, there is all the more reason to want to know them and their world more deeply.
And there is a truthfulness in fiction that is simply unavailable to the academic biographer.
When I was writing The Last Station, a novel about Tolstoy’s final year, for example, I knew from biographical sources that Sofya Tolstoy had thrown herself into the pond on their property one day in 1910, moments after she discovered that her husband had left her for good. What I could not do was know what she was thinking and feeling as she dropped through those sheets of black water. What was the quality of her despair? This is the kind of thing only a novelist can tell us, or try to tell us. And—in increasing numbers—they’re giving it a whirl, often succeeding in ways that are changing the face of modern fiction.” ~
https://lithub.com/reading-in-a-boom-time-of-biographical-fiction/?fbclid=IwAR2V12o-XnyXrC-fTlTvAzzIryKMsnCZGAe_02T3A-Fe2iBq4ns0lTUgX9o
GOSPEL ACCORDING TO CRUDE OIL
~ “Historian Darren Dochuk argues in his new book, Anointed with Oil: How Christianity and Crude Made Modern America, the search for fossil fuels has itself long been overlaid with Christian commitment. Oil executives themselves historically have been among the most active and enthusiastic promoters of apocalyptic Christianity in the United States, their zeal to drill representing their religious passion as well as their quest for self-enrichment. Over the course of U.S. history, Dochuk writes, oil companies “openly embraced the theological imperatives that informed their chief executives, aligned their boardrooms with biblical logics, and sacralized their operations as modes of witness and outreach.” Because of the heavy investment of the industry in religious faith, oil, for Dochuk, has become more than just a commodity or an energy source. Its “grip on the human condition” is “total”; it has become “an imprint on America’s soul.”
For the sociologist Max Weber, capitalism was defined by the distinctive way that it taught people to approach their work. As he wrote in The Protestant Ethic and the Spirit of Capitalism (1905), people had to be taught to treat their labor—whatever it might be—with the seriousness of purpose devoted to a calling: to come to their jobs day after day on time, to labor with dedication, and to postpone a life of pleasure. The Calvinist creed—according to which worldly riches were to be sought not for their own pleasures but as evidence of God’s grace, a bulwark against the loneliness and powerlessness of each individual before the divine—taught people how to act in a capitalist order.
Oil, Dochuk suggests, at once underwrote and was fueled by a different system of religious belief. Oil is an industry of speculation, of rocky land that hides wonders unseen, of alchemic transformation of the raw materials of the earth into the fuel of industrial society. As Dochuk shows, many of the men (and they were mostly men) who spent their lives drilling for oil also subscribed to belief systems revolving around the notion that the world contained spiritual paradoxes not comprehensible through science alone. Their desire to become phenomenally wealthy was often inextricable from their longing to carry out what they saw as God’s work on earth.
Dochuk opens his story with Patillo Higgins, one of the first entrepreneurs to strike oil in Texas. One Sunday afternoon in 1891, Higgins had escorted a Sunday-school class of eight-year-old girls up to the top of Spindletop Hill to show them springs of water bubbling forth from the rocky ground. While instructing them in this “everyday application of religion,” he had chanced to notice gaseous clouds rising up from the earth as well—a sign that oil might be hiding underneath. Returning to his small town of Beaumont, Texas, he bought the plot of land that held the springs with his church elder.
On New Year’s Day, 1901, a drill team finally struck oil. Three million barrels of oil poured forth in the thirty days that followed, and the population of Beaumont ballooned from 6,000 to 50,000. And a new pantheon of oil companies—Gulf, Texaco, and Sun Oil Company, smaller than Standard Oil but still substantial—rose out of the wells of Texas.
Much of Anointed with Oil is organized around the idea that the division in the oil industry between the independents and the majors (especially Standard Oil, which at its peak in the late nineteenth century controlled nearly 90 percent of all the oil refining in the country) was echoed in two competing versions of Christianity and capitalism: the “wildcat Christianity” of the independents and the “civil religion of crude” promoted by Rockefeller and his firm. On the one hand, Rockefeller was a severe and pious Baptist who believed in the responsibility of those with riches to improve the social order. “I believe the power to make money is a gift from God,” he argued. “Having been endowed with the gifts I possess, I believe it is my duty to make money and still more money, and to use the money I make for the good of my fellow man according to the dictates of my conscience.” Rockefeller taught Sunday school each week for sixty years, went to prayer before attending to crises at the oil fields, shuttered pubs and closed brothels in his oil towns. And he built the mammoth bureaucratic entity of Standard Oil, which he described as the “salvation of the oil business,” its executives “missionaries of light.” He was inspired by the conviction that his refining enterprise could provide “collective salvation” for the industry by introducing rationality where disastrous competition had prevailed. “The Standard was an angel of mercy,” he argued, “reaching down from the sky, and saying, ‘Get into the ark. Put in your old junk. We’ll take all the risks!’”
Opposing Rockefeller were the wildcatters, driven by their own mystical version of faith, one far more ragged and improvisational. They clung to “an absolute essence of pure capitalism” that safeguarded their ability to make money in whatever way they thought best and had nothing but contempt for Rockefeller’s efforts to rationalize the industry or to contain competition. Oil was a way for the past to speak to the present, a sign of God’s glory and of riches for the prophet who could see through the earth’s surface to glimpse another world beyond. Dochuk tells the story of one Montanan coal miner turned aspiring oilman whose correspondence with spiritualists and astrologists drove his quest for oil; they reassured him that by “taking minerals out of the earth” he was “allowing them to transmute into higher forms, synthesize with human need and desire, and serve as further reminders that the universe leaned toward unity.” Oil, the would-be driller came to believe, was the “Magic Wealth Producer!” (as one Texas town’s boosters advertised)—and finding it would allow him to contribute to the spiritualist cause.
Other oil independents, such as Lyman Stewart—one of the founders of Union Oil—subscribed to premillennialism, which held that end times were nigh and the arrival of Christ imminent. The world would soon descend into chaos, evil, and disorder, evidence of which could easily be found in the tumultuous society of the turn of the century. But all was well, for after a period of tribulations it would then be reborn. For Stewart, Dochuk suggests, the worldview of premillennialism rhymed with his life experiences seeking oil: the sense of powerlessness before supernatural, otherworldly forces, the pendulum of fortune swinging wildly to and fro. Stewart went on to imitate his arch-enemy, Rockefeller, in his philanthropic efforts—donating money to support religious education and to fund the publication of fundamentalist Christian texts.
Perhaps the greatest stronghold of “wildcat Christianity” was East Texas, where oil was discovered just as the Great Depression took hold. The oil boom that followed—the largest in American history—at once inspired and helped promote what Dochuk describes as “end-times urgency.” The denizens of East Texas believed they were blessed with oil, charged with using it to build God’s kingdom on earth, and pressed to do so quickly before the gifts that had been extended to them disappeared. Independent oil producers operated a majority of wells in the region throughout the 1930s. Church lots were littered with oil derricks as ambitious oilmen sought to drill wherever they could, while enterprising ministers dreamed of striking it rich; Dochuk describes a congregation gathering to pray over a new well. The “rush to obtain oil,” he writes, “always worked according to earth’s (and God’s) unknowable clock, with depletion (and Armageddon) an inevitability lingering on the horizon.” Their faith was undimmed even after the 1937 New London disaster, in which a gas explosion at a public school newly built for the children of oil workers killed about 300 students. As one religious leader put it in the aftermath, “These dear oil field people can set the world an example for consecration, and they will.” The intense melding of political and religious ideas with economic interest helped to make Texas one of the hotbeds of opposition to Roosevelt and to New Deal liberalism in the years that followed the second World War.
Still, by the early twenty-first century, the old certitudes were running out. Oil independents with their strong ties to evangelical Christianity believed that their fortunes were rising with Ronald Reagan’s election to the White House—but a glut in world oil markets that led to falling prices in the 1980s put many out of business, never to recover. Meanwhile, the oil giants no longer seemed able to promise stable, peaceful economic development to the rest of the world. The fragmentation of the Rockefeller dynasty was the most dramatic example. “Most of the fourth Rockefeller generation have spent long years with psychiatrists in their efforts to grapple with the money and the family, the taint and the promise,” pronounced one 1976 exposé. By the end of the twentieth century, Steven Rockefeller, a professor of religion at Middlebury College, had started to steer his family foundation toward positions that would have horrified his great-great-great grandfather—especially advocacy for environmental conservation.” ~
https://bostonreview.net/philosophy-religion/kim-phillips-fein-gospel-oil?utm_source=Boston+Review+Email+Subscribers&utm_campaign=7f13a8fc77-MC_Newsletter_9_4_19&utm_medium=email&utm_term=0_2cb428c5ad-7f13a8fc77-40729829&mc_cid=7f13a8fc77&mc_eid=97e2edfae1
Here are the 10 states with the highest percentage of millionaires
10. California: 6.61 percent. (Cost of living: 33 percent above national average.)
9. Delaware: 6.62 percent. (Cost of living: .6 percent below national average.)
8. Virginia: 6.98 percent. (Cost of living: 1.7 percent below national average.)
7. New Hampshire: 7.36 percent. (Cost of living: 14.7 percent above national average.)
6. Massachusetts: 7.41 percent. (Cost of living: 20.7 percent above national average.)
5. Alaska: 7.50 percent. (Cost of living: 18.5 percent above national average.)
4. Hawaii: 7.57 percent. (Cost of living: 26.9 percent above national average.)
3. Connecticut: 7.75 percent. (Cost of living: 18.5 percent above national average.)
2. New Jersey: 7.86 percent. (Cost of living: 13.4 percent above national average.)
1. Maryland: 7.87 percent. (Cost of living: 21.4 percent above national average.)
Oriana:
Well, I can see a surgeon "earning" his income etc. — a few top professions. But even those people pretty soon go into investments, since money makes money more effectively than working hard, and investment income is taxed at a lower rate (Note that Warren Buffett was astonished (and indignant) that his secretary paid a higher tax rate than he did)
THE LIFELONG JOURNEY OF RECOVERY
“I think I was about 15 when I conceived of myself as an atheist, but I think it was only very recently that I can really tell that there's nobody there with a copybook making marks against your name.” ~ Sharon Olds
This confirms what I’ve been saying for a while now: it can take a long, long time — half a lifetime or longer — let's face it, a lifetime — to recover from the “god of punishment” and rejoice in the knowledge that there is no punishment just for being human.
Since the existence of the God of Punishment isn’t so obvious, it turns out that all kinds of people are perfectly willing to take on the function: “I’ll be the God of Punishment.” I realize that people who had punitive (a nicer word than “abusive,” isn't it?) parents, who knew they had not been wanted children, who for any reason felt that they were "bad" when growing up face the same challenge of a lifelong recovery.
So there are all kinds of petty gods of punishment on parade, and we can’t avoid dealing with at least some of them. The older we get, the more we tend to find such people pathetic.
And sometimes life isn't long enough for full healing. Life rushes on whether or not you've reached clarity about the past, and (typically) understood it was not your fault. Recently I came upon this passage in a poem of mine, about blood drawing:
My veins are baby-fine
but my blood is dark red.
This worries me: my blood
so dark with the years,
but silent about
the shipwrecks of my life.
But of course it's a blessing that the blood is silent. The last thing I want is being reminded of those shipwrecks — and lately they seem to have receded, without my really trying. Coming to terms with the past usually happens automatically — the brain is clever and it's the unconscious that mysteriously, effortlessly takes care of such tasks — when it is ready. Before such readiness, the best solution can be simply deliberately not to think about the shipwrecks of the past — aren’t the current ordeals ENOUGH?
Remember: rumination is a habit, a behavior, and a behavior can be changed. You are not helpless over it, but you do need a motivation (for me the idea that I didn’t want to waste what precious few good years remained was enough).
Introspection can be dangerous for the insufficiently healed. Crying fits and time wasted brooding can follow. That’s why during the immediate period of recovery from depression I used to have the no-think zone.” “How can I be useful today?” is a more urgent matter.
This is a Hellenistic bronze of a Boxer at Rest, c. 330 BCE. I especially admire the face. How wonderful that the Greeks did not have a prohibition on making "graven images" (btw, the Catholic church tossed/falsified that commandment, for which I am infinitely grateful).
Ending on beauty:
God and I were walking in the woods
just down the road this morning.
He was quiet for a long time,
and I finally asked him what he was thinking.
He didn’t hesitate at all this time.
He said, I was thinking about Eden
and how happy we all were there.
I wish I’d given that young couple
another chance.
~ John Guzlowski
Adam and Eve by an Iraqi artist