Saturday, January 11, 2020

BORGES: IMMORTALITY IS OTHER PEOPLE; JOE BIDEN AND THE IRISHNESS OF LIFE; JANE AUSTEN, THE SECRET RADICAL; THE THE; THE 3.5% RULE OF REVOLUTIONARY CHANGE; CHRISTIANITY WITHOUT HELL?

Borges in Palermo
*
INSCRIPTION ON ANY TOMB

Let not the rash marble risk
garrulous breach of omnipotent oblivion,
in many words recalling
name, renown, events, birthplace.
All those glass beads are best left in the dark.
Let not the marble say what men don’t speak about.

The essentials of the dead man’s life —
the trembling hope,
the implacable miracle of pain,
the wonder of sensual delight —
will abide forever.

Blindly the willful soul asks for length of days
when its survival is assured by the lives of others.
You yourself are the embodied continuance*
of those who did not live into your time
and others will be (and are) your immortality on earth.

~ Jorge Luis Borges, tr. W.S. Merwin

* Borges: tu mismo eres el espejo y la réplica

= “you yourself are the mirror and the reply”

I don’t know why Merwin chose to “explicate” the poetry of that line in the original. He made it the meaning more clear, but at the cost of abstraction (“embodied continuance”).

Willful soul is in Borges “arbitrary soul” — which carries the connotation of “accidental.”

*

This poem presents a profound answer so often missing in poetry,  much of which deals with mortality, but not in a very satisfying way. We'd like to be immortal, but the old religious answers are not credible anymore. In any case, it's high time to be rid of the cruel archaic god whose power rests on the fear of hell, and whose heaven, let's face it, is utterly unattractive (a reminder: no internet there).

Borges provides a secular answer, and it's not the cosmic union. He does not warble that "we become stardust again.” After the richness of being human, that's not terribly attractive either. Borges says that our immortality is the lives of those who come after us, just as our ancestors continue in us. It may not be exactly what we'd want, but that's what we get. There is a senses of honesty here that's exhilarating in itself.

Borges was such a “singular” man (I mean it in the sense of unusual, exceptional — but the word insists on its most common meaning) that it’s striking how he doesn’t buy “individualism.” He does not insist on his “exceptionalism.” He does not announce that his verse will be immortal, or will make anyone immortal. No, we are all destined for “ominipotent oblivion.” But . . . simply because we are human, we are not isolated individuals; we are humanity. We pass as the water in the river passes, but the river remains.

Forget the particulars of a person’s life; they will be forgotten anyway. They are the worthless “glass beads.” What matters is the treasure of continuity, of being part of humanity. No one is just an isolated person, but a portion of humanity.

This realization may have come to Borges in part from his life among books. He realized that his mind is a mosaic of the endless volumes he’s read, influences he’d absorbed. From there it’s only a step to seeing oneself as part of the larger human community across time, and of the human continuum.

His acceptance of the collective mind set Borges apart from those writers in his generation who insisted on the cult of the artist as “separate, different, and superior” — alienated from the culture at large, and basically from all others. But Borges communed not only with his peers, but also, to a great extent, with great writers of the past, and knew he was part of a continuum.

It reminds me of Rilke’s idea that we love not just a single person, but, within that person, multitudes of others who’ve gone before — mothers and fathers, those crumbled mountains and dry riverbeds who shaped the landscape of the beloved. 



This is not to deny the uniqueness of each of us, something we bring to the universe only once: “there will never be another you.” That’s astonishing too; we tend to celebrate the uniqueness of an individual, those aspects that stand out as different. This individualism can make us forget how typical we are in most ways, children of our age and culture. In the West in particular, almost everyone has had at least moments of feeling so different from others that loneliness threatens to overwhelm: no one really knows me, so how can they love the “real me.” Never mind that the “real me” is so elusive, so . . . non-existent. Even our memories are not fully ours, but a collage of we absorbed in all kinds of ways, including books and movies and the stories we heard, eventually conflating them with our own.

If we were words, each person would be an oxymoron. We are both individual and collective. A single individual has no meaning apart from his social context. As Christian Wiman said, “Experience means nothing if it does not mean beyond itself: we mean nothing unless and until our hard-won meanings are internalized and catalyzed within the lives of others.”

Pondering how deeply I agree with this reminds me that it wasn’t always that way. At some point in my late teens and early twenties I felt a deep alienation and wanted to be as independent and unconnected as possible, not needing others. That is of course not infrequent in early youth, this wanting to be a lone hero — I’ve definitely seen it in others, especially young men. The culture enshrines the apparently self-sufficient frontiersman who  builds his own house, grows his own food, hunts, and so on. It may seem humbling to realize how dependent on others we are. Even the lone frontiersman has the benefit of tools designed and produced by others.

And experience quickly shows us that a lot more can be done by cooperating with others. As for writers, they may say it’s between them and the language, but that’s true only during the creative process. Writers needs readers, and they need their peers. And even the most reclusive of them (think of Emily Dickinson) rely on the collective psyche, that “greater mind” that resides in the language and culture. And that greater mind is so rich that even a recluse has enough to write about — especially an educated recluse surrounded by books.

So, in the most radical sense, there should be no inscription upon any tomb. This was once a human being; that is enough. And yet, and yet . . . I confess I love to read inscriptions on old tombs, the little that remains of that “individual” part. Now, however, cremation followed by the scattering of ashes is becoming standard, so the dead live only in our memory, as well as in the anonymous ways in which the words they said, the pain and delight they felt, are now said and felt by others. We are both mortal and immortal, individual and collective, Borges says. In this poem, his emphasis is on the collective. This is quite striking coming from Borges, who seems deeply “individuated,” as Jungians would say. Perhaps that’s why he feels completely at ease insisting on the collective dimension as the one that’s more essential.

And indeed, treasured reader (I say this without a grain of irony; I'm touched that people want to read my posts), the very fact that you are reading these words on an electronic screen is testimony of the collective human genius that creative the great collective mind known as the Internet. And before the Internet could come into being, countless developments in science, mathematics, and civilization as a whole had to happen.

*

“Eternity is a long time, especially towards the end.” ~ Stephen Hawking 


*

JANE AUSTEN, THE SECRET RADICAL

~ “Jane was born five years after the poet William Wordsworth, the year before the American Revolution began. When the French Revolution started, she was thirteen. For almost all of her life, Britain was at war. Two of her brothers were in the navy; one joined the militia. For several years she lived in Southampton, a major naval base. It was a time of clashing armies and warring ideas, a time of censorship and state surveillance. Enclosures were remaking the landscape; European empire building was changing the world; science and technology were opening up a whole universe of new possibilities.

We’re perfectly willing to accept that writers like Wordsworth were fully engaged with everything that was happening and to find the references in their work, even when they’re veiled or allusive. But we haven’t been willing to do it with Jane’s work. We know Jane; we know that however delicate her touch she’s essentially writing variations of the same plot, a plot that wouldn’t be out of place in any romantic comedy of the last two centuries.
We know wrong.


*


Think of Jane’s landowners, of her soldiers, her clergymen, her aristocrats. In Sense and Sensibility, John Dashwood feels that generosity to his fatherless, impoverished sisters would demean him; in Mansfield Park, Henry Crawford elopes with a married woman, the cousin of the very woman he has proposed marriage to. In Pride and Prejudice, the militia officers quartered in the heroine’s hometown spend their time socializing, flirting, and—on one occasion—cross-dressing, rather than defending the realm. The Reverend Mr. Collins is laughable. None of Jane’s clergymen characters have vocations, or even seem to care very much about the well-being—spiritual or physical—of their parishioners. Does Mr. Darcy’s arrogant, interfering aunt Lady Catherine de Bourgh look like a character designed to justify the aristocracy? Or Persuasion’s vain Sir Walter Elliot, who spends his time keeping up appearances with money that he doesn’t have?

Think, too, about the fact that Jane was the only novelist of this period to write novels that were set more or less in the present day and more or less in the real world—or, at any rate, a world recognizable to her readers as the one in which they actually lived.

The novels as they were printed are what bring us as close to Jane as we’re ever going to get, closer than any memoir or biography could—closer not necessarily to what she might have done or felt but to what she thought. It’s impossible for anyone to write thousands upon thousands of words and reveal nothing of how she thinks or what she believes. And, contrary to popular opinion, Jane did reveal her beliefs, not just about domestic life and relationships, but about the wider political and social issues of the day.

Jane’s novels were produced in a state that was, essentially, totalitarian. She had to write with that in mind. The trick was never to be too explicit, too obvious, never to have a sentence or a paragraph to which someone could point and say, “Look, there—it’s there you criticize the state, it’s there you say that marriage traps women, that the Church is crammed with hypocrites, that you promote breaking society’s rules.” Jane did fail, once, to err on the side of caution. Mansfield Park, alone of all her books, wasn’t reviewed on publication. This, as I will show, is because it was an inescapably political novel, from the title onward—a “fanatical novel” that continually forced its readers to confront the Church of England’s complicity in slavery.

Jane talks in one letter about wanting readers who have “a great deal of ingenuity,” who will read her carefully. In wartime, in a totalitarian regime, and in a culture that took the written word far more seriously than we do, she could have expected to find them. Jane expected to be read slowly—perhaps aloud, in the evenings, or over a period of weeks as each volume was borrowed in turn from the circulating library. She expected that her readers would think about what she wrote, would even discuss it with each other.


She never expected to be read the way we read her, gulped down as escapist historical fiction, fodder for romantic fantasies. Yes, she wanted to be enjoyed; she wanted people to feel as strongly about her characters as she did herself. But for Jane a story about love and marriage wasn’t ever a light and frothy confection. Generally speaking, we view sex as an enjoyable recreational activity; we have access to reliable contraception; we have very low rates of maternal and infant mortality. None of these things were true for the society in which Jane lived. The four of her brothers who became fathers produced, between them, 33 children. Three of those brothers lost a wife to complications of pregnancy and childbirth. Another of Jane’s sisters-in-law collapsed and died suddenly at the age of 36; it sounds very much as if the cause might have been the rupturing of an ectopic pregnancy, which was, then, impossible to treat. 


Marriage as Jane knew it involved a woman giving up everything to her husband—her money, her body, her very existence as a legal adult. Husbands could beat their wives, rape them, imprison them, take their children away, all within the bounds of the law. Avowedly feminist writers such as Mary Wollstonecraft and the novelist Charlotte Smith were beginning to explore these injustices during Jane’s lifetime. Understand what a serious subject marriage was then, how important it was, and all of a sudden courtship plots start to seem like a more suitable vehicle for discussing other serious things.


No more than a handful of the marriages Jane depicts in her novels are happy ones. And with the possible exception of Pride and Prejudice, even the relationships between Jane’s central characters are less than ideal—certainly not love’s young dream. Marriage mattered because it was the defining action of a woman’s life; to accept or refuse a proposal was almost the only decision that a woman could make for herself, the only sort of control she could exert in a world that must very often have seemed as if it were spiraling into turmoil. 


Jane’s novels aren’t romantic. But it’s become increasingly difficult for readers to see this.

*
Jane’s world is a world in which parents and guardians can be stupid and selfish; in which the Church ignores the needs of the faithful; in which landowners and magistrates—the people with local power—are eager to enrich themselves even when that means driving the poorest into criminality. Jane’s novels, in truth, are as revolutionary, at their heart, as anything that Wollstonecraft or Tom Paine wrote. But by and large, they’re so cleverly crafted that unless readers are looking in the right places—reading them in the right way—they simply won’t understand.

https://lithub.com/the-many-ways-in-which-we-are-wrong-about-jane-austen/?fbclid=IwAR0e_TpNfc7kKLw2NilPqpYrrYVuh_6orWlR0DTb5ko4BQUDwV48BZQJKvU
 

Mary:

The numbers of children and dead wives in her family alone illustrate the cogency of the writer's point. Marriage and childbirth were serious, life and death matters that women had little choice in or control over.



TIGHT BREECHES, LOOSE DRESSES, FAKE PREGNANCY PADS; THE FASHION WORLD OF JANE AUSTEN 

~ “The Regency is defined by strong clothing narratives. The French embraced Anglomania, waistlines rose and dresses turned white and flimsy, ornamented with fripperies borrowed from other times and cultures. Women’s heads retreated into bonnets; their bosoms were newly defined and uplifted. Men transformed Classicism into a focus on the athletic body. Their muscular thighs sprang into pale, defined relief by contrast with their broad, wool-clad shoulders. Disrupted from the Continent by war and blockade, British fashion embraced French style, after Napoleon’s final defeat in 1815, to succumb to a new tide of romantic influence.

How Austen’s contemporaries saw people wearing clothes is not the same as how we see them retrospectively. To the Regency observer in London’s streets a Frenchman stood out immediately, as did an English miss strolling in the Tuileries to Parisians. The dandy seeking perfect fit found it in a tighter jacket than any gentleman now would tolerate, while visible shoulder blades and upper arms could constitute scandalous female nakedness. Throughout, I have sought what was “entirely taken for granted” in dress during Austen’s lifetime and re-read her writings in the context of how she and her audience would have understood the clothed Regency body.

“In any search for Jane Austen,” as Emily Auerbach cautions, “we must break free of dear Aunt Jane . . . We must strip off those ruffles and ringlets added to her portrait, restore the deleted fleas and bad breath to her letters, and meet Jane Austen’s sharp, uncompromising gaze head on.” In dress terms the “ruffles and ringlets” are the fashion-plate, screen idealizations of Regency dress, and the “fleas and bad breath” are prosaic flannel underwear, stockings darned into lumps, and muddy, manure-coated streets. However, filmed Austen can suggest the lived effect of clothes in her lifetime, “of interest . . . as objects of desire in their own right.” If readers, re-enactors, curators, collectors, writers and designers now desire Regency clothing, the screen has shaped their vision.

https://lithub.com/tight-breeches-and-loose-gowns-going-deep-on-the-fashion-of-jane-austen/?fbclid=IwAR1qV9eLmnySOujcNb3VAAvjGdj-KPXphGP6qO946eX27D1bGqQB8F7Abw0



~ “With the Classical style came the willingness to expose the breast. With the new iconography of the Revolution as well as a change in emphasis on maternal breast-feeding, the chemise dress became a sign of the new egalitarian society. The style was simple and appropriate for the comfort of a pregnant or nursing woman as the breasts were emphasized and their availability was heightened. Maternity became fashionable and it was not uncommon for women to walk around with their breasts exposed. Some women took the "fashionable maternity" a step further and wore a "six-month pad" under their dress to appear pregnant.” ~ Wiki

*
“I am too intelligent, too demanding, and too resourceful for anyone to be able to take charge of me entirely. No one knows me or loves me completely. I have only myself.” ~ Simone de Beauvoir

She was insightful as a feminist; not particularly in her politics.
 
*


*

JOE BIDEN AND THE IRISHNESS OF LIFE
 
~ “In Promise Me, Dad, Biden quotes one of the grand figures of Irish-American politics, Daniel Patrick Moynihan: “To fail to understand that life is going to knock you down is to fail to understand the Irishness of life.” So while an African-American choral group was chosen to play “joyful music” at Beau’s memorial service, Biden notes that there were also “bagpipers to add the mournful, plaintive wail of Irishness.” In his address at the service, Barack Obama quoted a line from a song by the Irish poet Patrick Kavanagh, in which mourning is as inevitable as the passing of seasons: “And I said, let grief be a fallen leaf at the dawning of the day.”

It scarcely matters here whether there’s much truth in the notion that the Irish have a particularly familiar relationship with grief. What does matter is that the “mournful, plaintive wail of Irishness” is the soundtrack for both the Kennedy and the Biden stories, in which triumph is always shadowed by calamity. There is in this structure of feeling no easy opposition of hubris and nemesis. There is just, as Obama said to Biden when Beau was dying, the awareness that “life is so difficult to discern”—difficult because it does not offer itself in the easy forms of the wonderful and the terrible but confuses the two by conjoining them as twins. The political manifestation of this awareness is not the upbeat rhetoric of the American Dream; it is a politics of empathy in which the leader shares the pain of the citizen. While Biden seems hollow when he deploys the former, he has been a forceful practitioner of the latter. “We had to speak for those who felt left behind,” he writes in Promise Me, Dad. “They had to know we got their despair.” Biden has always been better at getting despair than at giving concrete, programmatic form to hope.

With Biden, fellow feeling is literal—he feels you. He is astonishingly, overwhelmingly hands-on. He extended the backslapping of the old Irish pol into whole new areas of the body—hugging, embracing, rubbing. In his foreword to Steven Levingston’s engaging account of the Biden–Obama relationship, Barack and Joe, Michael Eric Dyson writes of the vice president’s “reinforcing his sublimely subordinate position by occasionally massaging the boss’s shoulders.” But Cramer noted Biden doing the same thing to an anonymous woman at a campaign stop in 1987: “Gently, but decidedly, he put his hands on her. In Council Bluffs, Iowa! He got both hands onto her shoulders, while he talked to the crowd over her head, like it was her and him, through thick and thin.” So not really a gesture of submission or of domination, perhaps, but a desperate hunger to connect, to touch and be touched, to both console and be consoled. “The act of consoling,” Biden writes, “had always made me feel a little better, and I was hungry to feel better.”

But the Master Clock has moved too far forward. The Kennedys are too long dead. “Irish Catholic” no longer carries that old underdog voltage of resistance to oppression. The center of gravity of Irish-American politics now gathers around Trump: Mick Mulvaney, Kellyanne Conway, Brett Kavanaugh. A politics of white resentment has drowned out the plaintive wail of common sorrow. The valley of tears has been annexed as a bastion of privileged white, male suffering. Biden, who once promised to turn back time, is an increasingly poignant embodiment of its pitilessness.” ~

https://www.nybooks.com/articles/2020/01/16/joe-biden-designated-mourner/?utm_medium=email&utm_campaign=NYR%20Joe%20Biden%20Alma%20Mahler%20Paul%20Bowles%20and%20Cats&utm_content=NYR%20Joe%20Biden%20Alma%20Mahler%20Paul%20Bowles%20and%20Cats+CID_a1178783584349e376e29c1c7dcfb064&utm_source=Newsletter

Oriana:

This is a beautifully written article. I love both its style and insights.

By the way, I think that the “Polishness of life” would stand for the same thing, but Polish history, unlike Irish history, is not much known outside of Poland.

*


Edward: "Charles I at his trial. Charles' barber was dismissed and Charles let his beard grow because there was no one in Parliament whom he trusted handling a razor. Alas, he was found guilty at his trial, and was beheaded."
*

“Against eternal injustice, man must assert justice, and to protest against the universe of grief, he must create happiness.” ~ Albert Camus

*

THE ‘THE’


~ “It’s omnipresent; we can’t imagine English without it. But it’s not much to look at. It isn’t descriptive, evocative or inspiring. Technically, it’s meaningless. And yet this bland and innocuous-seeming word could be one of the most potent in the English language.


The’ tops the league tables of most frequently used words in English, accounting for 5% of every 100 words used. “‘The’ really is miles above everything else,” says Jonathan Culpeper, professor of linguistics at Lancaster University. But why is this? The answer is two-fold, according to the BBC Radio 4 programme Word of Mouth. George Zipf, a 20th-Century US linguist and philologist, expounded the principle of least effort. He predicted that short and simple words would be the most frequent – and he was right.

The second reason is that ‘the’ lies at the heart of English grammar, having a function rather than a meaning. Words are split into two categories: expressions with a semantic meaning and functional words like ‘the’, ‘to’, ‘for’, with a job to do. ‘The’ can function in multiple ways. This is typical, explains Gary Thoms, assistant professor in linguistics at New York University: “a super high-usage word will often develop a real flexibility”, with different subtle uses that make it hard to define. Helping us understand what is being referred to, ‘the’ makes sense of nouns as a subject or an object. So even someone with a rudimentary grasp of English can tell the difference between ‘I ate an apple’ and ‘I ate the apple’.

But although ‘the’ has no meaning in itself, “it seems to be able to do things in subtle and miraculous ways,” says Michael Rosen, poet and author. Consider the difference between ‘he scored a goal’ and ‘he scored the goal’. The inclusion of ‘the’ immediately signals something important about that goal. Perhaps it was the only one of the match? Or maybe it was the clincher that won the league? Context very often determines sense.


There are many exceptions regarding the use of the definite article, for example in relation to proper nouns. We wouldn’t expect someone to say ‘the Jonathan’ but it’s not incorrect to say ‘you’re not the Jonathan I thought you were’. And a football commentator might deliberately create a generic vibe by saying, ‘you’ve got the Lampards in midfield’ to mean players like Lampard.


The use of ‘the’ could have increased as trade and manufacture grew in the run-up to the industrial revolution, when we needed to be referential about things and processes. ‘The’ helped distinguish clearly and could act as a quantifier, for example, ‘the slab of butter’.
This could lead to a belief that ‘the’ is a workhorse of English; functional but boring. Yet Rosen rejects that view. While primary school children are taught to use ‘wow’ words, choosing ‘exclaimed’ rather than ‘said’, he doesn’t think any word has more or less ‘wow’ factor than any other; it all depends on how it’s used. “Power in language comes from context... ‘the’ can be a wow word,” he says.


This simplest of words can be used for dramatic effect. At the start of Hamlet, a guard’s utterance of ‘Long live the King’ is soon followed by the apparition of the ghost: ‘Looks it not like the King?’ Who, the audience wonders, does ‘the’ refer to? The living King or a dead King? This kind of ambiguity is the kind of ‘hook’ that writers use to make us quizzical, a bit uneasy even. “‘The’ is doing a lot of work here,” says Rosen.


Deeper meaning


‘The’ can even have philosophical implications. The Austrian philosopher Alexius Meinong said a denoting phrase like ‘the round square’ introduced that object; there was now such a thing. According to Meinong, the word itself created non-existent objects, arguing that there are objects that exist and ones that don’t – but they are all created by language. “‘The’ has a kind of magical property in philosophy,” says Barry C Smith, director of the Institute of Philosophy, University of London.


The British philosopher Bertrand Russell wrote a paper in 1905 called On Denoting, all about the definite article. Russell put forward a theory of definite descriptions. He thought it intolerable that phrases like ‘the man in the Moon’ were used as though they actually existed. He wanted to revise the surface grammar of English, as it was misleading and “not a good guide to the logic of the language”, explains Smith. This topic has been argued about, in a philosophical context, ever since. “Despite the simplicity of the word,” observes Thoms, “it’s been evading definition in a very precise way for a long time.”

Scandinavian languages such as Danish or Norwegian and some Semitic languages like Hebrew or Arabic use an affix (or a short addition to the end of a word) to determine whether the speaker is referring to a particular object or using a more general term. Latvian or Indonesian deploy a demonstrative – words like ‘this’ and ‘that’ – to do the job of ‘the’. There’s another group of languages that don’t use any of those resources, such as Urdu or Japanese.

Function words are very specific to each language. So, someone who is a native Hindi or Russian speaker is going to have to think very differently when constructing a sentence in English. Murphy says that she has noticed, for instance, that sometimes her Chinese students hedge their bets and include ‘the’ where it is not required. Conversely, Smith describes Russian friends who are so unsure when to use ‘the’ that they sometimes leave a little pause: ‘I went into... bank. I picked up... pen.’ English speakers learning a language with no equivalent of ‘the’ also struggle and might overcompensate by using words like ‘this’ and ‘that’ instead.

Atlantic divide


Even within the language, there are subtle differences in how ‘the’ is used in British and American English, such as when talking about playing a musical instrument. An American might be more likely to say ‘I play guitar’ whereas a British person might opt for ‘I play the guitar’. But there are some instruments where both nationalities might happily omit ‘the’, such as ‘I play drums’. Equally the same person might interchangeably refer to their playing of any given instrument with or without the definite article – because both are correct and both make sense.


And yet, keeping with the musical vibe, there’s a subtle difference in meaning of ‘the’ in the phrases ‘I play the piano’ and ‘I clean the piano’. We instinctively understand the former to mean the piano playing is general and not restricted to one instrument, and yet in the latter we know that it is one specific piano that is being rendered spick and span.

Culpeper says ‘the’ occurs about a third less in spoken language. Though of course whether it is used more frequently in text or speech depends on the subject in question. A more personal, emotional topic might have fewer instances of ‘the’ than something more formal. ‘The’ appears most frequently in academic prose, offering a useful word when imparting information – whether it’s scientific papers, legal contracts or the news. Novels use ‘the’ least, partly because they have conversation embedded in them.

According to Culpeper, men say ‘the’ significantly more frequently. Deborah Tannen, an American linguist, has a hypothesis that men deal more in report and women more in rapport – this could explain why men use ‘the’ more often. Depending on context and background, in more traditional power structures, a woman may also have been socialized not to take the voice of authority so might use ‘the’ less frequently. Though any such gender-based generalizations also depend on the nature of the topic being studied.

Those in higher status positions also use ‘the’ more – it can be a signal of their prestige and (self) importance. And when we talk about ‘the prime minister’ or ‘the president’ it gives more power and authority to that role. It can also give a concept credibility or push an agenda. Talking about ‘the greenhouse effect’ or ‘the migration problem’ makes those ideas definite and presupposes their existence.


 ‘The’ can be a “very volatile” word, says Murphy. Someone who refers to ‘the Americans’ versus simply ‘Americans’ is more likely to be critical of that particular nationality in some capacity. When people referred to ‘the Jews’ in the build-up to the Holocaust, it became othering and objectifying. According to Murphy, “‘The’ makes the group seem like it’s a large, uniform mass, rather than a diverse group of individuals.” It’s why Trump was criticized for using the word in that context during a 2016 US presidential debate.

Origins


We don’t know exactly where ‘the’ comes from – it doesn’t have a precise ancestor in Old English grammar. The Anglo Saxons didn’t say ‘the’, but had their own versions. These haven’t completely died out, according to historical linguist Laura Wright. In parts of Yorkshire, Lancashire and Cumberland there is a remnant of Old English inflective forms of the definite article – t’ (as in “going t’ pub”).


The letter y in terms like ‘ye olde tea shop’ is from the old rune Thorn, part of a writing system used across northern Europe for centuries. It’s only relatively recently, with the introduction of the Roman alphabet, that ‘th’ has come into being.


‘The’ deserves to be celebrated. The three-letter word punches well above its weight in terms of impact and breadth of contextual meaning. It can be political, it can be dramatic – it can even bring non-existent concepts into being.


http://www.bbc.com/culture/story/20200109-is-this-the-most-powerful-word-in-the-english-language

Ancestral home of James Joyce ("The the.")

*
TINY HABITS AND THE POWER OF CELEBRATION

~ “Break down big change into tiny actions, find where they fit naturally into your life, and then you feel good by celebrating. That’s it. People are shocked at how easy and fast this system works when it comes to forming habits.

In my own research, I found that habits can form very quickly, often in just a few days, as long as people have a strong positive emotion connected to the new behavior. In fact, some habits seem to get wired in immediately: You do the behavior once, and then you don’t consider other options again. You’ve created an instant habit.

When I teach people about human behavior, I boil it down to three words to make the point crystal clear: Emotions create habits. Not repetition. Not frequency. Not fairy dust. 


Emotions.

When you are designing for habit formation — for yourself or for someone else — you are really designing for emotions.


What happens in your brain when you experience positive reinforcement isn’t magic — it’s neurochemical. Good feelings spur the production of a neurotransmitter (a chemical messenger in the brain) called dopamine that controls the brain’s “reward system” and helps us remember what behavior led to feeling good so we will do it again. With the help of dopamine, the brain encodes the cause-and-effect relationship, and this creates expectations for the future.


You can hack into this reward system by creating an event in your brain that neuroscientists call a “reward prediction error.” Here’s how it works: Your brain is constantly assessing and reassessing the experiences, sights, sounds, smells, and movements in the world around you. Based on previous experiences, your brain has formed predictions about what you will experience in any given situation.


Let’s say your brain doesn’t expect the cauliflower-crust pizza to taste good. Your previous experiences with cauliflower have been negative. But you take a bite of the new pizza. And wow! You find it’s delicious. That’s when you get a “reward prediction error,” and neurons in your brain adjust the release of dopamine in order to encode an updated expectation.


The good news is that we are not helpless when it comes to our brain chemistry. Using what we know about how the brain functions, we can help our brains help us.


How?


By intentionally creating feelings to wire in the habits that we actually want in our lives. When we hack into the ancient behavioral pathways in our brains, we gain access to the amazing human potential for learning and change. We have an opportunity to use the brain machinery we already have to feel good and change behaviors.


You can use many types of positive reinforcement to wire in a habits–this includes pleasure, a sense of relief, and more–but in my research and teaching, I’ve found that the real winner is creating a feeling of success.


To make this super practical, I’ve studied and developed a technique people can use to spark a feeling of success at any moment they want. Part of the Tiny Habits method, this technique is called “celebration.”


Celebration is the best way to create a positive feeling that wires in your new habits. It’s free, fast, and available to people of every color, shape, size, income, and personality. In addition, celebration teaches us how to be nice to ourselves — a skill that pays out the biggest dividends of all.


The definition of a reward in behavior science is an experience directly tied to a behavior that makes that behavior more likely to happen again. The timing of the reward matters. Scientists learned decades ago that rewards need to happen either during the behavior or milliseconds afterward. Dopamine is released and processed by the brain very quickly. That means you’ve got to cue up those good feelings fast to form a habit.


Incentives like a sales bonus or a monthly massage can motivate you, but they don’t rewire your brain. Incentives are way too far in the future to give you that all-important shot of dopamine that encodes the new habit. Doing three squats in the morning and rewarding yourself with a movie that evening won’t work to rewire your brain. The squats and the good feelings you get from the movie are too far apart for dopamine to build a bridge between the two.


A real reward — something that will actually create a habit — is a much narrower target to hit than most people think.Here’s how to help a habit root quickly and easily in your brain: Perform the behavior sequence that you want to become a habit (“After I turn on the coffeemaker, I will get out my to-do list”) and then celebrate immediately.

When I say that you need to celebrate immediately after the behavior, I do mean immediately. Immediacy is one piece of what informs the speed of your habit formation.


The other piece is the intensity of the emotion you feel when you celebrate. This is a one-two punch: you’ve got to celebrate right after the behavior (immediacy), and you need your celebration to feel real (intensity).


Your brain has a built-in system for encoding new habits, and by celebrating you can hack this system. When you get good at celebrating, you will have a superpower for creating habits.

+ Say, “Yes!” or “Yay!”
+ Do a fist pump
+ Smile big
+ Imagine a child clapping for you
+ Give yourself a thumbs-up

When you find a celebration that works for you, and you do it immediately after a new behavior, (or while you are doing the behavior), your brain repatterns to make that behavior more automatic in the future. But once you’ve created a habit, celebration is now optional. You don’t need to keep celebrating the same habit forever. That said, some people keep going with the celebration part of their habits because it feels good and has lots of positive side effects.

Celebration might not feel natural to you, and that’s okay, but practicing this skill will help you to get comfortable. If celebrating the small stuff is hard for you, the go-big-or-go-home mentality is probably sneaking up on you. Shut it down. It’s a trap. Celebrating a win — no matter how tiny — will quickly lead to more wins.

 Your confidence grows when you celebrate not only because you are now a habit-creating machine but also because you are getting better and better at being nice to yourself. You start looking for opportunities to celebrate yourself instead of berating yourself. But over the course of weeks and months, these tiny, simple habits that you’ve woven into your life have changed the fabric of your world entirely.

https://time.com/5756833/better-control-emotions-better-habits/?utm_source=pocket-newtab

 
*
THE EARTH’S POPULATION GROWTH IS SLOWING DOWN

The natural increase in the U.S. population, which factors in the number of births and deaths, was below one million this year (2019), the lowest figure in decades, according to the Census Bureau estimates. There has been a sharp decline in the number of new immigrants, along with fewer births and more deaths due to the “graying of America.”

~ “The global fertility rate is expected to be 1.9 births per woman by 2100, down from 2.5 today. The rate is projected to fall below the replacement fertility rate (2.1 births per woman) by 2070. The replacement fertility rate is the number of births per woman needed to maintain a population’s size.

The world’s median age is expected to increase to 42 in 2100, up from the current 31 – and from 24 in 1950. Between 2020 and 2100, the number of people ages 80 and older is expected to increase from 146 million to 881 million. Starting in 2073, there are projected to be more people ages 65 and older than under age 15 – the first time this will be the case. Contributing factors to the rise in the median age are the increase in life expectancy and falling fertility rates.

Africa is the only world region projected to have strong population growth for the rest of this century. Between 2020 and 2100, Africa’s population is expected to increase from 1.3 billion to 4.3 billion. Projections show these gains will come mostly in sub-Saharan Africa, which is expected to more than triple in population by 2100. The regions that include the United States and Canada (Northern America) and Australia and New Zealand (Oceania) are projected to grow throughout the rest of the century, too, but at slower rates than Africa.

Europe and Latin America are both expected to have declining populations by 2100. Europe’s population is projected to peak at 748 million in 2021. The Latin America and Caribbean region is expected to surpass Europe in population by 2037 before peaking at 768 million in 2058.

The population of Asia is expected to increase from 4.6 billion in 2020 to 5.3 billion in 2055, then start to decline. China’s population is expected to peak in 2031, while the populations of Japan and South Korea are projected to decline after 2020. India’s population is expected to grow until 2059, when it will reach 1.7 billion. Meanwhile, Indonesia – the most populous country in Southeastern Asia – is projected to reach its peak population in 2067.

In the Northern America region, migration from the rest of the world is expected to be the primary driver of continued population growth. The immigrant population in the United States is expected to see a net increase of 85 million over the next 80 years (2020 to 2100) according to the UN projections, roughly equal to the total of the next nine highest countries combined. In Canada, migration is likely to be a key driver of growth, as Canadian deaths are expected to outnumber births.

Six countries are projected to account for more than half of the world’s population growth through the end of this century, and five are in Africa. The global population is expected to grow by about 3.1 billion people between 2020 and 2100. More than half of this increase is projected to come from Nigeria, the Democratic Republic of the Congo, Tanzania, Ethiopia and Angola, along with one non-African country (Pakistan). Five African countries are projected to be in the world’s top 10 countries by population by 2100.


India is projected to surpass China as the world’s most populous country by 2027. Meanwhile, Nigeria will surpass the U.S. as the third-largest country in the world in 2047.


Between 2020 and 2100, 90 countries are expected to lose population. Two-thirds of all countries and territories in Europe (32 of 48) are expected to lose population by 2100. In Latin America and the Caribbean, half of the region’s 50 countries’ populations are expected to shrink. Between 1950 and 2020, by contrast, only six countries in the world lost population, due to much higher fertility rates and a relatively younger population in past decades.

Africa is projected to overtake Asia in births by 2060. Half of babies born worldwide are expected to be born in Africa by 2100, up from three-in-ten today. Nigeria is expected to have 864 million births between 2020 and 2100, the most of any African country. The number of births in Nigeria is projected to exceed those in China by 2070.

Meanwhile, roughly a third of the world’s babies are projected to be born in Asia by the end of this century, down from about half today and from a peak of 65% in the 1965-70 period.

The Latin America and Caribbean region is expected to have the oldest population of any world region by 2100, a reversal from the 20th century. In 1950, the region’s median age was just 20 years. That figure is projected to more than double to 49 years by 2100.


This pattern is evident when looking at individual countries in the region. For example, in 2020, the median ages of Brazil (33), Argentina (32) and Mexico (29) are all expected to be lower than the median age in the U.S. (38). However, by 2100, all three of these Latin American nations are projected to be older than the U.S. The median age will be 51 in Brazil, 49 in Mexico and 47 in Argentina, compared with a median age of 45 in the U.S. Colombia is expected to undergo a particularly stark transition, with its median age more than tripling between 1965 and 2100 – from 16 to 52.


Japan is projected to have the highest median age of any country in the world in 2020, at 48 years old. Japan’s median age is expected to continue to rise until it peaks at 55 in 2065. It is expected to be lower in 2100 (54). By that time, the country with the highest median age is expected to be Albania, with a median age of 61.

https://www.pewresearch.org/fact-tank/2019/06/17/worlds-population-is-projected-to-nearly-stop-growing-by-the-end-of-the-century/



*
HOW THE U.S. POPULATION WILL CHANGE OVER THE NEXT DECADE

 
1. THERE WILL BE MORE OF US


The U.S. population today, at the start of 2020, numbers just over 331 million people.
The U.S. is the third largest country in the world, outnumbered only by the two demographic billionaires, China and India, at just over 1.4 billion and just under 1.4 billion, respectively.
Ten years from now, the U.S. population will have almost 350 million people. China and India will still be bigger, but India with 1.5 billion people will now be larger than China, with 1.46 billion.


2. THE POPULATION WILL GET OLDER


The U.S. is getting older and it’s going to keep getting older.


Today, there are over 74.1 million people under age 18 in the U.S. country. There are 56.4 million people age 65 and older. 


Ten years from now, there will almost be as many old folks as there are young ones. The numbers of young people will have grown just a little to 76.3 million, but the numbers of old people will have increased a lot – to 74.1 million. A lot of these new elderly will be baby boomers. 


For example, take the really old folks – people over the age of 100. How many centenarians are in the U.S. population today and how many are there likely to be 10 years from now?
According to demographers at the U.S. Census Bureau, the number of centenarians in the U.S. grew from over 53,000 in 2010 to over 90,000 in 2020. By 2030, there will most likely be over 130,000 centenarians in the U.S.


But this increase of centenarians by 2030 is only a small indication of their growth in later decades. In the year of 2046, the first group of surviving baby boomers will reach 100 years, and that’s when U.S. centenarians will really start to grow. By 2060 there will be over 603,000. That’s a lot of really old people.


3. RACIAL SHIFT 


What will the country look like racially in 2030? Whites will have dropped to 55.8% of the population, and Hispanics will have grown to 21.1%. The percentage of black and Asian Americans will also grow significantly.


So between now and 2030, whites as a proportion of the population will get smaller, and the minority race groups will all keep getting bigger. 


Eventually, whites will become a minority, dropping below 50% of the U.S. population in around the year of 2045. 


However, on the first day of 2020, whites under age 18 were already in the minority. Among all the young people now in the U.S., there are more minority young people than there are white young people.


Among old people age 65 and over, whites are still in the majority. Indeed white old people, compared to minority old people, will continue to be in the majority until some years after 2060.


Hispanics and the other racial minorities will be the country’s main demographic engine of population change in future years; this is the most significant demographic change Americans will see.” ~ 


*
THE 3.5% RULE OF REVOLUTIONARY CHANGE

~ “Nonviolent protests are twice as likely to succeed as armed conflicts – and those engaging a threshold of 3.5% of the population have never failed to bring about change.

In 1986, millions of Filipinos took to the streets of Manila in peaceful protest and prayer in the People Power movement. The Marcos regime folded on the fourth day.


In 2003, the people of Georgia ousted Eduard Shevardnadze through the bloodless Rose Revolution, in which protestors stormed the parliament building holding the flowers in their hands.


Earlier this year, the presidents of Sudan and Algeria both announced they would step aside after decades in office, thanks to peaceful campaigns of resistance.  


In each case, civil resistance by ordinary members of the public trumped the political elite to achieve radical change.


There are, of course, many ethical reasons to use nonviolent strategies. But compelling research by Erica Chenoweth, a political scientist at Harvard University, confirms that civil disobedience is not only the moral choice; it is also the most powerful way of shaping world politics – by a long way.


Looking at hundreds of campaigns over the last century, Chenoweth found that nonviolent campaigns are twice as likely to achieve their goals as violent campaigns. And although the exact dynamics will depend on many factors, she has shown it takes around 3.5% of the population actively participating in the protests to ensure serious political change.


Chenoweth’s influence can be seen in the recent Extinction Rebellion protests, whose founders say they have been directly inspired by her findings. So just how did she come to these conclusions?


Needless to say, Chenoweth’s research builds on the philosophies of many influential figures throughout history. The African-American abolitionist Sojourner Truth, the suffrage campaigner Susan B Anthony, the Indian independence activist Mahatma Gandhi and the US civil rights campaigner Martin Luther King have all convincingly argued for the power of peaceful protest.


“We were trying to apply a pretty hard test to nonviolent resistance as a strategy,” Chenoweth says. (The criteria were so strict that India’s independence movement was not considered as evidence in favor of nonviolent protest in Chenoweth and Stephan’s analysis – since Britain’s dwindling military resources were considered to have been a deciding factor, even if the protests themselves were also a huge influence.)

By the end of this process, they had collected data from 323 violent and nonviolent campaigns. And their results – which were published in their book Why Civil Resistance Works: The Strategic Logic of Nonviolent Conflict – were striking.


Strength in numbers


Overall, nonviolent campaigns were twice as likely to succeed as violent campaigns: they led to political change 53% of the time compared to 26% for the violent protests.
This was partly the result of strength in numbers. Chenoweth argues that nonviolent campaigns are more likely to succeed because they can recruit many more participants from a much broader demographic, which can cause severe disruption that paralyses normal urban life and the functioning of society.


In fact, of the 25 largest campaigns that they studied, 20 were nonviolent, and 14 of these were outright successes. Overall, the nonviolent campaigns attracted around four times as many participants (200,000) as the average violent campaign (50,000).


The People Power campaign against the Marcos regime in the Philippines, for instance, attracted two million participants at its height, while the Brazilian uprising in 1984 and 1985 attracted one million, and the Velvet Revolution in Czechoslovakia in 1989 attracted 500,000 participants.


Once around 3.5% of the whole population has begun to participate actively, success appears to be inevitable.


“There weren’t any campaigns that had failed after they had achieved 3.5% participation during a peak event,” says Chenoweth – a phenomenon she has called the “3.5% rule”. Besides the People Power movement, that included the Singing Revolution in Estonia in the late 1980s and the Rose Revolution in Georgia in the early 2003.


Chenoweth admits that she was initially surprised by her results. But she now cites many reasons that nonviolent protests can garner such high levels of support. Perhaps most obviously, violent protests necessarily exclude people who abhor and fear bloodshed, whereas peaceful protesters maintain the moral high ground.

Chenoweth points out that nonviolent protests also have fewer physical barriers to participation. You do not need to be fit and healthy to engage in a strike, whereas violent campaigns tend to lean on the support of physically fit young men. And while many forms of nonviolent protests also carry serious risks – just think of China’s response in Tiananmen Square in 1989 – Chenoweth argues that nonviolent campaigns are generally easier to discuss openly, which means that news of their occurrence can reach a wider audience. Violent movements, on the other hand, require a supply of weapons, and tend to rely on more secretive underground operations that might struggle to reach the general population.

By engaging broad support across the population, nonviolent campaigns are also more likely to win support among the police and the military – the very groups that the government should be leaning on to bring about order.


During a peaceful street protest of millions of people, the members of the security forces may also be more likely to fear that their family members or friends are in the crowd – meaning that they fail to crack down on the movement. “Or when they’re looking at the [sheer] numbers of people involved, they may just come to the conclusion the ship has sailed, and they don’t want to go down with the ship,” Chenoweth says.


In terms of the specific strategies that are used, general strikes “are probably one of the most powerful, if not the most powerful, single method of nonviolent resistance”, Chenoweth says. But they do come at a personal cost, whereas other forms of protest can be completely anonymous. She points to the consumer boycotts in apartheid-era South Africa, in which many black citizens refused to buy products from companies with white owners. The result was an economic crisis among the country’s white elite that contributed to the end of segregation in the early 1990s.

“There are more options for engaging and nonviolent resistance that don’t place people in as much physical danger, particularly as the numbers grow, compared to armed activity,” Chenoweth says. “And the techniques of nonviolent resistance are often more visible, so that it's easier for people to find out how to participate directly, and how to coordinate their activities for maximum disruption.”


A magic number?


These are very general patterns, of course, and despite being twice as successful as the violent conflicts, peaceful resistance still failed 47% of the time. As Chenoweth and Stephan pointed out in their book, that’s sometimes because they never really gained enough support or momentum to “erode the power base of the adversary and maintain resilience in the face of repression”. But some relatively large nonviolent protests also failed, such as the protests against the communist party in East Germany in the 1950s, which attracted 400,000 members (around 2% of the population) at their peak, but still failed to bring about change.


In Chenoweth’s data set, it was only once the nonviolent protests had achieved that 3.5% threshold of active engagement that success seemed to be guaranteed – and raising even that level of support is no mean feat. In the UK it would amount to 2.3 million people actively engaging in a movement (roughly twice the size of Birmingham, the UK’s second largest city); in the US, it would involve 11 million citizens – more than the total population of New York City.


The fact remains, however, that nonviolent campaigns are the only reliable way of maintaining that kind of engagement.


Isabel Bramsen, who studies international conflict at the University of Copenhagen agrees that Chenoweth and Stephan’s results are compelling. “It’s [now] an established truth within the field that the nonviolent approaches are much more likely to succeed than violent ones,” she says.


Regarding the “3.5% rule”, she points out that while 3.5% is a small minority, such a level of active participation probably means many more people tacitly agree with the cause.
These researchers are now looking to further untangle the factors that may lead to a movement’s success or failure. Bramsen and Chandler, for instance, both emphasize the importance of unity among demonstrators.


“Ordinary people, all the time, are engaging in pretty heroic activities that are actually changing the way the world – and those deserve some notice and celebration as well.” ~

https://www.bbc.com/future/article/20190513-it-only-takes-35-of-people-to-change-the-world


 
*

CHRISTIANITY WITHOUT HELL?

~ “That so many young people in the US identify as ‘spiritual but not religious’ at least partly results from their impression of organized religion – particularly the Protestantism that has long dominated the US religious landscape — as judgmental, exclusive, and punishing. There is a longing for a feel-good faith with a friendly deity. 

But the longing for a hell-less faith cannot be attributed to a contemporary generational shift alone. Time and again in the history of western Christianity, this longing has surfaced, only to be subdued and hell reaffirmed as not just scripturally but also morally necessary.” ~

Christian ideas about the afterlife drew from and expanded on ancient traditions that conceived of the afterlife as a single, neutral zone where everyone ended up, regardless of their behavior in this life. The ancient Jews had no concept of ‘heaven’ as a place of rewards, or ‘hell’ as a place of punishment, but instead held that all humans went to a shadowy and monotonous afterlife after death: Sheol. Rewards and punishments accrued to people in this life, not in the life to come. Similarly, the ancient Greeks believed that everyone went to the lethargic and gloomy underworld of Hades.


The contingent realities of human existence – that the righteous can suffer and the wicked can prosper – spurred the emergence of rewards and punishments from the undifferentiated Sheol and Hades. The concepts of heaven and hell recognized moral gradations between individuals and promised the righting of wrongs in a future life. In other words, while some today think of hell as a morally unsophisticated, pre-modern doctrine that has survived long past its prime, the emergence of hell could be seen as offering, rather than obstructing, ethical nuance.


Charon crossing the river Styx; Joachim Patenier (d. 1524). Note that here we see the Christian heaven and hell. A fusion of mythologies was not unusual in the Middle Ages and beyond.
 
*
And yet the idea of hell did not go uncontested. People argued over its duration, with some advocating a temporary instead of eternal hell. They debated the purpose of its punishments, whether corrective and purifying, or vengeful and vindictive. And they have bickered over its nature, with some arguing for hell as a metaphorical mental state as opposed to a physical and literal place.


As early as the second to the third centuries AD, at a time when the Church’s doctrines were still being hotly debated, the scholar Origen of Alexandria (c 185-254 AD) argued against a concept of eternal hell in favor of apokatastasis, or ‘restoration’. Origen taught that God creates everything in love and, through that love, ultimately brings all of creation back to him. In Origen’s scheme, eternal souls would be punished for wrongdoings, but punishment would occur as the soul inhabited successive bodies – whether demonic, human, or angelic – instead of in a permanent and everlasting hell of fire and brimstone. ‘For if… souls had no pre-existence,’ Origen asked in On First Principles, ‘why do we find some new-born babes to be blind, when they have committed no sin, while others are born with no defect at all?’ Over time, souls would learn from their mistakes and eventually be reunited with their perfect creator.


Some have wondered whether Origen might have been influenced by the concept of reincarnation in Eastern traditions. The idea of karma explains the status of every being – divinity, human, animal, ghost, or inhabitant of hell – as a consequence of its own earlier actions. As in the ancient Mediterranean, so in India, the concept of hell, as a region to which the wicked could be reborn, emerged to offer ethical nuance. Karma once referred primarily to a sacrificial system. The living could offer sacrifices to benefit the dead, who all went to the same netherworld, presided over by Yama, king of the deceased. Under the influence of ascetics who emphasized ethical behavior, the netherworld evolved into regions of reward and punishment, and Yama became king of hell. But unlike the Christian God, Yama did not condemn people to hell: they were reborn there as a result of their own bad karma, and could be reborn out of hell as well. In the Buddhist tradition, textual discussions of hell as punishment have been dated to at least the third century BC, if not earlier, predating Origen’s views by centuries.

Origen’s views did not prevail as Christian doctrine became standardized. Instead, the ideas of another early church father, Augustine of Hippo (354-430 AD), carried the day. In contrast to Origen’s dynamic afterlife where souls could rise and fall until they eventually reached their creator, Augustine said that humans had only one life, and only in that life could they choose their actions and beliefs. On the basis of their choices, humans’ eternal status would be decided at the moment of death, when they were swept up into heaven’s endless bliss or hell’s ceaseless suffering.


In addition to offering what would become accepted orthodoxy on the fixed nature of heaven and hell, Augustine also introduced elements that eventually coalesced into the doctrine of purgatory. For Augustine, the flames of purgatory were not intended to punish or save those who’d already made bad choices on earth. Instead, their purpose was to purify those already destined for the perfection of heaven.


Over time, the Catholic Church warmed to the idea that purgatory was an actual place, akin to heaven and hell. Just as the bifurcation of the afterlife seemed to offer more moral nuance than a single shadowy underworld where everyone ended up, so the emergence of purgatory seemed to offer more moral gradation than the stark either/or of heaven and hell. By the time of the Protestant Reformation, most people assumed that they would end up in purgatory after death, since few were good enough for immediate entry to heaven or bad enough for automatic consignment to hell. People’s fates were still decided at the moment of death, but at least they had time to make amends for earthly transgressions if death struck prematurely. Despite purgatory’s problems – the allegation that the rich could afford more masses and alms to shorten their stay – the notion that the living could assist the dead nevertheless offered a modicum of comfort.


While purgatory’s punishments – both in pain and in duration – could be daunting, they were also different from hell’s in that they were only temporary (even if they lasted for thousands of years) and ultimately purifying (even if excruciating). Purgatory addressed some of the questions surrounding the western Christian hell by reserving its terrifying eternity for the worst of the worst alone.

Some scholars have suggested that the biggest impact of the Reformation for ordinary people was the ‘death of purgatory’. Once reformers had pared back the afterlife to the two destinations of heaven and hell, Protestant laypeople were back to the terrifying prospect of eternal damnation on the basis of this life alone, without the ability to atone after death and without the possibility of assistance from the living. Protestants, of course, argued that purgatory was an unscriptural concept that placed a burdensome and impossible responsibility on the sinner alone to atone for sins. Only Christ’s sacrifice was sufficient to save, they said, and in any case this switch should lighten, not increase, the burden. As long as people repented and accepted Christ as their savior, they could rest assured that they would end up in heaven.


But this was easier said than done. The agonizing uncertainty of whether they were truly saved haunted the Puritans, who in the early 17th century left their native England for America due to concerns that it wasn’t reformed enough. The Puritans’ God was an absolute sovereign so perfect that even one sin was sufficiently odious as to merit eternal torment. But this God also became an easy target for Enlightenment intellectuals who increasingly emphasized human ability and perfectibility over innate depravity. A God who could consign his own creatures to eternal torture for seemingly minor misdeeds struck them as despotic and unjust.


By the time of the American Revolution in the late 18th century, colonists were arguing not just over the wisdom of waging war against England, but also over the justness of eternal punishment. Attracted by Enlightenment ideas, some members of the founding generation critiqued the British monarchy and the Calvinist God as tyrannical dictators both. As Jefferson put it: ‘It would be more pardonable to believe in no god at all than to blaspheme him by the atrocious attributes of Calvin.’ Some freethinkers departed from the concept of hell as literal and eternal fire and brimstone in favor of a temporary hell where individuals would be punished in proportion to their crimes before being admitted to heaven. Others abandoned hell entirely, arguing that a loving and merciful God would save all of creation for heavenly bliss.


And yet, a hell of fire and brimstone still had staunch defenders, who brought back the ghost of purgatory to accuse critics of being closet Catholics. A temporary hell, they argued, was nothing but purgatory all over again. It made Christ’s sacrifice meaningless, putting the onus squarely on humans to redeem themselves through suffering after death. Those in favor of universal salvation were nothing more than ‘Origenists’, a denunciation that, by the 18th century, denoted dangerous heresy.


More importantly in the new, monarchless US, defenders of hell argued that the threat of eternal punishment was necessary to ensure the morality of citizens. Even a temporary hell, they claimed, would give humans leave to commit socially harmful transgressions, from lying to cheating to murder, since they would still eventually end up in heaven after paying for their crimes. Indeed, the social argument in favor of eternal hell anticipated the arguments we hear today in favor of the death penalty. Both are supposed to serve as ultimate deterrents against crime.


Even European intellectuals, who had been questioning hell since at least the 17th century, recognized its social utility for the masses. Voltaire, favorite of American rationalists and bane of evangelicals, acknowledged in his Philosophical Dictionary (1764) that: ‘We are obliged to hold intercourse and transact business, and mix up in life with … vast numbers of persons addicted to brutality, intoxication, and rapine. You may, if you please, preach to them that there is no hell, and that the soul of man is mortal. As for myself, I will be sure to thunder in their ears, that if they rob me they will inevitably be damned.’


Debates over the scriptural basis and social utility of hell would continue to fester over the course of the 19th century, even as new voices entered the conversation. New religious groups such as the Mormons, Spiritualists and Adventists offered their own views on what hell might entail, if it existed at all. Mormons offered a multi-tiered afterlife. Just as the bifurcation of Sheol and Hades and the addition of purgatory added moral nuance to the afterlife, so the Mormon conception of the ‘sons of perdition’ and the telestial, terrestrial and celestial spheres offered shades of grey to accommodate circumstances ranging from true evil, to those who ‘died without the law’, to the righteous and the just.

But the orthodox hell of literal, eternal punishment has continued to hold strong to this day. So strong that when the US evangelical minister Rob Bell made an argument much less radical than Origen’s and hardly even new in the second millennium, he was met with an outcry of epic proportions. The bespectacled and charismatic Bell, founder of the Michigan megachurch Mars Hill, had begun to question the justness of an eternal hell and a theology where even Gandhi would end up there. In his book Love Wins (2011), Bell claimed that:


~ A staggering number of people have been taught that a select few Christians will spend forever in a peaceful, joyous place called heaven, while the rest of humanity spends forever in torment and punishment in hell with no chance for anything better … This is misguided and toxic and ultimately subverts the contagious spread of Jesus’s message of love, peace, forgiveness, and joy that our world desperately needs to hear. ~


To judge by the reactions to Bell’s book, it was as if no one had ever questioned hell before or emphasized God’s love over his wrath. Many evangelicals were appalled. The viral effects of social media magnified the outcry, with supporters and opponents jumping in to offer tweets of praise or condemnation.

The outcry over Bell’s book was perhaps all the more surprising given recent poll numbers in the US. A 2013 Harris Poll found that while 74 per cent of US adults believe in God and 68 per cent believe in heaven, only 58 per cent believe in the devil and in hell, down four percentage points from 2005. One might think that, with supporters of hell on the decline, defenders of Bell might have easily silenced the opposition. Yet only 25 per cent of US adults polled actively do not believe in hell, while another 18 per cent are unsure.

And numbers can hardly tell the whole story, anyhow. Believers in hell thrive on a sense of opposition and injustice – to affirm the stark either/or of heaven or hell requires it. Where Bell sees the violence humans enact against each other on earth as already a kind of hell, those who support eternal hell argue that it alone can make up for the world’s violence and suffering, and act as a deterrent against future forms of human-on-human brutality. Others say that there has to be a hell, if only for Hitler, or Stalin, or Mao, or Saddam, or Osama bin Laden.

These kinds of arguments have sustained the idea of eternal punishment for generations. Supporters of Hell have always claimed to have morality and justice on their side, even as its opponents have said the same. As much as some people might thirst for a hell-less faith and a hell-denying Pope, others eagerly participate in hell and judgment houses designed to frighten and convert attendees into belief. Poll numbers might fluctuate, but one thing’s for certain: in the US, hell isn’t going up in flames anytime soon.” ~

https://aeon.co/essays/why-has-the-idea-of-hell-survived-so-long


Giovanni di Paolo: Paradise, 1445. “Heaven and hell seem out of proportion to me: the actions of men do not deserve so much.” ~ Jorge Luis Borges

*
"Nancy Mitford disliked the sign of the cross because she thought it a symbol of cruelty. So her sister Pamela, also buried in Swinbrook churchyard, chose for her the mole."

*


“How do we generate the energy of compassion? 
Compassion is born from understanding, 
and understanding is born from looking deeply. 
We need to take time to look deeply into our situation, 
and the situation of the other person or other people. 
Let us learn to look at our suffering, 
and the suffering of our world, 
as a kind of compost. 
From that compost, from that mud, we can create 
beautiful lotuses of understanding and compassion.” ~ Thich Nhat Hanh 


Waterlilies and koi in Balboa Park, San Diego; photo by Charles Sherman

*
ending on beauty:

I was a winged obsessive, my moonlit
feathers were paper. I lived hardly at all among men and women;
I spoke only to angels.

~ Louise Glück, Ancient Text

 



No comments:

Post a Comment