Saturday, December 18, 2021

IS DUNE FASCIST? DUNE AND ISLAMIC FUTURISM; SIMPSONS’ LIFESTYLE NO LONGER ATTAINABLE; CRIMES OF FASHION; LETTUCE IS AN ANTI-DEPRESSANT; HOW TO PRAY TO A DEAD GOD

 *
GRIPPER

Mother and I pried open the bronze,
divided his ashes between us.
I took my portion upstairs,
quickly closed the door.    

I felt queasy. I had never seen
human ashes before —
would they still look human,
with sharp pieces of offended bone?

But my father’s ashes
looked just like ashes:
gray, speckled with white.
Then I glimpsed something

round and hard: a metal button.
On the disk, legible still,
the word Gripper twice —
two serpents made of letters,

smudged but not charred —
not returning dust to dust.
Soon I found more Grippers,
from his cremated hospital gown.

With half the ashes, half
the Grippers were later laid
in the family crypt
in my father’s hometown —

blessed by the priest,
consigned to everlasting mercy.
As if a sprinkling of holy water
could extinguish such persistent flame.

During the service, in my mind
I heard my father ask: “Is that
what is left of me? Buttons?
That’s the treasure you found?”

“Not worth a button”
was his favorite saying.
Laughter was his grip on life.
Only days before the end,

he said, with his widest grin,
“When you’re lying
in the coffin, you should suddenly
sit up and say Hah-hah! ”

That was of course too ambitious.  
From eternity I have only
these buttons. Still able
to grip. Not giving up.

~ Oriana

~ I think of the past less often than I had feared. The past is an immense album whose images are blurred, elusive — protean in their inconstancy and therefore embarrassing. Memory consoles with its balancing of gains and losses, because not all is on the debit side; the passage of years bestows a sense of architectonics, and the purity of arch, the crystalline contour can compensate for the fading of warm colors. It also teaches futility, because we know now that the distance between the word and the world, contrary to all our previous expectations, remains unbridgeable. ~ Czeslaw Milosz, The Land of Ulro

In Blake's mythology, "Ulro" stands for the land of suffering.


Funny, in this case the blurry image fits the blurry nature of memory, as Milosz asserts (I think memory is like dreams: most of the time, blurred and fragmentary, but now and then frightfully vivid)

Oriana:

I rarely look at old photographs. My “album of memory” is my poems. This is obvious when it comes to personal narratives. But all poems, even mythological, remain in part autobiographical for me. I remember their context in my life. I remember the impulse that inspired them, my thinking and attitudes toward various challenges of life in the past. Having the poems has pretty much resolved my fear of remembering the past (on the whole, I don't think of my past with pleasure).

The unbridgeable gap? In absolute terms, I have to agree. Words can only render a slice of reality, and a poem is at its best when it focuses on the narrow slice: a specific event and detail that then leads to a larger perspective. When a poem tries to cover too much, it’s almost bound to fail. But describe just one thing, and you birth an immensity.

I think that “Gripper” succeeds by focusing on one event, finding the metal buttons in my father’s ashes. And I'm reminded of one of Una’s poem, Geode.

GEODE

I bought a rock
at a souvenir shop in the desert,
not guaranteed to be a geode
but the seller hinted
at crystal inside.

It looked ordinary, small,
warm in my hand from its stay
at the window. I sensed
movement as if it leaned
like a living thing toward light.

Often I have picked up a hammer
tempted to smash my rock to see
if a violet excitement exists
inside. Something always
stays my hand.

Instead it rests on the shelf
next to the Book of Luminous Things.
When I die it will be tossed
in the trash to continue
its journey, the promise still intact.

~ Una Hynum

I foolishly tried to make Una end the poem with the geode resting next to Milosz's anthology, The Book of Luminous Things, “the promise still intact.” She wisely and courageously stood by what she saw as the truth — and even so, the poem ends with “the promise still intact.”



*
IS DUNE FASCIST?

~ Popular SF narratives like Dune play a central role in white nationalist propaganda. The alt-right now regularly denounces or promotes science fiction films as part of its recruiting strategy: fascist Twitter popularized the “white genocide” hashtag during a boycott campaign against inclusive casting in Star Wars: The Force Awakens. But Villeneuve’s film seemed to provoke greater outrage than normal because Herbert’s book is such a key text for the alt-right.

Dune was initially received as a countercultural parable warning against ecological devastation and autocratic rule, but geek fascists see the novel as a blueprint for the future. Dune is set thousands of years from now in an interstellar neofeudal society that forestalled the rise of dangerous artificial intelligences by banning computers and replacing them with human beings conditioned with parapsychological disciplines that allow them to perform at the same level as thinking machines [Oriana: Paul himself is a Mentat, a kind of human computer who can put aside emotion in favor of logic]. Spaceships navigate through space using the superhuman abilities of psychics whose powers are derived from a mind-enhancing drug known as melange ["spice"], a substance found only on the desert planet of Arrakis [Iraq?].

The narrative follows the rise of Paul Atreides, a prince who reconquers Arrakis, controls the spice, and eventually becomes the messianic emperor of the Known Universe. Dune was first published in serial form in John W. Campbell’s Analog Science Fiction and Fact and, like many protagonists in Campbell-edited stories, Paul is a mutant übermensch whose potential sets him apart from everyone else. He turns out to be the product of a eugenics program that imbues him with immense precognitive abilities that allow him to bend the galaxy to his will. Paul’s army also turns out to be selected for greatness: the harsh desert environment of Arrakis culls the weak, evolving a race of battle-hardened warriors.

In the fascist reading of the novel, space colonization has scattered the human species, but what Herbert calls a “race consciousness” moves them to unite under Paul, who sweeps away all opposition in a jihad that kills 60,000,000,000. For the alt-right, Paul stands as the ideal of a sovereign ruler who violently overthrows a decadent regime to bring together “Europid” peoples into a single imperium or ethnostate.

Herbert’s worlds represent impossible attempts to square the circle of fusing the destructive dynamism of capitalist modernization with the stable order prized by traditionalism. Beyond a shared affinity for space-age aristocrats, Faye and Herbert see the sovereign as one who is capable of disciplined foresight. Drawing on the Austrian School economist Hans-Hermann Hoppe, many thinkers on the alt-right believe that only men from genetically superior populations are capable of delaying gratification and working toward long-term goals. The alt-right asserts that white men hold an exclusive claim over the future. According to these white nationalists, science fiction is in their blood.

The Bene Gesserit sisterhood who bred Paul’s bloodline for prescience submit him to a kind of deadly marshmallow test to determine if he is fully human. One threatens to kill him with a poisoned needle (the gom jabbar) if he removes his hand from a device that produces the sensation of burning pain. Restraining his immediate impulses is only the first step toward using his precognitive abilities to choose between all the possible timelines. As in fascist doctrine, Paul’s ability to envision the future is a biogenetic trait possessed only by the worthy few.

Even the alt-right’s favorite novel does not seem to support their misreadings. Herbert’s book is often deeply conservative, but by the fascists’ own admission it presents a syncretic vision of the future in which cultures and populations have clearly intermingled over time. Paul’s army of desert guerillas, the Fremen, clearly owe something to Arabic and Islamic cultures, and Paul’s own genealogy defies the fascist demand for racial purity. The alt-right has tried to wrestle Islamophobic and Antisemitic messages from the book but they are stymied by its refusal to map existing ethnic categories onto the characters.

Fascists seek to tame class struggle and humanize capitalism by grounding it in a shared racial destiny, but they only end up enacting a program that leads to a more barbarous form of inhumanity.

Distorted as these fascist readings of Dune may be, Herbert’s novel will remain a persistent feature of alt-right culture as long as they fight to conquer the future. ~

https://www.lareviewofbooks.org/article/race-consciousness-fascism-and-frank-herberts-dune/

from Haaretz:

WHAT THE FAR RIGHT LIKES ABOUT “DUNE”; ISLAMIC FUTURISM

~ So what does the radical right like about “Dune”? The answer is clear. It may depict a futuristic world, but it’s one governed according to a feudal order, with houses of nobility battling one another. At the same time, the world portrayed in the film is also capitalist: Its economy is based entirely on the production and manufacture of “spice,” a kind of psychedelic version of petroleum, produced in distant desert realms.

In addition, the mythology created by Herbert has racial foundations: The imperial Bene Gesserit sisterhood seeks to produce the Messiah through planned racial crossbreeding of rival dynasties. All that is enough to turn Paul Atreides, “Dune’s” leading character, played in the new film by Timothee Chalamet, into a hero of the real-life fascist right wing.

An article in the extreme right-wing Daily Stormer describes Atreides as “the leader of the religious and nationalist rebellion against an intergalactic empire.” It even compared him to Hungarian President Viktor Orban in his battle against the European Union.

So is “Dune,” the book and the film, really a fascist or reactionary work? There are those who would argue that the question itself is irrelevant. Since it’s a blockbuster and a product of mass consumption, a movie like this purportedly belongs to the field of entertainment, and there’s no point in looking for deep political meaning in it.

As a result, the vast majority of moviegoers will be satisfied watching the space battles and marvel at the sandworms bursting forth from the dunes. But even if relating to a film as entertainment and nothing more is appropriate when it comes to “Star Wars” – “Dune” is something else.

So is “Dune,” the book and the film, really a fascist or reactionary work? There are those who would argue that the question itself is irrelevant. Since it’s a blockbuster and a product of mass consumption, a movie like this purportedly belongs to the field of entertainment, and there’s no point in looking for deep political meaning in it.

As a result, the vast majority of moviegoers will be satisfied watching the space battles and marvel at the sandworms bursting forth from the dunes. But even if relating to a film as entertainment and nothing more is appropriate when it comes to “Star Wars” – “Dune” is something else.

This epic of huge dimensions is based on one of the most serious and complex science-fiction works ever written. It’s not a superficial story about spaceships and swords, but rather a rich, multilayered work in which Herbert developed the theology, ecology, technology and economy of the universe that he created. And even more than that, in the 21st century, we cannot discount the political importance of science fiction.

To a great extent, the post-modern mythologies of fantasy and science fiction currently play the role that national epics played in the late 19th and early 20th centuries. Works such as “Lord of the Rings” and “Dune” can be considered the contemporary counterparts of the German national epic “Song of the Nibelungs,” which inspired Richard Wagner’s opera cycle.

From that standpoint, one can view production of the new film version of “Dune” as another expression of the conservative fantasies of our time, which go well beyond right-wing American extremist circles. People of all ages are easily swept up in works depicting kings and barons and royal dynasties, or in celebrating the racial differences among elves, dwarfs and humans.

But Herbert’s work contains a fundamental element that sets it apart from most other fantasy and science-fiction works: Islam.

Anyone unfamiliar with Herbert’s books will be surprised to come across Arabic, Persian, Turkish and even Hebrew words in the movie. The protagonist Paul Atreides may not be fundamentally different from King Arthur, or from fantasy heroes like Harry Potter or Frodo Baggins (the protagonist of “Lord of the Rings”), but unlike them he bears the title “Mahdi.” This is a concept that originated in the Koran, as a name for the Messiah, especially in Shi’ite Islam. Artreides is also referred to as “Muad’dib,” “Usal” or “Lisan al-Gaib” – titles given here to someone destined to lead a galactic jihad. And if that is not enough - he is also called “Kwisatz Haderech” (the Hebrew term for “the leap forward’), a concept with origins in the Babylonian Talmud.

Herbert cast the future world of “Dune” in the form of a kind of Middle Eastern, Islamic mythology. With the mass-culture reception of this latest rendering of the work, it is conceivable that “Dune” fans will begin to learn Arabic and Persian in order to trace the theological roots of the work. But this space jihadist fantasy also has limitations. The Arab and Islamic characteristics in the work are mostly associated with the Fremen, the desert dwellers on Dune, who are characterized as a kind of rather ignorant and primitive Bedouin.

In his book “Orientalism,” the Palestinian-born intellectual Edward Said criticized the stereotypical representations of the Middle East that are accepted in European and American culture. Indeed, “Dune” is perhaps the most “Orientalist” work in the science-fiction genre. The way in which Arab and Islamic culture are represented is saturated with clichés. The transliteration of Arabic-language words is incorrect. Moreover: As is common in the realm of “white” fantasy, the Fremen Bedouin expect a white savior to lead them to jihad. Herbert seems to have been influenced in this regard by the figure of T.E. Lawrence (of Arabia), the British Orientalist military officer who led some of the battles of the Arab Revolt during World War I.

Muslim futurism

But even if Herbert represented Islam stereotypically, he deserves credit for at least representing it. Would it have been better for the story of “Dune” – like so many other fictional works – to be set in a Nordic world, with gleaming blond heroes? Dune’s techno-orientalism expresses at least curiosity and fascination with the Islamic world, which is far from self-evident. This curiosity has a context: The books were written in the 1960s, before the era of the “war on terror.” Since the Islamic Revolution in Iran, the rise of Al-Qaida, and the Islamic State, it is hard to imagine that a popular work would be centered around what is referred to as a galactic jihad.

In an article published online in Al-Jazeera last year, Islamic scholar Ali Karjoo-Ravary noted that “Dune” granted Islam a central place in its futuristic world, in an extraordinary way. This future world does not resemble a California-based IT company transplanted to a different world, but a different, non-Western culture with Islamic characteristics.

In recent years, an interesting movement of Muslim futurism has emerged – works of science fiction written by Muslims from around the world, imagining a futuristic Muslim world. It is easier to imagine such a world in the present era, where Middle Eastern urban centers like Dubai and Doha are among the most futuristic cities in the world; moreover, the UAE has sent a probe to Mars. The distant future will not necessarily look like a Facebook board meeting, nor like a gathering of the Knights of the Round Table – but rather like an Islamic caliphate. Only Muslims will save civilization. ~

https://www.haaretz.com/israel-news/dune-may-be-fascist-but-its-focus-on-islam-is-groundbreaking-1.10357745

A reader’s comment:

Just because Neo-Nazis like something, does not make it 'fascist.' That word is so completely overused that it has become meaningless. Current white supremacists (or any supremacist) are not attracted to the centralized autocratic government so much as they are to the racial hierarchy that places them on top.

Another comment:

Denis Villeneuve's Dune suffers from being an umptieth iteration of a Star Wars-type arc story and its dubious futuristic-medieval construct of space combatants duking it out with knives, but its "orientalist" slant can provide a saving grace.

Dune and Star Wars share the same arc story of the Chosen One reluctant at first to embrace his mission against a galactic oppressor, etc., and Star Wars had more sequels, prequels, and whatnot that I care to remember.

Another:

I think it's a bias of the audience (and the article writer) to assume that Paul is white. He's not described like that in the book at all, but he's *cast* like that in movie adaptations. You could have a majority cast with people of color, that wouldn't matter or change the themes Frank Herbert was writing about at length in his novels. Again, this is a projection of an already biased audience. That's not Herbert's fault, and he would be tearing his hair out knowing that neo-Nazi trash is lauding his work.

Oriana:

Paul is the heir of the House of Atreides, which makes the educated viewer think back all the way to the Trojan war and the House of Atreus (Agamemnon). Does that make Paul Greek, especially given that thousands of years have passed since then? That’s overthinking it. He’s rather a stereotype of the lone hero. Can any literary work entirely escape from convention and be “original”?  

The dunes near Florence, Oregon, thought to have inspired Herbert's work

*
HOW ISLAMIC IS DUNE?

“Dune” is a multilayered allegory for subjects including T.E. Lawrence’s Bedouin exploits (which Herbert critiqued, following Suleiman Mousa’s “T.E. Lawrence: An Arab View”), Caucasian Muslim resistance to Russian imperialism, OPEC, and Indigenous struggles in the United States and Latin America. It is also thoroughly Muslim, exploring how Islam will develop 20,000 years into the future. While drawing on other religions, Herbert saw Islam as “a very strong element” of “Dune’s” entire universe, much as algebra or tabula rasa pervades our own — from Koranic aphorisms spoken by the Bene Gesserit missionary order to the Moorishness of a warrior-poet character (played in the movie by Josh Brolin) to the Shiism of the universe’s bible.

Rather than building and improving upon the novel’s audacious — and yes, Orientalist — engagement with these cultures and experiences, Villeneuve waters down the novel’s specificity. Trying to avoid Herbert’s apparent insensitivity, the filmmakers actively subdued most elements of Islam, the Middle East and North Africa (MENA). The new movie treats religion, ecology, capitalism and colonialism as broad abstractions, stripped of particularity.

Screenwriter Jon Spaihts claimed the book’s influences are “exotic” costumery, which “doesn’t work today,” when, in his words, “Islam is a part of our world.” This flies in the face of Herbert’s explicit aim to counter what he saw as a bias “not to study Islam, not to recognize how much it has contributed to our culture.” The film’s approach backfires: In justifying the film’s exclusion of Muslim and MENA creatives, it truly relegates “Dune’s” Muslimness to exotic aesthetics. The resulting film is both more Orientalist than the novel and less daring.

Take, for example, the languages in “Dune.” To create the Fremen language — often identified simply as Chakobsa, the Caucasian hunting language — Herbert mixed in “colloquial Arabic,” since he reasoned it “would be likely to survive for centuries in a desert environment,” wrote his son, Brian, in “Dreamer of Dune.” Herbert employed what he called an “elision process,” modifying “Arabic roots” to show how “languages change.” Herbert used Arabic throughout his universe, within and beyond the Fremen culture.

The film, however, dispenses with all that. Its Fremen language seems to be a futuristic take on Chakobsa, erasing Herbert’s elided Arabic. The film employs only the minimum Arabic necessary to tell the story, such as “Shai-Hulud” (the planet’s giant sandworms) and “Mahdi” (Paul’s messianic title, also a major figure in Islamic eschatology). The Arabic and Persian that does appear is pronounced poorly. When the Fremen speak English, their accents are a hodgepodge. Maybe people pronounce words differently in the future — but why do the Fremen sound like a 21st-century, Americanized caricature of a generic foreign accent? 

The film’s conlanger, David Peterson, wrote that “Dune” was set so far in the future that “it would be completely (and I mean COMPLETELY) impossible” for so much “recognizable Arabic” to survive. Unaware of Herbert’s inspiration, he also claimed “there’s nothing of the Caucasus in Dune.” For some unexplained reason, the movie’s characters do speak modern English and Mandarin (a fact widely advertised).

Similarly, the film employs “holy war,” not “jihad” — an attempt to avoid the conventional association of jihad with Islamic terrorism. In the book, Herbert’s jihad (which he sometimes calls “crusade”) is a positive description of anti-colonial resistance — but it also describes the colonial violence of the Atreides and the Bene Gesserit. The novel disrupts conventional understandings of the word “jihad”: If popular audiences see jihad as terrorism, then the imperialists, too, are terrorists.

The cinematic “Dune” skirts the novel’s subversive ideas, more black-and-white than its literary parent. Where Herbert challenged fixed, Orientalist categories such as “East” and “West,” the film opts for binaries: It codes obliquely Christian whiteness as imperialist and non-whiteness as anti-imperialist. The obvious Ottoman inspiration behind the Padishah Emperor’s Janissary-like military force, the Sardaukar, is absent. Instead, the imperial troops (who speak what is perhaps meant to be modified Turkish or Mongolian) are depicted with Christian imagery, bloodletting crucified victims. Meanwhile, the Bene Gesserit wear headscarves that look European Christian (with the exception of a beaded Orientalist veil).

The film dilutes Herbert’s anti-imperialist vision in other ways, too. One of the novel’s essential scenes involves a banquet where stakeholders debate the ecological treatment of Fremen. It was the only scene Herbert requested (unsuccessfully) for the David Lynch adaptation. Disgusted with McCarthyism’s bipartisanship, Herbert wrote the scene to expose corruption across political aisles: Liberals, too, are colonizers. One of them is the “Imperial Ecologist,” a half-Fremen named Liet Kynes who “goes native,” reforms the Fremen, and controls their environment. Herbert considered his death the “turning point” of the book: Swallowed by a sand formation even he cannot control, the ecologist realizes his hubris as the archetypal “Western man.”

There is no banquet in the movie; Kynes, played by a Black woman, dies in an act of triumphant defiance. The casting choice presented an incredible opportunity to explore how even subjugated people can participate in the oppression of others — a core theme of Herbert’s saga. Instead, the movie both inverts and reduces the ecologist’s character, simplifying Herbert’s critique of empire and cultural appropriation. It rests on an implicit premise: All dark-skinned people necessarily fit into an anti-colonial narrative, and racial identity easily deflects a character’s relationship to empire. The novel didn’t rely on such easy binaries: It interrogated the layered, particular ways that race, religion and empire can relate to each other.

Kynes’s depiction reflects the film’s broader worldview. It paints the Fremen as generic people of color, who are also generically spiritual. It sprinkles Brown and Black faces throughout the rest of the cast, with sparse attention to cultural or religious detail. The film does accentuate the novel’s critique of Paul as White savior, opening with the question, posed by a Fremen: “Who will our next oppressors be?” But the film fails to connect its abstract critique of messianism to anything resembling the novel’s deep cultural roots. It wants its audience to love the Atreides family and the ecologist — those banquet liberals — while keeping the Muslimness of “Dune” to a low whine.

Hans Zimmer’s score heightens the film’s cultural aimlessness. The music is vaguely religious, with primitive drums. (One hears the influence of Zimmer’s collaborator Edie Boddicker, who also worked with him on “The Lion King.”) The vocals sound like the “Lord of the Rings” hymns. The only distinctly Arab notes, during Paul’s education about Dune, are of “Aladdin” faire. These musical choices are particularly disappointing, given Villeneuve’s previous work. Over a decade ago, he made “Incendies,” a “Dune”-inspired movie that carefully explored MENA politics. Using Radiohead instead of “authentic” Arab music, Villeneuve aimed to interrogate the “westerner’s point of view” as an “impostor’s.” Imagine if Paul, Herbert’s impostor-savior, walked the desert to Zimmer’s cover of Pink Floyd?

This all feels like a missed opportunity. The film could have hired Muslim and MENA talent to lean into these influences, elevating the good and improving the bad. These artists could have developed Fremen custom further (which Herbert sometimes depicts as stereotypically rigid). What if they crafted language, dress and music, modifying traditional songs or prayers, improving Herbert’s “elisions” — or advanced this universe’s pervasive Islamic theology and eschatology?

On the planet Dune, it takes risk and creativity to cross the desert without attracting a worm’s notice: Fremen alter their regular gait to avoid being engulfed. Herbert was unafraid to explore the rich sands of Islamic and MENA histories, even if he made missteps. He put in the work.  

But the film usurps the ideas that shaped the novel. Seeking to save Muslim and MENA peoples from taking offense, Villeneuve — as Paul does to the Fremen — colonizes and appropriates their experiences. He becomes the White savior of “Dune.” Where Herbert danced unconventionally, the filmmakers avoid the desert entirely. But is it so hard to walk without rhythm?

https://www.washingtonpost.com/outlook/2021/10/28/dune-muslim-influences-erased/

[MENA= Middle East and North Africa]

 

from Aljazeera:

A quick look at Frank Herbert’s appendix to Dune, “the Religion of Dune”, reveals that of the “ten ancient teachings”, half are overtly Islamic. And outside of the religious realm, he filled the terminology of Dune’s universe with words related to Islamic sovereignty. The Emperors are called “Padishahs”, from Persian, their audience chamber is called the “selamlik”, Turkish for the Ottoman court’s reception hall and their troops have titles with Turco-Persian or Arabic roots, such as “Sardaukar”, “caid”, and “bashar”. Herbert’s future is one where “Islam” is not a separate unchanging element belonging to the past, but a part of the future universe at every level. The world of Dune cannot be separated from its language, and as reactions on Twitter have shown, the absence of that language in the movie’s promotional material is a disappointment. Even jihad, a complex, foundational principle of Herbert’s universe, is flattened – and Christianised – to crusade.

To be sure, Herbert himself defines jihad using the term “crusade”, twice in the narrative as a synonym for jihad and once in the glossary as part of his definition of jihad, perhaps reaching for a simple conceptual parallel that may have been familiar to his readership. But while he clearly subsumed crusade under jihad, much of his readership did the reverse.

One can understand why. Even before the War on Terror, jihad was what the bad guys do. Yet as Herbert understood, the term is a complicated one in the Muslim tradition; at root, it means to struggle or exert oneself. It can take many forms: internally against one’s own evil, externally against oppression, or even intellectually in the search for beneficial knowledge. And in the 14 centuries of Islam’s history, like any aspect of human history, the term jihad has been used and abused. Having studied Frank Herbert’s notes and papers in the archives of California State University, Fullerton, I have found that Herbert’s understanding of Islam, jihad, and humanity’s future is much more complex than that of his interpreters. His use of jihad grapples with this complicated tradition, both as a power to fight against the odds (whether against sentient AI or against the Empire itself), but also something that defies any attempt at control.

Herbert’s nuanced understanding of jihad shows in his narrative. He did not aim to present jihad as simply a “bad” or “good” thing. Instead, he uses it to show how the messianic impulse, together with the apocalyptic violence that sometimes accompanies it, changes the world in uncontrollable and unpredictable ways. And, of course, writing in the 1950s and 1960s, the jihad of Frank Herbert’s imagination was not the same as ours, but drawn from the Sufi-led jihads against French, Russian, and English imperialism in the 19th and mid-20th century. The narrative exhibits this influence of Sufism and its reading of jihad, where, unlike in a crusade, a leader’s spiritual transformation determined the legitimacy of his war.

In Dune, Paul must drink the “water of life”, to enter (to quote Dune) the “alam al-mithal, the world of similitudes, the metaphysical realm where all physical limitations are removed,” and unlock a part of his consciousness to become the Mahdi, the messianic figure who will guide the jihad. The language of every aspect of this process is the technical language of Sufism.

Perhaps the trailer’s use of “crusade” is just an issue of marketing. Perhaps the film will embrace the characteristically Islam-inspired language and aesthetics of Frank Herbert’s universe. But if we trace the reception of “the strong Muslim flavor” in Dune, to echo an editor on one of Herbert’s early drafts, we are confronted with Islam’s unfavorable place in America’s popular imagination. In fact, many desire to interpret Dune through the past, hungering for a historic parallel to these future events because, in their minds, Islam belongs to the past. Yet who exists in the future tells us who matters in our present. NK Jemisin, the three-time Hugo award-winning author, writes: “The myth that Star Trek planted in my mind: people like me exist in the future, but there are only a few of us. Something’s obviously going to kill off a few billion people of color and the majority of women in the next few centuries.”

Jemisin alerts us to the question: “Who gets to be a part of the future?”

Unlike many of his, or our, contemporaries, Herbert was willing to imagine a world that was not based on Western, Christian mythology. This was not just his own niche interest. Even in the middle of the 20th century, it was obvious that the future would be colored by Islam based on demographics alone. This is clearer today as the global Muslim population nears a quarter of humanity.

While this sounds like an alt-right nightmare/fantasy, Herbert did not think of Islam as the “borg”, an alien hive mind that allows for no dissent. Herbert’s Islam was the great, capacious, and often contradictory discourse recently expounded by Shahab Ahmed in his monumental book, What is Islam? Herbert understood that religions do not act. People act. Their religions change like their languages, slowly over time in response to the new challenges of time and place. Tens of thousands of years into the future, Herbert’s whole universe is full of future Islams, similar but different from the Islams of present and past.

Herbert countered a one-dimensional reading of Islam because he disavowed absolutes. In an essay titled: Science Fiction and a World in Crisis, he identified the belief in absolutes as a “characteristic of the West” that negatively influenced its approach to crisis. He wrote that it led the “Western tradition” to face problems “with the concept of absolute control”. This desire for absolute control is what leads to the hero-worship (or “messiah-building”) that defines our contemporary world. It is this impulse that he sought to tear down in Dune.

In another essay, Men on Other Planets, Herbert cautions against reproducing cliches, reminding writers to question their underlying assumptions about time, society, and religion. He encourages them to be subversive, because science fiction “permits you to go beyond those cultural norms that are prohibited by your society and enforced by conscious (and unconscious) literary censorship in the prestigious arenas of publication”.

https://www.aljazeera.com/opinions/2020/10/11/paul-atreides-led-a-jihad-not-a-crusade-heres-why-that-matters


One of the absurdities of Dune: if Fremen are a relatively simple tribe, based on the Bedouin [who at least had camels; Fremen have no vehicles of any sort], how come they are wearing the technologically advanced, moisture-conserving stillsuits?

 
*
STALIN'S PERSISTENT POPULARITY IN RUSSIA (March 2019)

~ “Burn in hell, executioner of the people and murderer of women and children!” shouted Yevgeny Suchkov, before snapping a red carnation and hurling it at a granite bust of Stalin. Police and Kremlin security officers reacted instantly, seizing him in a neck lock and dragging him away.

Stalin’s reputation has soared in Russia since Vladimir Putin, a former KGB officer, came to power in 2000. Busts and portraits of the Soviet dictator, once taboo, have reappeared across the country in recent years.

The decommunization movement, of which Suchov is a member, by contrast wants to see reminders of the Soviet era removed from Russia’s streets, as well as state archives relating to Stalin’s campaign of political terror being opened to the public.

While all the attention was on Suchkov, his fellow activist, Olga Savchenko, stepped up to Stalin’s grave and calmly said: “Shame on the executioner.” She was also detained.
Suchkov, 21, said he had felt obliged to stage his protest because standing by and doing nothing in the face of an “homage to evil” would make him an accessory.

“And I have no intention of becoming an accessory to the evil that was Stalin and Stalinism,” he told the Guardian after his release from police custody. Savchenko, 25, said her great-grandfather was executed by the Soviet secret police in 1937 at the height of Stalin’s purges.
Both activists were ordered to pay a small fine, although they said police had not specified in their report exactly what offense they had been charged with.

Almost three decades on from the collapse of the communist system in Russia, thousands of metro stations, streets and squares across the country continue to bear the name of Soviet leaders and officials, while almost every town or city has a statue of Vladimir Lenin. Opinion polls indicate around 25% of Russians believe Stalin’s campaign of political terror, estimated to have killed some 20 million people, was “historically justified”.

Critics accused the decommunization movement of enflaming dangerous social tensions. “We must clamp down hard on this or tomorrow blood will flow across the whole country,” said Alexander Yushchenko, a Communist Party MP.

Knowledge about the Stalin era is patchy among young Russians. There is nothing in the official school curriculum about Stalinist terror, and children can go through their entire school years without hearing anything about the topic. Unsurprisingly, almost half of all Russians aged 18-24 know nothing at all about Stalin’s purges, according to an opinion poll published last year.

Ukraine implemented a decommunization program after protests overthrew the country’s pro-Moscow president in 2014. Hundreds of Soviet-era statues and monuments have since been toppled in the parts of the country under government control.

“There is no other method of overcoming the Soviet legacy,” said Alexander Polozun, a 20-year-old decommunization activist. ~

https://www.theguardian.com/world/2019/mar/08/homage-to-evil-russians-activists-detained-over-stalin-protest


Oriana:

Even though this is not a new article, on December 18 (Stalin’s birthday) I thought again of how Stalin, a mass murderer, is still adored in Russia.

I remember the term “de-Stalinization” from my childhood, but then Poland was not the Soviet Union. The average Pole hated Stalin. Not so in Russia.

Though there are still Neo-Nazis in Germany and elsewhere, there are no statues of Hitler anywhere in the country. It is illegal to display the swastika, so the German Nazis use the confederate flag instead.
I know I’ve used this image before, perhaps more than once. To me it’s iconic: a reminder that if indoctrination is intense enough and long enough, it persists.

*
LIFE OF THE SIMPSONS NO LONGER ATTAINABLE

~ The most famous dysfunctional family of 1990s television enjoyed, by today’s standards, an almost dreamily secure existence that now seems out of reach for all too many Americans. I refer, of course, to the Simpsons. Homer, a high-school graduate whose union job at the nuclear-power plant required little technical skill, supported a family of five. A home, a car, food, regular doctor’s appointments, and enough left over for plenty of beer at the local bar were all attainable on a single working-class salary. Bart might have had to find $1,000 for the family to go to England, but he didn’t have to worry that his parents would lose their home.

This lifestyle was not fantastical in the slightest—nothing, for example, like the ridiculously large Manhattan apartments in Friends. On the contrary, the Simpsons used to be quite ordinary—they were a lot like my Michigan working-class family in the 1990s.

The 1996 episode “Much Apu About Nothing” shows Homer’s paycheck. He grosses $479.60 per week, making his annual income about $25,000. My parents’ paychecks in the mid-’90s were similar. So were their educational backgrounds. My father had a two-year degree from the local community college, which he paid for while working nights; my mother had no education beyond high school. Until my parents’ divorce, we were a family of three living primarily on my mother’s salary as a physician’s receptionist, a working-class job like Homer’s.

By 1990—the year my father turned 36 and my mother 34—they were divorced. And significantly, they were both homeowners—an enormous feat for two newly single people.
Neither place was particularly fancy. I’d estimate that the combined square footage of both roughly equaled that of the Simpsons’ home. Their houses were their only source of debt; my parents have never carried a credit-card balance. Within 10 years, they had both paid off their mortgage.

Neither of my parents had much wiggle room in the budget. I remember Christmases that, in hindsight, looked a lot like the one portrayed in the first episode of The Simpsons, which aired in December 1989: handmade decorations, burned-out light bulbs, and only a handful of gifts. My parents had no Christmas bonus or savings, so the best gifts usually came from people outside our immediate family.

Most of my friends and classmates lived the way we did—that is, the way the Simpsons did. Some families had more secure budgets, with room for annual family vacations to Disney World. Others lived closer to the edge, with fathers taking second jobs as mall Santas or plow-truck drivers to bridge financial gaps. But we all believed that the ends could meet, with just an average amount of hustle.

Over the years, Homer and his wife, Marge, also face their share of struggles. In the first episode, Homer becomes a mall Santa to bring in some extra cash after Homer learns that he won’t receive a Christmas bonus and the family spends all its Christmas savings to get Bart’s new tattoo removed. They also occasionally get a peek into a different kind of life. In Season 2, Homer buys the hair-restoration product “Dimoxinil.” His full head of hair gets him promoted to the executive level, but he is demoted after Bart accidentally spills the tonic on the floor and Homer loses all of his new hair. Marge finds a vintage Chanel suit at a discount store, and wearing it grants her entrée into the upper echelons of society.

The Simpsons started its 32nd season this past fall. Homer is still the family’s breadwinner. Although he’s had many jobs throughout the show’s run—he was even briefly a roadie for the Rolling Stones—he’s back at the power plant. Marge is still a stay-at-home parent, taking point on raising Bart, Lisa, and Maggie and maintaining the family’s suburban home. But their life no longer resembles reality for many American middle-class families.

Adjusted for inflation, Homer’s 1996 income of $25,000 would be roughly $42,000 today, about 60 percent of the 2019 median U.S. income. But salary aside, the world for someone like Homer Simpson is far less secure. Union membership, which protects wages and benefits for millions of workers in positions like Homer’s, dropped from 14.5 percent in 1996 to 10.3 percent today. With that decline came the loss of income security and many guaranteed benefits, including health insurance and pension plans. In 1993’s episode “Last Exit to Springfield,” Lisa needs braces at the same time that Homer’s dental plan evaporates. Unable to afford Lisa’s orthodontia without that insurance, Homer leads a strike. Mr. Burns, the boss, eventually capitulates to the union’s demand for dental coverage, resulting in shiny new braces for Lisa and one fewer financial headache for her parents. What would Homer have done today without the support of his union?

The purchasing power of Homer’s paycheck, moreover, has shrunk dramatically. The median house costs 2.4 times what it did in the mid-’90s. Health-care expenses for one person are three times what they were 25 years ago. The median tuition for a four-year college is 1.8 times what it was then. In today’s world, Marge would have to get a job too. But even then, they would struggle. Inflation and stagnant wages have led to a rise in two-income households, but to an erosion of economic stability for the people who occupy them.

Last year, my gross income was about $42,000—the amount Homer would be making today. It was the second-highest-earning year of my career. I wanted to buy a home, but no bank was willing to finance a mortgage, especially since I had less than $5,000 to make a down payment. However, my father offered me a zero-down, no-interest contract. Without him, I would not have been able to buy the house. (In one episode, Homer's dad helps him with a downpayment on his home.)

I finally paid off my medical debt. But after taking into account all of my expenses, my adjusted gross income was only $19. And with the capitalized interest on my student loans adding thousands to the balance, my net worth is still negative.

I don’t have Bart, Lisa, and Maggie to feed or clothe or buy Christmas presents for. I’m not sure how I’d make it if I did.

Someone I follow on Twitter, Erika Chappell, recently encapsulated my feelings about The Simpsons in a tweet: “That a show which was originally about a dysfunctional mess of a family barely clinging to middle class life in the aftermath of the Reagan administration has now become aspirational is frankly the most on the nose manifestations [sic] of capitalist American decline I can think of.”

For many, a life of constant economic uncertainty—in which some of us are one emergency away from losing everything, no matter how much we work—is normal. Second jobs are no longer for extra cash; they are for survival. It wasn’t always this way. When The Simpsons first aired, few would have predicted that Americans would eventually find the family’s life out of reach. But for too many of us now, it is. ~

https://www.theatlantic.com/ideas/archive/2020/12/life-simpsons-no-longer-attainable/617499/?utm_campaign=the-atlantic&utm_medium=social&utm_source=facebook&fbclid=IwAR2MilAn7Ht0OzcSOurJw22KfZTqTXogy9CVN8rtNd5CK6SH1xmUQ30vB2w

Mary:

In my family in the 1950's my father was the wage earner, my mother was at home, there were, by 1963, seven children. We had nothing fancy, but everything we needed. No one went hungry, without clothes or shoes, and sacrifices were made to pay tuition for us at catholic schools. Parish schools, not private schools, the tuition was reasonable . When the first three of us went to high school we continued at catholic school, but we worked to pay our tuition.

This all gradually changed, to the point that my mother did go out to work, even as we were older and moving into our adult lives. Five of us went to college, but all on combinations of scholarships, grants, loans and work study, our parents didn’t, and couldn’t have, paid those bills.

Everything is vastly different now. Less secure, more expensive, and many things now much less available. Few jobs are secure enough to last most of a lifetime, few families can afford so many kids and can get them educated so well. Families that could live on one worker's wages are rare, in most, both partners work outside the home...and even with 2 salaries expenses are such that most families carry quite a burden of debt. The only debt my parents carried was their mortgage, they never used credit. When I graduated from university I owed $2000. This is laughable now as people are graduating owing hundreds of thousands of dollars — debts so huge they have to postpone many things, like buying a home and starting a family. So many things are just simply out of reach.

Oriana:

One of the things that most impressed me about the US soon after my arrival was the ability of a factory worker to support his entire family with his salary. In Milwaukee I had a simple but filling hamburger lunch at the home of one such worker. He belonged to a union, of course. I forget now how many kids there were, 2 or maybe 3. The house was modest but the family owned it, the husband had a car while his wife drove an obviously cheaper car, perhaps bought used. Still, given where I was coming from, this seemed like a fairy tale. And to think that officially it was the Soviet Union that advertised itself as “the worker’s paradise.”

The first three decades after the end of WWII are regarded as the time of unprecedented prosperity in the US (and in the West in general). And it seemed like the standard of living would just keep on rising. Instead it was the cost of living that began to soar. Union busting became the practice. One bit of hope I see is the unions seem to be coming back. But then most manufacturing is done abroad. Those good union wages were tied mainly to the manufacturing industry, once the dominant kind of employment.

*
CRIMES OF FASHION: CLOTHES USED TO BE VERY EXPENSIVE

~ Could something as mundane as a shirt ever be the motive for murder? What if clothing were more expensive than rent or a mortgage? In 1636 a maidservant, Joan Burs, went out to buy mercury. A toxic heavy metal, mercury causes damage to the nervous system and can feel like insects are crawling beneath the skin. Burs baked the poison into a milk posset (which often contained spices and alcohol that might have masked the bitter taste), planning to kill her mistress. She believed that if the lady of the house were dead, she herself might get better clothing.

The simplest kind of coat cost £1, which was 20 days’ labor for a skilled tradesman. Clothes were sometimes mentioned first in a will, since they could be more expensive than a house. Even the well-off, such as Samuel Pepys, remade and refashioned existing garments as much as they could rather than buying new.

It is no wonder, therefore, that there was a thriving black market for second-hand clothing of dubious provenance; much of the clothing worn by the middling and working classes essentially ‘fell off a cart’. The web of how such things were acquired could become extremely complex, as tinkers hawked both new and second-hand wares, and items were passed on or exchanged – not to mention the markets that thrived on the clothing trade. 

To supply the country’s insatiable demand for new clothes, thieves might strip drunk people on their way home from a night out, force doors, or even tear down walls. In urban areas in 17th-century England stolen clothes accounted for the most prosecutions of any crime. It was rare for anyone to commit (or attempt to commit) murder over an item of clothing, but the motivations for stealing were broad. Often, they were crimes of opportunity: freshly washed linen hung out to dry on hedges, awaiting capture from any passer-by.

Some thefts, however, were more complicated, involving acting and the tricks of the con-artist’s trade. One cold winter’s night (since it was the little ice age, every winter’s night was cold), a teenage boy was sent on a simple errand. All he had to do was take some clothes – valued at about £4, no small sum – and deliver them to a gentleman across the city. Passing along Watling Street, a woman stopped him and demanded his name, his mother’s name, where he lived and what his errand was. He answered her questions and continued along his journey. Meanwhile, the woman passed all this information on to her partner-in-crime, who set off after the boy, hailing him by name and speaking of his mother. She asked him to buy a shoulder of mutton for her while she waited with the clothes. The boy did so, but returned to find no woman and no clothes. Such operations would have been immensely profitable and difficult to trace, as the stolen goods would have been sold on to the second-hand clothes dealers who supplied the whole country.

No member of society was safe from the theft of clothes. Perhaps the best-loved, and certainly one of the best-known, celebrities of the Elizabethan period (as well as being Elizabeth I’s personal jester) was the clown Richard Tarlton, known for his witty comebacks and cheeky persona. One night, while Tarlton was downstairs at an inn, wearing only his shirt and nightgown, drinking with some musician friends, a thief crept into his room and stole all his clothes. The story traveled around London to great hilarity and the clown was publicly mocked when he next performed onstage. However, Tarlton had the somewhat macabre last laugh, responding to the crowd with one of the impromptu verses that made him famous. He declared,

When that the theefe shall Pine and lacke,

Then shall I have cloathes to my backe:

And I, together with my fellowes,

May see them ride to Tiborne Gallowes.

Those caught stealing clothes were frequently hanged at Tyburn, known as ‘Tyburn tree’. (Executions were supposed to deter thieving.) Spending their last night at Newgate prison, they would be paraded through the streets in a horse and cart before a boisterous crowd, all jostling for the best view of the condemned and hanging on the thief’s last words. Ironically, the events were prime sites for pickpockets.

While clothing could be the motive for theft or murder because it was so difficult to come by, an accurate description by a witness of the perpetrator’s clothing could secure a conviction. For example, after Francis Terry stole wheat from a barn in 1626 he left a distinctive footprint that made identifying him easy. The print showed three indentations mapped to three nails on the sole of Terry’s right boot.

After other crimes, witnesses recalled a man in a red coat, wearing a hat with a hole in it, or dressed in grey clothes. Since many people only had one or two outfits, this was seen as positive proof and helped secure a conviction. Finally, in close communities where word of mouth was paramount, any change in clothing could arouse suspicion. Mary Watts gave the game away after allegedly stealing a silver bowl and some clothing, since she bought herself new clothes with the profits, to the shock of the community around her.

People in the 16th and 17th centuries had a relationship with clothing that is difficult to comprehend in an age of fast fashion, where clothes change with the seasons and any change in identity is instantly worn on the body. But, for early modern people, fashion was just as connected to identity. Most could not afford to change their clothes often, but their outfits became part of how they were seen and how they saw themselves. A change of clothing could provoke anger, hilarity, or even thoughts of murder.

https://www.historytoday.com/archive/history-matters/crimes-fashion?utm_source=pocket-newtab

Mary:
 
It is hard for us to imagine how hard clothes were to come by in the past, how very few most people had, how tempting they were to thieves as valuable objects. Old houses, both old worker's row houses, and old victorian city houses like my parents eventually bought, had cupboards for clothing. These cupboards were, however, very shallow, and could only accommodate a few garments.
 
That was because as a norm, people only had a few. I grew up this way through grade and high school. We wore uniforms every day, had some clothes to wear for after school play or work, and something for Sunday or special occasions. We didn't need deep closets.

There were advantages to this. You didn't have to spend time deciding what to wear, for one, plus it made everyone more or less equal, the rich kids wore the same uniform as the poorer kids. No one was shamed for their clothes, for the lack of finery or the latest trend. Now clothes are generally cheap and part of our throwaway culture. They are poorly made and don't last long. High couture, good and well made clothes, are as far from the ordinary person as the fineries of the upper class were in the days of nobles and peasants.

I think my upbringing stays with me in this. I usually have one purse I use every day until it wears out, and one or two dress ones for special occasions. Three or four pairs of shoes. When I see the quantities of these some women have I am sort of set back, thinking what a bother it must be to always be choosing and changing -- especially shifting things from purse to purse all the time. I don't spend much time thinking about any of it.

Maybe my experience of these things is unusual, but I don't think it's far from most. The world of our parents is not the world we have now, and many of my generation and those after, found themselves with a standard of living lower than their parents could aspire to. 

Oriana:

When I came to this country, I was startled by the low price of clothes relative to food, for instance. Clothes were expensive in Poland, and having just two pairs of slacks, say, was not regarded as poverty. Having lots and lots of clothes wasn't even imaginable. 

I remember a guest scientist from Poland who stayed at my mother's house for a while. One day my mother pointed out that there was a sale at Sears, two pairs of women's pants for ten dollars (or whatever it was). The Polish woman said, "But I already have two pairs: one gray, and one black." 

My mother and I used to chuckle over this, having somehow already forgotten that there is no point accumulating clothes (generally cheap clothes that don't last) -- while not so long ago, in Poland, we wouldn't have understood why anyone needed more than two pairs of anything. 

This is a culture of excess. The voices of protest are few but becoming louder, it seems to me. There is, for instance, the Buy Nothing movement, and consumerism is increasingly condemned as a threat to the environment. Recently I came across two articles that suggested that adults stop buying Christmas gifts for other adults -- gifts should be just for children. It makes sense. What a relief it would be! 

Of course we don’t want a return of the world in which most people had only one outfit, with one coat for winter if they were lucky. Oh yes, let me not forget “Sunday clothes.” There must have been some people who didn’t go to church precisely because they couldn’t afford Sunday clothes. And the sight of children wearing rags didn’t seem to offend anyone. 

No, we certainly don’t want that world, but perhaps we need to imagine one in which clothing is not as abundant, but instead it’s of good quality. 

A public washing ground

*
HOW TO PRAY TO A DEAD GOD

~ On an evening in 1851, a mutton-chopped 28-year-old English poet and critic looked out at the English Channel with his new bride. Walking along the white chalk cliffs of Dover, jagged and streaked black with flint as if the coast had just been ripped from the Continent, he would recall that:

The sea is calm to-night.
The tide is full, the moon lies fair
Upon the straits; on the French coast, the light
Gleams, and is gone; the cliffs of England stand,
Glimmering and vast, out in the tranquil bay.

Matthew Arnold’s poem ‘Dover Beach’ then turns in a more forlorn direction. While listening to pebbles thrown upon Kent’s rocky strand, brought in and out with the night tides, the cadence brings an ‘eternal note of sadness in’. That sound, he thinks, is a metaphor for the receding of religious belief, as

The Sea of Faith
Was once, too, at the full, and round earth’s shore …
But now I only hear
Its melancholy, long, withdrawing roar,
Retreating, to the breath
Of the night-wind, down the vast edges drear
And naked shingles of the world.


Eight years before Charles Darwin’s On the Origin of Species (1859) and three decades before Friedrich Nietzsche’s Thus Spoke Zarathustra (1883-5) – with its thunderclap pronouncement that ‘God is dead’ – Arnold already heard religion’s retreat. 

Darwin’s theory was only one of many challenges to traditional faith, including the radical philosophies of the previous century, the discoveries of geology, and the Higher Criticism of German scholars who proved that scripture was composed by multiple, fallible people over several centuries. While in previous eras a full-throated scepticism concerning religion was an impossibility, even among freethinkers, by the 19th century it suddenly became intellectually possible to countenance agnosticism or atheism. The tide going out in Arnold’s ‘sea of faith’ was a paradigm shift in human consciousness.

What ‘Dover Beach’ expresses is a cultural narrative of disenchantment. Depending on which historian you think authoritative, disenchantment could begin with the 19th-century industrial revolution, the 18th-century Enlightenment, the 17th-century scientific revolution, the 16th-century Reformation, or even when medieval Scholastic philosophers embraced nominalism, which denied that words had any connection to ultimate reality. Regardless, there is broad consensus on the course of the narrative. At one point in Western history, people at all stations of society could access the sacred, which permeated all aspects of life, giving both purpose and meaning. During this premodern age, existence was charged with significance. 

At some point, the gates to this Eden were sutured shut. The condition of modernity is defined by the irrevocable loss of easy access to transcendence. The German sociologist Max Weber wrote in his essay ‘Science as a Vocation’ (1917) that the ‘ultimate and most sublime values have retreated from public life either into the transcendental realm of mystic life or into the brotherliness of direct and personal human relations,’ the result of this retraction being that the ‘fate of our times is characterized by rationalization and intellectualization and, above all, by the “disenchantment of the world”.’

A cognoscente of the splendors of modern technology and of the wonders of scientific research, Arnold still felt the loss of the transcendent, the numinous, and the sacred. Writing in his book God and the Bible (1875), Arnold admitted that the ‘personages of the Christian heaven and their conversations are no more matter of fact than the personages of the Greek Olympus’ and yet he mourned for faith’s ‘long, withdrawing roar’.

Some associated the demise of the supernatural with the elimination of superstition and all oppressive religious hierarchies, while others couldn’t help but mourn the loss of transcendence, of life endowed with mystery and holiness. Regardless of whether modernity was welcomed or not, this was our condition now. Even those who embraced orthodoxy, to the extremes of fundamentalism, were still working within the template set by disenchantment, as thoroughly modern as the rest of us. Thomas Hardy, another English poet, imagined a surreal funeral for God in a 1912 lyric, with his narrator grieving that

. . . toward our myth’s oblivion,
Darkling, and languid-lipped, we creep and grope
Sadlier than those who wept in Babylon,
Whose Zion was a still abiding hope.

How people are to grapple with disenchantment remains the great religious question of modernity. ‘And who or what shall fill his place?’ Hardy asks. How do you pray to a dead God?

The question was a central one not just in the 19th century, but among philosophers in the subsequent century, though not everyone was equally concerned. When it came to where, or how, to whom, or even why somebody should direct their prayers, Thomas Huxley didn’t see an issue.

A stout, pugnacious, bulldog of a man, the zoologist and anatomist didn’t become famous until 1860, when he appeared to debate Darwinism with the unctuous Anglican Bishop of Winchester, Samuel Wilberforce, at the University of Oxford. Huxley was the ever-modern man of science and a recipient of a number of prestigious awards – the Royal Medal, the Wollaston Medal, the Clarke Medal, the Copley Medal, and the Linnean Medal – all garnered in recognition of his contributions to science. By contrast, Wilberforce was the decorated High Church cleric, bishop of Oxford and dean of Westminster. The former represented rationalism, empiricism and progress; the latter the supernatural, traditionalism and the archaic. 

Unfortunately for Wilberforce, Huxley was on the side of demonstrable data. In a room of dark wood and taxidermied animals, before an audience of a thousand, Wilberforce asked Huxley which side of the esteemed biologist’s family a gorilla was on – his grandmother’s or his grandfather’s? Huxley reportedly responded that he ‘would rather be the offspring of two apes than be a man and afraid to face the truth.’ The debate was a rout.

Of course, evolution had implications for any literal account of creation, but critics like Wilberforce really feared the moral implications of Huxley’s views. Huxley had a rejoinder. Writing in his study Evolution and Ethics (1893), he held that ‘Astronomy, Physics, Chemistry, have all had to pass through similar phases, before they reached the stage at which their influence became an important factor in human affairs’ and so too would ethics ‘submit to the same ordeal’.

Rather than relying on ossified commandments, Huxley believed that reason ‘will work as great a revolution in the sphere of practice’. Such a belief in progress was common among the 19th-century intelligentsia, the doctrine that scientific knowledge would improve not just humanity’s material circumstances but their moral ones as well. 

What, then, of transcendence? Inheritors of a classic, English education, both Huxley and Wilberforce (not to mention Arnold) were familiar with that couplet of the poet Alexander Pope, rhapsodizing Isaac Newton in 1730: ‘Nature, and Nature’s laws lay hid in night. / God said, Let Newton be! and all was light!’ For some, the answer to what shall fill God’s place was obvious: science.

The glories of natural science were manifold. Darwin comprehended the ways in which moths and monkeys alike were subject to the law of adaptation. From Newton onward, physicists could predict the parabola of a planet or a cricket ball with equal precession, and the revolution of Antoine Lavoisier transformed the alchemy of the Middle Ages into rigorous chemistry. By the 19th century, empirical science had led to attendant technological wonders; the thermodynamics of James Clerk Maxwell and Lord Kelvin gave us the steam engine, while the electrodynamics of Michael Faraday would forever (literally) illuminate the world. Meanwhile, advances in medicine from experimentalists such as Louis Pasteur ensured a rise in life expectancy.

Yet some were still troubled by disenchantment. Those like Arnold had neither the optimism of Huxley nor the grandiosity of Pope. Many despaired at the reduction of the Universe to a cold mechanization – even when they assented to the accuracy of those theories. Huxley might see ingenuity in the connection of joint to ligament, the way that skin and fur cover bone, but somebody else might simply see meat and murder. Even Darwin would write that the ‘view now held by most physicists, namely, that the Sun with all the planets will in time grow too cold for life … is an intolerable thought.’ Such an impasse was a difficulty for those convinced by science but unable to find meaning in its theories. For many, purpose wasn’t an attribute of the physical world, but rather something that humanity could construct.

Art was the way out of the impasse. Our prayers weren’t to be oriented towards science, but rather towards art and poetry. In Literature and Dogma (1873), Arnold wrote that the ‘word “God” is … by no means a term of science or exact knowledge, but a term of poetry and eloquence … a literary term, in short.’ Since the Romantics, intellectuals affirmed that in artistic creation enchantment could be resurrected. Liberal Christians, who affirmed contemporary science, didn’t abandon liturgy, rituals and scripture, but rather reinterpreted them as culturally contingent. 

In Germany, the Reformed theologian Friedrich Schleiermacher rejected both Enlightenment rationalism and orthodox Christianity, positing that an aesthetic sense defined faith, while still concluding in a 1799 address that ‘belief in God, and in personal immortality, are not necessarily a part of religion.’  

Like Arnold, Schleiermacher saw ‘God’ as an allegorical device for introspection, understanding worship as being ‘pure contemplation of the Universe’. Such a position was influential throughout the 19th century, particularly among American Transcendentalists such as Henry Ward Beecher and Ralph Waldo Emerson.

Lyman Stewart, the Pennsylvania tycoon and co-founder of the Union Oil Company of California, had a different solution to the so-called problem of the ‘death of God’. Between 1910 and 1915, Stewart convened conservative Protestant ministers across denominations, including Presbyterians, Baptists and Methodists, to compile a 12-volume set of books of 90 essays entitled The Fundamentals: A Testimony to the Truth, writing in 1907 that his intent was to send ‘some kind of warning and testimony to the English-speaking ministers, theological teachers, and students, and English-speaking missionaries of the world … which would put them on their guard and bring them into right lines again.’

Considering miracles of scripture, the inerrancy of the Bible, and the relationship of Christianity to contemporary culture, the set was intended to be a ‘new statement of the fundamentals of Christianity’. Targets included not just liberal Christianity, Darwinism and secular Bible scholarship, but also socialism, feminism and spiritualism. Writing about the ‘natural view of the Scriptures’, which is to say a secular interpretation, the contributor Franklin Johnson oddly echoed Arnold’s oceanic metaphor, writing that liberalism is a ‘sea that has been rising higher for three-quarters of a century … It is already a cataract, uprooting, destroying, and slaying.’

Like many radicals, Stewart’s ministers – such as Louis Meyer, James Orr and C I Scofield – saw themselves as returning to first principles, hence their ultimate designation as being ‘fundamentalists’. But they were as firmly of modernity as Arnold, Huxley or Schleiermacher.

Despite their revanchism, the fundamentalists posited theological positions that would have been nonsensical before the Reformation, and their own anxious jousting with secularism – especially their valorization of rational argumentation – served only to belie their project.

Praying towards science, art or an idol – all responses to disenchantment, but not honest ones. Looking with a clear eye, Nietzsche formulated an exact diagnosis. In The Gay Science (1882), he wrote:

God is dead. God remains dead. And we have killed him … What was holiest and mightiest of all that the world has yet owned has bled to death under our knives: who will wipe this blood off us?


Nietzsche is sometimes misinterpreted as a triumphalist atheist. Though he denied the existence of a personal creator, he wasn’t in the mold of bourgeois secularists such as Huxley, since the German philosopher understood the terrifying implications of disenchantment. There are metaphysical and ethical ramifications to the death of God, and if Nietzsche’s prescription remains suspect – ‘Must we ourselves not become gods simply to appear worthy of it?’ – his appraisal of our spiritual predicament is foundational. Morning star of 20th-century existentialism, Nietzsche shared an honest acceptance of the absurdity of reality, asking how it is that we’re able to keep living after God is dead.

Another forerunner of existentialism was the Russian novelist Fyodor Dostoevsky, who had a different solution. The Brothers Karamazov (1879) enacts a debate about faith far more nuanced than the bloviating between Huxley and Wilberforce. Two brothers – Ivan and Alyosha – discuss belief; the former is a materialist who rejects God, and the latter is an Orthodox novice.

Monotheistic theology has always wrestled with the question of how an omnibenevolent and omnipotent God could allow for evil. Theodicy has proffered solutions, but all have ultimately proven unsatisfying. To imagine a God who either isn’t all good or isn’t all powerful is to not imagine God at all; to rationalize the suffering of the innocent is ethically monstrous. And so, as Ivan tells his brother, God himself is ‘not worth the tears of that one tortured child’. Finally, Alyosha kisses his brother and departs. Such an enigmatic action is neither condescension nor concession, even though the monk agrees with all of Ivan’s reasoning. Rather, it’s an embrace of the absurd, what the Danish philosopher Søren Kierkegaard would call a ‘leap of faith’. It is a commitment to pray even though you know that God is dead.

Shūsaku Endō, in his novel Silence (1966), about the 17th-century persecution of Japanese Christians, asks: ‘Lord, why are you silent? Why are you always silent?’ Following the barbarity of the Holocaust and Hiroshima, all subsequent authentic theology has been an attempt to answer Endō. With Nietzsche’s predicted wars, people confronted the new gods of progress and rationality, as the technocratic impulse made possible industrial slaughter. If disenchantment marked the anxieties of Romantics and Victorians, then the 20th-century dreams of a more fair, wise, just and rational world were dissipated by the smoke at Auschwitz and Nagasaki. Huxley’s fantasy was spectacularly disproven in the catastrophic splitting of the atom.

These matters were not ignored in seminaries, for as the journalist John T Elson wrote in Time magazine in 1966: ‘Even within Christianity … a small band of radical theologians has seriously argued that the churches must accept the fact of God’s death, and get along without him.’ That article was in one of Time’s most controversial – and bestselling – issues. Elson popularized an evocative movement that approached the death of God seriously, and asked how enchantment was possible during our age of meaninglessness. Thinkers who were profiled included Gabriel Vahanian, William Hamilton, Paul van Buren and Thomas J J Altizer, all of whom believed that ‘God is indeed absolutely dead, but [propose] to carry on and write a theology … without God.’ Working at progressive Protestant seminaries, the death of God movement, to varying degrees, promulgated a ‘Christian atheism’.

Radical theology is able to take religion seriously – and to challenge religion. Vahanian, a French Armenian Presbyterian who taught at Syracuse University in New York, hewed towards a more traditional vision, nonetheless writing in Wait Without Idols (1964) that ‘God is not necessary; that is to say, he cannot be taken for granted. He cannot be used merely as a hypothesis, whether epistemological, scientific, or existential, unless we should draw the degrading conclusion that “God is reasons”.’ 

Altizer, who worked at the Methodist seminary of Emory University in Atlanta, had a different approach, writing in The Gospel of Christian Atheism (1966) that ‘Every man today who is open to experience knows that God is absent, but only the Christian knows that God is dead, that the death of God is a final and irrevocable event and that God’s death has actualized in our history a new and liberated humanity.’ What unified disparate approaches is a claim from the German Lutheran Paul Tillich, who in his Systematic Theology, Volume 1 (1951) would skirt paradox when he provocatively claimed that ‘God does not exist. He is being-itself beyond essence and existence. Therefore, to argue that God exists is to deny him.’

What does any of this mean practically? Radical theology is unsparing; none of it comes easily. It demands an intensity, focus and seriousness, and more importantly a strange faith. It has unleashed a range of reactions in the contemporary era, ranging from an embrace of the cultural life of faith absent any supernatural claims, to a rigorous course of mysticism and contemplation that moves beyond traditional belief. For some, like Vahanian, it meant a critical awareness that the rituals of religion must enter into a ‘post-Christian’ moment, whereby the lack of meaning would be matched by a countercultural embrace of Jesus as a moral guide. Others embraced an aesthetic model and a literary interpretation of religion, an approach known as ‘theopoetics’.

Altizer meanwhile understood the death of God as a transformative revolutionary incident, interpreting the ruptures caused by secularism as a way to reorient our perspective on divinity.
In Beyond God the Father: Toward a Philosophy of Women’s Liberation (1973), the philosopher Mary Daly at Boston College deconstructed the traditional – and oppressive – masculine symbols of divinity, calling for an ‘ontological, spiritual revolution’ that would point ‘beyond the idolatries of sexist society’ and spark ‘creative action in and toward transcendence’. 

Daly’s use of such a venerable, even scriptural, word as ‘idolatries’ highlights how radical theology has drawn from tradition, finding energy in antecedents that go back millennia. Rabbi Richard Rubenstein, in his writing on the Holocaust, borrowed from the mysticism of Kabbalah to imagine a silent God. ‘The best interests of theology lie not in God in the highest,’ writes John Caputo in The Folly of God: A Theology of the Unconditional (2015), but in something ‘deeper than God, and for that very same reason, deep within us, we and God always being intertwined.’

Challenges to uncomplicated faith – or uncomplicated lack of faith – have always been within religion. It is a dialectic at the heart of spiritual experience. Perhaps the greatest scandal of disenchantment is that the answer of how to pray to a dead God precedes God’s death. Within Christianity there is a tradition known as ‘apophatic theology’, often associated with Greek Orthodoxy. 

Apophatic theology emphasizes that God – the divine, the sacred, the transcendent, the noumenal – can’t be expressed in language. God is not something – God is the very ground of being. Those who practiced apophatic theology – 2nd-century Clement of Alexandria, 4th-century Gregory of Nyssa, and 6th-century Pseudo-Dionysius the Areopagite – promulgated a method that has come to be known as the via negativa. According to this approach, nothing positive can be said about God that is true, not even that He exists. ‘We do not know what God is,’ the 9th-century Irish theologian John Scotus Eriugena wrote. ‘God Himself does not know what He is because He is not anything. Literally God is not’.

How these apophatic theologians approached the transcendent in the centuries before Nietzsche’s infamous theocide was to understand that God is found not in descriptions, dogmas, creeds, theologies or anything else. Even belief in God tells us nothing about God, this abyss, this void, this being beyond all comprehension. Far from being simple atheists, the apophatic theologians had God at the forefront of their thoughts, in a place closer than their hearts even if unutterable. This is the answer of how to pray to a ‘dead God’: by understanding that neither the word ‘dead’ nor ‘God’ means anything at all.

Eleven centuries before Arnold heard the roar of faith’s tide and Nietzsche declared that God was dead, the Hindu sage Adi Shankara recounted a parable in his commentary to the Brahma Sutras, a text that was already a millennium old. Shankara writes that the great teacher Bhadva was asked by a student what Brahma – the ground of all Being – actually was. According to Shankara, Bhadva was silent. Thinking that perhaps he had not been heard, the student asked again, but still Bhadva was quiet. Again, the student repeated his question – ‘What is God?’ – and, again, Bhadva would not answer. Finally, exasperated, the young man demanded to know why Bhadva would not respond to the question. ‘I am teaching you,’ Bhadva replied. ~

https://aeon.co/essays/how-to-fulfil-the-need-for-transcendence-after-the-death-of-god?utm_source=Aeon+Newsletter&utm_medium=email&utm_campaign=december_drive_2021&utm_content=newsletter_banner

Oriana: THAT WHICH IS THE HIGHEST

As you can imagine, this provoked many lengthy comments . . . words, words, words. I agree that at least some individuals have a deep need for enchantment, and I count myself among them. But enchantment does not require supernaturalism. The feeling of awe can be inspired by beauty, including both the beauty of a scientific discovery and a poetic masterpiece such as Dover Beach.

So yes, it all depends on the definition of “God.” And if “god” can be ten thousand different things, perhaps it would be logical to stop using the term.

That said, I like Ayn Rand’s definition: “That which is the highest.” Rand’s moral philosophy appalls me, but I admit that this definition of the divine makes sense to me. Again, though, why use the term “god” when we could be more precise and specify that we mean “that which is the highest” (to be followed by a specific individual answer: to a scientist, it might be science). It’s more words, but the clarity is worth it. 

*
Another issue is that prayer is one possible response to danger. People want a big and mighty protector “up there.” If not god, then at least their mother in heaven, praying that they drive without having an accident. Hence also the rosary dangling on the handle of the rear-view mirror, and similar “amulets” — similar to how our remote ancestors tried to protect themselves in shamanic religions.

Life is fragile, and it can get terrible. In the end, no one is spared from suffering, aging, and mortality.As Stephen Dunn put it in his Ars Poetica,

Maybe from the beginning
the issue was how to live
in a world so extravagant
it had a sky,
in bodies so breakable
we had to pray
.

I can hardly bear the thought of how difficult life used to be the past centuries — and how helpless people felt. The more fear, the more religiosity. But technology and medicine kept advancing, so that today people feel a lot more secure. People keep attributing the decline of religion to science, but it’s really technology that has decreased our helplessness (as Milosz pointed out in one of his essays). Less fear, more empowering technology and effective medicine  — that’s the recipe for less religiosity.

To be sure, we face new apocalyptic dangers, such as climate catastrophe. But at least in the developed countries, people realize that the way to deal with it is not by praying, but by switching to clean energy.  When effective secular solutions exist, or at least can be developed, people don’t generally turn to prayer instead.

How to pray to the dead god? Those who have an emotional need for it may turn to the Universe as a kind of responsive and all-accepting deity. Whatever works.


Thomas JJ Altizer. 
His 2006 memoir is called Living the Death of God.

*

HOW COVID CAN AFFECT THE BRAIN

~ Months after a bout with COVID-19, many people are still struggling with memory problems, mental fog and mood changes. One reason is that the disease can cause long-term harm to the brain.

"A lot of people are suffering," says Jennifer Frontera, a neurology professor at the NYU Grossman School of Medicine.

Frontera led a study that found that more than 13% of hospitalized COVID-19 patients had developed a new neurological disorder soon after being infected. A follow-up study found that six months later, about half of the patients in that group who survived were still experiencing cognitive problems.

The current catalog of COVID-related threats to the brain includes bleeding, blood clots, inflammation, oxygen deprivation, and disruption of the protective blood-brain barrier. And there's new evidence in monkeys that the virus may also directly infect and kill certain brain cells.

Studies of brain tissue suggest that COVID-related changes tend to be subtle, rather than dramatic, says Geidy Serrano, director of the laboratory of neuropathology at Banner Sun Health Research Institute. Even so, she says: "Anything that affects the brain, any minor insult, could be significant in cognition."

Some of the latest insights into how COVID-19 affects the brain have come from a team of scientists at the California National Primate Research Center at UC Davis.

When COVID-19 arrived in the U.S. in early 2020, the team set out to understand how the SARS-CoV-2 virus was infecting the animals' lungs and body tissues, says John Morrison, a neurology professor who directs the research center.

But Morrison suspected the virus might also be infecting an organ that hadn't yet received much attention.

"Early on I said, 'let's take the brains,'" he says. "So we have this collection of brains from these various experiments and we've just started to look at them."

One early result of that research has generated a lot of interest among scientists.

"It's very clear in our monkey model that neurons are infected," says Morrison, who presented some of the research at the Society for Neuroscience meeting in November.

The monkey brains offer an opportunity to learn more because they come from a close relative of humans, are easier to study, and scientists know precisely how and when each animal brain was infected.

The monkey model isn't perfect, though. For example, COVID-19 tends to produce milder symptoms in these animals than in people.

Even so, Morrison says scientists are likely to find infected human neurons if they look closely enough.

"We're looking at individual neurons at very high resolution," he says, "so we can see evidence of infection."

The infection was especially widespread in older monkeys with diabetes, he says, suggesting that the animals share some important COVID-19 risk factors with people.

In the monkeys, the infection appeared to start with neurons connected to the nose. But Morrison says within a week, the virus had spread to other areas in the brain. 

"This is where you get into some of the neurologic symptoms that we see in humans," he says, symptoms cognitive impairment, brain fog, memory issues, and changes in mood. "I suspect that the virus is in the regions that mediate those behaviors."

That hasn't been confirmed in people. But other researchers have found evidence that the virus can infect human brain cells.

A draft of a study of brains from 20 people who died of COVID-19 found that four contained genetic material indicating infection in at least one of 16 areas studied.

And, similar to monkeys, the virus seemed to have entered through the nose, says Serrano, the study's lead author.

"There's a nerve that is located right on top of your nose that is called the olfactory bulb," she says. That provides a potential route for virus to get from the respiratory system to the brain, she says.

Serrano says the virus appears able to infect and kill nerve cells in the olfactory bulb, which may explain why many COVID patients lose their sense of smell — and some never regain it.

In other brain areas, though, the team found less evidence of infection.

That could mean that the virus is acting in other ways to injure these areas of the brain.

For example, studies show that the virus can infect the cells that line blood vessels, including those that travel through the brain. So when the immune system goes after these infected cells, it could inadvertently kill nearby neurons and cause neurological problems, Serrano says.

COVID-19 can also damage the brain by causing blood clots or bleeding that result in a stroke. It can damage the protective cells that create what's known as the blood-brain barrier, allowing entry to harmful substances, including viruses. And the disease can impair a person's lungs so severely that their brain is no longer getting enough oxygen.

These indirect effects appear to be a much bigger problem than any direct infection of neurons, Frontera says.

"People have seen the virus inside of brain tissue," she says. "However, the viral particles in the brain tissue are not next to where there is injury or damage," she says.

Frontera suspects that's because the virus is a "bystander" that doesn't have much effect on brain cells. But other scientists say the virus may be cleared from brain areas after it has caused lasting damage.

Researchers agree that, regardless of the mechanism, COVID-19 presents a serious threat to the brain.

Frontera was part of a team that studied levels of toxic substances associated with Alzheimer's and other brain diseases in older COVID patients who were hospitalized.

"The levels were really high, higher than what we see in patients that have Alzheimer's disease," Frontera says, "indicating a very severe level of brain injury that's happening at that time." 


It's not clear how long the levels remain high, Frontera says. But she, like many researchers, is concerned that COVID-19 may be causing brain injuries that increase the risk of developing Alzheimer's later in life.

Even COVID-19 patients who experience severe neurological problems tend to improve over time, Frontera says, citing unpublished research that measured mental function six and 12 months after a hospital stay.

"Patients did have improvement in their cognitive scores, which is really encouraging," she says. 

But half of the patients in one study still weren't back to normal after a year. So scientists need to "speed up our processes to offer some kind of therapeutics for these people," Frontera says. 

Also, it's probably important to "treat that person early in the disease rather than when the disease has advanced so much that it has created damage that cannot be reversed," Serrano says. 

All of the researchers mentioned that the best way to prevent COVID-related brain damage is to get vaccinated.

https://www.npr.org/sections/health-shots/2021/12/16/1064594686/how-covid-threatens-the-brain


*

LETTUCE IS AN ANTI-DEPRESSANT: FOOD THAT HELPS THE BRAIN

~ The full list of foods with purported mental-health benefits is expansive, but vegetables, organ meats (like liver), fruits, and seafood took the top four categories.

No single food has magical powers, however. “We want to shift [the conversation away] from singular foods and diets and into talking about food categories,” says Ramsey. His study, for example, found that spinach, Swiss chard, kale, and lettuce contain the highest antidepressant nutrients per serving, but that it didn’t really matter which leafy green you ate—what matters is that leafy greens are a regular part of your food intake.

“As a clinical psychiatrist, it’s intriguing to think about food interventions and how they could shift an entire organism,” says Ramsey. “What happens if I get someone using food for a more diverse microbiome, lower overall inflammation, and more connection to a sense of self-care? Those are all great things for someone struggling with mental and brain health.”

These findings could have a big impact. Worldwide, 4 percent of men and 7 percent of women suffer from depression, and the disorder can affect all facets of life, including productivity and athletic performance. Nutrition is just one piece of the mental-health puzzle, but it has researchers excited. “I really am a big fan of responsibly using medications and effective talk therapy to treat depression,” says Ramsey. “But [focusing on] diet allows us to empower patients to think about their mental health as tied to nutrition.” ~

https://getpocket.com/explore/item/how-your-diet-affects-your-mental-health?utm_source=pocket-newtab

Oriana:

Sure, everyone's heard that "fish is brain food." But what about spinach and other leafy greens, including, yes, lettuce? I find that the very act of preparing healthy food is mood-enhancing. It means that you have value, and are worth the time and the expense it takes to provide the best food. Likewise, you are too precious to be consuming junk food.


*

ending on beauty:

If the universe is—this is the latest—
bouncing between inflation
and shrinkage, as if on a trillion-year
pendulum, why wouldn’t

an infant’s sobbing, on the exhale,
have a prosody
as on the inhale have the chemistry
of tears and seas

~ Ange Mlinko, “This Is the Latest”


 

 

No comments:

Post a Comment