Saturday, October 30, 2021


Lee Sangsoo, Sculpture, 2021


You said: “I’ll go to another country, go to another shore,
find another city better than this one.
Whatever I try to do is fated to turn out wrong
and my heart lies buried like something dead.
How long can I let my mind molder in this place?
Wherever I turn, wherever I look,
I see the black ruins of my life, here,
where I’ve spent so many years, wasted them, destroyed them totally.”

You won’t find a new country, won’t find another shore.
This city will always pursue you.
You’ll walk the same streets, grow old
in the same neighborhoods, turn gray in these same houses.
You’ll always end up in this city. Don’t hope for things elsewhere:
there’s no ship for you, there’s no road.
Now that you’ve wasted your life here, in this small corner,
you’ve destroyed it everywhere in the world.

~ C.P. Kavafy, tr Edmund Keeley



Yes and no. I can think of one particular city connected with the most disastrous years of my life (though those were also years of intellectual and creative growth and development): 

Wherever I turn, wherever I look,
I see the black ruins of my life

The last time I drove through that city on my way to Los Angeles, I burst out crying. The names of street exits were enough. 

It felt good to move elsewhere, where the streets did not remind me of anything, a blank place that felt like a new beginning. 

And different cities have different things to offer. Exploring the new is a special kind of joy that helps one recover from the memories of the place where one was unhappy.

A snowman in Istanbul (note the nipples and navel). This year's Halloween decorations seem to fuse Halloween and Christmas (e.g. a "snowman" made of carved pumpkins).


~ James Joyce, who wrote of his worsening vision in 1931 that "I deserve all this on account of my many iniquities", was trying to confess that he was suffering from syphilis, according to new evidence uncovered by a Harvard scholar, which could upset current perspectives on the author's life and fiction.

Kevin Birmingham, a lecturer in history and literature at Harvard University, claims in his forthcoming history of Joyce's Ulysses, The Most Dangerous Book, that Joyce was going blind because he was suffering from syphilis – "his eye attacks were recurrent because syphilis advances in waves of bacterial growth and dormancy". The array of symptoms Joyce described in detail to his correspondents, "the abscesses that ravaged his mouth and the large 'boil' on his shoulder", were probably syphilitic, writes Birmingham. "Syphilis 'disabled' his right arm in 1907", and the psychological toll of the disease "likely caused Joyce's periodic fainting spells, his insomnia and his 'nervous collapses'", according to the scholar.

Rumors that Joyce had syphilis were circulating during the author's lifetime, but first hit print in 1975, when a biography of Joyce claimed he had congenital, rather than acquired, syphilis. In 1995, Kathleen Ferris's James Joyce and the Burden of Disease also made the claim – but she "makes a lot of assumptions and counts literary evidence as biographical evidence: if one of Joyce's character's had syphilis, that meant Joyce had syphilis", said Birmingham. Her work was "openly ridiculed" by the medical doctor JB Lyons (who knew Joyce), and the issue "faded away", according to Birmingham, who added that "to this day there are prominent Joyceans who haven't even heard of the debate”.

He himself stumbled across the pieces of the jigsaw which led him to his diagnosis while reading of Joyce's symptoms in Richard Ellmann's biography of the author. "There are very few ailments that cause decades of recurrent anterior uveitis (the current term for 'iritis'). Syphilis was by far the most common at the time, and yet Ellmann doesn't even mention the possibility that syphilis caused it," said Birmingham, whose book is published later this week by Head of Zeus, in time for Bloomsday on 16 June, the annual celebration of the day on which Ulysses' protagonist Leopold Bloom wandered the streets of Dublin.

The Harvard scholar decided to "turn over every stone" to find out what might have caused Joyce's deteriorating vision, compiling references to every symptom and treatment the author had. One item in particular sparked his curiosity: Joyce's reference in two separate 1928 letters to the injections of arsenic and phosphorous he was receiving.

"It wasn't too long before I found a medication that fit: galyl, a compound of arsenic and phosphorus that doctors injected multiple times. Galyl was only used to treat syphilis," said Birmingham.

The drug is obscure, and Birmingham believes Joyce opted for this treatment, rather than the more effective drug salvarsan, because one of salvarsan's side effects was that it could further damage his eyesight – and Joyce hated the idea of having to dictate his work. Today, syphilis is treated with a shot of penicillin.

"The more challenging part was making sure that galyl was the only injectable medication of arsenic and phosphorus. I had to prove a negative. Early 20th-century medicine isn't my area of expertise, so I contacted a couple of librarians at Harvard's Library of Medicine and the Centre of the History of Medicine, and they helped me search. I went through various pharmacopeias and national formularies (French, British and American) and couldn't find any other example of injectable medication of arsenic and phosphorus," said Birmingham.

Joyce's doctor, Louis Borsch, whom he had been seeing regularly for years, was treating him – ineffectively – for syphilis, according to the academic. Borsch "was reputable, and he knew what he was doing," said the scholar. So "the only way Joyce didn't have syphilis is if his examining doctor was somehow wrong. Syphilis is not difficult to diagnose, and you don't give someone three weeks of injections of an anti-syphilitic medication unless you're confident in your diagnosis.

"Add to Joyce's treatment (and his penchant for prostitutes) the fact that syphilis is virtually the only reasonable explanation for Joyce's decades of symptoms, and it seems rather difficult to refute.”

The disease clearly preoccupied the author – in his story The Sisters, part of Dubliners, he writes of the death of a priest whose illness, which "affected his mind", is sometimes diagnosed as the final stages of syphilis. "Every night as I gazed up at the window I said softly to myself the word paralysis," says Joyce's boy narrator. "But now it sounded to me like the name of some maleficent and sinful being. It filled me with fear, and yet I longed to be nearer to it and to look upon its deadly work." In Ulysses, he writes: "Thrust syphilis down to hell and with him those other licensed spirits", and warns that Dublin's "nighttown" and its "women of ill fame" is "a regular deathtrap for young fellows”.

Birmingham believes his diagnosis "gives us a very different vision" of the author. "Without the diagnosis, Joyce's letters make him seem as if he's a grousing hypochondriac or someone who just isn't particularly healthy. The truth is that he was in serious pain," said the academic. "He suffered deeply and privately, and the chasm between his private affliction and his public life helped to shape the way he wrote … How the news of a syphilitic Joyce changes the way we understand his life and his fiction is just beginning. This is what makes scholarship so exciting.”

Mark Traynor, manager of the James Joyce Centre in Dublin, said that he had "heard mutterings" of the theory before, including in Ferris's work. "To be honest I've always categorized such theories as half-baked retrospective diagnosis. But in this case it does seem plausible," said Traynor. "My understanding is that pre-penicillin the disease was really prevalent but so taboo that no one talked about it. So it is certainly possible Joyce 'suffered in silence' … If this proves to be the case it certainly would add to our understanding of how deeply he suffered. Whether or not this is some new prism through which we read his work is doubtful, however.”

Professor Derek Attridge at the University of York said that Deborah Hayden had also supported the idea that Joyce had syphilis in her recent book Pox. "The two well-known letters Birmingham refers to are about the treatment for Joyce's problems with his eyesight, and the way they are phrased don't sound like someone admitting he has syphilis, but the identification of arsenic and phosphorus with galyl is interesting, and as far as I know, new," said Attridge. ~

I suspect this is everyone's favorite photo of Joyce. I had no idea that syphilis could affect the eyes.

from another source: 

~ Joyce was no stranger to seedy red-light slums. He spent much of his formative teenage years in the brothels of the infamous Monto district in Dublin, seemingly developing his carnal sensibilities. Many Joycean academics agree that even after he met Nora and moved to Trieste, the red-light romps continued. And living within spitting distance from a brothel? Talk about a dangling carrot.

This environment not only influenced Joyce’s future published writings, but likely his personal correspondence as well. Anyone who has glanced even briefly at the dirty letters Joyce wrote to Nora in 1909, while he was away from Trieste on business, would know that the man’s mind was a bona fide breeding ground for imaginings that are, even by today’s standards, shockingly obscene and perverse.

Most of [the prostitutes of Trieste] were down and out, unable to find employment elsewhere. They drank and smoked heavily. Archival records make mention of whores during Joyce’s time who drank seven liters of beer and one liter of wine per day and were “voiceless due to smoking.” Not to mention the sexually transmitted diseases that ran rampant despite the city’s two venereal-disease-dedicated hospital wards. Since the widespread use of antibiotics was still almost two decades away, there was simply no definitive cure for any sexually transmitted diseases.

In fact, some Joyce historians, Erik included, suspect that Joyce’s 1907 bout with rheumatic fever was actually something much more ominous. Based on archival medical records, they believe Joyce was long-suffering from a case of syphilis (that was perhaps contracted even before arriving in Trieste). This theory met fierce resistance from Joyce’s living relatives. Twelve years after his hospitalization in 1907, Joyce was referred to Dr. Grünbaum, who had set up a new medical studio on Piazza Unità d’Italia not far from Joyce’s flat on Via della Sanità. Dr. Grünbaum said later, “I knew him [Joyce] and had to treat him for a venereal disease. Infected both front and back. Disgusting… but he was a genius and revolutionized literature.”

Of course, anything resembling a real Joycean itinerary is long gone. Gone are the working-man dive bars where Joyce would binge drink and ponder his literary hardship. Gone are the pharmacies that would dole out cocaine and heroine like Tylenol. Gone are the dank bordellos and painted whores of the old Jewish ghetto where Joyce would roam. Gone is the ghetto itself. In fact, gone, almost in its entirety, is the tangled, old Città Vecchia where Joyce did some of his best sinning.

The fact is, all that is left in Trieste of James Augustine Aloysius Joyce are the warm and fuzzy bits: a museum, some commemorative plaques embedded into various buildings, a couple of shiny, overpriced cafes (like the San Marco), and a bronze statue by the grand canal. All that has been immortalized here is the literary genius, not the perversion. Erik mentioned something interesting the night I met him. He said that Trieste was the last place that Joyce lived as a real person; that once he moved on to Paris, and became famous, he became a myth. ~

~ In 1917, while walking down a street in Zurich, James Joyce suffered an “eye attack” and remained frozen in agony for twenty minutes. Lingering pain left him unable to read or write for weeks. Joyce had endured at least two previous attacks, and after the third he allowed a surgeon to cut away a piece of his right iris in order to relieve ocular pressure. Nora Barnacle, Joyce’s partner, wrote to Ezra Pound that following the procedure Joyce’s right eye bled for days.

Joyce was suffering from a case of glaucoma brought on by acute anterior uveitis, an inflammation of his iris. It was, unfortunately, nothing new. Joyce’s first recorded bout of uveitis was in 1907, when he was twenty-five years old, and the attacks recurred for more than twenty years. To save his vision, Joyce had about a dozen eye surgeries (iridectomies,
sphincterectomies, capsulectomies) — every one of them performed without general anesthetic. 

He lay in dark rooms for days or weeks at a time, and his post-surgical eye patches became his trademark. Doctors applied leeches to siphon blood from his eyes. They gave him atropine and scopolamine, which cause hallucinations and anxiety, to dilate his pupils. They administered vapor baths, sweating powders, cold and hot compresses, endocrine treatment and iodine injections. They prescribed special diets (oatmeal and leafy vegetables) and warmer climates. They disinfected his eyes with silver nitrate, salicylic acid, and boric acid; instilled them with dionine to dissipate nebulae; and doused them with cocaine to numb the pain. Nothing really helped.

Uveitis raises intraocular pressure and produces a sticky exudate, which caused Joyce’s irises to attach to the lenses behind them. Sometimes the exudate was so thick that it congealed and blocked his pupil altogether — his future publisher Sylvia Beach remembered seeing his eye “covered by a sort of opaque curtain.” The increased pressure caused glaucoma, which eroded his optic nerve over the years, making his vision spotted, narrow, and dim.

By the age of forty-eight, Joyce’s left eye functioned at only one-eight-hundredth the normal capacity and his “good” eye at one-thirtieth. His eyeglasses prescription was +17 in both eyes — severely farsighted. One of the twentieth century’s great novelists often required a magnifying glass to read anyone’s writing, including his own. Each new attack brought him a step closer to blindness, and the consequent threat to his literary career contributed to a series of nervous breakdowns.

Joyce lived a thoroughly documented life, but the cause of his lifelong battle with uveitis has never been definitively named. Before the introduction of penicillin in the 1940s, the most common cause was syphilis, and because Joyce had begun visiting prostitutes at the age of fourteen, rumors began circulating that his chronic problems had been sexually transmitted. But the image of a syphilitic Joyce is one that few scholars have wanted to conjure in print. 

Richard Ellmann, Joyce’s preeminent biographer, had access to extensive biographical materials and didn’t even mention the possibility of syphilis — and yet he had no qualms diagnosing Oscar Wilde with syphilis despite questionable evidence. Joyce’s patron Harriet Shaw Weaver, his grandson Stephen, and Nora Barnacle all destroyed letters from Joyce, raising the question of whether allies were protecting Joyce’s reputation from the stigma of a dreadful disease.

Joyce’s medical history — which I’ve pieced together from decades of published and unpublished letters and documents — appears to be a painful journey through all of syphilis’s stages, beginning with his initial contraction in the red-light district of Dublin or Trieste, where Joyce lived for over a decade. In a diary, Joyce’s brother Stanislaus described him in 1907 as having not only inflamed eyes but also stomach problems and various “rheumatic” pains.  He was bedridden for weeks, and at the end of an illness lasting nearly three months, he walked around at an “invalid’s pace.”

After the first month of illness, Stannie wrote in his diary that his brother’s right arm had become “disabled,” and that it had remained that way for about a month while Joyce received electrotherapy treatment. What exactly Stannie meant by “disabled” has been a vexing question for Joyceans. If the joints in Joyce’s right arm were stiff and inflamed, he may have had a form of rheumatism. But if his arm was paralyzed or “paretic” (partially paralyzed), then it may have been a symptom of neurosyphilis. This would mean that the spirochete had begun to attack Joyce’s nervous system — presumably nerves in his right shoulder or arm.

As it turns out, Joyce’s right shoulder had a curious little medical history all its own. Years later, Joyce complained of pain in that shoulder and claimed that his right deltoid muscle had atrophied. In the midst of more eye troubles in 1928, his right shoulder had what he called a “large boil.” For anyone hunting for signs of syphilis, the boil sounds like a late-syphilitic lesion: they occur asymmetrically on the body, are typically large, and sometimes merge to form a single wound.

By 1922, Joyce and his friends had begun subscribing to a now-discredited theory of infection postulating that various ailments were caused by infections migrating outward from just a few bodily sources, particularly the oral cavity. So they started blaming Joyce’s uveitis on his bad teeth, and in two harrowing visits in 1923 a dentist extracted seventeen teeth, seven abscesses, and a cyst from Joyce’s mouth. His eye problems nevertheless continued, suggesting that syphilis may also have caused Joyce’s dental calamity. The disease, after all, frequently causes oral ulcers as well as periodontitis (a reason to extract affected teeth). A syphilitic Joyce would presumably have had colonies of Treponema in his weak eyes, his right shoulder, and his wretched mouth.

In 1980, a comparative literature Ph.D. named Vernon Hall and a medical doctor named Burton Waisbren reread Ulysses with a syphilitic author in mind, and they found syphilis everywhere — in Stephen Dedalus’s “somewhat troubled” sight, in Leopold Bloom’s verbal lapses, in the death of Bloom’s infant son. They list apparent references to syphilitic symptoms throughout Ulysses, among them “Stephen grimaces,” “Bloom and bowel problems,” “Bloom blunders stiff-legged.”

In 1995, Kathleen Ferris, then an assistant professor at Lincoln Memorial University, published James Joyce and the Burden of Disease, the first book laying out a case for a syphilitic Joyce. Ferris swept through Joyce’s biography and works, venturing much further into the issue than anyone before, and her conclusions were ambitious. She argued that Nora Barnacle had serious syphilitic complications, that their daughter Lucia suffered from insanity brought on by neurosyphilis, and that Joyce developed a form of advanced neurosyphilis — tabes dorsalis — which causes a distinctive doddering gait called locomotor ataxia.

Hall and Waisbren had tabes in mind when they noted the staggering Stephen and the stiff-legged Bloom, and Ferris used tabes to explain Joyce’s peculiar habit of walking around with an ashplant cane. She went on to suggest that tabes left Joyce impotent and incontinent before giving him the intestinal ulcer that killed him at the age of fifty-eight.

Estimates of syphilis rates in European cities were roughly 10 percent in Joyce’s day. ~


If Joyce was an alcoholic and a syphilitic, that makes him no exception among the creative crowd of the pre-penicillin era. Nor does it mean we should think less of his literary achievement. While Ulysses and even more so Finnegan's Wake are either regarded as masterpieces or as unreadable, his short stories and Portrait of the Artist as a Young Man still enjoy wide readership. Even those who don't consider him a genius agree that he was a wonderful writer in his earlier work, and a master stylist. And ultimately it's the work that counts. The same could be said of Baudelaire, who remains one of my favorite poets.

By the way, Joyce's fear of blindness was pretty much fulfilled. He lost the use of one eye, and toward the end had only 10% vision in the other one.


I'm thinking there are some diseases that seem to be so prevalent during certain historical periods they shape everything, from art and literature to fashion. In the nineteenth century in Europe and the US that disease was tuberculosis. Certainly before the advent of antibiotics syphilis was not only widespread, it was a disease that affected whole families and generations, and since there were no effective remedies it would persist as a chronic illness through all its various stages to the final stage of neurological destruction, general paresis. The mores of the times, where the wife was the "angel of the house" and sexual passions were a matter for whores and courtesans, left many wives infected by their poxy mates, and babies born with congenital syphilis. A social tragedy.

People seem to shy away from thinking a great artist may have been so afflicted, and there is a moralization in that reluctance. STDs continue to incur the stigma of sin, even now. Joyce's doctor voiced that attitude, saying Joyce was "Infected both front and back, disgusting....but he was a genius.”

Such a disease, both widespread and taboo, hidden and unacknowledged certainly complicates our impression of the man, his life and work. I would agree it makes his suffering more plausible and understandable, but would caution against seeing his work through this prism alone. It's  part of the picture, but not the only part,  or the most important.

[Paresis — inflammation of the brain in the later stages of syphilis, causing progressive dementia and paralysis]


Yes, tuberculosis and syphilis — interesting how certain centuries can be thought of in terms of their most widespread diseases, which then take on a certain symbolic, metaphorical quality. Tuberculosis was regarded as a sign of sensitivity; there was something romantic and artistic about having TB. Syphilis, on the other hand, obviously pointed to indulging one’s “animal instincts” with prostitutes. And there’s nothing positive to be said about slowly going blind, paralyzed, and demented. 

As for TB, the other great plague of the 19th century era, every year it killed off 1% of the British population. All the Bronte sisters died of it. Branwell may have had it as well, though his alcoholism became his primary killer.

That’s a very acute perception — that certain diseases practically define an era. Can anyone think of the Middle Age without bringing Black Death and similar recurrent epidemics into it? And if people are terrified of the plague. it’s not really that strange that they flee the cities or participate in the processions of flagellants, trying to do penance for the sins they imagined brought on the disease. And syphilis was indeed spread chiefly by prostitutes, whom Tolstoy regarded as absolutely necessary to preserve “the purity of marriage.” 

Today the tragedy is while syphilis can be treated and completely cured, a portion of the population is so ghastly distrustful of medicine that they won’t go to a doctor — and we are experiencing a strange surge of syphilis again (of course nothing compared compared with the nineteenth century.
Imagine, a treatment exists, but some people refuse it! Certainly parallels with Covid are hard to resist . . . 


~ Scotland has the highest proportion of ginger-haired people in the world, with 13% of the general population endowed with red hair. Ireland, by the way, has the second-highest incidence of redheads, at 10%. In both Ireland and Scotland, over 40% of the population carry the recessive gene for red hair, meaning that Irish and Scots are significantly more likely to be red-haired than the average of 2 to 6% for other people of northern and western European ancestry.

But red hair is not limited to the northwest corner of Europe. It occurs regularly, albeit generally with lesser frequency, among Polynesians, Ashkenazi Jews, the Berbers of North Africa, and among the various peoples of the Middle East. Still, with an average occurrence of 1 to 2% across the whole of humanity, ginger is the rarest of hair colors. Which translates to between 70 and 140 million redheads the world over.

Because gingers are such a small, not to mention very visible minority, they have often been an easy target for taunts, discrimination and worse. Historical examples of ‘gingerism’ date as far back as ancient Egypt, where red-headed men were sacrificed to Osiris. Judas, who betrayed Jesus to the Romans, is often depicted as having red hair. In medieval Europe, red hair was frequently considered the mark of a witch, a werewolf or a vampire.

Even in more recent times, redheads were considered behavioral outliers – more temperamental and libidinous than ‘normal-haired’ people. A 19th-century survey ‘proved’ that 48% of so-called ‘criminal women’ (i.e. prostitutes) had red hair, to name but one now discredited example.

That is not to say there aren’t any demonstrable peculiarities about redheads. While the average adult has 120,000 hairs on their head, redheads only have about 90,000. Strangely enough, redheads have a different sensitivity to pain than non-gingers: they are more sensitive to thermal pain (heat and cold), but less sensitive to certain other sources of pain (including electrical current). They also require a dose of anesthetic up to 20% higher than others. And according to some sources, bees sting redheads more than non-redheads – a claim oft repeated but not corroborated anywhere.

So why does red hair even exist? Some scientist speculate that ginger hair (and the often accompanying lighter skin) evolved in dimly sunlit northern regions in order to enhance the body’s heat retention and vitamin D production. But other scientists prefer to think of gingers as the result of ‘genetic drift’: red hair (and lighter skin) occurs in sunnier climes as well, but the reduced tolerance to UV rays means that gingers are less likely to survive and thrive there.

‘Gingerism’ hasn’t stopped redheads from achieving great things. The list of famous gingers throughout history includes Cleopatra, Rurik (the Viking who founded a state the name of which refers to his red hair: Russia), Queen Elizabeth I, Emperor Frederick Barbarossa (hence the name – meaning redbeard), Genghis Khan, George Washington (and at least half a dozen other US presidents), Mark Twain, Vladimir Lenin, Malcolm X (a.k.a. Big Red), Sylvia Plath, Winston Churchill and Woody Allen.

Some of these honorary members of the Red-Headed League are strange bedfellows indeed. It’s hard to come up with any other club that would count both Lenin and Churchill among its members – or Genghis Khan and Woody Allen for that matter.

The geographic distribution of redheads across Europe is equally puzzling, as this map demonstrates. There are two ginger hotspots: the Celtic fringe of the British Isles (i.e. Scotland, Ireland and Wales), and an area deep inside Russia, somewhere between Yaroslavl and Kirov.

Is this a sign that Europe was once dominated by redheads, but that they were pushed aside by a migration wave coming from the Middle East – as the other zones of lesser redheadedness seem to indicate? Was it the milk-drinkers who did this? Perhaps the habitual taunting of redheads is a distant echo of the ancient victory over the gingerfolk by the blond and brown-haired invaders, comparable to the seemingly instinctive English reflex to ridicule the Welsh.

So how about a Redhead Nation – a Gingeristan? Scotland would seem like the natural gathering place for the world’s redheads; but with a second referendum on independence from Britain a possible consequence of Brexit, the Scots probably don’t want to complicate the issue for fear of losing their argument. Ireland as the world’s official ginger nation? It would only cause a gingerer than thou fight with Northern Ireland.

Perhaps the most appropriate place to found a land for redheads would be the most mysterious patch of red on this map – the gingerest part of Russia – itself named after a redhead.

Considering the endlessness of the Russian steppes, red-headedness is probably the most distinguishing feature this bit of Russia has got going for itself. So, to echo the slogans of two quite different but equally aspirational philosophical traditions: Gingers of the world, unite! Next year in Yaroslavl! ~


The highest concentration of redheads is in Scotland (13%) followed by Ireland (10%). Worldwide, only 2% of people have red hair.

People with red hair are likely more sensitive to pain. This is because the gene mutation (MC1R) that causes red hair is on the same gene linked to pain receptors. It also means redheads usually need more anesthesia for dental and medical procedures.

Having red hair isn't the only thing that makes some redheads unique. They are also more likely to be left handed. Both characteristics come from recessive genes, which like to come in pairs.

Redheads probably won't go grey. That's because the pigment just fades over time. So they will probably go blond and even white, but not grey.

Rumor says Hitler banned marriage between redheads. Apparently he thought it would lead to "deviant offspring."

Redheads most commonly have brown eyes. The least common eye color: blue.
Bees have been proven to be more attracted to redheads.

Being a redheaded man may have health benefits. A study published by the British Journal of Cancer suggested that men with red hair are 54% less likely to develop prostate cancer than their brown and blond-haired counterparts.

Redheads actually have less hair than most other people. On average they only have 90,000 strands of hair while blonds, for example, have 140,000. However, red hair is typically thicker so it still looks just as full.



~ Research shows red hair usually results from a mutation in a gene called MC1R, which codes for the melanocortin-1 receptor. The pigment found in red hair that makes it red is called pheomelanin.

But redheads as a group have more in common than only their hair color

certain health conditions appear to be more common among people with red hair.


Redheads appear to be more sensitive to pain, and less sensitive to the kinds of local anesthesia used by the dentists, research recent suggests.

A 2004 study found that redheads required significantly more anesthetic in order to block pain from an unpleasant electric stimulation.

Another study found that redheads are more sensitive to sensations of cold and hot, and that the dental anesthetic lidocaine is less effective for redheads.

The MC1R gene that can cause red hair codes for a receptor that is related to a family of receptors involved in perceiving pain, which may explain why mutations in MC1R would increase pain perception.


A 2009 study of more than 130,000 people who were followed for 16 years found that those with lighter hair colors were at increased risk for Parkinson's disease compared to those with black hair.

Redheads had the highest risk — they were nearly twice as likely to develop Parkinson's, compared to people with black hair.


A new study finds that mutations in the MC1R gene — which cause red hair, fair skin and poor tanning ability — also set up skin cells for an increased risk of cancer upon exposure to ultraviolet (UV) radiation.

The mutation prevents MC1R from properly binding to a gene called PTEN, which helps protect against cellular changes that promote cancer. As a result, after exposure to UV rays, PTEN is destroyed at a higher rate, and growth of pigment producing cells (called melanocytes) is accelerated as it is in cancer, the researchers said.

Because the study was conducted on mice and cells in a lab dish, more research is needed to see if the same mechanism occurs in people.

~ Carrying just one copy of the recessive MC1R variant appeared to be tied to a bump in the number of mutations linked to melanoma, the deadliest type of skin cancer. ~


Some women with red hair may be at increased risk for endometriosis, a condition in which tissue from the uterus grows outside the uterus, often resulting in pain.

A 2006 study of more than 90,000 women ages 25 to 42 found that those who had red hair and were fertile were 30 percent more likely to develop endometriosis compared to women with any other hair color.

However, redheads who were infertile had a reduced risk of endometriosis compared to those of any other hair color.


A 2012 study found children with rare birthmarks called Congenital Melanocytic Naevi were more likely to have the MC1R mutation that causes red hair than children without the birthmarks.

Congenital Melanocytic Naevi are brown or black birthmarks that can cover up to 80 percent of the body. About 1 in 20,000 children have large or multiple CMN.

Study researcher Dr. Veronica Kinsler, of Great Ormond Street Hospital in London, said: "If you have red hair in your family, these findings should not worry you, as changes in the red hair gene are common, but large CMN are very rare. So the changes do not cause the CMN to happen, but just increase the risk.”


Redheads are also more likely to be deficient in folate and folic acid. These nutrients are degraded by ultraviolet late, so pale skin, unless protected, increases the risk of folate deficiency.

Folate is important for the production of red blood cells and it participates in the production of DNA. Liver, eggs, and seafood are among good sources of folate. If you prefer a supplement, methyl-folate is the recommended form.

Redheads do not produce melanin. The pigment that gives their hair its alluring hue is called pheomelanin. Redheads also have pale skin because of low levels of protective melanin. They don’t tan; some react to UV light by developing freckles. They are also more susceptible to skin cancer, and to UV-caused folate deficiency.

Historically, red hair was often associated with Jews, the awful thing being that Judas was typically portrayed as a redhead. Mary Magdalene was also often portrayed with luxuriant red hair, giving rise to the false perception that red hair was more common among prostitutes. Red-haired women were also likely to be suspected of being witches, especially if they had green eyes.

Jacob’s twin brother, Esau, and King David are described as fiery redheads.

That Russia is named after a red-haired Viking king is a delightful historical irony.

“Redheads aren’t always fair-skinned. There are native redheads born in places like Papua New Guinea and Morocco who have darker skin. There’s even a Hawaiian word for Polynesians with red hair — ehu — who they believe are the descendants of fire gods.”

As for the crazy stereotypes associated with redheads, we need to remember that outside of the Celtic countries, red hair is rare, so a redhead stands out and becomes the “other.” Anyone who stands out, who is different, is likely to be subject to some completely unproven notions. 


On redheads,  when I was working with surgical patients it was generally assumed redheads were "different" in their reactions to anesthesia, might have more difficulty recovering from anesthesia post op. I was never sure if this was more myth than fact, many people can have sensitivities to anesthesia..but I guess that bit of  lore had some basis in reality.


~ When I became a mother, I read parenting books obsessively because I wanted to do right by my children. But they made me feel like a failure and I wondered why. Delving into the lives and times of the experts, I decided that most of the advice they dish out flows from their personalities, their culture, and the limitations and biases of their eras and lives. I have no doubt that some of them were and are brilliant thinkers who have worked hard to improve the wellbeing of children. But too often, these parenting authors are treated with a reverence that defies criticism.

John Bowlby, the father of attachment theory, is a case in point. Bowlby had a typically British upbringing. For the upper middle classes at the start of the 20th century, that meant nannies and boarding school starting at age seven. His parents, Sir Anthony and Lady Mary Bowlby, subscribed to the then-common set of attitudes about children: too much attention and care could spoil them and turn them into egoistical and self-obsessed adults. When John’s beloved Nanny Minnie – a key attachment figure – left, he took the blow to heart.

Between his parents’ remote childrearing style and losing his nanny, Bowlby felt himself drawn to children who were somehow separated from their mothers. Attachment theory would be an important step towards understanding the way human beings related to one another, based on how their parents or caregivers related to them.

Drawing from psychology, ethology, psychotherapy and other disciplines, Bowlby’s ideas about human nature were monumental in scope. My point here is not to dispel attachment theory. It is simply to show that Bowlby’s experience of being parented contributed to his ideas on what an ideal relationship between mothers and children should look like.

Bowlby left the care of his own four children almost completely to their mother, Ursula, while he lost himself in his work. Thus, in an ironic twist of fate, Bowlby, having a distant father, became a distant father himself.

Then there’s Benjamin Spock, an athletic anti-war hero and the rare parenting expert who told mothers: ‘Trust yourself. You know more than you think you do.’ At first glance, Dr Spock seemed the perfect expert. An advocate for ‘permissive’ parenting – that is, following your baby’s cues and trying to fulfill their needs as much as possible – Spock broke with earlier traditions of rigid rules and feeding by the clock. In the 1940s, when his Common Sense Book of Baby and Child Care came out, he was considered an outsider in the parenting world, but he soon took over as an expert in the US and around the world.

As formidable as Spock’s persona was, he was not without fault. First of all, there’s the irony of telling mothers that they should trust their instincts – and then writing hundreds of pages giving them detailed advice. ‘Everywhere you turn there are experts telling you what to do,’ Spock says, conveniently forgetting that he is one of them. If mothers can follow their gut feelings, do they really need an expert, and a man at that, to tell them what to do?

Second, Spock subscribed to highly paternalistic and patriarchal ideals of motherhood. In early editions of his Common Sense book, the parent was addressed as ‘she’ and the child was always ‘he’. By 1976, Spock had changed pronouns throughout the book, using sometimes ‘he’ and sometimes ‘she’. The newest version went out of its way to include fathers as well as gay couples. Spock’s willingness to constantly revise his book and adapt it to the times is certainly inspiring. But if mothers had been trusted from the start, his advice wouldn’t have been necessary at all.

Attachment parenting, advocated by Dr William Sears and his wife Martha, a registered nurse, was one of the parenting methods I was drawn to myself. I gave birth naturally without pain relief the first two times around, breastfed, tried co-sleeping and babywearing – having your baby against your body in a sling throughout the day. But I found it impossible to follow all its tenets, and quickly burned out.

The influence of Sears was so big that in 2012, Time magazine devoted a whole issue to him asking: ‘Are You Mom Enough?’ What is less known is that the Searses were actually fundamentalist Christians. While the newest version of their tome, Attachment Parenting (2001), avoids any mention of religion or God, another book of theirs, The Complete Book of Christian Parenting and Child Care (1997), specifically targets Christian parents. This is where the Searses explain their belief that ‘attachment parenting’ is ‘God’s design for the father-mother-child relationship’. And, while Attachment Parenting claimed that this style of raising children was perfect for working mothers, in The Complete Book of Christian Parenting, the authors say that the best thing for women is to work from home, go part-time or borrow money, rather than go back to work full-time.

With all this in mind, attachment parenting seems less like a parenting strategy and more like the Searses’ desire to proselytize their beliefs to other parents.

My own rule, after reading several books on parenting, now is: ‘The more understanding the expert is towards children, the more brutal they will be on the parents.’ And there is no better example of that than the US author and lecturer Alfie Kohn. His attempt to understand children is formidable but he shows very little of that compassion for parents whom he accuses of being ‘controlling’.

In Punished by Rewards (1993), Kohn claims that the use of punishments and rewards should be limited, especially in relationships that are unequal, such as between parents and children, or between teachers and students. But he conveniently leaves out another unequal relationship: the one between parents (predominantly female) and experts, such as himself (predominantly male, at least in the US). And when it comes to parents, he has no problems using punitive tactics such as shaming: ‘If you’re unwilling to give up any of your free time, if you want your house to stay quiet and clean, you might consider raising tropical fish instead,’ he writes in Unconditional Parenting (2005).

In the UK, meanwhile, female experts such as Gina Ford, Penelope Leach, Tracy Hogg and lately, Philippa Perry, have come to the fore. Sadly, they can be just as patronizing toward parents as their male counterparts. For example, in The Book You Wish Your Parents Had Read (2019), Perry, a psychotherapist, encourages parents to ‘turn our shame into pride’; thus shame – an emotion she finds unacceptable for children – has its place for parents. But research shows that feeling shame can make people more depressed, no matter their age.

Caregivers in the 21st century don’t have one expert to follow, they have several, and they all espouse intensive, child-centered practices that focus on the parent-child relationship. In theory, this is not a bad thing. But in a world where parents get little to no support, and where caring for children is seen as a personal rather than a communal issue, intensive parenting practices can be impossible to follow. In fact, beliefs such as that mothering should be child-centered, and that children should be sacred and fulfilling to parents, can be associated with lower wellbeing in mothers.

Experts are sometimes surprised or even defensive when parents report feelings of immense guilt or shame after unsuccessfully trying to follow their advice. In an article about maternal guilt for the British newspaper The Independent in 2011, Leach, a psychologist, is quoted as saying: ‘If [my book Baby and Child (1977)] made you feel like that, why didn’t you bin it? You do rely on people to decide for themselves whether a book is useful to them or not.’

But if a parenting book has impacted whole generations of parents or can negatively affect their wellbeing, experts have a responsibility towards the parents they’re advising. What’s more, just like the dieting and wellness industries, the parenting industry relies on parents feeling inadequate so that they’ll keep looking for the next book that will solve all their problems. But, as I’ve found out the hard way, such a book does not exist.

This is not to say that all books for parents are problematic. Anthropologists such as David F Lancy, developmental psychologists such as Alison Gopnik, writers such as Janelle Hanchett or Sarah Menkedick, or economists such as Matthias Doepke, were also led by their personal experiences and expertise but, in their books on the history, culture and psychology of parenting, they offer a refreshing take that can make us feel a bit better.

Parenting books are not really for us, parents. They are for the people writing them. Perry’s book admits as much in its title, and repeats it in the text: ‘I wrote the book I wish I had read as a new parent, and I really wish my parents had read it.’ It has nothing to do with us as readers but everything with her feelings and experiences.

The idea is not to disparage parenting books as a category. If I were to write a book on parenting, it would certainly flow from my passion for languages, and my own experiences of raising children within several cultures. The solution is not to write or read fewer parenting books, or quit reading them altogether.

In fact, we might need more parenting books that offer a wider variety of perspectives and expertise and that include the real-life experiences of parents. We need more books written by experts who are not white, male, cisgender, neurotypical, able-bodied and Anglo-Saxon. And we need books that shift the blame and burden from parents to where it really belongs: to systems, governments and institutions.

Existing parenting books can be read in a subversive, critical way that is all about asking questions, such as: who wrote this book, when and why? What ideas about parents and children must this person have held? And what contributed to them thinking that way?
Ultimately, it’s about seeing the person behind the guru. Maybe in some cases, we’ll see that the emperor is naked and that the all-powerful Wizard of Oz is really just an old man behind the green curtain. ~\

from another source:


~ “Americans have no script,” says Jennifer Senior (TED Talk: For parents, happiness is a very high bar), author of All Joy and No Fun: The Paradox of Modern Parenthood. “We believe we get to invent our future, our opportunities and who are our children are going to be. Which is wonderful, but also very troubling.”

In reporting her book, says Senior, when she asked mothers who they went to for parenting advice, they named friends, websites and books. None named their own mothers. Only the most current child-rearing strategies were desired, in order to best position their children for achievement in the future.

In other words, that which is most American about us — our belief that the future is unwrit — is what is driving us mad as parents. Senior paraphrases Margaret Mead, who wrote this in 1942: In America, there are only this year’s children.

“You don’t see the handwringing in other places around the world,” says Christine Gross-Loh, author of Parenting Without Borders: Surprising Lessons Parents Around the World Can Teach Us. “People understand that there is a way to do things.”

In Norway, childhood is strongly institutionalized, says Norwegian sociologist and economist Margunn Bjornholt. Indeed, most children enter state-sponsored daycare at 1 year old (parents first get almost a full year of state-sponsored leave from work), then enter school and organized activities.

Norwegians believe that it is better for children to be in daycare as toddlers. At daycare, methods reflect the country’s fetishistic dedication to fresh air. So even in Oslo, where arguably the indoor air quality is fresher, and even in Scandinavian winters, children are bundled up and taken outside to nap in their strollers.

Craziness? Culture. In Japan, where Gross-Loh lives part of the year, she lets her 4-year-old daughter run errands with her 7-year-old sister and 11-year-old brother — without parental supervision. Her kids don’t hesitate to take the Tokyo subways by themselves and walk on busy streets alone, just like their Japanese peers. But when she comes back to the States, Gross-Loh doesn’t allow the same.

“If I let them out on their own like that in the U.S., I wouldn’t just get strange looks,” she says. “Somebody would call Child Protective Services.”

Both in Japan and Norway, parents are focused on cultivating independence. Children do things alone early, whether it’s walking to school or to the movies. The frames, however, are different. In Scandinavia, there is an emphasis on a democratic relationship between parents and children. In Sweden especially, the “rights” of a child are important. For example, a child has the “right” to access their parents’ bodies for comfort, and therefore should be allowed into their parents’ bed with them in the middle of the night. If a parent doesn’t allow them, they are both denying them their rights and being a neglectful parent. In parts of Asia, meanwhile, co-sleeping with a family member through late childhood is common. Korean parents spend more time holding their babies and having physical contact than most. But within a family, obedience is key — not democracy.

In Jewish tradition, says Wendy Mogel, a clinical psychologist and author of The Blessing of a B Minus: Using Jewish Teachings to Raise Resilient Teenagers, there’s a teaching in the Talmud that every parent has an obligation to teach their child how to swim.

We’re supposed to be raising our children to leave us,” she says. “They must develop self-reliance and resourcefulness and resilience, which is a challenge, because we must allow our children to make mistakes.

This is enormously hard for American parents to do. “Parents are genuinely anxious about really big things like the melting ice caps and collapsing economy and the unending stories about violence and predators and college admissions,” says Mogel. “They displace all of these fears of things they can’t control onto the one thing they believe they can control, which is children.” 

American parents are highly focused on making sure that their children’s talents are groomed for success. Sara Harkness, a professor in the Department of Human Development and Family Studies at the University of Connecticut and a pioneering researcher on parenting and culture, found that nearly 25 percent of all of the descriptors used by American parents were a derivation of “smart,” “gifted” or “advanced.” “Our sense of needing to push children to maximize potential is partly driven by fear of the child failing in an increasingly competitive world where you can’t count on the things that our parents could count on,” Harkness suggests.

This is not unlike many Asian nations, where parenting, from a very early age, is focused highly on academics and college acceptance. One Korean mother who Harkness interviewed played English tapes to her 2-month-old baby “because it’s never too early to start,” she says. The parent’s primary role is as an educator, and the child’s role is to respect the parent and repay them with sacrifices.

In the Netherlands, meanwhile, parents used “smart” to describe their children only 10 percent of the time. Dutch parents believe strongly in not pushing their children too hard. “People would talk about a cousin who got a PhD and was very unhappy because there were no jobs at universities, and said that you shouldn’t teach your child to read before they got to school, because then your child would be bored at school and not have any friends,” says Harkness.
Instead, regularly scheduled rest, food and a pleasant environment are the top priorities for Dutch parents.

But in Spain, where families are focused on the social and interpersonal aspects of child development, parents are shocked at the idea of a child going to bed at 6:30pm and sleeping uninterrupted until the next day, instead of interacting and participating in family life in the evenings. “They were horrified at the concept,” says Harkness. “Their kids were going to bed at 10 p.m.”

The diversity of ideas should be liberating, not stress-inducing, agrees Gross-Loh. “It was incredibly freeing to realize that there was no single way to do things and it’s totally okay to make mistakes as a parent,” says Gross-Loh of her research. “It gave me space to let my children be who they are, and let them grow into that.”

The U.S., home to immigrants who bring their own traditions from around the world, is uniquely positioned to both learn and let go. American parents can recast their scriptlessness as they see fit, drawing on both global tradition and present theory. Will they? Tomorrow’s children may decide. ~

With seven of us, my mother had no time or use for "helicopter parenting," and that's pretty much a phenomenon much more recent. We did all attend a "nursery school" that was part of our city's university, part of research in child development. I guess we were sort of guinea pigs for the students and professors to study. Dr Spock visited there, and we have his picture sitting in a circle of children, including my sister. I think they were playing drums. They had summer programs for the kids, and would interview us from time to time. They did longitudinal studies of families. Up to our early teens, I think.

My mother was considered more restrictive and overprotective than most, yet we had much more freedom than children seem to have now. No play dates or arranged activities, enriching or otherwise. Our childhoods were not all scheduled up. More free form. And we had lots of responsibilities, as siblings, from cooking and cleaning to caring for the younger ones. We were not pushed and groomed to achieve, to reach any particular social status, any prestigious careers. The upper middle class was not only not a goal, its pretensions and values were scorned as ridiculous and unauthentic.

I think one great good thing that happens was that someone came up with the phrase: "a good-enough mother." And the pressure on women (at least the educated women) to be super-moms lessened. They realized they were doing the best they could, trying to manage what remains the most difficult (and perhaps also the most fulfilling) job there is.

~ It’s easy to place the blame for America’s economic woes on the 0.1 percent. They hoard a disproportionate amount of wealth and are taking an increasingly and unacceptably large part of the country’s economic growth. To quote Bernie Sanders, the “billionaire class” is thriving while many more people are struggling. Or to channel Elizabeth Warren, the top 0.1 percent holds a similar amount of wealth as the bottom 90 percent — a staggering figure.

There’s a space between that 0.1 percent and the 90 percent that’s often overlooked: the 9.9 percent that resides between them. They’re the group in focus in a new book by philosopher Matthew Stewart (no relation), The 9.9 percent: The New Aristocracy That Is Entrenching Inequality and Warping Our Culture.

There are some defining characteristics of today’s American upper-middle class, per Stewart’s telling. They are hyper-focused on getting their kids into great schools and themselves into great jobs, at which they’re willing to work super-long hours. They want to live in great neighborhoods, even if that means keeping others out, and will pay what it takes to ensure their families’ fitness and health. They believe in meritocracy, that they’ve gained their positions in society by talent and hard work. They believe in markets. They’re rich, but they don’t feel like it — they’re always looking at someone else who’s richer.

They’re also terrified. While this 9.9 percent drives inequality — they want to lock in their positions for themselves and their families — they’re also driven by inequality. They recognize that American society is increasingly one of have-nots, and they’re determined not to be one of them.

I recently spoke with Stewart about America’s 9.9 percent — the people who are semi-rich but don’t necessarily feel it. We talked about fear, meritocracy, and why the 9.9 percent are so obsessed with nannies. Our conversation, edited for length and clarity, is below:

So, to start out, you write about the 9.9 percent and a “new aristocracy” in America. Who are these 9.9 percent?

The statistical side of it is very imprecise. I don’t think of the 9.9 percent as just everybody who has more than a certain amount of money and less than another amount of money. I see it more as a culture, and it’s a culture that tends to lead people into the 9.9 percent of the wealth distribution. It’s a cultural construct that is defined by attitudes toward family, toward identity issues about gender and race, by education and educational status and the idea of what constitutes a good career, which is mainly professional and managerial.

What does the culture look like? How do these people separate themselves out?

The guiding ideology is essentially that of a meritocracy. The driving idea is that people get where they are in society through a combination of talent and work and study. The main measures of that are educational attainment and material well-being, and anything that we provide to society or other people is on top or on the side of that and is a reflection of our own virtue and not in any way necessary for social functioning or part of a good life. It’s always, essentially, a sacrifice.

The obvious place to look for it is the whole college admissions game. But I think that’s kind of limited, too. I put a lot of emphasis on the family aspect because I think that’s a place where you really see in operation the attitudes and practices that go into child rearing and family formation. 

You have at least two very different groups emerging in American society. At a high level, you have people who have their kids late in life after getting a lot of education, have fewer kids, and invest massively in them. And then you have a large group that is much closer to the traditional style of having kids early and not investing as heavily in them although many of them, of course, try to emulate the practices of the upper-middle class.

One of the things you write about in the book is how much this 9.9 percent are willing to invest in their children — in nannies, in schools, in extracurriculars. Where does this pressure come from, this urge people have to make their kids the best?

I think the driving motivation is fear, and I think that fear is well-grounded. People intuit that in this meritocratic game, the odds are getting increasingly long of succeeding. They work very hard to stack the odds in their kids’ favor, but they know as the odds get longer, they may not succeed.

That’s coupled with another one of the traits of this class, which is a lack of imagination. The source of the fear is also this inability to imagine a life that doesn’t involve getting these high-status credentials and having a high-status occupation. This life plan looks good, and it certainly looked good in the past when the odds were more sensible. But it’s not a great deal. It’s something that isn’t just harmful to the people who don’t make it, it’s also harmful to the people who get involved and do make it, in some sense.

In what way is it harmful to the people who do make it to the 9.9 percent and the people who don’t?

I’m not suggesting it’s equally harmful. The psychological damage to the upper-middle class is kind of trivial compared to the substantive damages other people face. But it is, nonetheless, pretty real.

I would point to the sociological and psychological evidence that you have significant increases in anxiety-related disorders and other forms of unhappiness even among people who are fairly well off. It’s a trade-off that all or most of them are willing to make. But it’s not a free lunch. 

Well, even if people are on paper wealthy, they often don’t feel wealthy. They’re always looking at someone who has a little bit more than them. How does that play out here?

That’s almost the defining aspect of life in a high-inequality world. And the important thing is that it affects people all the way up.

I know people who are in the top 1 percentile of the wealth distribution who just feel incredibly poor and stretched because they’re looking around and see other people who have got just that much more and can do that much better. That insecurity is what runs throughout the system. Just because you’re in the top decile, or 9.9 percent, that doesn’t mean you escape it. In some ways, you’re more subject to that insecurity. That drives people to do crazy things to stay where they are and to avoid falling.

To what extent does the upper-middle class drive inequality, and to what extent are they driven by inequality?

Most of this culture of the 9.9 percent is an effect and a consequence of inequality. That said, it’s one of those effects that becomes a contributing cause; it’s part of a feedback loop.

Most of the root source of inequality is structural, and I think much of it goes to an economy that’s no longer as competitive, where you have oligopolies rising without significant challenge. The balance of power between what we call workers and what we call capitalists is out of whack, and that’s a fundamental source of inequality. Race and gender can also play into inequality.

That inequality does have these fundamental sources, and once it’s in place, other mechanisms come in to lock it in and to exacerbate it. That’s where the culture of the 9.9 percent comes in. This culture that focuses on meritocracy becomes a way to justify a professional credentialing game where certain categories of workers are able to carve out high rents for themselves. It’s where certainly families — because they have excess resources — are able to over-invest and lock in benefits.

Those are mostly consequences of rising inequality, but then they feed back into it in obvious ways. They lock people in place, they tend to make it harder for large numbers of people to do well, they exacerbate the irrationalities in society.

It all sounds very gloomy, but I’m not actually that gloomy. I just think this is the way human societies work. There’s nothing in human nature that says we’re particularly good at forming large, complex societies that make everybody better off. These are sort of the forces of entropy at work in human society. I don’t want to be some sort of misanthrope condemning all of humanity. My point is that we are imperfect at forming reasonable societies, and we need to understand those imperfections if we’re to do better, which we can.

We’ve talked a lot about the culture of the 9.9 percent so far, but what does that culture mean for everybody else? The people who can’t afford to super credential their kids and send them to Harvard?

I think the underemphasized concern here is the extent to which the other 90 percent end up buying into this value system to some degree. I’ve been in the child-rearing game, and I see a lot of the madness firsthand — parents freaking out when their child takes a sip of soda out of the refrigerator because they somehow imagine this is really going to make it impossible for them to demonstrate enough virtue to get into the right college. They will curate every experience for their kids — every travel experience, every friendship.

I mostly see it among members of the upper-middle class who can afford it. But increasingly, the same sets of values and practices are clearly spreading to where people can’t afford it and where it doesn’t make sense. They’re also buying into this idea that kids have to be absolutely optimized, maximized so they can get onto the narrow path that leads to a stable upper-middle-class life, and otherwise it’s Starbucks until the end of time.

It basically takes away a potential countervailing mechanism. If society were such that you produce this one noxious class but then that gives rise to a reaction of people angry with this class and then acting out, you might have some conflict. Hopefully, it’s not violent but can be mediated through political institutions, but you have at least a mechanism that might lead to a solution. But when the ideology starts to spread, it effectively removes the basis for that conflict, it neutralizes the opposition in a way, and that’s a problem. It means that the system just continues further down the road toward greater instability.

Why is there such a focus on the nanny? On child rearing?

Nannies cost a lot, you basically have to hire another full-time individual. And that is not something that most individuals can do. It’s creating a definition of success that will define most people out of the running even before they start.

[The 9.9 percent] all have internalized this idea that child rearing is meritocratic breeding, and the measure of your success is how well you optimize your child as a future member of the meritocracy.

That means that to the extent that you can’t yourself spend all of your time raising your child, you need to get somebody else to do it. And that person’s task is not child-rearing as it used to be understood, which was feeding them and preventing them from harming themselves. It’s about optimizing them, and there’s no limit to what you can do to optimize them. And so that’s why you’re going to go for a nanny who’s college-educated, preferably with a degree in child psychology, and who’s capable of organizing all sorts of enriching experiences for the child. The logic is pretty ironclad.

Generally, I don’t think it’s terrible for the kids. It’s just a model of parenting that a) is insane and b) cannot conceivably be emulated by most of the population.

What’s the role of the idea of meritocracy here?

I think that meritocracy mostly gets invented after the fact. You have significant inequality, and then you get people reimagining how the economy works. They first make the false assumption that individual merit or individual talent and effort is the main factor in production, and it isn’t. Most human economic activities depend far more importantly on the degree of cooperation that people are able to establish between themselves — cooperation within firms, cooperation between firms in a marketplace, and cooperation in a society at large in terms of having standards of trust, reasonable laws, and so on. All those things are far more important in determining economic output than mere merit or merely allocating rewards to merit.

People make this false assumption precisely because the inequality is already there, and they’re looking for a justification. Then, they make the further false assumption that the variation in human merit is tremendous — it’s astonishing that some people are literally a million times smarter than other people. You have to qualify a little bit because whenever you criticize meritocracy, someone will come back and say, “Well, people are unequal, some people are smarter.” I have no problem with that, there are differences among people, and those have to be recognized. But it’s completely false to think that those differences are great enough to explain the kind of variation that we see in the economy.

Nonetheless, all of this rhetoric around meritocracy tends to grow and becomes more convincing precisely as inequality grows. In this respect, I don’t think our meritocracy is all that different from previous aristocracy. The definition of aristocracy is just the rule of the best, and people who have merit are also by definition the best. It’s the same kind of rhetoric. Yes, aristocracy usually relied more on birth, but that’s just a mechanism for identifying the people who are going to be perceived to be the best.

And we work more in order to be able to have this merit to be perceived to be the best. That’s one of the things that struck me about your book — how many hours the upper-middle class, the managerial class, is working now to maintain their spot. 

There’s no question that workloads have gone up where people are earning the most. There again, there’s this ideology of merit because we think it’s because these people are so incredibly productive. The hour of that corporate lawyer is just worth so much money that of course they’re going to work those extra two hours just to cash in on that. And it’s just so ridiculous, it’s wrong. 

Those people are working hard because they intuit precisely that merit isn’t deciding who’s getting to claim these rents. They’ve got to do something to distinguish themselves from the competition, and the way to do that is just to demonstrate a greater willingness to sacrifice, a greater willingness to submit one’s own identity, and a greater willingness to obey. I see this manic work trend as some of the clearest evidence we have that the meritocracy is out of whack and inequality is far too great.

So, ultimately, what are some solutions here? How do we tamp down this pressure people feel to hang on so tightly to their status and this sense that there’s a smaller and smaller piece of the pie they’re fighting for, even among those who are quite well-off?

The solutions mainly have to do with the fundamental sources of inequality, and I don’t think those are that hard to see. Attacking the trusts and the oligopolies, that’s a very clear avenue to pursue; breaking apart some of the professional guilds that strangle the economy. Health care is an obvious place to look on both ends — on how much we spend on it and how access to it is distributed. We need to provide more public support for child care. Another avenue that’s very clear and very difficult to do is housing — we have a tremendous amount of land, and there isn’t really an excuse for the kind of housing affordability issues that we have.

This isn’t a kind of game where you need a 100 percent solution. You can get pretty far with moves that just reestablish equality on a firmer foundation. This isn’t an unsolvable problem — especially if you’re willing to aim for what’s good and not necessarily what’s perfect.

The other thing that concerns me in this debate is understanding the role of the 9.9 percent in this. There’s a tendency for members of the meritocratic class to say, “Oh, the problem is that we’re hoarding these spots. We’re hoarding spots at the elite universities and certain professions, and what we need to do is to make sure that we’re more representative in how we let people in.” Well, that’s really wonderful for people to do, but that is not going to be the solution to much of anything. It takes for granted that the hierarchy itself is justified and is economically productive, and it’s just a matter of making sure that everyone has a fair shot of getting in. Let’s say you have a society in which you have serfs and lords and you say you’re going to have a lottery where one out of every 100 serfs will become a lord every year, and every year or every generation you’ll rotate. That’s not going to make a just society, that’s going to make a perverse society. That’s a false line of solution.

So what is the role of the 9.9 percent in making this better?

The key contribution of the 9.9 percent, the culture of the 9.9 percent, is going to be to return to the actual original values of America’s upper-middle class. If you get rid of the false idea of meritocracy that everyone earns what they deserve and substitute the idea that meritocracy means holding power accountable to rational standards of public scrutiny, you have a class that can actively contribute in a positive way toward equality. There’s nothing more dangerous to inequality than a society where people and activities are held up to rational standards. There are some core values in what we call meritocracy — of holding power accountable to reason, of treating people as equals under the law, of making deliberations public, and professionalism. All of those core values are intrinsically good things. What’s happened is that inequality perverts and distorts them. The contribution of the 9.9 percent would be to pursue those.

I don’t think the answer is to put the 9.9 percent on a boat, send them out to sea, and sink it, though that would probably make for better sales on a book like this. But I do think the issue is basically a class that has allowed itself to delude itself about the sources of its own privilege, and its main contribution would be in opening its eyes and then living and working more in accordance with what I think was the original inspiration of the class.

What follows when people recognize the actual sources of their privilege is they become a little more humble and they are more willing to help other people, more willing to invest in the future. For me, one of the most distressing statistics is that the richer people get, the less they believe in publicly supported child care. It’s not that they don’t want their taxes to go to pay for child care, it’s that they’ve internalized this idea that everyone can do this, everyone can raise their own child or just hire a nanny. “Let them hire a nanny” is the new “let them eat cake.” It just shows how this incredibly virtuous, super-well-educated class becomes oblivious to the basis of its own existence. ~


~ Most people aren’t preoccupied with Satan, demons, or the end of the world. But like it or not, many Americans are greatly concerned with these topics. A 2010 poll from the Pew Research Center found that 58 percent of white evangelical Christian adults surveyed believed that Jesus Christ would return to the earth within the next 40 years. That was significantly higher than any other religious group. Just 32 percent of Catholic respondents agreed, for instance.

Speculating about how the world ends is probably as old as humanity itself. It was pivotal to the early formation and growth of Christianity. And early leaders of the faith frequently suggested [the end of the world] was just around the corner in their own lifetimes. But over time as those hundreds of predictions fail to come true, End Times literature and that tradition faded away in Christianity.

But it came back with a vengeance in the mid 20th century after the development of nuclear weapons, especially when white evangelicals began to emerge as a political movement in the 1970s.

Talking about all of this with me in this episode is Christopher Douglas, a professor of English at the University of Victoria in British Columbia, Canada. He is also the author of “If God Meant to Interfere: American Literature and the Rise of the Christian Right.”

SHEFFIELD: So before we get into the details here, let’s discuss briefly the idea of apocalypse. What does apocalypse mean? People oftentimes associate it with the end of the world, but that’s not entirely true necessarily.

DOUGLAS: Yeah, that’s right. Apocalypse is actually a literary genre that develops in the third century BCE and has another sequence of developments in the second century BCE.

So one of the things apocalypse does in the past, in the third century BCE, some of the early apocalypses actually don’t appear in the Bible, they are called, for instance, the Book of the Watchers, and the Astronomical book. And they take up this very strange passage, you probably remember it in Genesis 6, there’s a very strange passage about how the “Sons of God” came down and basically mated with human women. And they birthed a race of giants that were prone to wickedness. The early apocalypses took up that theme and wondered about the cosmic sphere.

In this apocalypse, it was attributed to Enoch who was the in the seventh generation of humans. I think this is Genesis 4. It just says Enoch walked with God and then he was no more. And so he, in a sense, gets whisked up into heaven. That’s a proto-Rapture. So Enoch, these apocalypses are attributed to him and he has a heavenly guide there because he can’t interpret all the sort of fascinating things he sees in that cosmic sphere.

So the second stage of development in apocalypse was in the second century BCE. And there we get, for instance, the Book of Daniel, which is in the Bible, probably second half of the Book of Daniel probably written around 164 BCE, but there’s also other pair of Biblical apocalypses, like the Apocalypse of Weeks and the Animal Apocalypse as well.

DOUGLAS: So apocalypse, in a sense, is a kind of national theodicy because it’s trying to answer the question of, how could all these things be going wrong for God’s people, if they’re trying to do the right thing. Sometimes in the Hebrew Bible the answer to why are things going so wrong is because God’s punishing us because we’ve started worshiping foreign gods or something like that. But if you’re trying to do the right thing, if you’re trying to circumcise your children, your boys, but the worldly ruler is preventing you, that can’t be attributed to God’s discipline anymore.

So the problem that apocalypse tries to answer is if this isn’t God’s punishment, then why are things going so terribly wrong? So it’s in that sense that it’s a kind of a, it’s a theodicy of national suffering that now attributes the problems that the Judeans are facing not to God, but to God’s enemies, basically to God’s cosmic enemies and to their worldly rulers.

SHEFFIELD: And that idea got transferred or accepted within Christianity when that came along later, and there are several books in that tradition, the apocalypse tradition, but of course the most famous one being the Apocalypse of John often called the Book of Revelation. And there’s a similar type of theme in that book as well, in these other Christian apocalypses.

DOUGLAS: So Christianity basically starts out as an apocalyptic theological tradition. Revelation is the most famous kind of Christian version of apocalypse. We again get a sort of vision of the divine realm by a human writer who can’t understand what he’s seeing. And therefore things have to be explained to him by a kind of angelic being, and especially for the symbolism.

Revelation is filled with all this strange and wonderful symbolism, there’s the beast and Babylon and all that stuff.

And Revelation is the best known version, a Christian version of apocalypse. It ends with this sense of, Jesus’s final words in the Book of Revelation are ‘I am coming.’ So it encapsulates that sense really that this is all going to happen. I think the early Jesus followers believed that they were years or decades away from seeing that return of Jesus in power to defeat his cosmic enemies, but also the Roman Empire, which was at that point, it seems from the Book of Revelation, basically persecuting Christians.

SHEFFIELD: Jesus didn’t come back. The world didn’t end. But in the Americas later, you began to see the emergence of some Protestant faiths that were apocalyptic.

So like the Jehovah’s Witnesses and Mormonism, which calls itself the Church of Jesus Christ of Latter-day Saints, meaning they see themselves as the being near the end of the world. And so, there was a renewed interest in speculations at that time in the 19th century. And one of the other sort of interesting developments that took place around this time, and maybe somewhat earlier, was the sort of the re-emergence of Satan as a spiritual foe, much more prominent than he was in the early Christianity. And certainly within Judaism.

So in some ways, the figure of Satan kind of fits into that pantheon as sort of a rival.

And we still see the traces of polytheism in the Bible. So for instance, in Deuteronomy 32, this figure called Elyon distributes the nations to the bene-elohim, the Sons of God, and Yahweh is one of the gods who seems to get Israel. There’s also that kind of imagery that’s retained in Psalm 82.

DOUGLAS: Yeah. So the idea is that with the development of apocalypse, it actually reaches back to some of the vocations and agents of this ancient history of Israelite polytheism. So in some sense, I think that apocalypse, as it proliferates these cosmic beings, some of whom are enemies of God, some of whom — Gabriel and Michael — are actually the agents of God as well.

It’s reaching back to this sort of polytheism to repopulate the cosmos. Gabriel, the angel, tries to come to Daniel to explain the vision that he’s had. He’s prevented for 21 days by the prince of Persia until Michael comes along. And Michael is referred to as the champion of your people.

So he’s almost imagined as a kind of patron angel. In older times, in older centuries, he would have been the patron god of Israel. In a sense, he almost inhabits the niche or the role that Yahweh used to play as the national God of Israel now played by Michael.

And we get, later on in Daniel, that vision. Daniel sees a throne room in which there’s a throne on which there’s an Ancient of Days. And another being, a one like a son of man rise on the clouds and all power and authority is given over to this son of man. And of course, that would be the phrase that gospel writers would attribute to Jesus, as this sort of second power in heaven.

When we see the son of man arriving in Daniel on the clouds, as someone is imagined to be a sort of warrior a cosmic warrior, in a sense, this is the old Ba’al imagery that Yahweh had inherited. So that relationship between El and Yahweh has, in a sense, been transmuted onto the Ancient of Days and the Son of Man.

William Blake: The Ancient of Days, 1794

And in the Christian tradition, this would end up being the first two people in the Christian Trinity God the Father and God the Son. Jesus having power over the waters. He walks on water and he’s able to calm the storm. He’s also unmatched in Revelation and as the warrior who comes at the end to destroy his opposition.

So there’s a definite set of echoes between that and ancient Israelite polytheism, and then the way in which apocalypse redeploys that and renames some of the agents to imagine a world in which God seems to be more distantly controlling.

SHEFFIELD: And so the tradition that having this sort of spiritual, unseen warfare, it percolates as a thread within the tapestry of Christianity and Judaism for a long time.

And then over time, in the 19th century, there’s a return to this idea of the world’s going to end. And we’re going to be there for it, and we’re going to be God’s chosen ones. So the Jehovah’s Witnesses have this thing where they take from the Book of Revelation where there’s said to be, people who were predestined to be saved by God. And of course their religion is the one that is. And so there’s this proliferation of books in 19th century United States of people saying that ‘I had a revelation, and this is how the world’s going to end, et cetera, et cetera.’

And you have several religions that are built out of that. Seventh-day Adventism, Jehovah’s Witnesses, Mormonism. But over time though, the literature itself changes from being prophecies and revelations to being more about as a novel. And around that time, like for instance, we had the emergence the first modern Christian novel, “Ben-Hur.” That story, which came out in the 1890s, if I remember right. And Jesus was a character in the book. And that’s the first time, that we know of, that Jesus was a character in a novel. It was enormously successful and so the Christian literature began, to some degree, kind of gravitate in that direction somewhat.

And then we had other books that came out that featured more symbolic interpretations of the end of the world. So like “Lord of the Rings” or the Narnia series by C.S. Lewis. But they were still metaphorical in that sense.

And there was this reimagination of Christianity in a novelistic sense, rather than a revelation. But then, you have the emergence of the idea of global nuclear holocaust after the bombings and Hiroshima and Nagasaki. And things kind of reoriented a little bit after that.

The prospect of the world, of everyone dying in a nuclear holocaust, that was hugely important throughout the world. Every culture began grappling with that idea.

And that filtered into the Christian tradition, the Christian fundamentalist tradition that was emerging. And in that context, we had the emergence of the Jesus People movement, which was a sort of left wing Christian fundamentalism. And out of that, came a book called “The Late, Great Planet Earth.” Can you talk a little bit about what that book was, and who wrote it?

DOUGLAS: Sure. I think about apocalypse as a kind of phenomenology of disorder. It’s been baked into the Christian tradition from the beginnings, but I think there’s renewed attention to apocalypse in those moments where people feel that the world has become crazy and disordered, and they want to understand or see a vision of order within the chaos that they’re experiencing.

I’m interested in evangelical fiction and in particular, one important sort of milestone in terms of the work that I do is John Nelson Darby in the 1840s who, would re-imagine apocalypse for the modern period, for the contemporary period. And he’s the guy that sort of came up with this notion of pre-millennial dispensationalism—

SHEFFIELD: Which means?

DOUGLAS: Dispensation, meaning that the world history is divided into certain specific eras that are chapters organized by God. And pre-millennial means we are in the last chapter right before the thousand year reign of Christ that’s imagined in the Book of Revelation.

So in other words, our current period is right before the End Times. So that’s still very much within this sort of apocalyptic vision, but John Nelson Darby preached widely in the United States and his vision of pre-millennial dispensationalism, that we’re in the sort of End Times, and that one of the things you needed to do was look for the signs of the End Times, became popular in U.S. evangelicalism especially. And it made its way into, for instance, the Scofield Reference Bible, which I think sold something like 50 million copies.

Revelation is no longer about the Roman Empire disguised through symbolism as Babylon. But rather everything becomes about the 20th century, or the early 21st century for folks who are still doing it. So all the books of the Bible in a sense become historically decontextualized.

So ”The Late, Great Planet Earth,” it takes these mash of quotations and it looks at nuclear proliferation, yes. But also it’s especially worried about the Chinese. I think there’s, I was just sort of opening this up again yesterday. I think there’s actually a chapter in ”The Late, Great Planet Earth” called the “Yellow Peril.” He’s reaching back to a bunch of old kind of racist tropes and re-imagining them for what things look like in 1970. All pointing toward the End Times, the events of Revelation, that the author of Revelation was imagining as going to happen within decades about the Roman Empire is now being redeployed by Lindsey to be about, how would the End Times occur after the state of Israel was formed in 1948?

What did this sort of sequence of dominoes that are going to lead to the coming of the Antichrist and the imposition of a one world religion? And then God’s final conflict in which Jesus emerges riding on his white horse.

DOUGLAS: The premise is the Rapture, which is this doctrine that, just as the End of Times begin, Christians will be, real true Christians will be whisked away by God into the air.

I think this was Paul in Thessalonians. This is based on a quotation from Paul in Thessalonians, and there’ll be gone. And that will inaugurate, according to this pre-millennial dispensationalism, that will inaugurate kind of seven years of tribulation during which we see the rise of the Antichrist who establishes a sort of one world religion and one world government.

SHEFFIELD: Yeah. And then I think the other sort of influential idea, different from the Hal Lindsey idea, is casting things perceived to be culturally liberal or theologically liberal as opposition. And the opposition also coming in the form of the United Nations.

And that came out of an earlier far-right opposition to the idea of the United Nations.

And it wasn’t really present in The “Late, Great Planet Earth,” but it becomes absorbed into this Christian literary far-right tradition. And the idea that New Age beliefs are evil and that they are resurrected paganism becomes very integral.

The book sold 28 million copies

And The Exorcist really put Satan on the map in a sense. And people become a lot more concerned about it. They were trying to ban the film from being seen in different areas. And people were claiming all sorts of being effected by epileptic fits or things like that, or feeling demonic presences in the theater.

DOUGLAS: I think one of the things we could just emphasize is this is in the apocalyptic tradition and that the world is full of cosmic beings and — they’re invisible, and they need to be opposed by God’s servants on earth, who in the sort of supersessionist view [supersessionist theology maintains that Christians have replaced (superseded) Jews as God’s chosen people], are Christians, white Christians, white American Christians. God’s chosen people are being opposed by not just their mundane political foes, but other invisible, cosmic, spiritual beings who are actually the enemies of God.

So domestic political foes are re-imagined as having a demonic sponsorship in this apocalyptic worldview that I think, in a sense, characterizes the Christian Right today.

DOUGLAS: The demons are going to establish a one world religion. And so the idea here is that any sign of secularism or religious pluralism in the United States, including the presence of strange New Age ideas, are things that are deployed against Christianity.

And so, the New Age ideas become a threat to Christianity. And it’s a way of imagining that religious pluralism is actually a form of oppression or persecution of fundamentalist Christians in the U.S. today. And that’s, I think, really the sort of strain that goes throughout the apocalyptic literature. And we could look ahead toward the Left Behind series carries that kind of idea forward especially.

But I think in lots of Christian Right politics today is the idea to be compelled to share power culturally and politically has been experienced as a form of persecution that brings these supersessionist Christians back to the sort of original context of apocalypse, which was persecution under empire by foreign powers.

So the fact that they have to bake cakes for gay husbands, for example, in the present is in some sense, anticipatory of the way Antichrist is going to try to overrule Christianity and establish this one world religion in the future. So the sense of imminence, the sense of combat, an existential tragedy, the sort of imminent defeat that’s happening, this extreme moral dualism, but also this expectation that they’re going to be saved. Like the the poll numbers you gave us right at the beginning, we’re in the End Times.

SHEFFIELD: This is a different version of Christianity. In the early days of Christianity, and for a long time, the end of the world, it was something that was God’s doing. God was going to do this. And you didn’t know when it was going to happen.

DOUGLAS: You didn’t participate in it.

SHEFFIELD: Yeah, this was something that was, this was entirely God’s thing. But then there was this re-imagining of it that Jesus could not return to Earth until his people had taken dominion. And so we’ve seen a corollary of this idea, literally often called by a scholars, Dominion Theology.

Even the Rapture itself is not a traditional Christian doctrine. And just as the founding of Israel was seen as an important milestone for some End Times folks, some readers of the Rapture and pre-millennial dispensationalism like Falwell were thinking about the role that nuclear weapons might play in that kind of the battles that are described in the Book of Revelation.

SHEFFIELD: You could basically argue that the Christian Right has transitioned from the traditional interpretation of ‘Save me from this cruel world, Jesus’ over to ‘I want to burn the sinners for you Jesus.’

And that Christian militarism was certainly visible on January 6th where you had many, many protestors there who had the signs proclaiming their devotion to Jesus. They had flags for Jesus, Christian flags, Jesus flags, Jesus Trump flags, and they were there to overthrow this government that had been imposed upon them by the servants of Satan.

DOUGLAS: I think your implication of Dominionism is exactly correct. This form of apocalypticism is seeing your political foes as the enemies of God, are sponsored by the enemies of God can tip over into, or be aligned with this notion that we need to retake the nation almost as a sort of precursor for for the End Times to occur for the return of Jesus and in the hastening of God’s kingdom.

So I think there can be that kind of effect of alliance between a politics of trying to retake the nation and establish proper rules according to God as you understand them. And the notion that thereby you might be hastening the coming rule of the kingdom of God.

SHEFFIELD: Where do you see all this heading?

DOUGLAS: I’m super pessimistic for your country, actually. I think that the sort of Christianized Republican Party has become extremist, I think partly in part, because of this apocalyptic theology.

I think one of the things apocalypse does is it means that your democratically elected opponents are not legitimate. And when we see that actually in the Left Behind series, the Antichrist comes to power at the UN partly through democratic votes, but the authors and readers understand that is not a legitimate victory because there’s actually satanic demonic events happening behind that.

I think there is a kind of sense that Democratic voters and Democratic politicians are not legitimate. They do not hold legitimate authority, even when elected legitimately, what we would think of as legitimately.

So I think that’s the explanation for a lot of the sort of anti-democratic legislation that’s currently being put in place to challenge the 2022 elections and the 2024 elections.

SHEFFIELD: I’ve found that people who haven’t been steeped in this tradition themselves, it’s hard for them to believe that it’s real. That there are tens of millions of people out there who literally think that they are living in a Bible story.

And when you think that you are God’s personal servant, everything is permissible. And so it’s important for people who are aware that this tradition exists to educate others who are not aware of it. Because if you don’t do that, then people don’t know what they’re fully up against, then they can’t really oppose it effectively.

SHEFFIELD: And I think also that you could say that many moderate or liberal Christians, they’re not aware that this alternative tradition has developed, and really grown as big as it is. And they’re also not aware that that tradition is coming for them. And that it has a power that is very compelling to a lot of people because it’s totalizing.

It’s a worldview that encompasses politics, that encompasses religion, that encompasses schooling, that encompasses family. It literally can run your life for you. It can make the decisions. It can make your identity. You can finally be a part of something bigger than yourself.

DOUGLAS: I think for lots of progressive and thoughtful and intellectual Christians, to engage with fundamentalist theology and politics is to experience shame. Because it’s not like yours. It’s simplistic and binary and into this sort of Manichean binary of good and evil.

It’s not as sophisticated as your own religious tradition. So I think that can oftentimes mean for the moderates and liberal/progressive Christians, there’s an experience of shame. And an attempt to, I think sometimes on the other hand argue that they’re not really Christian at all. Those people are not really Christian, they’re Christian nationalists, who aren’t really in the proper Christian tradition, like we’re practicing it. But that’s a different conversation. ~


The article on the history of evangelical activism in the United States does not shed light on the growth of brutality in conservative evangelicalism over the last 60 years. This type of evangelicalism combined the anti-communist movement with the antisemitism of the group’s leaders, Bishop Sheen and Billy Graham.

When the Vietnam antiwar movement exploded in the 1960s, the anti-communist movement began to be absorbed by the religious right of the John Birch society. The Late Great Planet Earth became the prime motivator for the modern-day Conservative Evangelical movement. In the back of the book, the author, Hal Lindsey, states prophecies that became a list of religious conspiracies.

His predictions were not about a probable future but a portrayal of Christianity under attack by university professors, scientists, and the Democratic party. Because his book demonstrated the profitability of his conspiracies, it attracted the younger preachers to change their message from salvation to militancy.

After Lindsey’s book became a best seller in 1970, Billy Graham and the Silent Majority used Lindsey’s format as a blueprint, and they changed the evangelical message from anti-communism to anti-American university and science.

In the 1990s, John Piper published a book, The Pleasures of God: Meditations on God’s Delight in Being God.

In his book, God delights in bruising the Son for the good of the Son, a practice that Evangelicals call ‘tough love.’ He delights in the fame His name receives when Christians bash the anti-Christian forces, i.e., teachers and scientists. Piper’s last chapters promote evangelical militarism.

Today this militarism is practiced as bullying and is a common occurrence in universities. On a recent panel shown on CSPAN2. One panelist taught religion and economics. She said Evangelical aggression increased two to three times from what it was in the 90s. Her class covers the economic influences of five major religions: Christianity, Judaism, Islam, Hinduism, and Buddhism.

When she first started teaching, an evangelical would question why her syllabus included false religions. That occurred once or twice a semester. Every semester from 2016, two or three evangelicals gang up on her because she discusses the non-Christian religions. She believed those students take her class to stop what they believe is a validation of the other religions.

An evangelical man in the audience said that the students had a right to defend their religion. She replied, My class is an elective, and I discuss religious influences on a region’s economy. He asked her, Why do university teachers disregard the beliefs of Christians? It seems, she said, my evangelical pupils came not to learn but to fight for a cause.


Thank you, Joe, for this update. When I went to college, we were more likely to call refer to god as “she” — that was oh, so radical for that times. My Bible as Literature and Comparative Religion classes, hugely popular, were delivered without any disruption. Sad to hear — and witness on the news — the growth of militancy among the Evangelicals. And it makes sense that the very existence of other religions is a threat to them, since that implies that their religion might be just one of many, as made-up as they believe other religions to be. 

My exposure to various mythologies (chiefly the classical mythology, of course) was crucial in my insight that Judeo-Christianity was just another mythology.



For instance, the German soldiers fighting in World War One and World War Two were said to be very hard to fight. They showed courage  and discipline, and other virtues that are excellent in a soldier. Alas, in a war it all depends on which side you are, "on what's being fought for."



~ Green tea has long been known to have health benefits. In particular, it contains catechins called ECG and EGCG that are said to prolong life.

Until now, researchers have assumed that the catechins neutralize these free radicals and thus prevent damage to cells or DNA. One source of oxygen free radicals is metabolism; for example, when the mitochondria—the powerhouses of the cell—are working to produce energy.

In the new study in the journal Aging, the researchers show that these polyphenols from green tea initially increase oxidative stress in the short term, but that this has the subsequent effect of increasing the defensive capabilities of the cells and the organism. As a result, the catechins in green tea that researchers fed to nematodes led to longer life and greater fitness.

“That means green tea polyphenols, or catechins, aren’t in fact antioxidants, but rather pro-oxidants that improve the organism’s ability to defend itself, similar to a vaccination,” says study leader Michael Ristow, professor of energy metabolism at the health sciences and technology department at ETH Zurich.

However, this increase in defensive capability manifests not through the immune system, but rather by activating genes that produce certain enzymes such as superoxide dismutase (SOD) and catalase (CTL). It is these enzymes that inactivate the free radicals in the nematode; they are essentially endogenous antioxidants.

Ristow isn’t surprised to see this kind of mechanism at work. His research group showed back in 2009 that the reason sport promotes health is because sporting activities [i.e. exercise] increase oxidative stress in the short term, thus improving the body’s defenses.

Consuming fewer calories has the same effect, as has been shown several times in animals. Mice fed a reduced-calorie diet live longer than those fed a normal, high-calorie diet. “So it made sense to me that the catechins in green tea would work in a similar way,” Ristow explains.

He goes on to say that the findings from this study translate well to humans. The basic biochemical processes by which organisms neutralize oxygen free radicals are conserved in evolutionary history and are present in everything from unicellular yeast to humans.

Ristow himself drinks green tea every day, a practice he recommends. But he advises against taking green tea extracts or concentrates. “At a certain concentration, it becomes toxic,” he says. High-dose catechins inhibit mitochondria to such an extent that cell death ensues, which can be particularly dangerous in the liver. Anyone consuming these polyphenols in excessive doses risks damaging their organs.

While the most catechins are to be found in Japanese varieties of green tea, other green teas also contain sufficient amounts of these polyphenols. Black tea, on the other hand, contains a much lower level of catechins, since these are largely destroyed by the fermentation process.


~  Many people believe free radicals, the sometimes-toxic molecules produced by our bodies as we process oxygen, are the culprit behind aging. Yet a number of studies in recent years have produced evidence that the opposite may be true.

Now researchers at McGill University have taken this finding a step further by showing how free radicals promote longevity in an experimental model organism, the roundworm C. elegans. Surprisingly, the team discovered that free radicals—also known as oxidants—act on a molecular mechanism that, in other circumstances, tells a cell to kill itself.

Programmed cell death, or apoptosis, is a process by which damaged cells commit suicide in a variety of situations: to avoid becoming cancerous, to avoid inducing auto-immune disease, or to kill off viruses that have invaded the cell. The main molecular mechanism by which this happens is well conserved in all animals, but was first discovered in C. elegans—a discovery that resulted in a Nobel Prize.

The McGill researchers found that this same mechanism, when stimulated in the right way by free radicals, actually reinforces the cell’s defenses and increases its lifespan. Their findings are reported in a study published online in the journal Cell.

“People believe that free radicals are damaging and cause aging, but the so-called ‘free radical theory of aging’ is incorrect,” says Siegfried Hekimi, a biology professor and the study’s senior author.

“We have turned this theory on its head by proving that free radical production increases during aging because free radicals actually combat—not cause—aging. In fact, in our model organism we can elevate free radical generation and thus induce a substantially longer life.”

According to Hekimi, the findings suggest that apoptosis signaling “can be used to stimulate mechanisms that slow down aging.”

“Since the mechanism of apoptosis has been extensively studied in people, because of its medical importance in immunity and in cancer, a lot of pharmacological tools already exist to manipulate apoptotic signaling. But that doesn’t mean it will be easy.”

Stimulating pro-longevity apoptotic signaling could be particularly important in neurodegenerative diseases, says Hekimi.

In the brain the apoptotic signaling might be particularly tilted toward increasing the stress resistance of damaged cells rather than killing them, explains Hekimi. That’s because it is harder to replace dead neurons than other kinds of cells, partly because of the complexity of the connections between neurons. ~

ending on beauty:


A day, another day,
A wave, another wave.
Where are you going? One and all?
Earth bruised by so many wanderers!
Earth enriched by so many of our corpses.
But the earth is ourselves,
We are not on it,
But in it, and always have been.

~ Robert Desnos