Saturday, March 31, 2018


Crane dancing, Hokkaido; Stefan Senft

America I’ve given you all and now I’m nothing.
America two dollars and twentyseven cents January 17, 1956.
I can’t stand my own mind.
America when will we end the human war?
Go fuck yourself with your atom bomb.
I don’t feel good don’t bother me.
I won’t write my poem till I’m in my right mind.
America when will you be angelic?
When will you take off your clothes?
When will you look at yourself through the grave?
When will you be worthy of your million Trotskyites?
America why are your libraries full of tears?
America when will you send your eggs to India?
I’m sick of your insane demands.
When can I go into the supermarket and buy what I need with my good looks?
America after all it is you and I who are perfect not the next world.

~ Allen Ginsberg, opening of “America”

This was written in 1956 and remains wonderfully “now.” This is the mysterious power of poetry — it’s “the news that stays news,” as Ezra Pound put it. And it’s not that poets and writers are seers who can foretell the future. It’s rather that they acutely see the present.

I’ve always loved the opening lines of this poem:

America I’ve given you all and now I’m nothing.
America two dollars and twentyseven cents January 17, 1956.

Another part of the mysterious power of poetry is that it feels personally true — at least those lines that really “hit.” I can’t explain why, but “America I’ve given you all and now I’m nothing” does feel true for me (call me irrational — as I said, I can’t explain it). Likewise with the line that follows, which I can explain even less. Both lines are magical, and the figure of two dollars and twenty-seven cents is exactly right — no other sum would do.

It’s the fourth line that starts the more serious lamentation, but it’s the mix of seriousness and humor that captures the insanity and contradictions better than a straight-faced text would.

This caricature came with the brief description of Ginsberg's chanting Ohmmm in a Chicago courtroom (“He's trying to calm us down,” his lawyer said — “us” included the judge.)
Allen Ginsberg by David Levine, New York Review of Books


~ “As director of the newly-minted Institute of Cytology and Genetics at Novosibirsk, Dmitry Belyaev was curious as to how dogs first became domesticated. (In the late 1950s), he decided that to fully understand the process, he must attempt to replicate the early days of domestication. He picked foxes for the experiment because of their close family ties with dogs (both are canids). His research team visited fur farms across the Soviet Union and purchased the tamest foxes on hand. They figured using the most docile of the wild foxes for their breeding program would hasten the pace of domestication, relative to the thousands of years it took to breed dogs.

a wild silver fox
Unfortunately, Belyaev died before seeing the final results. But today, 58 years after the start of the program, there is now a large, sustainable population of domesticated foxes. These animals have no fear of humans, and actively seek out human companionship. The most friendly are known as “elite” foxes.

“By the tenth generation, 18 percent of fox pups were elite; by the 20th, the figure had reached 35 percent,” Lyudmilla Trut, one of the lead researchers at the Institute of Cytology and Genetics, wrote in a paper describing the experiment in 1999. “Today elite foxes make up 70 to 80 percent of our experimentally selected population.”

One of the lab’s most interesting findings is that the friendly foxes exhibit physical traits not seen in the wild, such as spots in their fur and curled tails. Their ears show weird traits, too.

Like puppies, young foxes have floppy ears. But the ears of domesticated foxes stay floppier for a longer time after birth, said Jennifer Johnson, a biologist who has worked with Kukekova since the early 2000s.

a domesticated silver fox

After the collapse of the Soviet Union, the domesticated fox experiment fell on hard times as public funding for the project evaporated. The researchers realized quickly that keeping more than 300 foxes is an expensive enterprise. In the 1990s, the lab switched to selling some of the foxes as fur pelts to sustain the breeding program.

“The current situation is not catastrophic, but not stable at the same time,” Institute of Cytology and Genetics research assistant Anastasiya Kharlamova told BBC Earth last year. Now, the lab’s primary source of revenue is selling the foxes to people and organizations across the globe.

One customer is the Judith A. Bassett Canid Education and Conservation Center, located near San Diego. The center keeps six foxes — five of which are domesticated — as ambassadors for their species, so that people can get an up-close-and-personal view of the animals.

“We have a fox whose name is Boris, and as soon as someone walks in, he’ll run up to them like a dog will,” said David Bassett, president of the Conservation Center. “He wants to be scratched and if you don’t scratch him he’ll make you.”

While domesticated foxes are friendlier than those in the wild, they can still be unpredictable.

“[You can be] sitting there drinking your cup of coffee and turning your head for a second, and then taking a swig and realizing, ‘Yeah, Boris came up here and peed in my coffee cup,’” said Amy Bassett, the Canid Conservation Center’s founder. “You can easily train and manage behavioral problems in dogs, but there are a lot of behaviors in foxes, regardless of if they’re Russian or U.S., that you will never be able to manage.”


There’s been a further discovery linking domestication (i.e. the kind of friendly temperament that would be unusual in wild animals), spotty fur, shorter faces, floppy ears (at least in young animals) and curly tails.


~ “In 1868, the same year that Darwin published an entire monograph on domestication, Swiss anatomist Wilhelm His, Sr (1831-1904) described what became known as the embryonic neural crest.

Vertebrate embryos at an early stage of development consist of three “germ layers”. He described a strip of cells in the outer layer (ectoderm), between the part that produces skin and the part that produces the central nervous system. It’s now called the neural crest.

Wilkins and colleagues now propose a hypothesis that links the development of the neural crest with the body changes that accompany domestication.

The neural crest produces not only facial skeletal and connective tissues, teeth and external ears but also pigment cells, nerves and adrenal glands, which mediate the “fight or flight” response.

Neural crest cells are also important for stimulating the development of parts of the forebrain and for several hormonal glands.

The researchers argue that the domestication process selects for pre-existing variants in a 

number of genes that affect neural crest development. This causes a modest reduction in neural crest cell number or activity. This in turn affects the broad range of structures derived from the neural crest, giving rise to domestication syndrome.

Interestingly, deleterious alterations in genes controlling neural crest development cause wide-ranging syndromes called neurocristopathies in humans and in animals. The researchers suggest that the domestication syndrome resembles a mild multi-gene neurocristopathy.” ~


Wolves too can be tamed. Norway has a national park where you can pet the wolves.

You may be wondering about humans. Did humans in essence domesticate themselves? Did they selectively breed for more friendly, pro-social behavior? I suspect most people would say that the success has been partial at best. Still, scientists point out to clues such as differences between human and Neanderthal skulls. The human skulls show features association with domestication: for instance, our faces are shorter, and the faces of men and women are relatively similar — an important part of the DOMESTICATION SYNDROME.


~ “Tameness, says evolutionary biologist and primatologist Richard Wrangham of Harvard University, may boil down to a reduction in reactive aggression — the fly-off-the-handle temperament that makes an animal bare its teeth at the slightest challenge. In this sense, he says, humans are fairly tame. We might show great capacity for premeditated aggression, but we don’t attack every stranger we encounter.

Sometime in the last 200,000 years, humans began weeding out people with an overdose of reactive aggression, Wrangham suggests. Increasingly complex social skills would have allowed early humans to gang up against bullies, he proposes, pointing out that hunter-gatherers today have been known to do the same. Those who got along, got ahead.

Once humans began to self-­domesticate, changes to neural crest cells could have nudged us toward a highly communicative species. Something similar happens in songbirds: Domesticated birds have more complex songs than their wild counterparts. What’s more, self-domestication may be more common than once thought. Bonobos, Wrangham notes, live in peaceful groups compared with closely related, but more violent, chimpanzees. If humans took steps to domesticate themselves, perhaps they weren’t the only ones.” ~


I think the self-domestication of humans is still ongoing . . .  And cats are only partly domesticated, scientists agree.

By the way, it’s important to note the difference between tameness and domestication. A wild animal can be tamed, i.e. become non-fearful and non-aggressive around humans — but it doesn’t pass on that tameness to its offspring. That takes generations of selective breeding for less aggression and what might be called “friendliness.” 

While people assume that domestication means mainly lower testosterone levels, testosterone is not the critical hormone here. Rather, we see lower levels of stress hormones (e.g. cortisol) — some scientists even speak of “hypoadrenia” of domesticated animals. Darwin assumed that the lesser fear response stemmed from “gentler living conditions.” But, again, it takes many generations of selective breeding to produce less fearful animals. 


I remember seeing the story of the Russian experiment with domesticating foxes in a documentary about our relationship with dogs. So much is interesting about this experiment — perhaps most intriguing is that the changes seen in the “domesticated” animals occurred fairly rapidly, after a couple dozen generations, which challenges our sense that evolution is a long, slow process — apparently, under pressure, whether from environmental factors or human interference,  evolutionary changes can occur fairly rapidly. We can watch it happen, and not only in microscopic organisms that spin through generations in minutes or hours, but in complex, long lived mammals, like foxes and dogs. It is interesting to note that the “domesticated” silver fox looks a lot like a puppy. These domesticated animals, like humans, retain “infantile” features, like the floppy ears and short snout . . . “babyish” features that we find irresistibly cute and endearing.

So, domestication becomes not the simple taming of an animal, but the development of a mutual relationship between two species. The dog/human relationship is unique. They are the only animals that look us in the eye, that seek out that intimacy of the gaze, that do not find it disturbing. They are attuned to our moods and expressions in an active way, not only to follow commands and be rewarded, but because they seem to want and enjoy the connection. In the beginning of this interspecies relationship there was a situation of mutual advantage--dogs benefited from the shared food, humans benefited from the dog's protection and warnings, each benefited from co operation in the hunt. Recent theory seems to feel this relationship was one of co-evolution, we changed each other in tandem, humans not only may have domesticated themselves by selecting for co operation over aggression, the two companion species domesticated each other in ways that changed and benefited both.

As you note there is an essential difference between “tamed” and "domesticated." You may tame a lion, or a wolf, but don't imagine it loves you. I think of an instance where a woman kept  and bred wolves, and even sold them to fools who wanted one rather than a dog. One day her wolves tore her to pieces. Although there were no witnesses, I am sure it occurred simply because to a wolf you are always more a potential meal than a potential ally.


You are right about the persistence of  "juvenile" traits in domesticated animals. The domesticated foxes show an extended puppyhood: they remain curious and playful, roll over for belly rubs, and otherwise treat a human as "mother."

And yet even with certain large-size dog breeds, the closeness to the wolf remains; fatal attacks on a human, though rare, have happened. Parents are warned not to leave a baby alone with a dog large and strong enough to kill it, no matter how sweet and laid-back the animal usually is. 

The woman killed by her “tame” wolves is a story to heed, not the strange myths about being raised by wolves. As far as I know, no animal likes human babies, which have a strong smell that canines especially dislike. And a baby’s screaming, hard enough on humans, must be painful to dogs’ sensitive ears. Sooner the sirens (I love it when dogs howl to sirens — I sing with them).

Also, when a human merely tames a wild animal, e.g. by taking care of that animal during sickness, providing food and petting, the reciprocal affection — if it happens — will be directed toward that  specific human (who better keep up daily feeding and petting, making sure the petting is the kind that is truly pleasant to the animal). The merely tame rather than truly domesticated animal may still attack another human who unwittingly “corners” or otherwise threatens the animal, sometimes by mere detail of appearance (e.g. dogs allegedly dislike certain kinds of clothing). 

This fox became tame because her caretaker saved her life and nursed her back to health.

I suspect that the woman’s wolves weren’t really hungry, but she might have inadvertently done something that triggered their fear/attack response. They probably didn’t see her as a meal, but all of a sudden they saw her as a threat. From all I’ve read, one statement really stays engraved in my mind: if you come too close to a wild animal, it will bare its teeth. What we perceive as aggression is actually fear.

Of course we are not talking about cocker spaniels here. And I wonder if ears that remain permanently floppy are a sign of a deeper domestication, which essentially signals hypoadrenia — such an animals is physiologically pretty incapable of becoming enraged.

If humans control the breeding, then domestication can indeed progress rapidly. I remember reading another recent article about the foxes where a scientist is quoted as saying, “What a difference just fifteen years can make! Now all the foxes here are “elite” — i.e. they show no fear of humans and come over to be petted.

Though domestication produces many changes, some of them merely incidental, I suspect a mild hypoadrenia is the most essential of those. Truly domesticated animals have hypoactive adrenals and don’t produce the levels of the fight-or-flight stress hormones that their wild equivalents do. Fear and aggression are highly related; affection toward a human, the acceptance of that human as a protective “mother” rather than a threat, can happen in a wild animal under certain unusual circumstances, but it will not be passed on to its offspring. That takes selective breeding.

When it comes to humans, it’s been pointed out that women don’t really control reproduction to the degree it would take to breed out the bullies out of gene pool — but social means other than sexual selection by women exist, and I think have operated throughout thousands of years to reduce the levels of violence. As women become more empowered, the process can accelerate.

As a thought experiment, imagine if only the most affectionate individuals had children. But with humans everything is more complicated because of the huge influence of culture. Still, genetic factors can be very powerful, and our experience with breeding dogs tells us humans too could be bred for size, or intelligence, longevity, cancer resistance, or any other trait. But we also know about the law of unintended consequences (you change one thing, and you simultaneously change a gazillion other things), and are wisely afraid to tamper.

Did dogs domesticate us? We did indeed breed dogs mainly for affection, and the affection we get from them (and give to them) brings us great pleasure. And men seen with dogs were indeed found to get a lot more interest from women (just today I chatted up a man who was merely carrying a bag of dog food; it was as if he carried a sign that said, I'm a loving person). 

But I’ve also noticed something disquieting: some people, including married couples, prefer dogs to such an extent that they aren’t interested in having children. I think we need to make childcare less stressful  — e.g. make help with childcare chores readily available, so the primary caretaker can be rested and more relaxed. France has done this, and birthrate has gone up: the availability of quality childcare had a much bigger effect than financial incentives.

The lowering of stress seems the key. 


The woman killed by her wolves story was actually something my husband learned about back in Pittsburgh. The garage he took his work truck to for inspections and maintenance was actually near this woman’s property, and the workers there were all abuzz after she (or what was left of her) was found. I think all the wolves were euthanized.

And I believe fear is absolutely the trigger for most violence, the perception of some threat, even if that perception is inaccurate. My beloved chocolate lab, who was a real sweetheart, had never been around an infant. We had visitors one day who had a baby with them—she was in her little carry seat, and they set her down on the coffee table. We were all sitting around talking, and I didn’t see the dog. The baby was not crying, just making those little chortling sounds babies sometimes do. I found the dog standing back behind the couch, anxiously peering over the back at the baby, obviously nervous about this strange creature, afraid of what it might mean for him. Exactly the kind of circumstance where even a laid back dog might snap or bite.

And the kind of rabid hate talk that comes from the political right is certainly based in fear—fear of threats real or imagined, often of the most primitive kind. I think of some of the astonishing ideas they have spouted about women, and women’s sexuality, and the way any variation on “traditional” gender definitions is perceived as a threat—an intimate and powerful threat—to their own sense of identity. So much fear that even the smallest chink in the fortress of their world might bring it all down in ruins. Gay Marriage, for instance, can only lead to Armageddon.


This is so funny: “Gay Marriage, for instance, can only lead to Armageddon.” On the other hand, many of those right-wingers yearn for Armageddon. They want to see the whole world destroyed: a marvelous (to them) fulfillment of crazed prophecies. The weaker the evidence, the more rabid the zeal.

Interesting that powerful white males would feel so threatened by women, gays, poor people, immigrants . . . Maybe Shakespeare offers insight into this when he speaks about sleepless, anxiety-ridden kings: “Uneasy lies the head that wears the crown” (Henry IV, pt 2).


Latest UN projections expect the world’s population to grow by 2.9 billion – equal to another China and India – in the next 33 years, and possibly by a further three billion by the end of the century. By then, says the UN, humanity is expected to have developed into an almost exclusively urban species with 80-90% of people living in cities.

Whether those cities develop into sprawling, chaotic slums – with unbreathable air, uncontrolled emissions and impoverished populations starved of food and water – or become truly sustainable depends on how they respond. Many economists argue that population growth is needed to create wealth, and that urbanization significantly reduces humanity’s environmental impact. Other observers fear cities are becoming ungovernable – too unwieldy to adapt to rising temperatures and sea levels, and prone to pollution, water shortages and ill health.

Many cities are already investing in clean transport and water, sewage, renewable energy, planning, wellbeing and good housing for all. Others face what seem like insurmountable problems.


India, which is widely expected to be the most populous country in the world with more than 1.5 billion people by 2050, has seen its urban population double in 30 years, to nearly 600 million. Its megacities, like Mumbai and Delhi, are not expected to grow much more; instead, smaller cities are rapidly expanding.

Ramachandra and his colleague Bharath H Aithal have documented the environmental effects of breakneck urban growth in Bangalore. They say temperature in the city has increased by 2-2.5C over the past three decades, while the water table has declined in places from 28 metres down to 300 metres deep; there has been an 88% loss of vegetation and a 79% loss in wetlands, and frequent flooding even during normal rainfall.

Ramachandra fears that what has happened to Bangalore will happen to all Indian cities. “Air pollution is at dangerous levels, the water is polluted, there is nowhere for the waste to go, and the lakes have been killed,” he says.

The “frenzy of unplanned urbanization” is threatening nature as never before, says Prerna Bindra, author of The Vanishing, a new analysis of how urbanization and economic growth have affected India’s rich wildlife. “Wetlands, lakes, green spaces are giving way to glass and concrete. The retreat of natural habitats has meant the rapid decline of urban wildlife – even the once ubiquitous – house sparrows, or the bullfrogs and common toads that serenaded the monsoons, or jackals [which were] once not a very uncommon sight on urban fringes.”

The solution may be in the hands of the many strong indigenous and middle class groups that have set up in the last 20 years to demand less destructive development and attempt to reduce the use of polluting fossil fuels, enforce conservation laws and educate the authorities. But there is a long way to go.


The scale and speed of China’s shift to cities is shocking – possibly the fastest and largest migration of a human population in history. In just 30 years, nearly 500 million people have moved from rural areas into China’s 622 main cities, and a predominantly rural country has become nearly 60% urban. By 2025, over one billion Chinese – two in three people – will live in cities.

Guiyang is a model of central urban planning from the perspective of people. It has few slums and little sprawl, and its growth has been ordered. But urbanization has been an ecological disaster. In the early days, pollution turned the Nanming river black and stinking. Air pollution was allowed to continue unchecked, while carbon dioxide emissions rocketed from coal-fired industry, forests were cleared and soil was contaminated on a massive scale. And China’s breakneck urbanization extends beyond its borders, devastating vast areas of Africa and Latin America, where it turned for the raw materials for its industrial revolution.

“Rapid urbanization was encouraged. It was the way China grew its economy,” says Gordon McGranahan of the Institute of Development Studies in the UK, who specializes in global urbanization. “China used cities to generate growth and land to generate investment. It had to bring people to the cities; it experimented with converting land to urban areas. Its cities were critical to its growth. No one paid much attention to the environment until it hit them in the face.” ~



I hope that increasing urbanization will mean that a large portion of the earth can eventually be returned to wilderness. It will of course take time — and the megacities will have to find a way to be more sustainable. 


“It is not a lack of love, but a lack of friendship that makes unhappy marriages.” ~  Friedrich Nietzsche


The word "caucasian" came up and I said, "That's so much nonsense." ~ "Of course it is — unless you are Hungarian," my interlocutor earnestly replied. I decided to pass. I constantly run into ignorance so deep it's best to just talk about the weather. Weren't we supposed to get rain? I suppose it went to Hungary instead, the home of the true “causasians.”


~ “No line about class in the United States is more famous than the one written by the German sociologist Werner Sombart in 1906. Class consciousness in America, he contended, foundered “on the shoals of roast beef and apple pie.” Sombart was among the first scholars to ask the question, “Why is there no socialism in the United States?” His answer, now solidified into conventional wisdom about American exceptionalism, was simple: “America is a freer and more egalitarian society than Europe.” In the United States, he argued, “there is not the stigma of being the class apart that almost all European workers have about them. . . . The bowing and scraping before the ‘upper classes,’ which produces such an unpleasant impression in Europe, is completely unknown.”

In “White Trash,” Nancy Isenberg joins a long list of historians over the last century who have sent Sombart’s theory crashing on the shoals of history. The prolific Charles and Mary Beard, progressive historians in the first third of the 20th century, reinterpreted American history as a struggle for economic power between the haves and have-nots. W.E.B. Du Bois interpreted Reconstruction as a great class rebellion, as freed slaves fought to control their own working conditions and wages. Labor and political historians in the 1970s and 1980s recovered a forgotten history of blue-collar consciousness and grass-roots radicalism, from the Workingmen’s Party in Andrew Jackson’s America to the late-19th-century populists of upcountry Georgia to the Depression-era leftist unions of the Congress of Industrial Organizations. Historians of public policy, like the influential Michael B. Katz, emphasized the persistence of notions of “the undeserving poor,” an ideology that blamed economic deprivation on the alleged pathological behavior of poor people themselves and eroded support for welfare programs.

Isenberg — a historian at Louisiana State University whose previous books include a ­biography of Aaron Burr — provides a cultural ­history of changing concepts of class and inferiority. She argues that British colonizers saw their North American empire as a place to dump their human waste: the idle, indigent and criminal. Richard Hakluyt the younger, one of the many colorful characters who fill these pages, saw the continent as “one giant workhouse,” in ­Isenberg’s phrase, where the feckless poor could be turned into industrious drudges.

 That process of shunting outsiders to the nation’s margins, she argues, continued in the early Republic and in the 19th century, when landless white settlers began to fill in the backcountry of Appalachia and the swamps of the lowland South, living in lowly cabins, dreaming of landownership but mostly toiling as exploited tenant farmers or itinerant laborers.

In the book’s most ingenious passages, Isenberg offers a catalog of the insulting terms well-off Americans used to denigrate their economic inferiors. In 17th-century Virginia, critics of rebellious indentured servants denounced them as society’s “offscourings,” a term for fecal matter. A hundred years later, elites railed against the “useless lubbers” of “Poor Carolina,” a place she calls the “first white trash colony.” In the early 19th century, landowners described the landless rural poor as boisterous, foolish “crackers” and idle, vagabond “squatters.”

Not all stereotypes of the white poor were negative. In the Jacksonian period, populists celebrated Davy Crockett and his coonskin cap. Lincoln might be derided as a poor woodsman, but he was also valorized for his log cabin roots. During the Great Depression, New Deal photographers and writers depicted farmers displaced by the Dust Bowl as virtuous people, victims of economic forces beyond their control.

By the second half of the 19th century and into the 20th, Isenberg shows, crude caricatures gave way to seemingly scientific explanations of lower-class status. “Class was congenital,” she writes, summarizing a mid-19th-century view of poor whites. One writer highlighted the “runtish forefathers” and “consumptive parents” who birthed a “notorious race” of inferior white people. Essayists described human differences by borrowing terminology from specialists in animal husbandry. Just as dogs could be distinguished by their breeds and horses distinguished from mules, so could people be characterized as superior or inferior based on their physical traits.

The story of eugenics offers an example of the ways that, throughout the American past, questions of class status have been entangled with notions of racial inferiority. Isenberg makes a strong case that one of the most common ways of stigmatizing poor people was to question their racial identity. Backcountry vagabonds were often compared unfavorably with the “savage,” nomadic Indian. Sun-browned tenant farmers faced derision for their less-than-white appearance. After the emancipation of slaves, politicians warned of the rise of a “mongrel” nation, fearful that white bloodlines would be contaminated by blacks, a process that might expand the ranks of “trash” people.

“Class,” she writes, “had its own singular and powerful dynamic, apart from its intersection with race.” Thus we get a history of class in America that ­discusses white tenant farmers at length, but scarcely mentions black sharecroppers or Mexican farmworkers, as if somehow their race segregated them from America’s history of class subjugation. Native Americans make cameo appearances playing their role as a degraded race or as the noble savage — as ideal types rather than as ­exploited and impoverished peoples themselves. The “coolie” Asian workers imported to the post-Civil War South, the Filipino agricultural laborers of California’s Central Valley and the inhabitants of San Francisco’s and New York’s 19th-­century Chinatowns, all workers, most at the bottom of the economic ladder, are virtually absent from these pages, even though they were subject to caricatures stunningly similar to those hurled at backcountry “squatters” and “hillbillies.”

It is a commonplace argument in American politics that somehow race and class stand apart. Pundits charge that racial minorities practice a self-segregating “identity politics” rather than uniting around shared economic grievances. But a history of class in America that assumes its whiteness and relegates the nonwhite poor to the backstage is one that misses the fundamental reality of economic inequality in American history, that race and class were — and are — fundamentally entwined.” ~

From another source:

~ “The wretched and landless poor have existed from the time of the earliest British colonial settlement. They were alternately known as "waste people," "offals," "rubbish," "lazy lubbers," and "crackers." By the 1850s, the downtrodden included so-called "clay eaters" and "sandhillers," known for prematurely aged children distinguished by their yellowish skin, ragged clothing, and listless minds.

Surveying political rhetoric and policy, popular literature and scientific theories over four hundred years, Isenberg upends assumptions about America's supposedly class-free society – where liberty and hard work were meant to ensure real social mobility. Poor whites were central to the rise of the Republican Party in the early nineteenth century, and the Civil War itself was fought over class issues nearly as much as it was fought over slavery. 

Reconstruction pitted poor white trash against newly freed slaves, which factored in the rise of eugenics — a widely popular movement embraced by Theodore Roosevelt that targeted poor whites for sterilization. These poor were at the heart of New Deal reforms and LBJ's Great Society; they haunt us in reality TV shows like Here Comes Honey Boo Boo and Duck Dynasty. Marginalized as a class, white trash have always been at or near the center of major political debates over the character of the American identity.

We acknowledge racial injustice as an ugly stain on our nation's history. With Isenberg's landmark book, we will have to face the truth about the enduring, malevolent nature of class as well.” ~ 


from yet another:

~ “As Nancy Isenberg describes in her new book, White Trash: The 400-Year Untold History of Class in America, one question that polite American society has always asked itself is whether poor whites can really be considered white (or even truly human). Usually, the answer has been that the people whom the upper classes have alternately called offscourings, bogtrotters, clay-eaters, swamp people, mudsills, hillbillies and rednecks are indeed a breed apart, deserving of sympathy or scorn but rarely solidarity.

White Trash documents in exhaustive detail how every stage in the continent’s development – from the arrival of the Pilgrims to the inauguration of President Donald Trump – has seen its elites construct their own taxonomies of deplorable (and expendable) white people. Isenberg’s purpose in doing so is to undermine the belief among Americans that their society miraculously shed the burdens of class and pedigree that prevailed in the mother country of England. ‘Far more than we choose to acknowledge’, she writes, ‘our relentless class system evolved out of recurring agrarian notions regarding the character and potential of the land, the value of labor, and critical concepts of breeding’.

This story begins with the evacuation of Britain’s human ‘waste’ from the country’s slums as part of the English nobility’s efforts to ‘fertilize’ the New World. In Isenberg’s words, ‘the idle poor, the dregs of society, were to be sent thither to throw down manure and die in a vacuous muck.’ Agricultural metaphors and the logic of animal husbandry persisted into the revolutionary period: Benjamin Franklin’s thoughts on populating the country were shaped by his experiences raising pigeons, while Thomas Jefferson’s vision of future US leadership rested on his faith in a ‘fortuitous concourse of breeders’ that would give birth to a natural aristocracy. Isenberg makes a strong case that the new republic was not – and was never envisioned by its founders to be – an egalitarian and classless society, but rather one based on rank and privilege.”

With the advent of the Civil War, the language of class identity took a new turn: destitute southern whites in particular became a ‘notorious race’, which according to some critics had ‘fallen below African slaves on the scale of humanity’. As one writer for the Atlantic Monthly asked in 1865, why should the victorious Union keep ‘the humble, quiet, hard-working negro’ disenfranchised and leave the North vulnerable to the vote of the ‘worthless’ and ‘vicious’ poor whites, who were ‘fit for no decent employment on earth except manual labor’? It was from this point onwards that the label of ‘poor white trash’ began to stick, and the image of the inbred, ignorant and immoral southern redneck emerged as the ridiculous and frightening figure that is still so firmly entrenched in modern culture (the 1960s sitcom The Beverly Hillbillies and the 1972 film Deliverance are just two of the examples that Isenberg examines in this regard).” ~

I was particularly struck by the observation that poverty is like traveling back in time: houses are old, and haven’t been renovated or even painted in decades. The porches are sagging. The furniture is decades old. The same liquor stores and tattoo parlors line the streets, in the same dilapidated condition.

In Johnson, Vermont, I was especially intrigued by a low-slung house whose small windows were covered almost to the top by some mysterious stuff inside. Finally I managed to get close enough to see: those were plastic bags. Whoever lived there collected plastic bags over the years and just kept piling them on the floor the way some people keep stacking newspapers and magazines. This made me flash back to a homeless man I once met, with not one shopping cart, but two. I assumed the carts were filled with blankets and other practical stuff, but no — they were filled with crumpled plastic bags jammed in to overflowing.

I think there is a special name for this pathology. It doesn’t absolutely have to go with poverty, but perhaps it’s the ultimate in “poverty mentality.” You don’t throw anything away — it might “come in handy” one day. Nothing is designated as “trash” — except, tragically, the people who can’t seem to part with it. 


As the distance between the wealthy one percent and the rest of us becomes steadily and obscenely greater and greater, it gets harder and harder to ignore. As long however, as that powerful one percent can depend on and cultivate racial and ethnic division we can be set against each other, so busy blaming and hating our fellows, that we defeat ourselves and act against our own best interests —witness the huge white working class support of Trump, which still continues despite his abandonment of campaign promises. It seems hugely irrational, as irrational as racism and ethnic hatred always is — and just as hugely powerful. These are the engines that created the Holocaust, that continue to fuel genocide, that remain a real possibility, even now, even here, even for us.

With all that I feel that now is also a moment full of potential. Many voices are being raised, voices that have been silenced in the past, and are now refusing silence. Changes are coming.

I am glad to be here now, in this singular place and time, looking for great things to come.


I feel both frightened by what’s happening, especially the rise in hate crimes, and cautiously optimistic when I hear words I thought I’d never get to hear, e.g. Medicare for everyone — and see the young rise up for their right to go to school, say, without having to fear for their lives. I hope I’ll live long enough to see campaign finance reform and many other necessary changes — after all, I have already lived long enough to witness quite a cultural change.

Since we’ve talked about domestication and the diminishment of the fear response, it’s interesting that conservatives show a more intense fear response (as revealed by neuro-imaging, not merely through answers to questions about immigration and so forth).

Right-wing demagogues know that their most powerful tool is an appeal to fear. Fear pushes a person to the right, while making someone feel safer accomplishes the opposite. Much follows from that. FDR was brilliant when he said, “what we have to fear is fear itself.”

We definitely live in a turbulent time. One thing we’ve learned is that the Germans are probably correct when they say, “If you were in our place, you’d have acted the same.”

But maybe not — when I watch the young speak out without fear, I am reminded that this is something special about America.



~ “The notion that tarot cards somehow survived the cultural wreck of ancient Egypt has a surprisingly specific origin: Paris, 1781, when France seethed with secret societies and private clubs. Some were radically political, as would soon bear fruit. Many more had pretensions to having privileged access to occult traditions.

Freemasonry was the most fashionable of these societies, with its claims to a heritage deriving from the Knights Templar, and back further, from the architects of Solomon’s temple itself. But it bobbed in a rich esoteric stew that looks familiar to any observer of the present-day ‘New Age’: Rosicrucianism, theosophy, Swedenborgianism, Mesmerism, Martinism, Hermeticism (believed to derive ultimately from the ancient Egyptian god Thoth). Esoteric ideas and traditions were widely explored, elaborated and, often, invented.

The mood has been dubbed ‘anti-Enlightenment’ – or Counter-Enlightenment, to use Isaiah Berlin’s term. It owes something to the Romantic taste for the exotic and mystical, though anti-clerical and anti-authoritarian impulses were present too. Delving into ‘dark’ mysteries was conceived as an intellectual and ethical riposte to the hyper-rationalism of ‘enlightened’ secularism. Emanuel Schikaneder’s libretto for Mozart’s opera The Magic Flute (1791) frames the opposition neatly by setting the crepuscular, murderously matriarchal Queen of the Night against Sarastro, the rationalist, Masonic, Isis-and-Osiris-worshipping priest of the Sun.

Ancient Egypt was particularly fashionable in early 1780s Paris. (And not just in Paris: the Great Seal of the United States, with its prominent pyramid and Masonic eye, was designed in 1782.) Between 1781 and 1785, the Italian charlatan Giuseppe Balsamo, the self-styled Count Cagliostro, founded his own Egyptian ‘rite’ of Freemasonry. A fashionable Cagliostro lodge was reported in 1785 as being decorated with statues of Egyptian gods, hieroglyphics, a stuffed ibis and an embalmed crocodile.

However, the immediate source for the suits of European tarot was uncovered in 1939, when the archaeologist Leo Mayer found a 15th-century deck of Mamluk tarot cards in Istanbul’s Topkapi Palace. The correspondences revealed are temptingly neat. Children inducted into the Mamluk military slave retinue progressed through the ranks from page to equerry to khassakiyah, or elite soldier. The most trusted khassakiyah carried symbols of office, including the cup (the cup-bearer), sword (sword-bearer), and polo stick – emblems that were frequently depicted on coins. The polo stick was presumably translated into the baton or club by mystified Italian Renaissance card-makers who made the first European tarot decks, copied from the east.

The Mamluk origin theory is the best on offer, but it is not definitive. Nor does it explain the separate suit of 21 named trumps: the Magician, Empress, Tower, Moon and so on – 22, including the Fool. They emerged later, in the aristocratic courts of the Italian Renaissance – as is suggested by tarot’s original name. In 15th-century Italy, the cards were known as trionfi. They drew from the imagery of the ‘triumphs’, or allegorically themed carnival parades, of which modern carnival floats are the descendants.

Even at the very birth of tarot card-makers were cherrypicking from diverse traditions, and seizing on the allegories that suited them. The beguiling obscurity of the symbolism lent itself to the process. So it’s almost surprising that the idea of using tarot for divination took so long to emerge. But, like the cards’ supposed ancient Egyptian origin, this notion belongs very specifically to the 1780s.

The querent is complicit in the process. What drives this complicity was first measured in 1949, when the psychologist Bertram Forer asked his students in Los Angeles to fill out what he told them was a personality test, the ‘Diagnostic Interest Blank’. When Forer handed back the ‘results’, he told his students that they were based on the tests. In fact, he had culled 13 broad-brush statements from published horoscopes, including such devastating insights as ‘You have a great deal of unused capacity which you have not turned to your advantage’ and ‘At times you are extroverted, affable, sociable, while at other times you are introverted, wary, reserved.’ No! You too?

Every student was asked to assess the test’s accuracy on a scale of 0 to 5, where 5 indicated perfection. Their average score, astonishingly, was 4.26. Feeling smugly sure that I wouldn’t have fallen for it myself, I spoke to the psychologist Christopher French, who researches paranormal beliefs at Goldsmiths, University of London. He told me that most of the 13 personality descriptions in Forer’s original test are ‘two-headed statements that describe the human condition – and that’s why they resonate so much’. If they don’t describe you, French delightedly tells his own students, ‘you’re probably a psychopath’. But the universality of the statements is only part of the point: Forer’s experiment demonstrated that the statements were not perceived as universal but as highly individual.

Confirmation bias is also at work. We prefer to have our existing beliefs confirmed, and selectively pay attention to statements that perform this happy function for us. So when a tarot reading is momentarily inaccurate, we ignore or forget it. When it hits the mark, we are struck by its success.

Why does tarot survive? In a sense, tarot does encode wisdom – albeit within an invented tradition rather than a secret one. It is a system for describing aspirations and emotional concerns. It is a closed system rather than one based on evidence but, as such, it is not dissimilar to psychoanalysis, another highly systematized, invented tradition whose clinical efficacy depends ultimately on the relationship between client and practitioner.

In The Occult Tradition (2005), the historian David S Katz describes how deeply psychoanalytic theory, and Jung in particular, drank from the well of occult literature. The same combination of therapeutic aims and occult mystery was irresistible to the New Age too. Farley describes the cards as the ‘New Age tool par excellence’, able to shift fluidly from play to fortunetelling to ‘healing’.

It would require a sociological study to be sure, but it is possible that tarot offers a means to practice therapy for people who in some way stand outside formal or orthodox educational systems. It certainly seems to flourish in places such as Brighton and Glastonbury, not Cambridge and Hampstead.

And in a larger sense, the occult pseudo-history of tarot grasps a wisp of truth along with its armfuls of chaff. The cards are not the last survival of an ancient and exotic wisdom tradition. They are not the lost Book of Thoth. They are, however, a fairly unique remnant of the esoteric wisdom traditions of the European Renaissance, and they offer a form of informal, popular, easily accessed therapy. Meditating on the meaning and relevance of the four virtues, of Time, Love and Death, of the Hanged Man, the Angel and the Wheel of Fortune, can be valuable. The same is true, even, of meditating on the Fool.

Italian tarot cards, 1466


"Now that the Easter Bunny is caged for another year, here's my question: If God raised Jesus from the dead, why not make it public so everyone could see and believe? If the empty tomb is a deal breaker — all that gnashing of teeth and eternal damnation — why be so secretive? What kind of God plays games like that?

I think it's more likely that it never happened. It's likely that our annual Easter hoopla was never supposed to be about raising Jesus from the dead. There's no call to remember it in the bible. In fact, the bible sets aside all kinds of days as religious holidays for believers, but no mention of Easter.

But what we do know is that ancient civilizations have been celebrating Easter for at least 5,000 years — long before Jesus and long before Judaism.

In fact, we can say that the driving theme of the entire human story has always been the "dawn of new day” — the resurrection of the earth, and the call to new life. Hence, every culture of antiquity found unique ways to celebrate the triumph of spring. And they called it by similar names — Ishtar, Eostre, Easter. You get the point.

We know that, in the 4th century, the Emperor Constantine and the Council of Nicaea, hijacked yet another popular festival (winter solstice comes to mind) and "repurposed" the old Easter holiday for their religious holy day. They made Easter a religious day for Christians only.

But the question remains: If Jesus wanted everyone in the world to know that he resurrected from the tomb, why didn't he make at least a one public appearance for history's sake? Why only to a handful of biased believers who didn't even know how to read or write?

Why not appear to the people who killed him — like say, Pontius Pilate, or Judas Iscariot, or the Sanhedrin Council who'd been plotting his death for years? If all these were too scary, then why not at least show up for Pilate's wife who, according to the bible, told her husband to leave the poor guy alone! That's honorable mention stuff right there.

Or better yet, why not go back to the vulgar crowds in Jerusalem — the fickle folks who supposedly threw palms at his feet before they turned against him? These were the people who needed convincing, for crying out loud, not his mom and his best friends.

So now we've got a problem. All these lack of appearances in town mean that no legitimate historian, and not even a disinterested bystander, could ever report that they saw Jesus — even from a distance — after he was dead and buried.

Lets face it: Since the future of the world depended on it — not to mention the billions of people about to go to hell for lack of even a shred of evidence or logic — we're looking at a totally unfair and unjust religious belief system. This makes no sense. None. Nada. Zip. Zero.

We could argue that, “Jesus wanted to reward his disciples for their faithfulness, so that's why he didn’t let anyone else see him.”

But that means he put the whole burden on his disciples to go forth and convince the world. That makes no sense because they had no credibility — people knew they were already believers. And besides, they were not only illiterate, but they'd likely be dead before anyone else started to write about the life and times of Jesus of Nazareth.

In fact, it would be 40 more years before the anonymous "Mark" wrote about the death of Jesus, and guess what? Mark never mentioned a single sighting of Jesus, or the name of anyone who claimed he or she had ever SEEN Jesus after he died.

And then there's this: The tomb of Jesus was located at the thriving city of Jerusalem — the historic and religious center of the whole Jewish religion. Wait. Weren't these the same Jewish people whom Jesus said he came to save? So why did Jesus not at least go into Jerusalem after the resurrection, and shake a few hands?

Why would God not want the Jews of Jerusalem to see and hear for themselves that Jesus was alive and well? This makes absolutely no sense.

UNLESS . . .  there was no resurrection. Unless Jesus was actually a Jewish rabbi — and a revolutionary who turned the tables on the status quo. In that case, it all makes sense. The mission of Jesus was to bust down the doors of the encrusted, established religion of the day — and bust the whole crowd of rich, religious hypocrites for the ways in which they oppressed the poor and the outcast. His mission was to call them out for their lack of love, compassion, grace, and basic decency. He busted them for failing to be human.
Their privilege had corrupted their humanity, and they didn't like being exposed and busted.

In that case, there was no need for a resurrection. Only a crucifixion.” ~ Noah Einstein (Facebook, Einstein’s God)


The larger question here is Yahweh's secretiveness, not just Jesus’  or in regard to Jesus (odd: even when the divine allegedly shows itself on a grilled-cheese sandwich or in a mountain grotto, it’s always Jesus or Mary, never the Big Daddy). If Yahweh is the only true god and all the other gods are false — why not be more public? Noah writes about it too in a different essay. Ehrman focuses on the contradictions between the four gospels, but Noah here addresses the even more bothersome issue of secretiveness.

By the way, I used to be under the impression that a whole crowd gathered on the banks of the Jordan heard Yahweh’s words from the opening in the clouds during the baptism of Jesus — but then I happened on the text in Mark that makes it clear that only Jesus heard that voice. Likewise — and this may be even more indicative of schizophrenia — he hears “unclean spirits” calling him the Son of God.

Of course there is no way to prove that Jesus even existed, much less rose from the dead. But various end-of-days preachers did roam the countryside, and more than one may have gotten crucified — though not exactly as atonement for humanity’s sins, starting with the eating of the Forbidden Fruit in Eden. That interpretation is in itself more extreme than the claim of

 created by Bob Boldt

ending on beauty:


“Hiking: I don't like either the word or the thing. People ought to saunter in the mountains — not hike! Do you know the origin of that word 'saunter?' It's a beautiful word. Away back in the Middle Ages people used to go on pilgrimages to the Holy Land, and when people in the villages through which they passed asked where they were going, they would reply, 'A la sainte terre,' 'To the Holy Land.' And so they became known as sainte-terre-ers or saunterers. Now these mountains are our Holy Land, and we ought to saunter through them reverently, not ‘hike’ through them.” ~ John Muir

Saturday, March 24, 2018


Philadelphia, Rittenhouse Square, 3-21-18

It goes on and hurries to some end,
circling and turning without a goal.
Flashes of red, of green, of gray whirl past,
solid shapes barely glimpsed.

Sometimes a smile comes toward us,
and, like a blessing, shines and is gone
in this dizzying parade with no destination.

~ Rilke, New Poems

I prefer this carousel poem to the more famous one, with the white elephant. I prefer this one by far — the smile coming toward us and then disappearing, the carousel as a metaphor for life. It would be an unbearable metaphor without that smile

Sometimes a smile comes toward us,
and, like a blessing, shines and is gone

but it has already made us just a touch more happy than we were a moment before then. Yes, a blessing.


The carousel goes and goes and gets no where, and yet is so enchanting — not a thrill ride, but gentle, soothing motion, and a celebration of the imagination — the lights, the fabulous carved animals, the decorative painting. All part of that carnival experience, where it isn't the destination that matters, but the sensation.


Thank you, Mary,  for this beautiful insight about what the carousel “teaches” about life: it’s not the destination that matters (the carousel has none), but the sensation, the enchantment, the music, the magical details.

I think it's quite significant that Rilke was fascinated by the carousel and the fountain — the circularity of that motion, the return to where it started. By contrast, he was not especially drawn toward rivers, for instance — or at least he doesn’t mention them again and again in his poems. Because “life is a journey” is such a common saying, and a journey presupposes a destination, this focus on images that don’t suggest a linear progression is quite striking. Sure, we keep saying that it's the journey itself that matters, and not the destination — but you can't quite remove destination from the concept of the journey. Rilke (who traveled a lot) manages to find other metaphors.

If we think of life as a journey, there is no avoiding the knowledge that a typical life ends in old age and ultimately death. Much as we’d love to say that life is a journey toward wisdom, and even if in part it is that, at least for some people, we don’t have a good answer to the statement that life is a journey toward death — and unless you expect eternal bliss afterwards, and even then, that fact just not something we like to think about. “It doesn’t help me get out of bed in the morning and go on living,” is how more than one friend put it.

We say that death is what makes life precious, but, along with Woody, we’d “rather not be there when it happens.” So a way to think of life in terms other than a journey from birth to death is of more than theoretical interest.

A few have suggested a circular motion, but in the form of a spiral — we keep coming back to the same central themes, but at different stages of life, with new perception, a new understanding. Every several years or so, our priorities are different. At fifty, our philosophy of life may be quite the opposite of what it was at twenty-five. 

Melk Abbey, Spiral Staircase

And disappointments are inevitable. Shattered dreams everywhere! Or dreams that have come true, but the fulfillment was less gratifying than we expected, or perhaps downright depressing (Teresa of Avila’s remark that more tears have been shed because of answered prayers comes to mind here; the Buddha too spoke negatively about fulfilled desires). But there are some fulfilled dreams that have not proved disappointing (great art, the beauty of nature) — and there are the grace notes, the beautiful details, the moments of happiness — in Rilke’s poem, it’s that smile we receive as the carousel whirls on.

Note also, in the Second Duino Elegy

Oh smile, where are you going? Oh upturned glance,
a new, warm, receding wave from the heart —
alas, it’s in us all . . .
Are we mixed into [the angels’] features
as slightly as that vague look in the faces
of pregnant women? Unnoticed by them in their whirling
return to themselves. (How should they notice it?)

“Oh smile, where are you going?” But that’s not quite the right question. The afterlife of a smile is irrelevant next to the gift of the smile in the moment. 

“For if life were questioned a thousand years and asked, ‘Why live?’ and if there were an answer, it could be no more than this: ‘I live only to live!’ And this is because Life is its own reason for being, springs from its own source, and goes on and on, without ever asking why — just because it is life.” ~ Meister Eckhart

Dali: Don Quixote (1)

From the preface to the 1964 edition of Labyrinths by Jorge Luis Borges:

~ Pierre Menard undertakes to compose Don Quixote — not another Quixote, but the Quixote. His method? To know Spanish well, to rediscover the Catholic faith, to war against the Moors, to forget the history of Europe — in short, to be Miguel de Cervantes. The coincidence then becomes so real that the twentieth-century author rewrites Cervantes’ novel literally, word for word, and without referring to the original. ~

And here Borges has this astonishing sentence: “The text of Cervantes and that of Menard are verbally identical, but the second is almost infinitely richer.” This he triumphantly demonstrates, for this subject, apparently absurd, in fact expresses a real idea: the Quixote that we read is not that of Cervantes any more than our Madame Bovary is that of Flaubert. Each twentieth-century reader involuntarily rewrites in his own way the masterpieces of past centuries.

Thus, the reader changes the meaning of the text not only because the mentality of each reader is different, but because a modern reader has a mentality vastly removed from that of someone writing long ago. We certainly read the Odyssey, say, very differently than Homer could have conceived it. But even something written in the 19th century is a very different text to us than it was to 19th century readers.

Dali: Don Quixote (2)


With all respect to Borges, I do not believe his hypothetical re-writer of Cervantes’ Quixote could succeed in reproducing that original, word for word. The impossibility is in that “forgetting European history”—and he would have to forget his own personal history as well, and come into possession of Cervantes own history and memories. We are the products of our own times, our own experiences, shaped and tempered by place and time, the fears and dreams, the very shape of desire as we know it here and now, all both the same and utterly different from what they were in other places, other times. And the distance need not be that great—a few decades, a generation—a new, or at least different, world.

When I read Clarissa, I enter her world, so unlike what I know that it seems strange, impossible, alien—but drawn by a master’s hand, so that I recognize what we share despite those differences, and even the unusual length of her story (four long volumes of letters! Itself an antiquity of communication) serves to capture and hold me there with her, until I feel and suffer with her, as she is imprisoned, cut off from any possibility of escape, any hope—and I experience that suffocation with her, but also always as someone for whom those circumstances remain part of another, distant, vanished, world.  We can never read anything without re-creating it as we read, because we can not lose our own history and memories, we see all with and through a sensibility particular to our time, place, circumstance and personality. Our angle of vision, our lens to the world.


Borges was being literary, not literal — I'm sure you know that. He made up a surreal, impossible story in order to make his point about how readers change the meaning — especially over the centuries as the culture changes — as everything changes. And I agree that it doesn’t have to be centuries — decades is enough — even a few years, these days, when events rush on.

In fact a mere month — even a week — might do it if something significant happens! We can never see Islam the same way after 9/11. In fact we can never see our country or our world the same way. And that subtly or heavily affect everything else, including the way we see meaning in books and movies.

(A bit of a shameless digression, perhaps, reaching back to the previous blog — one man wrote that he saw the story of Abraham and Isaac in a different light after he became a father. For the first time the idea of a father killing his own son — a foreshadowing of god-the-father in effect killing his own son as a sacrifice to himself — became unbearable to him.)

So yes, of course we are a product of our times, circumstances, and unique experiences, both collective and individual — and this shapes the way each generation reads The Iliad, say, or any other work of literature.

And yet, as you point out, it’s possible for us to enter a remote, alien, vanished world portrayed in the kind of literature that tends to survive (a small percentage, especially as we move closer to modern times) — to “re-create” that world as we read about it, an excellent way to describe it — and empathize with the characters. We understand love and hate, we understand grief, or being betrayed by someone you trusted. There are enough universals so we can still connect even with Don Quixote’s deluded idealism, his need to believe in knightly heroism and courtly love. In the end we can even connect with sensible Sancho’s newly found love of adventure.

By the way, I love the way you describe entering the world of Clarissa (a book I found too painful to continue reading) — the very idea of an epistolary novel become antiquity. 


~ “Everybody's heard of the My Lai massacre — March 16, 1968, 50 years ago today — but not many know about the man who stopped it: Hugh Thompson, an Army helicopter pilot. When he arrived, American soldiers had already killed 504 Vietnamese civilians (that's the Vietnamese count; the U.S. Army said 347). They were going to kill more, but they didn't — because of what Thompson did.

I met Thompson in 2000 and interviewed him for my radio program on KPFK in Los Angeles. He told the story of what happened that day, when he and his two-man crew flew over My Lai, in support of troops who were looking for Viet Cong fighters.

"We started noticing these large numbers of bodies everywhere," he told me, "people on the road dead, wounded. And just sitting there saying, 'God, how'd this happen? What's going on?' And we started thinking what might have happened, but you didn't want to accept that thought — because if you accepted it, that means your own fellow Americans, people you were there to protect, were doing something very evil."

Who were the people lying in the roads and in the ditch, wounded and killed?

"They were not combatants. They were old women, old men, children, kids, babies."

Then Thompson and his crew chief, Glenn Andreotta, and his gunner, Lawrence Colburn, "saw some civilians hiding in a bunker, cowering, looking out the door. Saw some advancing Americans coming that way. I just figured it was time to do something, to not let these people get killed. Landed the aircraft in between the Americans and the Vietnamese, told my crew chief and gunner to cover me, got out of the aircraft, went over to the American side."

What happened next was one of the most remarkable events of the entire war, and perhaps unique: Thompson told the American troops that, if they opened fire on the Vietnamese civilians in the bunker, he and his crew would open fire on them.

"You risked your lives," I said, "to protect those Vietnamese civilians."

"Well, it didn't come to that," he replied. "I thank God to this day that everybody did stay cool and nobody opened up. ... It was time to stop it, and I figured, at that point, that was the only way the madness, or whatever you want to call it, could be stopped.”

Back at their base he filed a complaint about the killing of civilians that he had witnessed. The Army covered it up. But eventually the journalist Seymour Hersh found out about the massacre, and his report made it worldwide news and a turning point in the war. Afterwards Thompson testified at the trial of Lt. William Calley, the commanding officer during the massacre.

Then came the backlash. Calley had many supporters, who condemned and harassed Thompson. He didn’t have much support — for decades. It took the Army 30 years, but in 1998, they finally acknowledged that Thompson had done something good. They awarded him the Soldier's Medal for “heroism not involving actual conflict with an enemy.”

On the 30th anniversary of the massacre, Thompson went back to My Lai and met some of the people whose lives he had saved. "There were real good highs," he told me, "and very low lows. One of the ladies that we had helped out that day came up to me and asked, 'Why didn't the people who committed these acts come back with you?' And I was just devastated. And then she finished her sentence: she said, 'So we could forgive them.' I'm not man enough to do that. I'm sorry. I wish I was, but I won't lie to anybody. I'm not that much of a man."

And what were the highs?

“I always questioned, in my mind, did anybody know we all aren't like that? Did they know that somebody tried to help? And yes, they did know that. That aspect of it made me feel real good.”

Today there's a little museum in My Lai, where Thompson is honored, and which displays a list of the names and ages of people killed that day. Trent Angers, Thompson's biographer and friend, analyzed the list and found about 50 there who were 3 years old or younger. He found 69 between the ages of 4 and 7, and 91 between the ages of 8 and 12.

Hugh Thompson died in 2006, when he was only 62. I wish we could have done more to thank him.” ~

Hugh Thompson 1969


~ “Whether you are an optimist or a pessimist is not just a question of personal temperament.  It is also, increasingly, a question of politics. The divide between the optimists and the pessimists is as acute as any in contemporary politics and like many others—the generational divide between old and young, the educational divide between people who did and didn’t go to college—it cuts across left and right. There are left pessimists and right pessimists; left optimists and right optimists. What there isn’t is much common ground between them. Competing views about whether the world is getting better or worse has become another dialogue of the deaf.

Steven Pinker’s new book Enlightenment Now: The Case for Reason, Science, Humanism, and Progress—along with its critical reception—illustrates how dug in the two sides are. Pinker argues that most people have lost sight of the incredible benefits that liberal democratic values continue to deliver because too many of us have a bias in favor of bad news. He blames the things he doesn’t like—including Donald Trump’s presidency—on this innate and deeply misguided pessimism about the possibility of progress. “The most consistent predictor of Trump support,” he writes, “was pessimism.” He accuses these pessimists of fatalism, because they assume that any good news they hear is essentially fake news. They discount progress because of their deep faith in the inexorable pull of the worst that modern societies have to offer. He thinks that the pessimists have effectively given up on the capacity of human beings to make a better future.

Pinker is at pains to insist that there is nothing fatalistic about his own conception of progress. The point of his polemic is to warn that we will toss it all away if we give up on progressive values. No one should take progress for granted. As Pinker says, “a belief that things will always get better is no more rational than a belief that things will always get worse.”

Lying behind Pinker’s account is the suggestion that the main thing we can do to progress is screw it up with our stupid pessimism. It is better on this account to ignore bad news than to overreact to it. That attitude is not the same as fatalism, but it is innately passive. It is fatalism light.

Pinker’s previous book, The Better Angels of Our Nature (2011), was a more tightly argued account of progress across a particular domain — the relative decline of violence over both the long and the short term.  It provoked a similar reaction — readers divided on the basis of their prior convictions about the state of the world. They read the evidence according to what they thought should be true, rather than adjusting what they thought was true in the light of the evidence. 

Optimists see acts of violence as the exception not the rule; pessimists see them as the rule not the exception. When a terrible act of violence takes place, we tend to filter it through the stories we tell ourselves about the possibility of progress. On the one hand, we can argue that it shouldn’t derail the good news; on the other, we can argue that it makes a mockery of the good news. Either way, we can be left with a feeling of helplessness.

The recent Stoneman Douglas High School shooting in Florida, like many previous horrors, illustrated this, but it also illustrated something else: the possibility of a space between the twin fatalisms of the optimists and the pessimists. Gun control in the United States is a classic example of an issue that can be captured by fatalism on both sides. It is tempting to think there is nothing to be done, because the divided worldview of the two sides is too great to bridge.

But there is an alternative to discounting the horror as a blip against the wider story of progress and also to treating it as evidence that progressive values are a lie. When something terrible happens, we can seek to use it as evidence of the need for change. This is what the pupils at Stoneman Douglas and their supporters have been doing. Change is possible when people stop treating what happens in the world as confirmation of their view of the direction of travel of the world and instead start trying to see it for what it is: the consequence of particular political choices and therefore subject to further political choices.

It is possible to see things like this. But it is also incredibly difficult, especially in an age when the pull of fatalistic narratives on both sides of our political divides is so strong. Worse, it is hard enough with cases of violence, but it is even harder when dealing with issues that lack the same sense of immediacy. There often appears to be a close relationship, for instance, between environmental politics and a sense of fatalism. Environmental threats can leave people feeling powerless. It is a small step from feelings of powerlessness to shoulder-shrugging resignation: if we believe there is little we can do to remedy a situation, then there is little point in trying to remedy it.


In a series of exchanges with his friend John Stuart Mill, Tocqueville discussed the varieties of fatalism. Mill distinguished between what he called “pure” fatalism and what he described as the prevailing “modified” version. Pure fatalism was a belief that the future had already been decided. It made its adherents either stupefied or serene; in either case they were accepting of the path that was chosen for them. It was almost always a manifestation of a set of religious or traditional beliefs and Mill associated it with the philosophy of the Far East.

Modified fatalism was a more modern and Western phenomenon. It derived from an understanding that individuals are the product of social and economic forces beyond their power to control; it could be reinforced by social science (including the new science of psychology). Modified fatalists might be passive and resigned as a result of a sense of their powerlessness. But just as likely they would become impatient, complaining, dissatisfied. The impersonal forces that conditioned their fate might be a provocation to rail against it. Equally they might become complacent, happy to count their blessings.

Pinker is right: ours is a far less violent world than it was in the last century, never mind in the century before that. But the tyranny of the majority in the twenty-first century has more in common with the intemperate passions of Tocqueville’s America of the early 1830s than with the exhausted fears of Hayek’s Europe of the late 1940s. It is ardent, not resigned, and it makes it very hard for elected politicians to do anything that might rouse its anger. Indeed, as the case of environmental politics in contemporary democracies shows, our politics are not as violent as nineteenth-century American politics, but they are just as confrontational.

So it is not a straight choice between fatalism and freedom. We need to find a way between the fatalism that gives up on personal freedom and the fatalism that can’t give up on it even when it needs to. [We can’t] preclude the possibility of a future in which this turns out to be the wrong approach because the risk of disaster is too great. The reason Pinkerian faith in progress opens the door to fatalism is because it assumes that present problems are just future solutions waiting to happen. But what if some of our present problems end up making future solutions impossible?

Optimistic fatalists imagine that there is no mistake that cannot be corrected in time so long as we leave the future open. One reason for pessimism in the present is that the optimistic fatalists appear to have the upper hand. In this respect, their fatalism might yet be self-fulfilling. Climate change raises the risk of getting trapped in a future whose effects, while presently unknown, will be bad enough to trump the capacity of human ingenuity to ameliorate them. This is what makes the politics of environmental catastrophe different and it is the reason why all forms of fatalism are worth resisting while we can.

“I can’t tell any one person what to do, but I will say this: despair is not an option. Now more than ever, we need you to fight back.” ~ Bernie Sanders

Dürer: Self-Portrait as Man of Sorrow, 1522


said the big sign on the side of a car parked at the post-office. In smaller typeface, it was a personal ad: Dave described himself as good with children and pets, honest, faithful, etc. “Open to all ethnic” he stated toward the end, before the concluding the marvelous statement: “Can be clean-shaven or not.” And then: “Over 69.” 

Dave was in fact sitting in the car, and I wish I’d walked over to the driver’s side for a quick chat — just to learn a bit more about Dave and what made him paste that ad on his car and the responses he’s been getting — or not getting. The old childhood prohibition of approaching male strangers still prevailed, though now I see how perfectly safe it would be, and no, I wouldn’t have to get into Dave’s car . . .  

Anyway, I blew a chance for what perhaps would be an unusual chat — though also might give Dave a moment of hope I’d quickly have to disappoint. I guess men still assume that women want to marry them even if they are “over 69.” At least Dave wasn’t fussy: “Open to all ethnic” [I assume he meant “ethnic groups,” but there is only so much space on the side of an average car]. And reasonably eager to please: “Can be clean-shaven or not.” 

So I just drove home, past the church sign that said: “Easter Musical: The Road to Cavalry.” Not interested. If I were to put a sign on my car, it would say, “Not interested.”

Glory lily and morpho butterfly by Georg Dionysius Ehret, 1758


~ “A 2009 poll in eight east European countries asked if the economic situation for ordinary people was ‘better, worse or about the same as it was under communism’. The results stunned observers: 72 per cent of Hungarians, and 62 per cent of both Ukrainians and Bulgarians believed that most people were worse off after 1989. 

In no country did more than 47 per cent of those surveyed agree that their lives improved after the advent of free markets. Subsequent polls and qualitative research across Russia and eastern Europe confirm the persistence of these sentiments as popular discontent with the failed promises of free-market prosperity has grown, especially among older people.

Thoughtful observers should suspect any historical narrative that paints the world in black and white. Since nuance in the story of 20th-century communism might ‘reduce the ease of our thoughts and the clarity of our feelings’, anti-communists will attack, dismiss or discredit any archival findings, interviews or survey results recalling Eastern Bloc achievements in science, culture, education, health care or women’s rights. They were bad people, and everything they did must be bad; we invert the ‘halo’ terminology and call this the ‘pitchfork effect’. Those offering a more nuanced narrative than one of unending totalitarian terror are dismissed as apologists or useful idiots. Contemporary intellectual opposition to the idea that ‘bad people are all bad’ elicits outrage and an immediate accusation that you are no better than those out to rob us of our ‘God-given rights’.

Contrarian Twitter users [ask]: ‘And are you going to expose the horrendous record of slavery, murders, and all the capitalism crimes too?’ East Europeans suffering from the severe downturn in economic growth after 1989 might ask this same question. Ethnographic research on the persistence of red nostalgia shows that it has less to do with a wistfulness for lost youth than with a deep disillusionment with free markets. Communism looks better today because, for many, capitalism looks worse. But mentioning the possible existence of victims of capitalism gets dismissed as mere ‘whataboutism’, a term implying that only atrocities perpetrated by communists merit attention. 

Conservative and nationalist political leaders in the US and across Europe already incite fear with tales of the twin monsters of Islamic fundamentalism and illegal immigration. But not everyone believes that immigration is a terrible threat, and most Right-wing conservatives don’t think that Western countries are at risk of becoming theocratic states under Sharia law. Communism, on the other hand, provides the perfect new (old) enemy. If your main policy agenda is shoring up free-market capitalism, protecting the wealth of the superrich and dismantling what little is left of social safety nets, then it is useful to paint those who envision more redistributive politics as wild-eyed Marxists bent on the destruction of Western civilization.

What better time to resurrect the specter of communism? As youth across the world become increasingly disenchanted with the savage inequalities of capitalism, defenders of the status quo will stop at nothing to convince younger voters about the evils of collectivist ideas. They will rewrite history textbooks, build memorials, and declare days of commemoration for the victims of communism – all to ensure that calls for social justice or redistribution are forever equated with forced labor camps and famine.

Responsible and rational citizens need to be critical of simplistic historical narratives that rely on the pitchfork effect to demonize anyone on the Left. We should all embrace Geertz’s idea of an anti-anti-communism in hopes that critical engagement with the lessons of the 20th century might help us to find a new path that navigates between, or rises above, the many crimes of both communism and capitalism.” ~ 

Soviet soldier feeding a Ural owl; Stanislav Lvovsky


The longer I live, the more I perceive that (barring extremes) nothing is all good or all bad. And some of the good is quite surprising. Thus, the Golden Age of Polish Poetry unfolded in the second half of the twentieth century before the political change; the same is true of the Golden Age of Polish cinema, theater (including the Jewish theater), radio and TV comedy, literary criticism, and the arts in general.

Stanislaw Baranczak once caused an uproar during a lecture at UCLA by saying that censorship turned out to be good for Polish poetry, forcing a greater reliance on metaphor, analogy, and indirectness in general; poets had to be more subtle and eschew nationalism, which made their work more universal. 

(But most people who came to Baranczak's lecture were the "older generation," embedded in the God-and-Fatherland brainwashing of their youth, I suspected, not willing to grant to this wonderful man the brilliance of his insights.)

The Warsaw in which I grew up was a vibrant metropolis. When I returned for a visit after the fall of communism, it seemed muted and half-empty. Where did the crowds go? Where did the energy go? I kept asking myself, knowing I’d not find an answer. Even if an answer existed, it would consist of tangled strands of economics, the stranglehold of Catholicism, the growing emphasis on private as opposed to social life, and dozens of other factors.

I am not implying a lost paradise — far from it. And no one cites the flowering of the arts as an excuse for the government’s servility to Moscow and the privileges of the party elite. Land reform, health care, the spread of education, roads, electricity
— is it right to forget the good and concentrate only on the bad? Would the system ever have had charisma and inspired millions if it were devoid of idealism and a vision of a better world for many, not just the few? What is needed is a more balanced assessment. Alas, it’s not forthcoming. 


"We have to cultivate our garden" which is not easy in California, with the drought. I do my best (even my ridiculous best) to recycle water. 

I love gardening as a metaphor for any useful work we do, hopefully helping to create beauty as well.

DELUSIONS CAN BE ADAPTIVE (just read the fabulous opening example)

~ “A patient lies in a hospital bed in the neurological ward, his head wrapped in bandages. He’s just suffered a major trauma to the brain. The injury has wiped out the region that controls motion in his left arm. More than that, it’s destroyed the man’s ability to even conceive of what moving his arm would be like.

He’s paralyzed, in other words, but he doesn’t know that. He can’t know.

“Would you be so kind as to raise your left hand?” his doctor asks.

“Certainly,” the patient. But the hand remains where it is. “It’s gotten tangled up in the sheets,” the man explains.

The doctor points out that his arm is lying free and unencumbered on top of the sheets.

“Well, yes,” the man says. “But I just don’t feel like lifting it right now.”

The inability to recognize one’s own disability is a disorder called anosognosia, and it offers an unusually clear window into that peculiarly infuriating and astonishing aspect of human psychology: our seemingly boundless capacity for delusion. Faced with stark and unambiguous information that a part of their body is paralyzed, anosognosia sufferers can effortlessly produce a stream of arguments as to why this is simply not the case. They’re not lying; they themselves actually believe in the validity of their claims.

We’d like to think that we mold our beliefs to fit with the reality that surrounds, but there’s a natural human impulse to do the reverse: to mold our reality so that it fits with our beliefs, no matter how flimsy their justification may be.

Psychologists define “delusion” as a manifestly absurd belief held in the face of overwhelming evidence to the contrary, specifically as a symptom of a disorder like schizophrenia or bipolar disorder. But we’re all delusional to some degree. In fact, a certain amount of delusion may be essential for our mental health.

As we go about our lives, we form all sorts of beliefs and opinions about the world, which psychologists divide into two types. The first kind, “instrumental” beliefs, are ideas that can directly help us accomplish our goals. I believe that a chain saw can cut down a tree; I believe that the price of a first-class postage stamp is 49 cents. These kinds of beliefs tend to be directly testable: if I rely on them and they fail, I’ll have to revise my understanding.

The other kind of belief, the “philosophical” kind, are not so easily tested. These are ideas that we hold these beliefs not because they are demonstrably true, but because of the emotional benefits of holding them. When I say that I live in the greatest country on earth, or that true love lasts forever, I can’t really offer any evidence supporting these ideas, and that’s okay. They’re worth believing because they fulfill my emotional needs.

We get into trouble when we confuse the two types, and start holding instrumental beliefs for emotional reasons.

What kind of emotion tends to lead us astray? Well, one of the most powerful is the need to feel in control. Countless psychological experiments have shown that for both humans and animals, helplessness in the face of danger is intensely stressful. Believing that we have power over our destiny helps relieve that negative experience, even when that belief is unfounded. Hence the enormous appeal of “magical thinking” — the belief that one’s thoughts and private gestures by themselves can influence the surrounding world. If you’ve ever put on a lucky shirt because you thought it would help your favorite sports team win, leaned sideways to keep a bowling ball out of the gutter, or felt like you were more likely to win the lottery because you used numbers that had special significance to you, then you’ve succumbed to the delusion of magical thinking.

When you start relying on emotionally-motivated beliefs to make decisions with real consequences, you’re treading in dangerous territory. One fellow I know had to sell his small business and move lock, stock and barrel to rural Idaho because his wife had a dream about the Apocalypse. She said if he didn’t come with her, it would mean divorce. I like Idaho, but having to move there in the dead of winter strikes me as a steep price for confusing two modes of belief.

I wish I could wrap up this essay by giving you the secret key for avoiding delusion, but it’s not easy. The whole problem with delusion is that we don’t want to escape from its clutches. Even I don’t. I mean, look at us: suspended on a tiny dot in the middle of the vastness of empty space, doomed to suffer and die, and never know the reason why. If we woke up every morning and stared reality in the face, we’d slit our wrists. Maybe literally. Psychologists have long known that depressed people are less delusional than the rest of us; they’re much more perceptive of their own flaws, a phenomenon called “depressive realism.” (Imagine knowing exactly how flawed your “Call Me Maybe” is.) So I say: Raise a cheer and throw up your arms, assuming you’re able. Enjoy your delusions while you can. Let’s just hope that they don’t wreak too much havoc along the way.
 Middle of March; John Bellinger


The tendency to cling to delusion and magical thinking seems directly connected to the unbearable fear of being helpless and out of control. Certainly a factor in all systems of religious and superstitious belief, fears of helplessness are replaced by belief in the efficacy of prayer, that, like a magic spell, can coerce god, or the universe itself, to rule in your favor, give you what you want. Thus we have faith healers, various “no fail” prayers, and even things like chain letters, lucky talismans, even things posted on the internet that urge you to repeat or share them so many times to be spectacularly rewarded (usually by money).

The temptation of this kind of magical thinking is powerful. I have been a sceptic since age 13, but when my young brother was diagnosed with advanced lung cancer, inoperable and sure to be fatal, in grief and panic I bought a prayer book, searching for the saints and novenas promising relief — there actually is a patron saint for cancer. But I couldn’t find any way to believe, even in desperation, and put the book away. In that pain and helplessness, I flirted with the comfort of delusion, even knowing it was delusion. In the end, as you say, comfort comes with love, with beauty, with work, and with intellectual discovery—wonder and joy.


Thank you, Mary, for sharing the story of your brother’s illness and how in your desperate need you bought a prayer book. In my mid-thirties I too tried to “re-believe” on various occasions — oddly enough, not connected with acute suffering (this may seem like a poor joke, but I’ve heard it from a few others too — “I was too busy suffering”), but rather just wondering about life and reality, trying to find some evidence of a “real god.” The god of the bible had no appeal, though I did go through a period of trying to imagine Jesus as a real person, standing next to me, listening to me even though he never spoke — even so I can even understand those who interpret some of their own thoughts as Jesus speaking to them, and thus claim to have a “personal relationship with Jesus” — sure, you can create anything in your head!

And I understand Catholics praying to their favorite saint or to special saint who is supposed to be a “patron” of some select group of people — for instance, Saint Barbara is a patron saint of miners, and Saint Catherine of Alexandria, a saint who didn’t even exist, of wheelwrights (due to a detail of her fictitious martyrdom), and of philosophers, preachers, and lawyers (because of her supposed eloquence). (By the way, the church removed her from the calendar of saints, but later semi-restored her, making her veneration “optional”).

Maybe wanting to be loved completely the way only god can allegedly love us has something to do with it on an unconscious level — but I know I'm being influenced by a post which said, about a dog, “for the first time in my life, I felt completely loved.”

Entirely believable when said about a dog!

But I agree that mainly it’s about not wanting to feel helpless.
The paradox is that at least at times accepting one’s helplessness is liberating because we stop trying, hoping and crashing. We can finally move on. But that’s a topic for another post. 

St. Catherine of Alexandria by Carlo Crivelli


“My shortcomings are my voice, my height, my gestures, my lack of culture and education, my frankness and my lack of personality . . . I am incorrigible . . . I say ‘merde’ to anybody, however important he is, when I feel like it.” ~ Charles Aznavour

The statue of Aznavour (real name: Aznavourian) in Gyumri, Armenia

~ "The Book of Revelation is war literature," Pagels explained. John of Patmos was a war refugee, writing sixty years after the death of Jesus and twenty years after 60,000 Roman troops crushed the Jewish rebellion in Judea and destroyed Jerusalem.

In the nightmarish visions of John’s prophecy, Rome is Babylon, the embodiment of monstrous power and decadence. That power was expressed by Rome as religious. John would have seen in nearby Ephesus massive propaganda sculptures depicting the contemporary emperors as gods slaughtering female slaves identified as Rome’s subject nations. 

And so in the prophecy the ascending violence reaches a crescendo of war in heaven. Finally, summarized Pagels, "Jesus judges the whole world; and all who have worshipped other gods, committed murder, magic, or illicit sexual acts are thrown down to be tormented forever in a lake of fire, while God’s faithful are invited to enter a new city of Jerusalem that descends from heaven, where Christ and his people reign in triumph for 1000 years.”

Just one among the dozens of revelations of the time (Ezra’s, Zostrianos’, Peter’s, a different John’s), the vision of John of Patmos became popular among the oppressed of Rome. Three centuries later, in 367 CE, Bishop Athanasius of Alexandria confirmed it as the concluding book in the Christian canon that became the New Testament.

As a tale of conflict where one side is wholly righteous and the other wholly evil, the Book of Revelation keeps being evoked century after century. Martin Luther declared the Pope to be the Whore of Babylon. Both sides of the American Civil War declared the opposing cause to be Bestial, though the North had the better music — “He hath loosed the fateful lightning of His terrible swift sword." African-American slaves echoed John’s lament: "How long before you judge and avenge our blood on the inhabitants of the earth?"

But like many Christians through the years, Pagels wishes that John’s divisive vision had not become part of the Biblical canon. Among the better choices from that time, she quoted from the so-called "Secret Revelation of John": "Jesus says to John, ‘The souls of everyone will live in the pure light, because if you did not have God’s spirit, you could not even stand up.’
"The other revelations are universal, instead of being about the saved versus the damned.”

Woman and dragon, Apocalypse 12, Beatus d'Osma, 11th century


~ “Antimicrobial peptides are part of the ancient immune system that's found in all forms of life and plays an important role in protecting the human brain.
One way antimicrobial peptides protect us is by engulfing and neutralizing a germ or some other foreign invader. That gives newer parts of the immune system time to get mobilized.

These peptides are "extremely important," Moir says. "They're not like legacies from an immune system we don't use anymore. If you don't have them, you're going to die in a couple of hours.”

One of these ancient molecules, known as LL-37, looked a lot like a molecule closely associated with Alzheimer's. That molecule is called amyloid-beta and it forms the sticky plaques that tend to build up in the brains of people with dementia.

LL-37 and Amyloid-beta "looked just like peas in a pod," Moir says.

The two scientists began to discuss a wild idea. What if amyloid-beta was an integral part of the ancient immune system? What if those sticky plaques were actually an effort to protect the brain by encapsulating foreign invaders?

Their idea was that the brain was producing amyloid for much the same reason an oyster forms a pearl — for self-defense. "Maybe amyloid plaques are a brain pearl," Moir says, "a way for our body to trap and permanently sequester these invading pathogens.”

Tanzi and Moir set out to prove that amyloid really is part of the immune system. And they were lucky enough to have a funder, the Cure Alzheimer's Fund, that was willing to take a chance on their idea.

The effort took years. But in 2010, Moir, Tanzi, and their team demonstrated that amyloid is really good at killing viruses and bacteria in a test tube. And, in 2016, they showed it did the same thing in worms and mice.

"It was very clear that amyloid protected against infection," Tanzi says. "If a mouse had meningitis or encephalitis, [and] if that mouse was making amyloid it lived longer." In contrast, mice that did not produce amyloid died quickly from the infection.

"Even though we really concentrate on these plaques and tangles in Alzheimer's disease, it looks like it's the brain's immune system — the very primitive immune system of the brain — that's gone awry," Tanzi says, "and the plaques and tangles are a part of that system."

The question now is: What's causing the glitch in the ancient immune system?

One possibility is that it's overreacting to viruses and bacteria that get into the brain. Or, the system could be getting confused and attacking healthy cells — a lot like what happens in diseases like lupus or multiple sclerosis.” ~


Interesting that the prevalence of Alzheimer’s is almost double among women — already in the sixties, before differential life expectancy could account for it. Female prevalence is one of the clues that auto-immunity is likely to be involved.


ending on beauty:

The grass between the tombs is intensely green.
From steep slopes a view onto the bay,
Onto islands and cities below. The sunset
grows garish, slowly fades. At dusk
Light prancing creatures. A doe and a fawn
Are here, as every evening, to eat the flowers
Which people brought for their beloved dead.

~ Milosz, Provinces