Saturday, August 1, 2020

FIRST COLONISTS’ BRUTAL WINTERS; THOREAU AS A HUMORIST; RELIGION MORE LIKE SEX THAN SCHOOL; PARKINSON’S STARTS IN THE GUT; TALL PEOPLE AT GREATER COVID RISK

Tenerife, Canary Islands. Jack Gilbert: “We have already lived in the real paradise.” 
 
*
WHAT HE THOUGHT
for Fabbio Doplicher

 . . . We last Americans
were due to leave tomorrow. For our parting evening then


our host chose something in a family restaurant, and there

we sat and chatted, sat and chewed,

till, sensible it was our last

big chance to be poetic, make

our mark, one of us asked

"What's poetry?
Is it the fruits and vegetables and


marketplace of Campo dei Fiori, or

the statue there?" Because I was

the glib one, I identified the answer


instantly, I didn't have to think—
"The truth

is both, it's both," I blurted out. But that

was easy. That was easiest to say. What followed

taught me something about difficulty,

for our underestimated host spoke out,

all of a sudden, with a rising passion, and he said:
The statue represents Giordano Bruno,


brought to be burned in the public square

because of his offense against

authority, which is to say

the Church. His crime was his belief
the universe does not revolve around


the human being: God is no

fixed point or central government, but rather is

poured in waves through all things. All things

move. "If God is not the soul itself, He is

the soul of the soul of the world." Such was

his heresy. The day they brought him

forth to die, they feared he might

incite the crowd (the man was famous

for his eloquence). And so his captors

placed upon his face

an iron mask, in which

he could not speak.

That's

how they burned him. That is how

he died: without a word, in front

of everyone.

And poetry—

(we'd all

put down our forks by now, to listen to

the man in gray; he went on

softly)—

poetry is what
he thought, but did not say.


~ Heather McHugh



statue of Giordano Bruno in Campo dei Fiori


My favorite lines:

                                               God is no

fixed point or central government, but rather is

poured in waves through all things. All things

move. "If God is not the soul itself, He is

the soul of the soul of the world."

*

THOREAU AS A HUMORIST

~ When I was 14 my mother, exasperated by the onset of my teenage angst, handed me a Penguin paperback of Thoreau’s Walden and said, “Read this. The guy who wrote it was a rebel like you.” For some reason, I did as she suggested, and in Walden’s transcendental rants I found all my angsty teenage convictions gloriously and authoritatively ratified. 


Institutions were bad: they wanted to straitjacket my thoughts and crush my creativity; my elders were either corrupt or absent-mindedly hypocritical, either tyrannical or brainwashed, tragic or just pitiful; the dictates of fashion and good form were stifling and almost always ridiculous; money was a golden calf, prosperity overrated, and “making a good living” was a fool’s errand; as for our so-called government, it was just one big immoral business.

It was all there in Thoreau. On institutions: “Wherever a man goes, men will pursue and paw him with their dirty institutions, and, if they can, constrain him to belong to their desperate odd-fellow society.”


And concerning one’s elders: “One generation abandons the enterprises of another like stranded vessels.”


And about the absurdity of fashions: “We know but few men, a great many coats and breeches . . . The head monkey at Paris puts on a traveler’s cap, and all the monkeys in America do the same.”


And regarding materialism: “A man is rich in proportion to the number of things which he can afford to let alone.” “Give me the poverty that enjoys true wealth.”


And about the corruption of government: “The only true America is that country where…the state does not endeavor to compel you to sustain . . . slavery and war and other superfluous expenses.”


And all this in a book that was 140 years old! Could my mother have known, as she plunked down $8.95 for that crisp paperback with its innocuous cover (Lady Liberty in an 18th-century allegorical painting), that she was financing the most significant turn in her son’s life? The pliable goop of my pre-frontal cortex could not have met with any text more salutary to that hour in my development and its obsessions.


But it wasn’t just Walden’s relentless exposure of every innately North American hypocrisy that caused me—like so many young people before and after me—to adopt it as a gospel; it was also the things the book extolled: the ideal of liberty, the beauty of being out of step, the value of standing aside and taking one’s time, our all-encompassing quotidian natural wonders. Walden was as much a spiritualized embrace of a few choice things as a sneering rejection of others. Even here, though, my adoption of the book was more blindly evangelical than truly comprehending. I needed so badly to attach myself to an idealized Thoreau and ventriloquize through that fearlessly eloquent, time-tested voice—and so keenly did I embody Erikson’s fifth developmental stage (identity vs. identity confusion)—that I was neurologically incapable of any lucid appreciation of just what Thoreau the writer was up to.


*
Since around the centennial of his death in the early 1960s, when the advent of the Beats and Civil Rights and Environmentalism engendered a new attachment to Walden’s bygone prophet, we’ve made such good use of old Henry as mouthpiece for one worthy cause or another that we’ve lost all perspective on his marvelous authorial range. In particular, we’ve totally failed to get his jokes.


For me, Thoreau’s writing was a drug. It knocked my neurons around. It worked me over completely, induced a sort of insanity, and actually changed the course of my life forever. And still, until quite recently, I did not get the jokes. Had you taken the pains to point out to me, at fourteen, the extent of the levity that permeates Walden and much of Thoreau’s writing, I might have punched you in the nose. You see, my upbringing was not only religious, but of the proselytizing kind, and that fervor was still in my bones. None of us at church, incidentally, got the jokes in Jesus either. Beam in the eye? Camel through a needle? Sheep and goats? You could cue us with a high-hat and snare—ba-dum-chah!—but beyond the ensuing song of crickets all you’d get back was grim reverence. Our common assumption was that if one was talking of serious things (in Jesus’ case, salvation of the soul; in Thoreau’s, a secular version of the same), one could only do so, well, seriously.


But though at 14 I wished to believe otherwise, Thoreau never intended for Walden to serve as a self-help manual for virtuous living. His intention was to arrive at certain inalienable truths, but his means of getting there was often oblique rather than direct, and always more artful than strictly autobiographical. He was no social architect, had no such pretensions, and thus was not in the business of issuing blueprints. Throughout his body of writing he relied on exaggeration, sarcasm, paradox, and aphoristic hyperbole over straightforward statement. He frequently opted for the literary and allusive over the literal. And he larded his lines with puns and submerged secondary meanings that frequently aimed for laughs. Depicting himself ironically well-employed as nature’s surveyor and inspector in Walden’s first chapter, he boasts: “I have watered the red huckleberry, the sand cherry and the nettle tree, the red pine and the black ash, the white grape and the yellow violet, which might have withered else in dry seasons.” Like much of Walden’s wit, the Falstaffian irreverence here, wherein Thoreau confesses to being a serial outdoor pisser, went—to deploy a pun of my own—whizzing right over the teenage reader’s head.


Strangely enough, during Henry’s lifetime his problem was precisely the opposite: his audience had learned to anticipate laughs whenever he appeared in print or rose to speak. Thoreau was reputed to be a humorist (it was by that term that Nathaniel Hawthorne described him in a letter to a friend), and on the lyceum circuit he tended to crack up the crowds. Following lectures in Salem and Gloucester in 1848, the reviews were unanimous. His Salem talk, as reported in the local Observer, was “done in an admirable manner, in a strain of exquisite humor, with a strong undercurrent of delicate satire against the follies of the times. Then there were the interspersed observations, speculations, and suggestions upon dress, fashions, food, dwellings, furniture, &., &., sufficiently queer to keep the audience in almost constant mirth…The performance has created ‘quite a sensation’ amongst lyceum goers.” Gloucester’s Telegraph also lauded his humor, noting his facility for: “‘bringing down the house’ by his quaint remarks.” In both places, he had lectured from what would become Walden’s overture, “Economy,” which today is popularly (and unjustly) considered a dry and over-long barrier to entry for new readers of the book.

Following his public successes, Thoreau was importuned for more of the same. “Curators of lyceums write to me,” he noted in his journal. “Dear Sir,—I hear that you have a lecture of some humor. Will you do us the favor to read it before the Bungtown Institute?”


He was a comic savant for hire, or so they hoped. At the lectern or not, he tended to give the impression that he was kidding around. After Walden’s 1854 publication, Thoreau bemusedly recorded in the journal that a friend of Emerson had been “much interested in my Walden, but relished it merely as a capital satire and joke, and even thought that the survey and map of the pond were not real, but a caricature of the Coast surveys.”


How can we square these facts with Thoreau’s more current reputation? For alongside the idolators and pedestal-makers, Henry has his haters, and one of their favorite complaints is that he was all scowling prophecy, zero funnybone. These dour literalists would show us in Walden and Thoreau’s other works a pouting town pariah, a preachy puritan whose proto-tiny-house experiment in living was no more than a white man’s delusional holiday, a mean little humorless bastard of a scourge on par with Dickens’ Scrooge, and/or a wannabe woodsman disingenuously fortified, in his rustic cabin, by pies from Mom’s kitchen.

Still, while such philippics crop up with boring semi-annual regularity, Thoreau’s standing in our literature is unshaken. And as we continue to read him, he will continue to be many things to many people—wonderfully so. Speaking to change-makers and ecologists, rainbow-tog Buddhists and curriculum committees, his work is vast in its importance and appeal. And yet, having planted our peace flags, hiking boots, and sundry other emblems and totems in the fecund soil of his pond-side ode to nonconformity, let’s take care not to reduce the richness of Thoreau’s literary genius and flatten out our sense of his roundness as a writer.


Let us remember, especially, how funny he could be—how alive to the special force of wisecracking wit. Here he is, for instance, in the chapter of Walden called “Reading”:


“I confess I do not make any very broad distinction between the illiterateness of my townsmen who cannot read at all, and the illiterateness of him who has learned to read only what is for children and feeble intellects. We should be as good as the worthies of antiquity, but partly by first knowing how good they were. We are a race of tit-men, and soar but little higher in our intellectual flights than the columns of the daily paper.”


And in the chapter called “Solitude”:


“Society is commonly too cheap. We meet at very short intervals, not having had time to acquire any new value for each other. We meet at meals three times a day, and give each other a new taste of that old musty cheese that we are.”


And in “Visitors”:


“I had more visitors while I lived in the woods than at any other period of my life; I mean that I had some.”


And in “Economy”:


“Probably I should not consciously and deliberately forsake my particular calling to do the good which society demands of me, to save the universe from annihilation; and I believe that a like but infinitely greater steadfastness elsewhere is all that now preserves it.”


And in his journal:


“The merchants and banks are suspending and failing all the country over, but not the sand banks, solid and warm . . . You may run on them as much as you please,—even as the crickets do, and find their account in it. . . . In these banks, too, and such as these, are my funds deposited, funds of health and enjoyment. (October 14, 1857)


There is some advantage in being the humblest, cheapest, least dignified man in the village, so that the very stable boys shall damn you. Methinks I enjoy that advantage to an unusual extent. (July 6, 1851)”


He is often at his most sparkling when describing animals, as in this 1857 journal entry:


“I hear the alarum of a small red squirrel. I see him running by fits and starts along a chestnut bough toward me. His head looks disproportionately large for his body, like a bulldog’s, perhaps because he has his chaps full of nuts. . . . He finds noise and activity for both of us. It is evident that all this ado does not proceed from fear. There is at the bottom, no doubt, an excess of inquisitiveness and caution, but the greater part is make-believe, and a love of the marvelous. He can hardly keep it up till I am gone, however, but takes out his nut and tastes it in the midst of his agitation. “See there, see there,” says he, “who’s that? O dear, what shall I do?” and makes believe run off, but does n’t get along an inch,—lets it all pass off by flashes through his tail, while he clings to the bark as if he were holding in a race-horse. He gets down the trunk at last upon a projecting knot, head downward, within a rod of you, and chirrups and chatters louder than ever. Tries to work himself into a fright. The hind part of his body is urging the forward part along, snapping the tail over it like a whip-lash, but the fore part, for the most part, clings fast to the bark with desperate energy. (October 5, 1857)”


What misanthrope writes like that?


Let’s bear in mind that Transcendentalism, beyond being a 19th-century pseudo-religion concerned with abolitionism and utopian farming communities (and long, long before it meant self-congratulatory hallucinogenic trips), was a literary-aesthetic movement. Its writers were practitioners of craft above all else. They knew what they were about on every page. Thoreau “shaped Walden specifically with his contemporary readers in mind,” writes David S. Reynolds in Beneath the American Renaissance: The Subversive Imagination in the Age of Emerson and Melville. “His adaptation of popular humor was an intrinsic element of his self-appointed mission to absorb the language of common Americans and make it the vehicle of uplifting notions about individualism and deliberate living. More frequently than any American writer of the period—more frequently, even, than Emerson—Thoreau underscored the necessity for persons of genius to incorporate into their style the popular idioms of their own time and culture. . . .  The humor of Walden is indeed popular humor, but it is popular humor carefully transformed by a philosopher who wishes to salvage both his culture and his culture’s favorite images.”


Does it seem superficial to plead for appreciation of Thoreau’s stylistic range, and to do so now, in our apocalyptic days of social and climatological unrest, when it is his messages—“In wildness is the preservation of the world,” etc.—that remain so urgent and resonant? The thing is, despite our contemporary enshrinement of “Content” as the keyword of progress, popularity, and profit, what makes a work of literature lasting is its form as much as what it contains. The order, shape, and semi-secret palimpsest of meanings matters. The design matters. That is to say, the art and discipline of literature matters.


In our understanding of old Henry, prophetic exponent of simplicity, what seems called for is an injection of salubrious complexity. This was no wannabe woodsman nor failed hermit nor mama’s boy. This was a writer. Generation after generation since Lowell, Thoreau’s haters have insisted on “exposing” him as fake, “unlikeable,” or humorless because they fail to grasp the distinction between literary license and the literal, or to appreciate his complex identity as an artist—that is, as a creator of literary conceits.


Is it because he sneered at so many of our still precious constructs that they love to sneer back at the construct of his masterful book? For artistic construct—deep, painstakingly developed, and nearly ten years in the writing—is what Walden has been all along. Ralph Waldo Emerson spoke to this aspect of Thoreau’s vision in the eulogy for his friend: “There was an excellent wisdom in him, proper to a rare class of men, which showed him the material world as a means and symbol. . . . The tendency to magnify the moment, to read all the laws of nature in the one object or one combination under your eye . . . The pond was a small ocean; the Atlantic, a large Walden Pond. He referred every minute fact to cosmical laws.”

From Walden’s opening pages Thoreau himself pitches the book’s whole premise at the level of parable:


I long ago lost a hound, a bay horse, and a turtle-dove, and am still on their trail. Many are the travelers I have spoken concerning them, describing their tracks and what calls they answered to. I have met one or two who had heard the hound, and the tramp of the horse, and even seen the dove disappear behind a cloud, and they seemed anxious to recover them as if they had lost them themselves.


And in the following, which comes roughly at Walden’s midpoint, he lays bare the entire conceit. Just look at how he draws the world’s immensity into the luminous exurban environs of the book as into a visionary crystal ball:


“There is commonly sufficient space about us. Our horizon is never quite at our elbows. . . . For what reason have I this vast range and circuit, some square miles of unfrequented forest, for my privacy, abandoned to me by men? My nearest neighbor is a mile distant, and no house is visible from any place but the hill-tops within half a mile of my own. I have my horizon bounded by woods all to myself; a distant view of the railroad where it touches the pond on the one hand, and of the fence which skirts the woodland road on the other. But for the most part it is as solitary where I live as on the prairies. It is as much Asia or Africa as New England. I have, as it were, my own sun and moon and stars, and a little world all to myself. At night there was never a traveler passed my house, or knocked at my door, more than if I were the first or last man.”


Thoreau's cabin, reconstructed

And at Walden’s close, quite deliberately, Thoreau disowns the role of exemplar, thus discouraging the reader from taking his account too much at face value, as some kind of sociological reform paper:


“I left the woods for as good a reason as I went there. Perhaps it seemed to me that I had several more lives to live, and could not spare any more time for that one. It is remarkable how easily and insensibly we fall into a particular route, and make a beaten track for ourselves. I had not lived there a week before my feet wore a path from my door to the pond-side; and though it is five or six years since I trod it, it is still quite distinct. It is true, I fear that others may have fallen into it, and so helped to keep it open.”


“Walden aims at conversion,” noted John Updike. But this is true only insofar as we mean conversion of the mystic and personal order: introspective realization, not outward revolution. Inward moral clarity at the level of the individual situated within the natural world, not a holy-minded movement en masse. Thoreau never saw a crowd he could trust or would care to join. He’d said it in the first chapter, too:


“I would not have anyone adopt my mode of living on any account; for, besides that before he has fairly learned it I may have found out another for myself, I desire that there be as many different persons in the world as possible; but I would have each one be careful to find out and pursue his own way, and not his father’s or his mother’s or his neighbor’s instead.”
There it is, kids. And lest any of you aged 14 or younger are now rowing out upon the rippling and prismatic infinities of Thoreau’s work, I wish you the best on what may become a lifelong exploration. Like Walden Pond itself, rumored in his time to have no bottom, old Henry won’t be plumbed. But I beg you, please, please give the guy some credit by laughing as you sink your line.


https://lithub.com/thoreau-was-actually-funny-as-hell/?fbclid=IwAR0JbUUWfju7UH4qa8jX_97qdbWEHGCCT2Ztm0bfxSS-vAeof71S1duu73M


 
Oriana:

But it will never be humor for which we appreciate Thoreau, but rather his poetic daring and lyricism. Listen to this, near the beginning of Walden: "I long ago lost a hound, a bay horse, and a turtle-dove, and am still on their trail"; and the end: "The sun is but a Morning Star.”

I saw him and loved him mainly as a poet. As a sage, he reminded me of some of the gems in the Gospels and in Eastern Wisdom. As a poet he could speak with lyrical beauty, and one doesn't argue with beauty. As a visionary, he could indeed be criticized.

Imagery can be interpreted in infinite ways, and yet remain fresh and timeless. Thoreau’s skillful use of imagery is what makes Walden stand apart from mere political manifestos — which by their nature are doomed to be fallible and inadequate, and, no matter what they advocate, generally too extreme. But true poetry lives forever.

*
“My life has been the poem I would have writ


But I could not both live and utter it.” 

~ Henry David Thoreau
Thoreau’s grave at the Sleepy Hollow Cemetery
 
*

“One of the costliest errors human civilizations keep making is mistaking generic tyranny for the mask it happens to have worn last time around. The Soviet dictatorship wasn't communist. Middle-East dictatorships aren't Muslim. Aspiring dictators rarely wear the mask of past dictators because that's what people are looking out for.

"No more crooks in those masks!" we say, as new crooks in other masks fleece us.” ~ Jeremy Sherman


As Mikhail Iossel bluntly said: "The Soviet Union was never a communist country. It was a fascist dictatorship.”

And, come to think of it, Russia is again pretty much a fascist dictatorship.


THE COLLAPSE OF THE SOVIET UNION IS STILL GOING ON
 
~ When the Soviet Union collapsed 25 years ago, the world breathed a collective sigh of relief as the threat of nuclear annihilation was all but eliminated. Russia transitioned into a democracy, and the West could refocus its efforts on peace and prosperity. In the process, however, the pendulum swung from intense anxiety toward Moscow to inattention and neglect. 


Unfortunately, while the West was ignoring Russia, it was quietly mutating into something far more dangerous than the Soviet Union. 


With no real laws or institutions, 22 Russian oligarchs stole 40 percent of the country’s wealth from the state. The other 150 million Russians were left in destitution and poverty, and the average life expectancy for men dropped from 65 to 57 years. Professors had to earn a living as taxi drivers; nurses became prostitutes. The entire fabric of Russian society broke down. 


Meanwhile, the West wasn’t just ignoring the looting of Russia; it was actively facilitating it. Western banks accepted pilfered funds from Russian clients, and Western real estate agencies welcomed oligarchs to buy their most coveted properties in St-Tropez, Miami, and London. 


The injustice of it all was infuriating for average Russians, and they longed for a strongman to restore order. In 1999, they found one: Vladimir Putin. Rather than restoring order, however, Putin replaced the 22 oligarchs with himself alone at the top. From my own research, I estimate that in his 18 years in power he has stolen $200 billion from the Russian people.

Putin did allow a fraction of Russia’s oil wealth to seep into the population — just enough to prevent an uprising, but nowhere near enough to reverse the horrible injustice of the situation. But that didn’t last long either. As the oil boom waned, the suffering of ordinary Russians resumed, and people took to the streets in 2011 and 2012 to protest his rule. Putin’s method of dealing with an angry population comes from the standard dictator playbook: If your people are mad at you, start wars. This was the real reason behind his invasion of Ukraine, and it worked amazingly well: Putin’s approval rating skyrocketed from 65 percent to 89 percent in a few months. 


In response to the annexation of Crimea, the war in Ukraine, and the downing of Malaysia Airlines Flight 17, which killed 298 innocent people, the West had no choice but to respond with a range of sanctions against Russia. These sanctions, combined with the collapse of oil prices, led to more economic hardship, which made the Russian people even angrier. So Putin started another war, this time in Syria. 


The problem the world now faces is that Putin has effectively backed himself into a corner. Unlike any normal world leader, he cannot gracefully retire — he would lose his money, face imprisonment, or even be killed by his enemies. Therefore, what started out as a profit-maximizing endeavor for Putin has transformed into an exercise in world domination to ensure his survival. 


Twenty-five years after the fall of the Soviet Union, the West still faces a menacing threat from the Kremlin. It is now driven by kleptocracy rather than communist ideology. But it is still the same menace, with the same nuclear weapons, and an extremely dangerous attitude.
The real tragedy is that if Western governments hadn’t tolerated Russian kleptocracy over the last quarter century, we wouldn’t be where we are today. But as long as Putin and his cronies continue to keep their money safe in Western banks, there is still leverage: Assets can be frozen, and accounts can be refused. If one lesson is to be taken from the collapse of the Soviet Union, it is that we in the West cannot continue to keep our heads in the sand and ignore kleptocracy in Russia, because the consequences are disastrous. ~


https://getpocket.com/explore/item/the-soviet-union-is-gone-but-it-s-still-collapsing?utm_source=pocket-newtab

A mural in the officers’ building at the former Soviet military base on January 26, 2017 in Wuensdorf, Germany. Once called “The Forbidden City,” Wuensdorf was the biggest base for the Soviet armed forces in communist East Germany from 1945 until the last Soviet troops left in the early 1990s. Photo by Sean Gallup 
 
*

AN ICY CONQUEST

~ “We are starved! We are starved!” the sixty skeletal members of the English colony of Jamestown cried out in desperation as two ships arrived with provisions in June 1610. Of the roughly 240 people who were in Jamestown at the start of the winter of 1609–1610, they were the only ones left alive. They suffered from exhaustion, starvation, and malnutrition as well as from a strange sickness that “caused all our skinns to peele off, from head to foote, as if we had beene flayed.” Zooarchaeological evidence shows that during those pitiless months of “starving time” they turned to eating dogs, cats, rats, mice, venomous snakes, and other famine foods: mushrooms, toadstools, “or what els we founde growing upon the grounde that would fill either mouth or belly.” 


Some of the settlers reportedly ingested excrement and chewed the leather of their boots. Recent discoveries of human skeletons confirm the revelation of the colony’s president, George Percy, that they also resorted to cannibalism: “Some adventuringe to seeke releife in the woods, dyed as they sought it, and weare eaten by others who found them dead.” When one man confessed under torture to having murdered and eaten his wife, Percy ordered his execution.

That happened a mere three years after the first adventurous group of Englishmen arrived in Jamestown. From the beginning, it was a struggle for subsistence. Most of the settlers fell ill only a few weeks after landfall in May 1607. One colonist recalled that “scarse ten amongst us coulde either goe, or well stand, such extreame weaknes and sicknes oppressed us.” The corn withered in the summer drought, and as the flow of the James River waned in the unrelenting heat, salt water encroached from the sea, depriving the settlers of their main source of fresh water. Nor was divine assistance forthcoming. The Quiyoughcohannock Indians, scarcely better off, beseeched the Englishmen to intercede and ask their powerful God for supernatural intervention. But when the colonists’ prayers seemed to bring only more suffering instead of rain to Jamestown, the natives concluded that the Christian god must be a vindictive one, and their relations with the colonists deteriorated.

By September 1607, half the colony’s members were dead. “Our men were destroyed with cruell diseases as Swellings, Flixes, Burning Fevers, and by warres, and some departed suddenly,” Percy later recalled, “but for the most part they died of meere famine.” The next winter months would prove equally deadly. “It got so very cold and the frost so sharp that I and many others suffered frozen feet,” another witness wrote, adding that the cold was so severe that “the river at our fort froze almost all the way across.”

Fresh groups of colonists arrived in 1608 and 1609, but steady attrition and the “starving time” of 1609–1610 pushed the settlement to the brink. In June 1610, when the two ships arrived with provisions for the emaciated survivors, it seemed too late. Jamestown’s leaders announced to the settlers that they would all return to England by way of Newfoundland. 


“There was a general acclamation, and shoute of joy,” one person remembered. They set sail on June 17, but the next day, when they reached the small settlement on Mulberry Island along the James River just a few miles away, they sighted another boat, working its way up the river with news that an English relief fleet was on its way with more settlers and enough provisions to last a year. That chance encounter saved the colony of Jamestown. “God would not have it so abandoned,” one settler wrote. The following winter proved less harsh, and by 1614 colonists had begun lucrative exports of tobacco. In 1619 the Virginia House of Burgesses would hold its first assembly in Jamestown.

*

The brutal story of Jamestown scarcely fits the pageant of success that students are often taught in the condensed version of early American history that starts in 1492 when Columbus sailed the ocean blue and then jumps to the Pilgrims’ safe landing at Plymouth Rock in 1620 and their peaceful celebration of the first Thanksgiving the following year. But in his deeply researched and exciting new book, A Cold Welcome, the historian Sam White focuses on the true stories of the English, Spanish, and French colonial expeditions in North America. 


He tells strange and surprising tales of drought, famine, bitterly cold winters, desperation, and death, while anchoring his research in the methods and results of the science of climate change and historical climatology. In doing so, he erases what C.P. Snow, the British physicist and author of The Two Cultures, considered the damaging cultural barrier and “mutual incomprehension” estranging humanists and scientists from one another. “Historians can, and must, embrace this science,” White counsels.
 
He weaves an intricate, complex tapestry as he examines the effects both of climate—meteorological conditions over relatively long periods of time—and of weather—the conditions of the atmosphere over a short term—on vulnerable colonists in North America in the late sixteenth and early seventeenth centuries. The half-century that led up to the founding of permanent settlements saw, as White notes, “one of the steepest declines in Northern Hemisphere temperatures in perhaps thousands of years.”


His fresh account of the climatic forces shaping the colonization of North America differs significantly from long-standing interpretations of those early calamities. Edmund S. Morgan’s classic American Slavery, American Freedom: The Ordeal of Colonial Virginia (1975) contains a lengthy assessment of the reasons why the Jamestown colonists experienced their “Lord of the Flies” fate. Morgan faults the poor organization and direction of the colony but most of all points to sociological and psychological factors, especially the indolence of the colonists and the large number of “gentlemen” among them who were averse to descending to ordinary labor. “He that will not worke, shall not eate,” John Smith warned them to little avail. A Cold Welcome does not replace these well-grounded interpretations but rather supplements them by shining a spotlight on a wholly different dimension: the timing of these colonial enterprises, which ensnared them in what came to be known as the Little Ice Age.


*
As climatologists define it, the Little Ice Age was a long-term cooling of the Northern Hemisphere between 1300 and 1850. They locate maximum cooling in the early seventeenth century, just when European settlers were attempting to establish colonies in North America. To reconstruct past climate, scientists use indicators called climate “proxies,” such as ice cores, tree rings, and lake-bottom sediments that they analyze for indications of past temperatures and precipitation. In addition, zooarchaeologists examine animal bones to see what settlers ate, while bioarchaeologists study human skeletons to probe health and nutrition.


Climate proxies also provide important evidence of volcanic activity. Between the 1580s and 1600 large tropical volcanic eruptions spewed dust and sulfates high into the atmosphere, dimming sunlight, cooling Earth’s surface, and causing oscillations in atmospheric and oceanic circulation. Eruptions in Colima, Mexico, in 1586, in Nevado del Ruiz in present-day Colombia in 1595, and especially the huge Huaynaputina eruption in the Peruvian Andes in 1600 helped produce shockingly cold decades.


Even before colonists departed from Europe, their lack of reliable information about the extremes of weather in the Little Ice Age was compounded by fatal misconceptions linking geographical latitudes with climate. Educated in the work of the classical Greek geographer Ptolemy, for whom climate and latitude were synonymous, Europeans assumed that they would find a relatively mild climate in North America, since Britain lies latitudinally north of the continental United States and Paris north of Quebec, while Spain lines up with New Mexico. The confusion sowed by those misleading notions would doom many of their enterprises.


During those harrowing decades, European countries—England and Spain in particular—also suffered from freezing winters, cold, wet summers, intense rain, flooding, ruined crops, famine, outbreaks of disease, plague, and spikes in mortality. Economic and demographic factors, worsened by climate-related disasters, White argues, influenced the colonial ambitions of European nations: “The Little Ice Age came at a particular moment and in a particular way that helped to undermine Spain’s commitment to North American colonization but to reinforce England’s.” 


He suggests that a pervasive sense of overcrowding in England, worsened by an influx of poverty-stricken famine refugees into London, helped the planners and promoters of American colonies secure private investment and gather public support by depicting North America as an opportune overseas outlet for the surplus population. In Spain, meanwhile, a decline in imperial revenue, heavy military expenses, and disillusionment with the nation’s fragile settlements in North America, along with weather-related hardships and a general sense of crisis in the empire, led King Philip III to pull back on Spain’s North American claims, opening the way for the English and the French to establish their own colonies there and ultimately allowing for a decisive shift of power in the North Atlantic world.


*
Spain’s expeditions in the early sixteenth century to La Florida—today’s southeastern United States—resulted in lost lives and lost investments. Explorers and colonists expected to find a familiar Mediterranean climate in La Florida: hot, dry summers and cool, wet winters. Instead they encountered wet summers, storms, hurricanes, and freezing winters. “We were farming people in Spain,” wrote one bitterly disillusioned settler in Santa Elena, now Parris Island in South Carolina. “Here we are lost, old, weary, and full of sickness.” In 1587, the few remaining colonists in Santa Elena left for St. Augustine. Frustrated, Philip III was anxious to abandon La Florida and focus instead on New Spain—the territory encompassing the Caribbean and what is now Mexico. In 1608, however, he yielded to Franciscan missionaries who urged him to maintain the settlement in St. Augustine and not abandon the Indians who had been converted to Christianity.

The Spanish colony of New Mexico received a reprieve at the same time and for the same reason: the Franciscans convinced the viceroy of the need to minister to the more than seven thousand Indians who had been baptized. Ever since the colonists’ first arrival in 1540, the barren desert landscape had tested their endurance. In 1598 they set up a base about thirty miles north of present-day Santa Fe, built houses and a church, and dug irrigation channels for crops. But neither they nor the Pueblo Indians, born to that climate, were immune to the hazards of New Mexico’s Little Ice Age.


The nadir came in 1601 following the Huaynaputina eruption, when both colonists and natives found themselves unprepared, physically and psychologically, for one of the coldest and driest periods of the past millennium. During the long freezing winter months, fields of cotton and corn were destroyed, livestock perished in the snow, and even the Rio Grande froze over. Summer was no less discouraging. One witness reported that the four months of summer heat were “almost worse than the cold in winter; and so the saying there is, winter for eight months and hell for four.”


The New Mexico colony all but collapsed at the end of 1601. Gradually, though, the drought came to an end, the winters became less unforgiving, and in 1608 the colonists and missionaries were granted land to set up a new town called Santa Fe, making it, White comments, “an almost exact contemporary of Jamestown.”

In 1609, just when Spanish colonists were securing their settlement in Santa Fe and English colonists starved in Jamestown, the French explorer Samuel de Champlain established a settlement on low ground near the edge of the St. Lawrence River; it had good soil, streams, fresh water, and the protective shelter of high cliffs. He called the colony Quebec, a name derived from the Algonquin word kébec, meaning “where the river narrows.”


Champlain was by then painfully familiar with the climate and geography of the region. He and the explorers Pierre Dugua and François Gravé had already experienced the challenges of establishing settlements in Canada. Their first attempt to set up a colony on the island of St. Croix in the Bay of Fundy failed during the devastating winter of 1604–1605. “The cold is harsher and more excessive than in France and much longer,” Champlain discovered. In the summer of 1605, he and Dugua led the St. Croix colonists who hadn’t died of malnutrition and scurvy to a new site, Port Royal on Nova Scotia. Though the first winter in Port Royal was also deadly, the second one, Champlain noted, “was not so long as in preceding years.” 


The settlers on Port Royal chanced upon more fresh food, including berries, and suffered fewer instances of scurvy; Champlain’s beneficial creation of a social club, the Order of Good Cheer, also boosted morale. But just when the settlement began to thrive, King Henry IV abruptly canceled the fur trade monopoly that made Port Royal economically viable.

In the end, St. Croix and Port Royal contributed to the eventual success of the French in Canada, for Champlain was able to apply to Quebec what he had learned from the mistakes on St. Croix and the accomplishments in Port Royal. He grasped the importance of constructing storehouses with cellars to insulate food and drink from the winter cold and of locating dwellings around a compact central courtyard for defense against storms as well as Indian attacks. White also praises Champlain for having sought out Native Americans for their local knowledge, though the Frenchman could neither abide nor understand their consumption of raw organ meat—pancreas, kidney, tongue—one of the few sources of ascorbic acid that protected them from scurvy during the frigid winter months.


After decades of failed European expeditions and aborted settlements in North America, England, Spain, and France finally had their first enduring colonies in Jamestown, St. Augustine, Santa Fe, and Quebec in the early seventeenth century. At great cost in lives, money, and hopes and expectations, these colonies not only overcame the rigors and ravages of the Little Ice Age but would come to define much of the cultural heritage of the continent.


White remarks that, in undertaking this intriguing study, he was “conscious of the challenges posed by climate change” today. Indeed, he acknowledges that he wrote A Cold Welcome “from the vantage point of global warming” and that he saw in the colonial period “an era that addresses concerns of the present.” It was “another age when America spoke many languages and when its future, its environment, and its place in the world were all uncertain. It was another age when climatic change and extremes threatened lives and settlements.” But while the Europeans who traveled to North America in the sixteenth and seventeenth centuries were not responsible for the Little Ice Age, today the responsibility for the global climate lies largely with humanity.


The earliest North American colonies survived the Little Ice Age by the skin of their teeth, but as White points out, other longer-established colonies in the North Atlantic did not. Vikings first settled Greenland in the tenth century. They raised sheep, goats, and cattle, hunted seal and walrus, and had sporadic commerce with the Scandinavian mainland, yet by the mid-1400s nothing more was heard from them. Between 1605 and 1607, Denmark’s King Christian IV sent out three expeditions to find the colonies. His ships struggled through storms, frigid waters, “ilandes of ice,” and “ice piled upon ice so high,” as one contemporary chronicler wrote, “that it resembled great cliffs.” What the sailors finally discovered was a frozen, treeless land sparsely populated by Inuit natives. The Viking families, communities, and churches had vanished long before, victims of climatic change they could neither adapt to nor control.


Susan Dunn, the Massachusetts Professor of Humanities at ­Williams, is the author of “1940: FDR, Willkie, Lindbergh, Hitler—the Election Amid the Storm.”


https://getpocket.com/explore/item/an-icy-conquest?utm_source=pocket-newtab


Captain John Smith about to be shot

*
“When we look at the image of our own future 
provided by the old we do not believe it: 
an absurd inner voice whispers 
that that will never happen to us––
when that happens 
it will no longer be ourselves that it happens to.” ~ Simone de Beauvoir, The Coming of Age


*
CASTE VERSUS RACE: THE UNTOUCHABLES IN INDIA AND THE U.S.
 
~ In the winter of 1959, after leading the Montgomery bus boycott that arose from the arrest of Rosa Parks and before the trials and triumphs to come, Martin Luther King Jr and his wife, Coretta, landed in India, at Palam Airport in New Delhi, to visit the land of Mohandas K Gandhi, the father of nonviolent protest. They were covered in garlands upon arrival, and King told reporters: “To other countries, I may go as a tourist, but to India I come as a pilgrim.”


He had long dreamed of going to India, and they stayed an entire month. King wanted to see for himself the place whose fight for freedom from British rule had inspired his fight for justice in America. He wanted to see the so-called “untouchables”, the lowest caste in the ancient Indian caste system, whom he had read about and had sympathy for, but who had still been left behind after India gained its independence the decade before.


He discovered that people in India had been following the trials of his own oppressed people in the US, and knew of the bus boycott he had led. Wherever he went, the people on the streets of Bombay and Delhi crowded around him for an autograph. At one point in their trip, King and his wife journeyed to the southern tip of the country, to the city of Trivandrum in the state of Kerala, and visited with high-school students whose families had been untouchables. The principal made the introduction.


“Young people,” he said, “I would like to present to you a fellow untouchable from the United States of America.”


King was floored. He had not expected that term to be applied to him. He was, in fact, put off by it at first. He had flown in from another continent, and had dined with the prime minister. He did not see the connection, did not see what the Indian caste system had to do directly with him, did not immediately see why the lowest-caste people in India would view him, an American Negro and a distinguished visitor, as low-caste like themselves, see him as one of them. “For a moment,” he later recalled, “I was a bit shocked and peeved that I would be referred to as an untouchable.”


Then he began to think about the reality of the lives of the people he was fighting for – 20 million people, consigned to the lowest rank in the US for centuries, “still smothering in an airtight cage of poverty,” quarantined in isolated ghettos, exiled in their own country.
And he said to himself: “Yes, I am an untouchable, and every negro in the United States of America is an untouchable.” In that moment, he realized that the land of the free had imposed a caste system not unlike the caste system of India, and that he had lived under that system all of his life. It was what lay beneath the forces he was fighting in the US.


 
*
There was little confusion among some of the leading white supremacists of the previous century as to the connections between India’s caste system and that of the American south, where the purest legal caste system in the US existed. “A record of the desperate efforts of the conquering upper classes in India to preserve the purity of their blood persists to until this very day in their carefully regulated system of castes,” wrote Madison Grant, a popular eugenicist, in his 1916 bestseller, The Passing of the Great Race. “In our Southern States, Jim Crow cars and social discrimination have exactly the same purpose.”


In 1913, Bhimrao Ambedkar, a man born to the bottom of India’s caste system, born an untouchable in the central provinces, arrived in New York City from Bombay. He came to the US to study economics as a graduate student at Columbia, focused on the differences between race, caste and class. Living just blocks from Harlem, he would see first-hand the condition of his counterparts in the US. He completed his thesis just as the film The Birth of a Nation – the incendiary homage to the Confederate south – premiered in New York in 1915. 


He would study further in London and return to India to become the foremost leader of the untouchables, and a pre-eminent intellectual who would help draft a new Indian constitution. He would work to dispense with the demeaning term “untouchable”. He rejected the term Harijans, which had been applied to them by Gandhi, to their minds patronizingly. He embraced the term Dalits, meaning “broken people” – which, due to the caste system, they were.

It is hard to know what effect his exposure to the American social order had on him personally. But over the years, he paid close attention, as did many Dalits, to the subordinate caste in the US. Indians had long been aware of the plight of enslaved Africans, and of their descendants in the US. Back in the 1870s, after the end of slavery and during the brief window of black advancement known as Reconstruction, an Indian social reformer named Jyotirao Phule found inspiration in the US abolitionists. He expressed hope “that my countrymen may take their example as their guide”.


Many decades later, in the summer of 1946, acting on news that black Americans were petitioning the United Nations for protection as minorities, Ambedkar reached out to the best-known African American intellectual of the day, WEB Du Bois. He told Du Bois that he had been a “student of the Negro problem” from across the oceans, and recognized their common fates.


“There is so much similarity between the position of the Untouchables in India and of the position of the Negroes in America,” Ambedkar wrote to Du Bois, “that the study of the latter is not only natural but necessary.”


Du Bois wrote back to Ambedkar to say that he was, indeed, familiar with him, and that he had “every sympathy with the Untouchables of India”. It had been Du Bois who seemed to have spoken for the marginalized in both countries as he identified the double consciousness of their existence. And it was Du Bois who, decades before, had invoked an Indian concept in channeling the “bitter cry” of his people in the US: “Why did God make me an outcast and a stranger in mine own house?”

*
While I was in the midst of my research, word of my inquiries spread to some Indian scholars of caste based in the US. They invited me to speak at an inaugural conference on caste and race at the University of Massachusetts in Amherst, the town where WEB Du Bois was born and where his papers are kept.


There, I told the audience that I had written a 600-page book about the Jim Crow era in the American south – the time of naked white supremacy – but that the word “racism” did not appear anywhere in the narrative. I told them that, after spending 15 years studying the topic and hearing the testimony of the survivors of the era, I had realized that the term was insufficient. “Caste” was the more accurate term, and I set out to them the reasons why. 


They were both stunned and heartened. 

At a closing ceremony, the hosts presented to me a bronze-colored bust of the patron saint of the low-born of India, Bhimrao Ambedkar, the Dalit leader who had written to Du Bois all those decades before.


It felt like an initiation into a caste to which I had somehow always belonged. Over and over, they shared stories of what they had endured, and I responded in personal recognition, as if even to anticipate some particular turn or outcome. To their astonishment, I began to be able to tell who was high-born and who was low-born among the Indian people there, not from what they looked like, as one might in the US, but on the basis of the universal human response to hierarchy – in the case of an upper-caste person, an inescapable certitude in bearing, demeanor, behavior and a visible expectation of centrality.


On the way home, I was snapped back into my own world when airport security flagged my suitcase for inspection. The TSA worker happened to be an African American who looked to be in his early 20s. He strapped on latex gloves to begin his work. He dug through my suitcase and excavated a small box, unwrapped the folds of paper and held in his palm the bust of Ambedkar that I had been given.


“This is what came up in the X-ray,” he said. It was heavy like a paperweight. He turned it upside down and inspected it from all sides, his gaze lingering at the bottom of it. He seemed concerned that something might be inside.


“I’ll have to swipe it,” he warned me. He came back after some time and declared it OK, and I could continue with it on my journey. He looked at the bespectacled face, with its receding hairline and steadfast expression, and seemed to wonder why I would be carrying what looked like a totem from another culture.


“So who is this?” he asked.


“Oh,” I said, “this is the Martin Luther King of India.”


“Pretty cool,” he said, satisfied now, and seeming a little proud.


He then wrapped Ambedkar back up as if he were King himself, and set him back gently into the suitcase. ~ 


https://www.theguardian.com/world/2020/jul/28/untouchables-caste-system-us-race-martin-luther-king-india?utm_source=pocket-newtab


Stained glass  window donated to the 16th St Baptist church in Birmingham, AL, by the people of Wales

Oriana: 


I think it was George Bernard Shaw who said that the essence of being a Duchess is not her clothes or her carriage; it’s in how she is treated by others. She gets special respect. This applies more broadly: people are sensitive to how they are treated. It’s the amount of respect from others that practically defines our social standing. As the author of this article says: “To their astonishment, I began to be able to tell who was high-born and who was low-born among the Indian people there, not from what they looked like, as one might in the US, but on the basis of the universal human response to hierarchy – in the case of an upper-caste person, an inescapable certitude in bearing, demeanor, behavior and a visible expectation of centrality.”


But how a person treats another, be it someone rich or poor, young or old, also defines the person. In Europe and Latin America there is the notion of “culture,” meaning not so much education or going to the opera as precisely the respectful treatment of others: it’s the tone of voice, politeness, the implicit recognition of the dignity of another person.

*
THE ATHENS OF PERICLES AND THE US OF TODAY: THUCYDIDES STILL RELEVANT

 
~ On the morning after the 2016 presidential election I tried to distract myself by reading some pages of Thucydides that I had assigned for a class the next day, and found myself reading the clearest explanation I had seen of the vote that I was trying to forget. In the third book of his History of the Peloponnesian War, Thucydides describes the outbreak of civil war on the northern island of Corcyra in 427 BC: 


“There was the revenge taken in their hour of triumph by those who had in the past been arrogantly oppressed instead of wisely governed; there were the wicked resolutions taken by those who, particularly under the pressure of misfortune, wished to escape from their usual poverty and coveted the property of their neighbors; there were the savage and pitiless actions into which men were carried not so much for the sake of gain as because they were swept away into an internecine struggle by their ungovernable passions.”


The closest thing to a consolation that I found in the election was the catastrophic failure of almost every attempt to predict the outcome by using numerical data, instead of interpreting the passions that provoked it, as Thucydides interpreted the conflict in Corcyra. The most confident pre-election pollsters proclaimed themselves 99 percent certain of the result that didn’t happen. Even the least confident predicted exactly what did not occur. 


Everyone who reads Thucydides knows him as the most profound and convincing historian of empire, not only in his own age but also in his explicit and implicit predictions of later ages. One of his great themes is Athenian exceptionalism and the moral and military failures that inevitably issued from it. Early in his book, he reconstructs or invents the funeral oration spoken by the Athenian leader Pericles to commemorate those who died in the first year of the Peloponnesian War, with its brilliantly inspiring praise of Athenian democracy in its full flower, prospering and triumphant through its voluntary self-control, its mutual responsibility, its reverence for the law. 


Then, three sentences after the end of the oration, Thucydides, without making a point of the devastatingly subversive turn that his history is about to take, refutes almost everything Pericles said, everything that was self-deceiving and hollow. In the summer after the oration, Thucydides reports, Athens was afflicted by the plague, and its democracy collapsed into “a state of unprecedented lawlessness.” Those who survived cared only for “the pleasure of the moment and everything that might conceivably contribute to that pleasure,” and “no fear of god or law of man had a restraining influence.” Self-satisfied virtue is easy in prosperity, less so in crisis. Much of the book takes the form of debates that Thucydides records without explicitly endorsing either side, though it is almost always clear which side is closest to his own habits of thought. Pericles’s funeral oration seems to be a rare instance of a speech left unanswered by an opposing point of view. But the plague itself is the silent response to Pericles’s speech, and the clear victor in an unspoken but unmistakable debate. 


Athens, soon after the plague, recovered its confident conviction—voiced by Pericles himself—that its political and technical superiority over lesser states justified its program of imperial expansion. A few years later, ignoring the warnings of its wisest general, Athens launched its expedition against primitive Sicily, anticipating an easy conquest of its ragged and outnumbered opponent. Instead, trapped in remote and unfamiliar territory, incompetent to fight a culture it ignorantly disdained, it suffered total defeat of its armed forces, and faced economic ruin at home. 


Thucydides made this claim for his book: 


“It will be enough for me… if these words of mine are judged useful by those who want to understand clearly the events which happened in the past and which (human nature being what it is) will, at some time or other and in much the same ways, be repeated in the future. My work is not a piece of writing designed to meet the taste of an immediate public, but was done to last forever.


He was proved right when Napoleon and Hitler sent their armies into Russia, when the Soviet Union invaded Afghanistan, when the United States sent its forces into Vietnam and Iraq. 


Historians argue among themselves whether Thucydides is a moralizing philosopher or, in a common phrase, “the first scientific historian.” What is radical about him, and gives him his unerring clear-sightedness, is that he is both. He understands morals, not as a set of arbitrary rules imposed or wished upon reality, but part of the fabric of reality itself, in the same way that Greek philosophy had begun to understand physical laws as inseparable from reality. Thucydides came to the same insight that Ludwig Wittgenstein recorded centuries later when he wrote that ethics “must be a condition of the world like logic.” 


In Thucydides’s morally coherent universe, moral action is also, inevitably, practical action, and immoral action is inevitably impractical, no matter how insistently short-sighted strategists pretend that it isn’t. He records a debate over capital punishment between Cleon, who was “remarkable among the Athenians for the violence of his character,” and the sober-minded Diodotus. Cleon wanted Athens to kill the captives it defeated in rebellious Mytilene. An empire, Cleon said, rules by fear; it must never be moved by pity. 


Diodotus responded by urging clemency, but insisted that he urged this choice not because it was the moral one, but because it seemed to him “the best thing for the state.” If Athens killed its captives, he said, cities that rebel in the future would have no motive to surrender, and Athens would waste its resources defeating opponents whose only practical choice was to fight to the death. If, however, Athens spared the defeated, they might even find reason to become its allies. After the debate, Athens for once made the moral and practical choice to spare the Mytilenians, though the vote was close—only for Athens, years later, to make its immoral and impractical choice to conquer Sicily.

In the two years since the 2016 US election, it seems ever more clear that Thucydides is the greatest historian not only of empire but also of contemporary politics. This excerpt is his account of civil war in Corcyra, 427 BC—and, equally, of politics in America, AD 2018: 


“So revolutions broke out in city after city, and in places where the revolutions occurred late the knowledge of what had happened previously in other places caused still new extravagances of revolutionary zeal, expressed by an elaboration in the methods of seizing power and by unheard-of atrocities in revenge. To fit in with the change of events, words, too, had to change their usual meanings. What used to be described as a thoughtless act of aggression was now regarded as the courage one would expect to find in a party member; to think of the future and wait was merely another way of saying one was a coward; any idea of moderation was just an attempt to disguise one’s unmanly character; ability to understand a question from all sides meant that one was totally unfitted for action. Fanatical enthusiasm was the mark of a real man, and to plot against an enemy behind his back was perfectly legitimate self-defense. Anyone who held violent opinions could always be trusted, and anyone who objected to them became a suspect. 


Revenge was more important than self-preservation. And if pacts of mutual security were made, they were entered into by the two parties only in order to meet some temporary difficulty, and remained in force only so long as there was no other weapon available. When the chance came, the one who first seized it boldly, catching his enemy off his guard, enjoyed a revenge that was all the sweeter from having been taken, not openly, but because of a breach of faith. It was safer that way, it was considered, and at the same time a victory won by treachery gave one a title for superior intelligence. And indeed most people are more ready to call villainy cleverness than simple-mindedness honesty. They are proud of the first quality and ashamed of the second. 


Love of power, operating through greed and through personal ambition, was the cause of all these evils. To this must be added the violent fanaticism which came into play once the struggle had broken out.” ~


https://getpocket.com/explore/item/what-thucydides-knew-about-the-us-today?utm_source=pocket-newtab


Pericles: Funeral Oration, by Philipp Foltz 1877

Mary:

I am very interested in the idea of Thucydides' that the moral action is also the most practical. The opposite would be action out of the base and selfish motives that seem so much in evidence now...greed, revenge, racial and ethnic hatreds, the defense of injustice in service to the privileged already in power. The examples—Napoleon in Russia, the US in Vietnam, are particularly telling —especially because they came outside and in contradiction to the narrative of the aggressors, their confidence in their own superior power. Another instance of telling ourselves the wrong story despite everything, down to the bitter and unexpected end in defeat.


Oriana:

It's astonishing how people resist learning from history, and the same mistakes are repeated over and over. You'd think that after Napoleon, nobody would venture again to conquer the unconquerable vastness of Russia, with its killing winters — but there goes Hitler again. Thucydides realized that he was writing for all times. 

Denial, denial: after Napoleon, Hitler tries again — and again ends up with stacks of frozen corpses and the beginning of the end. And the U.S. gets involved in every possible foreign war — never mind Vietnam and the never-ending war in Afghanistan, never mind that military overextension was a huge part of the fall of the Roman Empire.

So it goes . . . just as Thucydides predicted many centuries ago. 


*
WHY SINGLE WOMEN ARE LEAVING CHRISTIANITY 

 
~ One of this blog’s central concerns remains the decline of Christianity in America, especially what I’ve come to call its churn rate. For many years, that decline has resulted in a decided gender skew in churches. Nowadays, most churches consist mostly of women! But now, it looks like that tide has turned. Today, let’s look at the flight of single women from Christianity, and what that might mean for the religion moving forward.

This article is especially about Gen Z women born after 1996. Gen Z women are less likely to be married than their older sisters. In addition, they’re also the least religious generation so far in our history.

The article that got my attention focused on this topic comes to us from Relevant Magazine, a Millennial-aimed Christian news and blogging site. Katie Gaddini titled her April 28 article “Why Are So Many Single Women Leaving the Church?”

Gaddini, a sociologist interested in how women engage with religion, describes a convention of women seeking to reconcile their faith with their feminism:


Then a clear voice rang out: “I’m so tired of fighting Christian church leaders to be treated equally but I don’t want to leave the church. So, what do I do?” She paused before reformulating her question: “How do I stay?”


The plaintive question grabbed Gaddini’s attention, perhaps especially since she herself is one of the women who’ve left church culture. Outsiders to Christianity might look at all the injustices that women face in this religion and marvel that they don’t simply leave.
As it turns out, increasing numbers of single Christian women are doing exactly that.

She offers, in her article, three big reasons why she thinks this exodus is happening.


Single women can’t find good husbands in their church communities. Once they’re ready to get married, they find very few marriageable prospects in their churches. Many eventually look outside the church for mates, which means a high likelihood of ending up with a husband holding different beliefs. 


Their fellow Christians criticize, police, ostracize, exclude, and negate them if they don’t perfectly fit into stereotypically feminine molds.

Churches’ prurient over-focus on sex and puritanical control-grabs over women’s lives and bodies alienates many women.

Gaddini hints at some other potent pain points for single women, as well:


Single women feel like they have no place in a religion completely sold on the Cult of Family.


Married women who feel threatened by their presence can be really unkind to their single sisters.


Strangely, Gaddini very pointedly did not include causes I’ve seen many Christian women cite as their reasons for distancing themselves from church culture: Constant streams of scandals involving men abusing their power over women.


Church leaders blowing off their concerns and making pretty noises about caring but doing nothing whatsoever to alleviate the injustices against them.


Political agendas that specifically and pointedly seek to destroy women’s access to human rights and civil liberties.


Strangely, “Jesus” seems completely uninterested in the plight of the single women in his flocks. The churches devoted to him seem more like high school drama clubs than more loving communities of Spirit-filled saints.


And the churches should be concerned. Churches may be very poorly-run businesses, but at heart they are, indeed, just Jesus-flavored businesses. Their leaders need their flocks like real shepherds need their sheep, and for the exact same reasons.

Church leaders and fervent male congregants spend a lot more time whining about the “feminization” of churches than they do bending over backwards to thank women for all the stuff they do to help those churches survive. They’ll miss those women when they’re gone, but they sure won’t act now to ensure those women want to stick around.


Their ingratitude should speak volumes to the women in their flocks. After all, they tell women constantly and in no uncertain terms that they are happy to accept women’s money and labor and time, but want to make sure women don’t expect anything in return for those expenditures.


https://www.patheos.com/blogs/rolltodisbelieve/2020/07/12/the-exodus-of-single-women-from-christianity-continues/


Oriana:

Here is the first stanza of one of my poem “grandmother poems.”

“Why are there no women priests?”
She shrugs: Because men rule the world.
Like we have to listen to the Bolshevik
.
We’re doing Stations of the Cross.

In the Sixth Station, it’s a woman, St. Veronica, who dares show compassion to Jesus. His torturers are “Bolsheviks” for whom loyalty to an ideology (or the empire, be it Roman or Soviet) comes ahead of any sentimental concerns about kindness.

I can also understand why some single women are alienated by the cult of the family and the message that their main function is to breed more church members. Women who are interested in  spiritual life may find a more appealing path in Buddhism — or in devotion to any of the arts, including gardening and organic farming. I think it’s artists — real ones, which takes a lot of sacrifice and hard work — who are “Spirit-filled saints.”

But I realize my own bias here. If a saint is someone who has a relationship with the divine, then a dedicated artist indeed fits. In more popular terms, however, a saint is someone dedicated to selfless service. I think that’s also the vision of true Christianity: the nurturing of others with loving words and deeds. And women are good at that. True Christianity is indeed more feminine. Instead of decrying the “feminization” of churches, church leaders should realize that their organizations are still not feminine enough — I mean “feminine” in the best sense of the word: loving and full of calm strength, not subservience. But then the whole world needs to become more caring and “feminine.” 



*

COULD IT BE THAT RELIGION MORE LIKE SEX THAN LIKE SCHOOL? 
 
~ There’s an increasing body of evidence to suggest that we need to think about religion not as a process of training or indoctrination, but as arising from some deep-seated instincts, hardwired into our brains and then shaped by our cultures. This is more like the way we think about sex, emotions and relationships. 


The shift in thinking arises from a field of study known as the cognitive science of religion, where cognitive psychologists and evolutionary theorists have joined forces to address a puzzling question. In the words of Jeffrey Schloss:


“Why, despite a century of presumed secularization, does religion persist in the western world, and why does it seem easier for human beings to be religious than to be secular?”


The answer they propose is that our brains are hardwired with cognitive biases that have evolved in order to help us to survive, but which have the side-effect of making it natural to develop religious belief. For example, we are cognitively predisposed to imagine that every rustle in the bushes is a creature watching our every move: this hyperactive agency detection device was of real benefit to early humans alone in the jungle. It might have caused our early ancestors to run away from a few imaginary tigers, but they also will have escaped one that might otherwise have eaten them. The side effect, however, is that we see unseen watchers everywhere. From this point, it is a relatively easy leap to believe in gods that watch over us, unseen. 


According to this model, we did not evolve to be religious, but ended up with religion as a spandrel, an unintended by-product of the main evolutionary process. Nevertheless, unintended consequence or not, it is now part of our mental architecture and culturally infused throughout our societies – and this is why religious behavior proves so durable and persistent. 


The hyperactive agency detection device and other mechanisms become incorporated into our social and cultural life. They help keep us honest with each other, help us to care for each other and fight our common enemies, and they become codified into the religions that survive and evolve alongside human societies. It is in this sense that religion is more like sex than like school – we might choose to ignore it or decide to have nothing more to do with it, but it will keep returning to haunt us in some form or another. 


A NEW PERSPECTIVE


This evolutionary account of the existence and persistence of religion in most, if not all, human societies (it depends a lot on how you define it) is hotly debated and open to criticism from a number of angles. Opponents point out that the move from identifying in-built biases in human cognition to a theory of why we create entire religious universes that structure societies looks suspiciously like a “just-so story” – one that is highly speculative and requires us to make some assumptions for which there is little or no evidence. The cognitive science of religion gives us an interesting account of why we have religious intuitions, but tells us nothing about how these are translated into particular religious beliefs and practices. 


Nevertheless, its description of religion as driven by deep-seated desires rather than rival accounts of reality opens up an intriguing set of questions and possibilities.


~ Whatever floats your boat. We no longer believe that everybody’s sexual life has to be the same. Some people choose to give up sex altogether, others have multiple partners. There is a whole range of LGBTQI+ preferences now recognized alongside “vanilla” heterosexual monogamy. Perhaps our religious desires and impulses should be allowed the same diversity and recognition.


~ You mean the whole world to me but … I do not expect everybody else to see how absolutely wonderful and perfect my partner is. What is absolutely true to me, religiously, may not make any sense to you. And that’s OK. Truth claims do not belong in affairs of the heart, or in affairs of the spirit. Arguments about whose religion is true similarly miss the point.


~ Don’t shut me out. Although the religious drive is nothing like as powerful or fundamental as the sex drive for most people, it would be unwise to attempt to repress it completely. Perhaps the rise of extremism religion is partly to do with the “return of the repressed”, the violence with which an aspect of our character may reassert itself when it has been pushed down and ignored for too long. 


~ I love you … I just don’t like you. We have ambiguous relationships with our partners, sometimes adoring them and sometimes hardly able to be in the same room as them. Sexual attraction is part habit, part mystery, part madness. Most religious people, if pushed, might say something similar about how their spiritual involvement or commitment fluctuates and varies over time. It’s much more complicated than can be captured by simple questions like “What do you believe?” or “Are you religious?”


This sort of approach to religion has the potential to upset devoutly religious people but also the “devout atheists” who can see no place for it. It provides an explanation of religion which can sit alongside, but does not require, appeals to the call of god or the truth of religious claims. It also stands as a warning to the devout atheists that religion will never go away, and that attacks on religious people as irrational will not make any real difference. At the same time, it opens up a new and intriguing set of possibilities for thinking differently about how religion fits into our world, and how we might learn to express our religious instincts in a diverse society without blind dogmatism or violence.


https://theconversation.com/could-it-be-that-religion-is-more-like-sex-than-school-84168st teresa bernini



Oriana:

This aligns with the findings of the cognitive psychologist Jesse Bering, author of The Belief Instinct: religion stems from cognitive errors. The difference is that he clearly stresses assumptions such as “if something exists, it must have been created by an intelligent maker,” or seeing natural phenomena as signs and omens, as common mistakes that can be corrected with more knowledge and more rigorous (rather than wishful) thinking. Reading Bering’s exposition of religion as founded on cognitive errors made me get off the “agnostic” fence and declare myself an atheist.
 

At the same time I can’t deny that that has been my bodily intuition from the start — everything is nature, and in nature there is nothing supernatural (though our understanding of nature is incomplete and will always be so, no matter how much science progresses). I said “bodily intuition” rather than “emotional intuition” because my sense of what some call metaphysics is basically physiological. Thus, the erasure of consciousness under anesthesia or simply during dreamless sleep provides me with the taste of the afterlife. 

It’s not the kind of afterlife I’d like to experience— my choice would be an intellectual heaven — but it’s better than trying to force myself to believe in the traditional Hell, Purgatory, and Paradise. 

Still, thanks to the school of suffering I’ve become softer toward believers. Life can be so terrible and suffering so random and undeserved that “whatever floats your boat” is fine with me. This was the pragmatic approach of William James: if your religion helps you live, that’s good enough. Don’t bother about whether it’s objectively true or try to solve theological puzzles. (Besides, as one theologian explained to me, theologians don’t really believe —and in any case, they can always appeal to Mystery.)


The trouble is that “those who can make you believe in absurdities can make you commit atrocities” (Voltaire). If a religion sprouts a poisonous extremist branch and advocates the need for atrocities to guard the purity of its doctrines (and the purity of women — all major religions seem obsessed with regulating the woman’s body), major problems can result. 


Religion is fine when it becomes relaxed and psychologically supportive, like self-help books. But there is always the possibility that the hate-filled lunatic fringe will grow more powerful. That’s why seeing that all religions are man-made helps provide an emotional distance, neutralizing the potential fanaticism. 

Comparative religion should be a required subject. Of course I realize how difficult it would be to make this a reality — “true believers” are suspicious even of yoga classes, much less a serious examination of Buddhism, not to mention the doctrines of Judaism or Islam. 


“Soft,” humanitarian religion is fine; the “hard” and anti-human kind of religion needs to be recognized as evil. 


While recognizing the evil, we need also to recognize the good: the gems of wisdom contained in the various religious teaching, the beauty of religious art and music. The beliefs that inspired the builders of Gothic cathedrals may strike us as absurd, and sometimes even revolting, but we can still enjoy the cathedrals. Life is full of paradoxes, and that’s one of them. And let’s face it: paradoxes make life more interesting. 


*

PEOPLE OVER SIX FEET TALL ARE MORE THAN TWICE AS LIKELY TO BE DIAGNOSED WITH THE CORONAVIRUS, the results of a new survey reveal.

The global team of researchers, including experts from the University of Manchester and Open University, surveyed 2,000 people in the country, as well as the US, to determine whether their personal attributes, work and living practices might play a role in transmission, The Telegraph reported.


The results found that taller people are at a higher risk, which researchers say suggests that the contagion is spreading through the air — because height would not be a factor if the virus was only contractible through droplets, according to the report.


The study also found that using a shared kitchen or accommodation played a large role — especially in the US, where those circumstances made the chances of contracting the bug 3.5 times as high.


In the UK, chances were 1.7 times higher.


https://nypost.com/2020/07/28/people-over-6-feet-tall-are-more-likely-to-contract-coronavirus/


Oriana:


The results of this study need to be interpreted with caution until it is replicated. On the whole, though, we know that there is a height range optimal for health, and too tall can mean an early excess of the growth hormone, which later has harmful effects. 


Among breeds of dogs, we know that small dogs live longest, while huge dogs like Great Danes often die when they are only five or six years old. But it’s not possible to make a leap to human populations with scores of factors affecting our life span and susceptibility to various diseases. 


There are both advantages and disadvantages to being tall. Taller individuals are more susceptible to blood clots, atrial fibrillation, and certain types of cancer, but are less likely to have high blood pressure, diabetes, or dementia. 


Still, we are talking about statistical averages and not individual cases. When it comes to Covid infection, don’t worry about your height. Instead, make sure you wear an adequate mask and maintain social distancing. If you go to a restaurant, sit outdoors. Avoid crowded enclosed spaces. Don’t use public transportation unless you absolutely have to. 


Dr. Fauci said that he wouldn’t board a plane or eat at an indoors restaurant. We’d be wise to follow suit, remembering that we are far from helpless.


Regardless of height, masks work. 



*

PARKINSON’S STARTS IN THE GUT

~ New research indicates that Parkinson’s disease may begin in the gastrointestinal tract and spread through the vagus nerve to the brain.
“We have conducted a registry study of almost 15,000 patients who have had the vagus nerve in their stomach severed. Between approximately 1970-1995 this procedure was a very common method of ulcer treatment. If it really is correct that Parkinson’s starts in the gut and spreads through the vagus nerve, then these vagotomy patients should naturally be protected against developing Parkinson’s disease,” explains postdoc at Aarhus University Elisabeth Svensson on the hypothesis behind the study.


A hypothesis that turned out to be correct:


“Our study shows that patients who have had the the entire vagus nerve severed were protected against Parkinson’s disease. Their risk was halved after 20 years. However, patients who had only had a small part of the vagus nerve severed where not protected. This also fits the hypothesis that the disease process is strongly dependent on a fully or partially intact vagus nerve to be able to reach and affect the brain,” she says.


The research project has just been published in the internationally recognized journal Annals of Neurology.


The first clinical examination


The research has presented strong evidence that Parkinson’s disease begins in the gastrointestinal tract and spreads via the vagus nerve to the brain. Many patients have also suffered from gastrointestinal symptoms before the Parkinson’s diagnosis is made.


Patients with Parkinson’s disease are often constipated many years before they receive the diagnosis, which may be an early marker of the link between neurologic and gastroenterologic pathology related to the vagus nerve,” says Elisabeth Svensson.


Previous hypotheses about the relationship between Parkinson’s and the vagus nerve have led to animal studies and cell studies in the field. However, the current study is the first and largest epidemiological study in humans.


The research project is an important piece of the puzzle in terms of the causes of the disease. In the future the researchers expect to be able to use the new knowledge to identify risk factors for Parkinson’s disease and thus prevent the disease.


“Now that we have found an association between the vagus nerve and the development of Parkinson’s disease, it is important to carry out research into the factors that may trigger this neurological degeneration, so that we can prevent the development of the disease. To be able to do this will naturally be a major breakthrough,” says Elisabeth Svensson.


https://neurosciencenews.com/parkinsons-gastrointestinal-tract-neurology-2150/


from another source:

~ The earliest evidence that the gut might be involved in Parkinson’s emerged more than 200 years ago. In 1817, the English surgeon James Parkinson reported that some patients with a condition he termed “shaking palsy” experienced constipation. In one of the six cases he described, treating the gastrointestinal complaints appeared to alleviate the movement-related problems associated with the disease.

Since then, physicians have noted that constipation is one of the most common symptoms of Parkinson’s, appearing in around half the individuals diagnosed with the condition and often preceding the onset of movement-related impairments. Still, for many decades, the research into the disease has focused on the brain. Scientists initially concentrated on the loss of neurons producing dopamine, a molecule involved in many functions including movement. More recently, they have also focused on the aggregation of alpha synuclein, a protein that twists into an aberrant shape in Parkinson’s patients. A shift came in 2003, when Heiko Braak, a neuroanatomist at the University of Ulm in Germany, and his colleagues proposed that Parkinson’s may actually originate in the gut rather than the brain.

Braak’s theory was grounded in the observation that in post-mortem samples of Parkinson’s patients, Lewy bodies, clumps of alpha synuclein, appeared in both the brain and the gastrointestinal nervous system that controls the functioning of the gut. The work by Braak and his colleagues also suggested that the pathological changes in patients typically developed in predictable stages that starts in the gut and ends in the brain. At the time, the researchers speculated that this process was linked to a “yet unidentified pathogen” that travels through the vagus nerve—a bundle of fibers connecting major bodily organs to the brainstem, which joins the spinal cord to the brain.

Microbes themselves are another potential trigger for promoting the build-up of intestinal alpha-synuclein. Researchers have found that, in mice, bacterial proteins could trigger the aggregation of the alpha-synuclein in the gut and the brain. Some proteins made by bacteria may form small, tough fibers, whose shape could cause nearby proteins to misfold and aggregate in a manner akin to the prions responsible for mad cow disease, explains Robert Friedland, a neurologist at the University of
Louisville who coauthored that study.

The microbiome, the totality of microorganisms in the human body, has spurred intense interest among Parkinson’s researchers. A number of reports have noted that individuals with the disease harbor a unique composition of gut microbes, and scientists have also found that transplanting fecal microbes from patients into rodents predisposed to develop Parkinson’s can worsen motor symptoms of the disease and increase alpha-synuclein aggregation in the brain.

But rather than bacterial proteins triggering misfolding, Sarkis Mazmanian, a Caltech microbiologist, believes that these microbes could be acting through the metabolites they produce, such as short-chain fatty acids. Mouse experiments from his lab have shown that these molecules appear to activate microglia, the immune cells of the brain. The metabolites, Mazmanian adds, may send a signal through the vagus nerve or bypass it completely through another pathway such as the bloodstream. Because epidemiological studies find that vagus nerve removal does not completely eliminate the risk of Parkinson’s, other brain-gut routes may also be involved.

Yet another idea holds that that intestinal inflammation, possibly from gut microbes, could give rise to Parkinson’s disease. The latest evidence supporting this idea comes from a large epidemiological study, in which Inga Peter, a genetic epidemiologist at the Icahn School of Medicine at Mount Sinai, and her colleagues scanned through two large U.S. medical databases to investigate the overlap between inflammatory bowel diseases and Parkinson’s.

Their analysis compared 144,018 individuals with Crohn’s or ulcerative colitis and 720,090 healthy controls. It revealed that the prevalence of Parkinson’s was 28 percent higher in individuals with the inflammatory bowel diseases than in those in the control group, supporting prior findings from the same researchers that the two disorders share genetic links. In addition, the research team discovered that in people who received drugs used to reduce inflammation—tumor necrosis factor (TNF) inhibitors—the incidence of the neurodegenerative disease dropped 78 percent. The anti-TNF finding suggests that the overlap between the two diseases might be primarily mediated by inflammation.

While many lines of evidence support the gut origins of Parkinson’s, the question of how early the gastrointestinal changes occur remains, Tansey says. In addition, other scientists have suggested that it is still possible that the disease begins elsewhere in the body. In fact, Braak and his colleagues also found Lewy bodies in the olfactory bulb, which led them to propose the nose as another potential place of initiation. “I think there’s likely multiple sites of origin for Parkinson’s disease,” says Viviane Labrie, a neuroscientist at the Van Andel Research Institute in Michigan. “For some individuals, it might be the gut, for others it might be the olfactory system—or it might just be something that occurs in the brain.”


https://www.scientificamerican.com/article/does-parkinsons-begin-in-the-gut/

Neurons (green) with Lewy bodies (orange)
 

Oriana:

I have a personal interest in Parkinson’s since my father died of this terrible disease. The finding that “Parkinson’s starts in the gut” suggests that diet may have an influence on the development and onset of the symptoms. Japanese-type diet seems protective, as does drinking a lot of tea and coffee. Ketogenic diet also shows promise, and does supplementation with vitamins D and K2. 


Mary: RELATIONSHIPS, NOT SEPARATE PARTS

The discussion of the Pilgrims' starving time and of the processes involved in Parkinson's disease also both tell something essential about our habits of thinking and organizing knowledge. We tend to organize things in separate categories and focus on the categories themselves rather than the relationships between them. This is not absolute, and in fact we have been steadily moving away from it, notably in studying ecological systems, but still remains part of our thinking in a very basic way. When I studied nursing, anatomy was taught in terms of "systems"...skeletal, vascular, muscle etc. Care was also organized by "system" or medical specialty...orthopedics, cardiovascular, neurologic. Our health care system is fragmented across the same lines...experts focusing within the boundaries of their chosen specialty.

The problem is that none of these things exist or operate in isolation. A living organism, multi systemed, alive in a specific time, place and environment, is constantly in conversation with itself, chemical, physical, electrical signals going back and forth, influencing and changing the state of the whole. Rather than puzzling that the gut is part of what we see operating in Parkinson's we should be amazed if it had no part, no influence.
My father also suffered from this disease, and the symptoms seemed to involve every system, Constipation was a factor, movement disturbances from the smallest tremors to the Parkinson's stance and shuffle, muscle wasting, even late mental changes and delusions. And we were so obtuse, my siblings and I, despite having medical training, we saw him with all the classic movement and gait symptoms and yet didn't let it add up in our minds to "Dad has Parkinson's." Denial and habit of mind in action.

As for the situation of the first settlers, a similar situation applies. To think of the colony in isolation is not enough to account for the starving time...searching for explanations in what the colonists did or didn't do, knew or didn’t know. This is immediately clarified when the situation of the Native Americans is taken into consideration. They were also suffering, so it was something more than the pilgrims ineptitude going on. Once the focus is again widened to include the climate, the "little ice age," all becomes clear.

What we learn from all this is to enlarge our focus, to engage with as many parts of an integrated system as we can, in order to approach true understanding. See connections rather than divisions, symphonies rather than single harmonic lines. This may get us far closer to the heart of those stories we need to understand, and the ones we need to change.


Oriana:

Yes, to understand anything we must not look at it in isolation, but as part of a complex interaction with the whole. And there is the problem of deliberate blindness. We’ve known forever about stress being a factor as huge as smoking in the development of heart disease, and now it turns out that it’s huge in dementia as well — but we want scientists to come up with a magic pill, and refuse to do anything about stress, often self-imposed. 


*
ending on beauty:


Things are violin-bodies


filled with murmuring darkness:

the weeping of women dreams in it,

the resentment of whole generations

stirs in its sleep.

~ Rilke, “At the Shore of the Night”




No comments:

Post a Comment