Saturday, June 26, 2021


Driftwood sculpture by Paul Baliker. . . . ’that certain bounds hold against chaos, that is a place of first permission, everlasting omen of what is.'  ~ Robert Duncan


It was supposed to be Arts & Crafts for a week,
but when she came home
with the "Jesus Saves" button, we knew what art
was up, what ancient craft.

She liked her little friends. She liked the songs

they sang when they weren't 

twisting and folding paper into dolls. 

What could be so bad? 

Jesus had been a good man, 
and putting faith 
in good men was what 

we had to do to stay this side of cynicism, 

that other sadness. 
OK, we said, One week. But when she came home 

singing "Jesus loves me, 
the Bible 
tells me so," it was time to talk. 

Could we say Jesus 

doesn't love you? Could I tell her the Bible

is a great book certain people use 

to make you feel bad? We sent her back 

without a word. 

It had been so long since we believed, so long 

since we needed Jesus 

as our nemesis and friend, 
that we thought he was 
sufficiently dead, 
that our children would think of him 
like Lincoln 
or Thomas Jefferson. 

Soon it became clear to us: you can't teach disbelief 

to a child, 

only wonderful stories, and we hadn't a story 

nearly as good. 
On parents' night 
there were the Arts & Crafts 

all spread out

like appetizers. Then we took our seats 

n the church 

and the children sang a song about the Ark, 

and Hallelujah 

and one in which they had to jump up and down 

for Jesus. 

I can't remember ever feeling so uncertain 

about what's comic, what's serious. 
Evolution is magical but devoid of heroes. 

You can't say to your child 

"Evolution loves you." 
The story stinks 
of extinction and nothing

exciting happens for centuries. I didn't have 

a wonderful story for my child 

and she was beaming. All the way home in the car 

she sang the songs, 

occasionally standing up for Jesus. 

There was nothing to do 
but drive, ride it out, sing along 

in silence.

~ Stephen Dunn (1939 - 2021)


On June 24, 2021 we lost Stephen Dunn, one of the most engaging modern American poets. I had the privilege of meeting him during a poetry conference, and remember hearing him read this poem with particular pleasure. This is the one that stays with me, along with Tenderness, about an abused woman for whom a brief relationship with the speaker was her discovery of tenderness.

Tenderness toward all that is human could indeed be Dunn’s motto — the wisdom of understanding, which leads to tenderness.

My favorite lines:

It had been so long since we believed, 
so long 
since we needed Jesus 

as our nemesis and friend, 
that we thought he was sufficiently dead,

that our children would think of him like Lincoln 

or Thomas Jefferson. 

Soon it became clear to us: you can't teach disbelief 

to a child, 
only wonderful stories, 
and we hadn't a story 
nearly as good. 

That is a problem, unless you become really fascinated with the nature of things, with science. Look at something deeply enough, and an infinity opens up, as Nietzache observed. The real is more interesting and mysterious than even the best fables of religion. I remember discovering a book on paleontology, and it was more interesting by far than the creation of the animals in Genesis. True, there is extinction, but no supernatural punishment. To me religion was about sin and punishment; science was about discovery and ever-growing knowledge. 


The same lines from Dunn's poem struck me most..that it had been so long since they believed they thought Jesus  was "sufficiently dead," and were surprised to encounter those still going on with religious activities they had long abandoned. I have felt that same sort of surprise, as though expecting everyone must have come to the same conclusions I did at thirteen, and moved away from the mythology and institution of religion.

 It's a kind of shock, of surprise they all “haven't gotten over it,” that everyone didn't share with me the conclusion I came to, that there was no real apology for the god that allowed suffering and sat in judgement on those who were more kind, larger in spirit, less vengeful and demanding than their supposed creator. Once you raise certain questions, take certain steps, it is impossible to go back, to undo them. Return to belief is impossible, and observance of rituals progressively uncomfortable, like wearing an ill-fitting costume for a part in a play whose lines are in an imaginary language you no longer remember.

But, like the parents in the poem, you can't simply impose your conclusions, especially on a child too young to disbelieve. The state of unbelief comes at the end of a very personal journey, and only with a certain maturity of thought and emotion. It can't be thrust on someone else, or presented as a foregone conclusion...that would be the creation of yet another orthodoxy, another system of institutionalized belief. I think everyone must take the road to unbelief on their own, that each journey is different, and even the refusal to take the first steps on that journey is a matter particular and private, that no one should deny or abbreviate or dismiss for anyone else. For me, there can be no Evangelical unbelief.


So true: each journey toward non-belief in the archaic concept of a Creator/Judge/Sadistic Torturer is individual. There is no Church of Atheism. But there is literature, science, visual art, movies, gardening, yoga classes . . . and so much more to immediately occupy the mind and time of a curious person. So much to learn, so much to do! In fact, there is no time for religion, and no unbeliever seems misses it as such; they may miss the communitarian aspect of the church (which my church failed to provide).

I thought that perhaps it’s possible to return to religion, if not Catholicism than perhaps one of the Protestant denominations. I went through a period of occasionally “trying out” this or that church. The Greek church pleased me most, but I realized I’d always be an outsider.

For whatever it’s worth, I read somewhere that former Catholics and former Fundamentalists make the most ardent atheists, while former Lutherans, Presbyterians, etc., fall away from religiosity gradually, without drama. No “fear and trembling” in the religion, no fear and trembling in dropping it. I suspect that the deity who’s so easily forgotten was never entirely real to the person, even in childhood when bible stories are first presented, and have the most impact they’ll ever have (minus some rare cases of adult conversion).



~ A funny story. Edith Wharton was awarded the Pulitzer Prize for her novel The Age of Innocence in 1921 (it was published in 1920), but the jury had originally chosen to award it to Sinclair Lewis’s Main Street. The trustees, the actual powers-that-be within the organization, to whom it falls to make the final decision based on the advice of the jury, balked at the choice because they thought Main Street was unwholesome. Back then, the prize was to be awarded to a novel “which shall best present the wholesome atmosphere of American life”, and Main Street, a trenchant satire on narrow-mindedness in a small midwestern town, ruffled some self-important feathers. In the book, a married man perhaps has an affair with a neighbor, while his wife contemplates an affair with a younger man, but does nothing about it – these are the only morally racy bits I can come up with. No, the problem was actually political: then, as now, the rural Midwest was considered to be the sacred beating heart of America (Mom and apple pie and all that), and it wouldn’t do to question that myth.

But back to Lewis. Another of his books was then awarded the Pulitzer in 1926; supremely annoyed by his first go-around with them, he gave the award back. Then in 1930 he was awarded the Nobel Prize in Literature – surely, one might think, establishing him as a global force in writing. Yet how many people today have even heard of Sinclair Lewis, let alone read one of his books? He has, for the most part, been swallowed by time (as have many of the Pulitzer laureates from that era: Ernest Poole, Margaret Wilson, Edna Ferber, Louis Bromfield, Julia Peterkin…), while Wharton flourishes. She has left him in her dust. In recent decades, two cinematic auteurs have even adapted her books into films stuffed with bankable stars: Martin Scorsese with The Age of Innocence (1993) and Terence Davies with The House of Mirth (2000). If Scorsese takes an interest in something, it must have become part of America’s DNA.

Why has The Age of Innocence endured? It and Main Street are alike in being almost anthropological explorations of tribal cultures: the former of upper-crust New York society in the 1870s, the latter of small-town Minnesota in the 1910s. Both are “hieroglyphic worlds” (Wharton’s phrase from The Age of Innocence) of brittle convention under which the human spirit roils. The protagonists of both rebel against society, but ultimately give up. Yet, while Main Street is supposed to be a naturalistic masterpiece (to the point, I would argue, of tedium), The Age of Innocence offers more: there is something elegant and complicated going on with its poetics (Wharton is a strikingly sophisticated writer without ever being showy), and the final chapter of the book leaps forward in time in such a way that the novel devastatingly captures the passage of an era. The author Colm Tóibín tells BBC Culture that he thinks of it as a “stylish book about disappointment, written… by someone who knew that world but is clear-eyed rather than nostalgic about its demise”.

The shift to modernity

Wharton was living in Paris during the Great War and threw herself into the war effort, “working on behalf of the hundreds of thousands of refugees who flooded across the French border”, as the writer Elif Batuman (herself shortlisted for a Pulitzer this year) notes in her foreword to the new Penguin edition of Age of Innocence. After the war, a changed person, Wharton sat down to write about how the Gilded Age of her girlhood had yielded inexorably to modernity. Janet Beer, professor of literature at Liverpool University and a Wharton expert, tells me that she regards The Age of Innocence as “an almost perfect example of the genre of historical novel”: it “has her at the height of her powers and totally in control of the subject, the genre, and the conviction that this was the only way to commemorate a society now dead”.

To get a sense of what a gilded cage the Gilded Age was, watch Scorsese’s film and marvel at the canyons of mahogany and reefs of crystal these people swam through – the hushed, heavily encrusted formality of an upmarket funeral home. Tóibín, in his introduction to the Scribner edition (2020), calls this the “twilight time” of the old New York aristocracy.  

Twilight indeed. The novel signals that the “elaborate futility” of an elite way of life, in which work features as a hobby, was about to be overcome by what Beer calls “the unstoppable power of new money, financial profiteering, rising materialism, and family breakdown”. It is an unsentimental modernist critique of a genteel stratum of society for whom the 20th Century figures as a storm on the horizon.

With an implacable eye, Wharton observes what Tóibín calls Old New York’s “stylized social rigidities”. As I read the novel, I am struck by how impossible people find it to put emotion into words: they are forever “blushing”, “reddening”, “flushing”, or otherwise indicating embarrassment at the unspoken. “The persons of their world lived in an atmosphere of faint implications and pale delicacies,” writes Wharton.

The book opens with its major characters attending an opera, but the real drama, the real motions being gone through here, are happening among the audience members, who are performing for each other. Batuman, who in 2018 actually ended up a writing fellow at the Mount, Wharton’s former home in Massachusetts, tells BBC Culture that while she was re-reading The Age of Innocence at the time, “I was deeply struck by… Wharton’s view of the transactional nature of family and the role-playing it requires”. The structural dishonesty of Old New York shows many signs of being under great strain. When Newland Archer, Wharton’s lead character, is pulled between the formal demands of being a member of a prestigious New York family, and the personal longing to lead an unconstrained, honest, and passionate life, his mind almost breaks.

As The Age of Innocence opens, Archer is on course to marry May Welland, a society ingenue (all well-bred girls are trained in “factitious purity”). It is an appropriate match – essentially an arranged marriage. But when the cosmopolitan and free-spirited Countess Olenska appears in their social circle, Archer is strongly drawn to her, and the scene is set for a psychological struggle between duty and desire, the slumber of convention and feverish personal awakening. So far, so period novel. Wharton’s brilliant move is to suddenly and brutally frame the period in a final chapter, a coda, that leaps into the future 26 years.

Archer, now 57, contemplates how he ended up sinking back into a marriage with May and having three children. He reviews “the packed regrets and stifled memories of an inarticulate lifetime” and accepts them. His cultural obsolescence weighs heavily on him. But he has also fought his way to a certain maturity; his acknowledgement of all his regrets somehow rounds him out as a sympathetic and admirable person. Batuman says that The Age of Innocence is a “touchstone” for her because it “plays with... the relationship between youth and middle age”. The final chapter of the book lifts it into the big league.

Her Pulitzer may have been a second choice, but Wharton was the first woman to receive one – and she has done it justice. When she was young, her mother was disgusted by her writing, and forbade her to read novels. After she freed herself of her own dud society marriage, in her 40s, she reveled in her liberty, and threw herself into her fiction. 

Chimamanda Ngozi Adichie said in 2019 that she was “reading all of Edith Wharton” because she understood “the texture of character and she gets human beings”. Jonathan Franzen is a great admirer: “She was a doer, an explorer, a bestower, a thinker.” Ta-Nehisi Coates has confessed to being blown away; “what I love about Wharton… is her empathy and ambivalence”.

Wharton was no friend of change. She didn’t like feminism, and she saw the worship of status in Old New York being swept away and replaced by the worship of money – hardly a forward step. But the story of Newland Archer and his tribe (also her own tribe, let’s not forget) is expressed with an elegant and complex ambivalence. I was in my early 20s when I first read the book, and then I saw it as a parable of cold-blooded repression and wasted youth. Now I am almost 57 myself, and I see it as a shrewd meditation on desire and regret. I bow to The Age of Innocence, 100 years old but ageless.


~ The folklore of Grace O'Malley begins in her childhood, when she supposedly begged her father to let her join him on a trade mission to Spain. When he refused his daughter's request on the grounds that her long hair would be hazardous on the rolling deck of a ship, she hacked off her mane, earning herself the nickname Gráinne Mhaol, or "Grace with cropped hair."

Though little is known of Grace's early life, when she was about 16 she made a political marriage to Dónal Ó Flaithbheartaigh, heir to the lands of Ó Flaithbheartaigh. It was an excellent dynastic match, but despite bearing her husband three children, Grace wasn't made for housewifery. She had more ambitious plans. 

Soon Grace was the driving force in the marriage, masterminding a trading network to Spain and Portugal and leading raids on the vessels that dared to sail close to her shores. When her husband was killed in an ambush by a rival clan around 1565, Grace retreated to Clare Island, and established a base of operations with a band of followers. According to legend, she also fell in love with a shipwrecked sailor—and for a time life was happy. But when her lover was murdered by a member of the neighboring MacMahon family, Grace led a brutal assault on the MacMahon castle at Doona and slaughtered his killers. Her actions earned her infamy as the Pirate Queen of Connaught. 

Though Grace remarried for the sake of expanding her political clout, she wasn't about to become a dutiful wife. Within a year she was divorced, though pregnant, and living at Rockfleet Castle, which she'd gained in the marriage and which became her center of operations. According to legend, the day after giving birth to to her ex-husband’s son aboard a ship, she leapt from her bed and vanquished attacking corsairs.

Grace continued to lead raiding parties from the coast and seized English vessels and their cargo, all of which did little to endear her to the Tudors. She was known for her aggression in battle, and it's said that when her sons appeared to be shirking, she shamed them into action with a cry of "An ag iarraidh dul i bhfolach ar mo thóin atá tú, an áit a dtáinig tú as?"—which roughly translates as "Are you trying to hide in my arse, where you came out of?" 

In 1574 an English expedition sailed for Ireland with the aim of putting an end to her exploits once and for all. Though they besieged Rockfleet Castle, no one knew the coastline better than Grace, and she repulsed them with the might of her own ships. 

But Grace made history in 1593 after her son was captured by Sir Richard Bingham, the English governor of Connaught. Appointed in 1584, Bingham had taken office as part of English efforts to tighten their hold on Ireland, and in 1586 his men had been responsible for the death of one of Grace's sons. Bingham also took cattle and land from Grace, which only served to increase her thirst for revenge. Yet she was a politician as much as a warrior, and knew that she couldn't hope to beat Bingham and the forces of the English government single-handedly. 

Instead, she took the diplomatic route and traveled to England, where she requested an audience with Queen Elizabeth I to discuss the release of her son and the seizure of her lands. In addition, she challenged Gaelic law that denied her income from her husband's land and demanded that she receive appropriate recompense. She argued that the tumult reigning in Connacht had compelled her to "take arms and by force to maintain [my]self and [my] people by sea and land the space of forty years past." Bingham urged the queen to refuse the audience, claiming that Grace was "nurse to all rebellions in the province for 40 years," but Elizabeth ignored his entreaties. Perhaps the monarch was intrigued by this remarkable woman, because Grace's request was granted, and the two women met in September 1593.

Grace's Greenwich Palace summit with the queen has become legendary. She supposedly wouldn't bow to Elizabeth, whom she didn't recognize as the Queen of Ireland. Though dressed in a magnificent gown that befit her status, she also carried a dagger, which she refused to relinquish. The queen, however, was happy to receive her visitor—dagger and all. 

The summit was conducted in Latin, supposedly the only tongue the two women shared. Ignoring the fact that they were virtually the same age, Elizabeth decided that there was only "pity to be had of this aged woman" whom she believed "will fight in our quarrel with all the world." 

By the end of the long meeting, an agreement had been reached. Bingham would be instructed to return Grace's lands, pay her the funds she had demanded, and free her son. In return, Grace would withdraw her support of the Irish rebellion and attack only England's enemies. 

Yet the victory was short-lived. Though her son was freed, Bingham's censure was brief, and Grace received back none of the territory she had lost. Grace was furious, and she soon withdrew from public life. 

The last years of Grace O’Malley are shrouded in mystery. It’s believed that she died at Rockfleet Castle around 1603—the same year as Queen Elizabeth I. Her memory lives on, not least in the Irish ballads, which remember her with these verses: 

In the wild grandeur of her mien erect and high 

Before the English Queen she dauntless stood 

And none her bearing there could scorn as rude 

She seemed well used to power, as one that hath 

Dominion over men of savage mood 

And dared the tempest in its midnight wrath 

And thro' opposing billows cleft her fearless path.


~ Through much of the 20th century, the two political parties had clear identities and told distinct stories. The Republicans spoke for those who wanted to get ahead, and the Democrats spoke for those who wanted a fair shake. Republicans emphasized individual enterprise, and Democrats emphasized social solidarity, eventually including Black people and abandoning the party’s commitment to Jim Crow. But, unlike today, the two parties were arguing over the same recognizable country. This arrangement held until the late ’60s—still within living memory.

Since then, the two parties have just about traded places. By the turn of the millennium, the Democrats were becoming the home of affluent professionals, while the Republicans were starting to sound like populist insurgents. We have to understand this exchange in order to grasp how we got to where we are.


Call the first narrative “Free America.” In the past half century it’s been the most politically powerful of the four. Free America draws on libertarian ideas, which it installs in the high-powered engine of consumer capitalism. The freedom it champions is very different from Alexis de Tocqueville’s art of self-government. It’s personal freedom, without other people—the negative liberty of “Don’t tread on me.”

The conservative movement began to dominate the Republican Party in the 1970s, and then much of the country after 1980 with the presidency of Ronald Reagan. As the historian George H. Nash observed, it uneasily wove together several strands of thought. One was traditionalist, a reaction against the utopian plans and moral chaos of modern secular civilization. The traditionalists were sin-fearing Protestants, orthodox Catholics, southern agrarians, would-be aristocrats, alienated individualists—dissidents in postwar America. They were appalled by the complacent vulgarity of the semi-educated masses. Their hero was Edmund Burke, the avatar of conservative restraint, and their enemy was John Dewey, the philosopher of American democracy. The traditionalists’ elitism put them at odds with the main currents of American life—the one passage of American history that most appealed to them was the quasi-feudal Old South—but their writings inspired the next generation of conservatives, including William F. Buckley Jr., who introduced the first issue of National Review, in 1955, with the famous command to “Stand athwart history, yelling Stop. ”

Adjacent to the traditionalists were the anti-Communists. Many of them were former Marxists, such as Whittaker Chambers and James Burnham, who carried their apocalyptic baggage with them when they moved from left to right. Politics for them was nothing less than the titanic struggle between good and evil, God and man. The main target of their energy was the ameliorative creed of Eleanor Roosevelt and Arthur Schlesinger Jr., good old liberalism, which they believed to be a paler communism—“the ideology of Western suicide,” Burnham called it. The anti-Communists, like the traditionalists, were skeptics of democracy—its softness would doom it to destruction when World War III broke out. If these hectoring pessimists were the sum of modern conservatism, the movement would have died of joylessness by 1960.

The libertarians were different. They slipped more easily into the American stream. In their insistence on freedom they could claim to be descendants of Locke, Jefferson, and the classical liberal tradition. Some of them interpreted the Constitution as a libertarian document for individual and states’ rights under a limited federal government, not as a framework for the strengthened nation that the authors of The Federalist Papers thought they were creating. 

Oddly, the most influential libertarians were Europeans, especially the Austrian economist Friedrich Hayek, whose polemic against collectivism, The Road to Serfdom, was a publishing sensation in America in 1944, during the most dramatic mobilization of economic resources by state power in history.

What distinguished libertarians from conventional, pro-business Republicans was their pure and uncompromising idea. What was it? Hayek: “Planning leads to dictatorship.” The purpose of government is to secure individual rights, and little else. One sip of social welfare and free government dies. A 1937 Supreme Court decision upholding parts of the New Deal was the beginning of America’s decline and fall. Libertarians were in rebellion against the mid-century mixed-economy consensus. In spirit they were more radical than conservative. No compromise with Social Security administrators and central bankers! Death to Keynesian fiscal policy!

Despite or because of the purity of their idea, libertarians made common cause with segregationists, and racism informed their political movement from the beginning. Their first hero, Senator Barry Goldwater, ran for president in 1964 as an insurgent against his own party’s establishment while opposing the civil-rights bill on states’-rights grounds.

How did Free America become the dogma of the Republican Party and set the terms of American politics for years? Like any great political change, this one depended on ideas, an authentic connection with people’s lives, and timing. Just as there would have been no Roosevelt revolution without the Great Depression, there would have been no Reagan revolution without the 1970s. After years of high inflation with high unemployment, gas shortages, chaos in liberal cities, and epic government corruption and incompetence, by 1980 a large audience of Americans was ready to listen when Milton and Rose Friedman, in a book and 10-part public-television series called Free to Choose, blamed the country’s decline on business regulations and other government interventions in the market.

But it took the alchemy of that year’s Republican nominee to transform the cold formula of tax cuts and deregulation into the warm vision of America as “the shining city on a hill”—land of the Pilgrims, beacon to a desperate world. In Reagan’s rhetoric, leveraged buyouts somehow rhymed with the spirit of New England town meetings. Reagan made Free America sound like the promised land, a place where all were welcome to pursue happiness. The descendants of Jefferson’s yeoman farmers, with their desire for independence, became sturdy car-company executives and investment bankers yearning to breathe free of big government.

In 1980, the first year I cast a vote, I feared and hated Reagan. Listening to his words 40 years later, I can hear their eloquence and understand their appeal, as long as I tune out many other things. Chief among them is Reagan’s half-spoken message to white Americans: Government helps only those people. Legal segregation was barely dead when Free America, using the libertarian language of individualism and property rights, pushed the country into its long decline in public investment. The advantages for business were easy to see. As for ordinary people, the Republican Party reckoned that some white Americans would rather go without than share the full benefits of prosperity with their newly equal Black compatriots.

The majority of Americans who elected Reagan president weren’t told that Free America would break unions and starve social programs, or that it would change antitrust policy to bring a new age of monopoly, making Walmart, Citigroup, Google, and Amazon the J.P. Morgan and Standard Oil of a second Gilded Age. They had never heard of Charles and David Koch—heirs to a family oil business, libertarian billionaires who would pour money into the lobbies and propaganda machines and political campaigns of Free America on behalf of corporate power and fossil fuels. Freedom sealed a deal between elected officials and business executives: campaign contributions in exchange for tax cuts and corporate welfare. The numerous scandals of the 1980s exposed the crony capitalism that lay at the heart of Free America.

Rather than finding new policies to rebuild declining communities, Republicans mobilized anger and despair while offering up scapegoats.

The aggressive new populism of talk radio and cable news did not have the “conservative orderly heart” that Norman Mailer had once found in the mainstream Republicans of the 1960s. It mocked self-government—both the political and the personal kind. It was rife with destructive impulses. It fed on rage and celebrity culture. The quality of Free America’s leaders steadily deteriorated—falling from Reagan to Gingrich to Ted Cruz, from William F. Buckley to Ann Coulter to Sean Hannity—with no bottom.

Government, which did so little for ordinary Americans, was still the enemy, along with “governing elites.” But for the sinking working class, freedom lost whatever economic meaning it had once had. It was a matter of personal dignity, identity. Members of this class began to see trespassers everywhere and embraced the slogan of a defiant and armed loneliness: Get the fuck off my property. Take this mask and shove it. It was the threatening image of a coiled rattlesnake: “Don’t tread on me.” It achieved its ultimate expression on January 6, in all those yellow Gadsden flags waving around the Capitol—a mob of freedom-loving Americans taking back their constitutional rights by shitting on the floors of Congress and hunting down elected representatives to kidnap and kill. That was their freedom in its pure and reduced form.

A character in Jonathan Franzen’s 2010 novel, Freedom, puts it this way: “If you don’t have money, you cling to your freedoms all the more angrily. Even if smoking kills you, even if you can’t afford to feed your kids, even if your kids are getting shot down by maniacs with assault rifles. You may be poor, but the one thing nobody can take away from you is the freedom to fuck up your life.” 


The new knowledge economy created a new class of Americans: men and women with college degrees, skilled with symbols and numbers—salaried professionals in information technology, computer engineering, scientific research, design, management consulting, the upper civil service, financial analysis, law, journalism, the arts, higher education. They go to college with one another, intermarry, gravitate to desirable neighborhoods in large metropolitan areas, and do all they can to pass on their advantages to their children. They are not 1 percenters—those are mainly executives and investors—but they dominate the top 10 percent of American incomes, with outsize economic and cultural influence.

They’re at ease in the world that modernity created. They were early adopters of things that make the surface of contemporary life agreeable: HBO, Lipitor, MileagePlus Platinum, the MacBook Pro, grass-fed organic beef, cold-brewed coffee, Amazon Prime. They welcome novelty and relish diversity. They believe that the transnational flow of human beings, information, goods, and capital ultimately benefits most people around the world. You have a hard time telling what part of the country they come from, because their local identities are submerged in the homogenizing culture of top universities and elite professions. They believe in credentials and expertise—not just as tools for success, but as qualifications for class entry. They’re not nationalistic—quite the opposite—but they have a national narrative. Call it “Smart America.”

The cosmopolitan outlook of Smart America overlaps in some areas with the libertarian views of Free America. Each embraces capitalism and the principle of meritocracy: the belief that your talent and effort should determine your reward. But to the meritocrats of Smart America, some government interventions are necessary for everyone to have an equal chance to move up. The long history of racial injustice demands remedies such as affirmative action, diversity hiring, and maybe even reparations. The poor need a social safety net and a living wage; poor children deserve higher spending on education and health care. Workers dislocated by trade agreements, automation, and other blows of the global economy should be retrained for new kinds of jobs.

Educated professionals pass on their money, connections, ambitions, and work ethic to their children, while less educated families fall further behind, with less and less chance of seeing their children move up. By kindergarten, the children of professionals are already a full two years ahead of their lower-class counterparts, and the achievement gap is almost unbridgeable. After seven decades of meritocracy, a lower-class child is nearly as unlikely to be admitted to one of the top three Ivy League universities as they would have been in 1954.

The winners in Smart America have lost the capacity and the need for a national identity, which is why they can’t grasp its importance for others.

Smart Americans are uneasy with patriotism. It’s an unpleasant relic of a more primitive time, like cigarette smoke or dog racing. It stirs emotions that can have ugly consequences. The winners in Smart America—connected by airplane, internet, and investments to the rest of the globe—have lost the capacity and the need for a national identity, which is why they can’t grasp its importance for others. Their passionate loyalty, the one that gives them a particular identity, goes to their family. The rest is diversity and efficiency, heirloom tomatoes and self-driving cars. They don’t see the point of patriotism.

Patriotism can be turned to good or ill purposes, but in most people it never dies. It’s a persistent attachment, like loyalty to your family, a source of meaning and togetherness, strongest when it’s hardly conscious. National loyalty is an attachment to what makes your country yours, distinct from the rest, even when you can’t stand it, even when it breaks your heart. This feeling can’t be wished out of existence. And because people still live their lives in an actual place, and the nation is the largest place with which they can identify—world citizenship is too abstract to be meaningful—patriotic feeling has to be tapped if you want to achieve anything big. If your goal is to slow climate change, or reverse inequality, or stop racism, or rebuild democracy, you will need the national solidarity that comes from patriotism.


Real America is a very old place. The idea that the authentic heart of democracy beats hardest in common people who work with their hands goes back to the 18th century. It was embryonic in the founding creed of equality. “State a moral case to a plowman and a professor,” Thomas Jefferson wrote in 1787. “The former will decide it as well, and often better than the latter, because he has not been led astray by artificial rules.” 

Moral equality was the basis for political equality. As the new republic became a more egalitarian society in the first decades of the 19th century, the democratic creed turned openly populist. Andrew Jackson came to power and governed as champion of “the humble members of society—the farmers, mechanics, and laborers,” the Real Americans of that age. The Democratic Party dominated elections by pinning the charge of aristocratic elitism on the Federalists, and then the Whigs, who learned that they had to campaign on log cabins and hard cider to compete.

The triumph of popular democracy brought an anti-intellectual bias to American politics that never entirely disappeared. Self-government didn’t require any special learning, just the native wisdom of the people. “Even in its earliest days,” Richard Hofstadter wrote, “the egalitarian impulse in America was linked with a distrust for what in its germinal form may be called political specialization and in its later forms expertise.” Hostility to aristocracy widened into a general suspicion of educated sophisticates. The more learned citizens were actually less fit to lead; the best politicians came from the ordinary people and stayed true to them. Making money didn’t violate the spirit of equality, but an air of superior knowledge did, especially when it cloaked special privileges.

The overwhelmingly white crowds that lined up to hear Palin speak were nothing new. Real America has always been a country of white people. Jackson himself was a slaver and an Indian-killer, and his “farmers, mechanics, and laborers” were the all-white forebears of William Jennings Bryan’s “producing masses,” Huey Long’s “little man,” George Wallace’s “rednecks,” Patrick Buchanan’s “pitchfork brigade,” and Palin’s “hardworking patriots.” The political positions of these groups changed, but their Real American identity—their belief in themselves as the bedrock of self-government—stayed firm. 

From time to time the common people’s politics has been interracial—the Populist Party at its founding in the early 1890s, the industrial-labor movement of the 1930s—but that never lasted. The unity soon disintegrated under the pressure of white supremacy. Real America has always needed to feel that both a shiftless underclass and a parasitic elite depend on its labor. In this way, it renders the Black working class invisible.

From its beginnings, Real America has also been religious, and in a particular way: evangelical and fundamentalist, hostile to modern ideas and intellectual authority. The truth will enter every simple heart, and it doesn’t come in shades of gray. “If we have to give up either religion or education, we should give up education,” said Bryan, in whom populist democracy and fundamentalist Christianity were joined until they broke him apart at the Scopes “monkey trial” in 1925.

Finally, Real America has a strong nationalist character. Its attitude toward the rest of the world is isolationist, hostile to humanitarianism and international engagement, but ready to respond aggressively to any incursion against national interests. The purity and strength of Americanism are always threatened by contamination from outside and betrayal from within. The narrative of Real America is white Christian nationalism.

Real America isn’t a shining city on a hill with its gates open to freedom-loving people everywhere. Nor is it a cosmopolitan club to which the right talents and credentials will get you admitted no matter who you are or where you’re from. It’s a provincial village where everyone knows everyone’s business, no one has much more money than anyone else, and only a few misfits ever move away. The villagers can fix their own boilers, and they go out of their way to help a neighbor in a jam. A new face on the street will draw immediate attention and suspicion.

Trump had a reptilian genius for intuiting the emotions of Real America—a foreign country to elites on the right and left. They were helpless to understand Trump and therefore to stop him.
Trump’s language was effective because it was attuned to American pop culture. It required no expert knowledge and had no code of hidden meanings. It gave rise almost spontaneously to memorable phrases: “Make America great again.” “Drain the swamp.” “Build the wall.” “Lock her up.” “Send her back.” It’s the way people talk when the inhibitors are off, and it’s available to anyone willing to join the mob. 

Trump didn’t try to shape his people ideologically with new words and concepts. He used the low language of talk radio, reality TV, social media, and sports bars, and to his listeners this language seemed far more honest and grounded in common sense than the mincing obscurities of “politically correct” experts. His populism brought Jersey Shore to national politics. The goal of his speeches was not to whip up mass hysteria but to get rid of shame. 


In 2014, American character changed.

A large and influential generation came of age in the shadow of accumulating failures by the ruling class—especially by business and foreign-policy elites. This new generation had little faith in ideas that previous ones were raised on: All men are created equal. Work hard and you can be anything. Knowledge is power. Democracy and capitalism are the best systems—the only systems. America is a nation of immigrants. America is the leader of the free world.

Just America emerged as a national narrative in 2014. That summer, in Ferguson, Missouri, the police killing of a Black 18-year-old, whose body was left to lie in the street for hours, came in the context of numerous incidents, more and more of them caught on video, of Black people assaulted and killed by white police officers who faced no obvious threat. And those videos, widely distributed on social media and viewed millions of times, symbolized the wider injustices that still confronted Black Americans in prisons and neighborhoods and schools and workplaces—in the sixth year of the first Black presidency. The optimistic story of incremental progress and expanding opportunity in a multiracial society collapsed, seemingly overnight. The incident in Ferguson ignited a protest movement in cities and campuses around the country.

What is the narrative of Just America? It sees American society not as mixed and fluid, but as a fixed hierarchy, like a caste system. An outpouring of prizewinning books, essays, journalism, films, poetry, pop music, and scholarly work looks to the history of slavery and segregation in order to understand the present—as if to say, with Faulkner, “The past is never dead. It’s not even past.” The most famous of this work, The New York Times Magazine’s 1619 Project, declared its ambition to retell the entire story of America as the story of slavery and its consequences, tracing contemporary phenomena to their historical antecedents in racism, sometimes in disregard of contradictory facts. Any talk of progress is false consciousness—even “hurtful.” Whatever the actions of this or that individual, whatever new laws and practices come along, the hierarchical position of “whiteness” over “Blackness” is eternal.

Here is the revolutionary power of the narrative: What had been considered, broadly speaking, American history (or literature, philosophy, classics, even math) is explicitly defined as white, and therefore supremacist. What was innocent by default suddenly finds itself on trial, every idea is cross-examined, and nothing else can get done until the case is heard.

Just America has dramatically changed the way Americans think, talk, and act, but not the conditions in which they live. It reflects the fracturing distrust that defines our culture: Something is deeply wrong; our society is unjust; our institutions are corrupt. If the narrative helps to create a more humane criminal-justice system and bring Black Americans into the conditions of full equality, it will live up to its promise. But the grand systemic analysis usually ends in small symbolic politics. In some ways, Just America resembles Real America and has entered the same dubious conflict from the other side. The disillusionment with liberal capitalism that gave rise to identity politics has also produced a new authoritarianism among many young white men. Just and Real America share a skepticism, from opposing points of view, about the universal ideas of the founding documents and the promise of America as a multi-everything democracy.

But another way to understand Just America is in terms of class. Why does so much of its work take place in human-resources departments, reading lists, and awards ceremonies? In the summer of 2020, the protesters in the American streets were disproportionately Millennials with advanced degrees making more than $100,000 a year. Just America is a narrative of the young and well educated, which is why it continually misreads or ignores the Black and Latino working classes.  

The fate of this generation of young professionals has been cursed by economic stagnation and technological upheaval. The jobs their parents took for granted have become much harder to get, which makes the meritocratic rat race even more crushing. Law, medicine, academia, media—the most desirable professions—have all contracted. The result is a large population of overeducated, underemployed young people living in metropolitan areas.

The historian Peter Turchin coined the phrase elite overproduction to describe this phenomenon. He found that a constant source of instability and violence in previous eras of history, such as the late Roman empire and the French Wars of Religion, was the frustration of social elites for whom there were not enough jobs. Turchin expects this country to undergo a similar breakdown in the coming decade. Just America attracts surplus elites and channels most of their anger at the narrative to which they’re closest—Smart America. 

The social-justice movement is a repudiation of meritocracy, a rebellion against the system handed down from parents to children. Students at elite universities no longer believe they deserve their coveted slots. Activists in New York want to abolish the tests that determine entry into the city’s most competitive high schools (where Asian American children now predominate). In some niche areas, such as literary magazines and graduate schools of education, the idea of merit as separate from identity no longer exists.

The rules in Just America are different, and they have been quickly learned by older liberals following a long series of defenestrations at The New York Times, Poetry magazine, Georgetown University, the Guggenheim Museum, and other leading institutions. The parameters of acceptable expression are a lot narrower than they used to be. A written thought can be a form of violence. The loudest public voices in a controversy will prevail. Offending them can cost your career. Justice is power. These new rules are not based on liberal values; they are post-liberal.

Just America’s origins in theory, its intolerant dogma, and its coercive tactics remind me of 1930s left-wing ideology. Liberalism as white supremacy recalls the Communist Party’s attack on social democracy as “social fascism.” Just American aesthetics are the new socialist realism.

The dead end of Just America is a tragedy. This country has had great movements for justice in the past and badly needs one now. But in order to work, it has to throw its arms out wide. It has to tell a story in which most of us can see ourselves, and start on a path that most of us want to follow.


All four of the narratives I’ve described emerged from America’s failure to sustain and enlarge the middle-class democracy of the postwar years. They all respond to real problems. Each offers a value that the others need and lacks ones that the others have. Free America celebrates the energy of the unencumbered individual. Smart America respects intelligence and welcomes change. Real America commits itself to a place and has a sense of limits. Just America demands a confrontation with what the others want to avoid. They rise from a single society, and even in one as polarized as ours they continually shape, absorb, and morph into one another. But their tendency is also to divide us, pitting tribe against tribe. These divisions impoverish each narrative into a cramped and ever more extreme version of itself.

All four narratives are also driven by a competition for status that generates fierce anxiety and resentment. They all anoint winners and losers. In Free America, the winners are the makers, and the losers are the takers who want to drag the rest down in perpetual dependency on a smothering government. In Smart America, the winners are the credentialed meritocrats, and the losers are the poorly educated who want to resist inevitable progress. In Real America, the winners are the hardworking folk of the white Christian heartland, and the losers are treacherous elites and contaminating others who want to destroy the country. In Just America, the winners are the marginalized groups, and the losers are the dominant groups that want to go on dominating.

Meanwhile, we remain trapped in two countries. Each one is split by two narratives—Smart and Just on one side, Free and Real on the other. Neither separation nor conquest is a tenable future. The tensions within each country will persist even as the cold civil war between them rages on. ~

This essay is adapted from George Packer’s new book, Last Best Hope: America in Crisis and Renewal. It appears in the July/August 2021 print edition with the headline “The Four Americas.”

Packer twists his title from a boast into an abject plea: “No one is going to save us. We are our last best hope.” (The Guardian)

from the Amazon review: 

In lively and biting prose, Packer shows that none of these narratives can sustain a democracy. To point a more hopeful way forward, he looks for a common American identity and finds it in the passion for equality—the “hidden code”—that Americans of diverse persuasions have held for centuries. Today, we are challenged again to fight for equality and renew what Alexis de Tocqueville called “the art” of self-government. In its strong voice and trenchant analysis, Last Best Hope is an essential contribution to the literature of national renewal.

from The Guardian:

~ George Packer finds the US caught in a ‘cold civil war’ between incompatible versions of the country after its ‘near-death experience’ with Donald Trump. An “epistemic rupture”, he says, has made Americans “profoundly unreal to one another”; lacking a shared reality, they have burrowed into partisan encampments or sealed themselves in digital ghettos, echo chambers of angry prejudice.

Packer believes that his country’s dualistic political parties have in effect changed places, with the Democrats now “the home of affluent professionals, while the Republicans… sound like populist insurgents”. Commenting on an American meritocracy whose sole merit is its luck on the stock market, Packer predicts: “As with any hereditary ruling class, political power will fall into the hands of increasingly inferior people.” 


George Packer is onto something. I like his labels, while realizing that someone else might come up with a different division and different labels — though for now I find Packer’s description quite compelling. America is a turbulent and complex country — never boring. 

I like the way Packer called Sarah Palin “John the Baptist” to Trump. I also like his critique of Just America — they know how to criticize but have failed to provide any viable answers. And because they are in constant attack mode, they are divisive rather than healing. 

Personally, I don't seem to fit anywhere — not even with Packer's "Smart America," which seems more Silicon-Valley oriented. What strikes me as most familiar and personally relevant is the overproduction of educated elites for whom there aren't enough jobs — something that Karl Marx acutely experienced. 

Of course all this fine analysis becomes pointless if the climate catastrophe truly bears on us. 


Packer's description of the Four Americas seems pretty accurate. It takes into account the class shifts and divisions in this country that seem to me the true foundation of these divisions and ideological splits. The worst part of all of these is their increasing drift toward more extreme positions, all or nothing, good versus evil, with no moderation, no room for compromise or even conversation. It then leaves us with people entrenched in extreme hardline positions for whom all others are not only enemies, but completely incomprehensible, for whom there is no excuse, and certainly no room.

Think only,  for instance, of Just America, whose issue is not cancel culture but cultural appropriation, who reduce all the complexities of history to a simpler narrative — rigid, demanding and absolute. They are not wrong: certainly slavery was key to our history as a nation, our Original sin, whose ramifications are still at play — but it's not the whole story, or the only story. The problem here, as with the other three Americas, is that the idea barrels on to its farthest possible extreme, at which point it becomes not only untenable, but tyrannical.

The pendulum of ideas and social forces always swings. The question is if we can moderate the degree and ferocity of its movement. For instance, cultural appropriation exists, but to avoid its crimes must no one ever speak from anything but the narrow point of their own identity and experience? Can imagination be outlawed, discredited, disowned? Understanding is more than judgement, requires more than a simple tic of "correct" or "incorrect." Self-righteousness will get us nowhere, and we know it. That's the kind of thinking that leads to disasters, to purges, "cultural revolutions," crusades, gulags, and forced "corrections."

Will we find a way to recognize each other? To speak instead of shout, listen rather than accuse? None of this will be easy or comfortable, but the alternative, to continue on the path to further extremism in our divisions, and further  entrenchment behind our own ideological barricades, will only mean disaster.


America is prone to extremes, and who knows, perhaps someone will tell us not to eat Italian food unless we really had Italian ancestors — I'm joking, of course. But even that minor example is revealing: the remarkable nature of America lies in its great diversity and the fusion of different cultures from all over the world. Nor are immigrants are representative sample of the original populations — it takes either desperation or a special kind of personality, or both, to become an immigrant.

The story of slavery is important, but not more important than the story of the westward waves of colonization, or of successive waves of immigration. There is also the story of how America became the dominant world economy, and the story of its huge cultural influence. I feel sorry for historians who try to arrive at some kind of “summary” of America. And I admire Packer for having at least traced more complex divisions than red and blue America.

However many groups there may be, can they talk to one another? Not if each is based in radically different reality, e.g. I can’t imagine having much to talk about with someone whose life revolves around a fundamentalist church. Perhaps we can only agree to disagree, and at least be polite about it: the differences may be unbridgeable, but so far the cold civil war has not led to bloody violence, nor gulags, nor outright dictatorship
and I hope it never will.

”Making the simple complicated is commonplace; making the complicated simple, awesomely simple, that's creativity.” ~ Charles Mingus


~ The idea for a tool to probe the basis of consciousness came to Gordon G. Gallup, Jr. while shaving. “It just occurred to me,” he says, “wouldn’t it be interesting to see if other creatures could recognize themselves in mirrors?”

Showing chimpanzees their reflections seemed like a fascinating little experiment when he first tried it in the summer of 1969. He didn’t imagine that this would become one of the most influential—and most controversial—tests in comparative psychology, ushering the mind into the realm of experimental science and foreshadowing questions on the depth of animal suffering. “It’s not the ability to recognize yourself in a mirror that is important,” he would come to believe. “It’s what that says about your ability to conceive of yourself in the first place.”

Gallup was a new professor at Tulane University in Louisiana, where he had access to the chimps and gorillas at what would later be known as the Tulane National Primate Research Center. The chimpanzees there had been caught as youngsters in Africa and shipped to America, where they were used mainly in biomedical research. By comparison, his experiment was far less invasive. He isolated two chimps in cages, and placed a mirror in each cage for eight hours at a time over 10 days. Through a hole in the wall, Gallup witnessed a shift in the chimps’ behavior. First they treated the reflection like it was another chimp, with a combination of social, sexual, and aggressive gestures. But over time, they started using it to explore their own bodies. “They’d use the mirror to look at the inside of their mouths, to make faces at the mirror, to inspect their genitals, to remove mucous from the corner of their eyes,” Gallup says.

Gallup was sure that the chimps had learned to recognize themselves in the mirror, but he didn’t trust that other researchers would be convinced by his descriptions. So he moved on to phase two of the experiment. He anesthetized the chimps, then painted one eyebrow ridge and the opposite ear tip with a red dye that the chimps wouldn’t be able to feel or smell. If they truly recognized themselves, he thought he knew what would happen: “It seemed pretty obvious that if I saw myself in a mirror with marks on my face, that I’d reach up and inspect those marks.” 

That’s exactly what the chimps did. As far as Gallup was concerned, that was proof: “the first experimental demonstration of a self-concept in a subhuman form,” he wrote in the resulting 1970 report in Science. “It was just clear as day,” he remembers. “It didn’t require any statistics. There it was. Bingo.” 

But the result that really blew Gallup’s mind came when he tested monkeys, and discovered that they did not do the same. The ability to recognize one’s reflection seemed not to be a matter of learning abilities, with some species being slower than others. It was an issue of higher intellectual capacity. Gallup had obtained the first good evidence that our closest relatives share with us a kind of self-awareness or even consciousness, to the exclusion of other animals. Here, finally, was an experimental handle on a topic that had been the subject of speculation for millennia: What is the nature of human consciousness?

Gallup wasn’t the first to come up with the notion that it might be significant if a person or animal recognizes itself in the mirror. He would only later learn that Charles Darwin had shown mirrors to orangutans, but they didn’t figure the mirror out, at least while he was watching. Darwin had also noted that, for their first few years, his children couldn’t recognize themselves in their reflections. In 1889, German researcher Wilhelm Preyer became the first to posit a connection between mirror self-recognition and an inner sense of self in people.  

More than 50 years later, French psychoanalyst Jacques Lacan conceived of a childhood “mirror stage,” in which mirrors contribute to the formation of the ego. By 1972, developmental psychologists started using mark tests similar to Gallup’s to pin down the age at which children begin to recognize themselves in the mirror: 18 to 24 months.

Meanwhile Gallup, who moved to the University at Albany-SUNY, became interested in whether any non-primates could pass. In the early 1990s, he encouraged one of his Ph.D. students, Lori Marino, to explore the question. Working with Diana Reiss at Marine World Africa USA in California, Marino exposed two bottlenose dolphins at an aquarium to a mirror. Like the chimpanzees, the dolphins learned to use the mirror in a variety of ways, even “having sex in front of the mirror with each other, which we call our dolphin porno tapes,” Marino says. The three researchers published the results, saying they were “suggestive” of mirror self-recognition. 

Still, they were missing the crucial mark test for another decade. The biggest hurdle was anatomical: The dolphins didn’t have hands to touch a mark. But Reiss and Marino, by then at the New York Aquarium, designed a modified test. When marked with black ink on various parts of their bodies, the dolphins flipped and wriggled in an attempt to see it, convincing the researchers and many others that they recognized themselves.

For Reiss and Marino, the dolphin study was not only convincing, it was a call to action. They and others argue that passing the mirror test indicates a level of self-awareness that makes it unethical to keep a species in captivity. “These animals have at least some level of self-awareness, and if they do, they know where they are, they can be aware of the limitations of their physical environment,” Marino says. She is now the science director for the Nonhuman Rights Project, which is attempting to gain legal rights for animals with higher-order cognitive abilities by getting courts to recognize them as “legal persons,” and Reiss advocates for dolphin protection. Key to their arguments is the scientific evidence that chimps, elephants, cetaceans, and other animals are self-aware like humans. Not only can they suffer, but they can think to themselves, I am suffering.

Gallup, now in his 70s, mainly stays away from advocacy work but he likes to philosophize about what exactly mirror self-recognition shows, and why that capability might have evolved. Clearly, it has little to do with mirrors since aside from the occasional still pond, our distant ancestors would never have encountered their reflections. He’s come to the conclusion that a pass of the mirror test indicates a profound level of consciousness that includes animals’ ability to contemplate their own thoughts and experiences as well as to imagine what others could be thinking and experiencing. This ability is called “theory of mind.” 

For support, he points to the fact that children start to demonstrate theory of mind at roughly around the same time that they start to recognize themselves in the mirror. “You have to be aware of yourself in the first place in order to begin to take into account what other people may know, want, or intend to do,” he says. He notes that people with schizophrenia often cannot recognize themselves in the mirror, and they struggle with theory of mind as well. For example, compared to controls, schizophrenic individuals were less likely to understand a request hidden in a husband’s statement to his wife, “I want to wear that blue shirt, but it’s very creased.” 

Gallup suggests that a powerful sense of self may have evolved because it helped great apes deal with complex social situations. “Intellectual prowess supplanted physical prowess as a means of achieving dominance,” he says. And, he suggests that strong self-awareness may also entail death-awareness. “The next step, it seems to me logically, is to confront and eventually grapple with the inevitability of your own individual demise,” he says. 

As for why dolphins and other non-primates recognize themselves in mirrors, Gallup isn’t yet convinced they do. He suggests an alternative explanation for why his former student’s dolphins wriggled in the mirror: to see marks on what they perceived as another dolphin peering back at them. And he requires replication of recent studies finding that elephants use their trunks to touch white crosses on their foreheads, and magpies dislodge stickers on their chests with their beaks. 

Then there are researchers who discount whether the mirror test says anything about theory of mind in any animal, including humans. Most notably, Gallup’s mentee, Daniel Povinelli. Like a son who witnesses his father’s foibles and decides to become his opposite, Povinelli, now at the University of Louisiana-Lafayette, has become one Gallup’s most outspoken critics, even as they remain close on a personal level. 

He’s come to believe that a chimp doesn’t need to have an integrated sense of self in order to pass the mirror test. Instead, it needs only to notice that the body in the mirror looks and moves the same as its own body, and then make the connection that if there’s a spot on the body in the mirror, there could also be a spot on its own body. That ability would still be pretty sophisticated, Povinelli adds, and it might reflect a keen awareness of the position of body parts that would likely be very helpful for swinging through trees. Indeed, he speculates that this high-level physical self-awareness may have developed when our tree-dwelling ancestors increased in size and faced more challenges while navigating their branchy, leafy world.

Povinelli’s concerns stretch to other landmark studies on theory of mind in chimps, such as those that document how a subordinate chimp refrained from hidden food when she watched a dominant chimp see researchers hide the food. The authors of this study argued that this was because the subordinate chimp reasoned about what the dominant chimp had seen and what it would do. Combined with results from other experiments, they concluded that chimps can “understand both the goals and intentions of others as well as the perception and knowledge of others,” and they can predict the action that will result. 

But Povinelli calls this reasoning “folk psychology”— unscientific inferences made based on our own human experiences. The subordinate chimp doesn’t have to know the dominant’s mind, he says; all she has to know is to avoid interfering with the dominant chimp. 

To apply Povinelli’s logic to humans, we may think deep, reflective thoughts when using a mirror to brush our teeth, but that doesn’t mean that the part of the brain that’s using the mirror to direct our toothbrush is the same part of the brain that’s contemplating the self. Those two abilities may develop at the same time in children, but that does not mean that they’re related, much less one and the same.  

Povinelli’s critiques aside, most comparative psychologists say there’s something to mirror recognition, not least because it’s only been observed in intellectually superior animals. Neuroscientists are now trying to shed light on the matter by searching for a physical basis for the ability in the brain. Although they haven’t found a clear signal yet, Gallup remains undeterred. After nearly 45 years of fending off challengers, he is not likely to wake up in the morning, look in the mirror, and change his mind. ~


In 1953, geologist Marie Tharp created a detailed map of the Atlantic Ocean's floor and discovered the Mid-Atlantic Ridge, Earth's largest physical feature. Tharp's map proved the controversial theory of plate tectonics and established sea floor spreading.

At the Lamont-Doherty Earth Observatory, Tharp's colleague, Bruce Heezen, groaned at her discoveries and dismissed them as "girl talk." But her meticulous work ultimately won over Heezen and the scientific community as a whole.

Working with only pens, ink, and rulers, Tharp took those thousands of sonar readings and literally drew the underwater details of the ocean floor, longitude degree by latitude degree. Tharp used what is known as the physiographic mapping technique, using light and texture for her diagrams instead of color.  

Up until that time, the ocean floor had previously been envisioned as a flat plain of mud.

Tharp and Heezen's maps revealed 40,000 miles of an underwater ridge that runs along the globe. In 1953, Tharp made another remarkable discovery in the mid-Atlantic Ridge, a chain of mountainous volcanoes that runs north to south through the ocean. She had observed a depression in the ridge that appeared to be a continuous crack along its length. Those observations led her to emphasize the theory of continental drift, or seafloor spreading, the idea that the continents move by spreading across the ocean bed. Although unpopular at first, the theories of continental drift and plate tectonics was widely accepted into the scientific community over the years that followed.

In 1999, Tharp fondly remembered her time at the observatory: "The whole world was spread out before me. I had a blank canvas to fill with extraordinary possibilities. … It was a once-in-a-lifetime — a once-in-the-history-of-the-world — opportunity for anyone, but especially for a woman in the 1940s.”


I love the idea that "girl talk" is about the Mid-Atlantic Ridge!

To do justice to Bruce Heezen, he changed his attitude and became Tharp’s most important collaborator.

How come so many people know about Marie Curie while hardly anyone has heard of Marie Tharp, the pioneering ocean cartographer and a major contributor to the tectonic plate theory? I think it’s a matter of “romancing” a person’s life — just as Van Gogh’s sister-in-law made him famous, and an American woman journalist presented Marie Curie as a heroic pioneer woman scientist. Outstanding women scientists simply need more publicity — and their lives indeed provide colorful details.


~ A lot of research — much of it recent — has examined the different types and qualities of attention and their associations with mental health and cognitive functioning. This work has revealed that certain types of attention may tire out your brain and contribute to stress, willpower failures, and other problems.

Meanwhile, activities that broaden and soften your attention may reinvigorate your brain and promote psychological and cognitive wellbeing.

Whenever you train your attention on something — an act that cognitive scientists sometimes call “directed attention” — this requires effort. More effort is needed when other things (i.e. distractions) are vying for your attention, or if the thing you’re trying to focus on is boring.

According to a 2016 review from researchers at the University of Exeter Medical School in the U.K, your ability to effortfully focus your attention is finite. Just as an overworked muscle grows weak, overworking your attention seems to wear it out. When that happens, a lot can go wrong.

For one thing, your ability to concentrate plummets. Your willpower and decision-making abilities also take a hit. According to a 2019 study in the journal Occupational Health Science, attention fatigue may also contribute to stress and burnout.

There’s even some work linking attention fatigue to attention deficit hyperactivity disorder (ADHD). “The symptoms of ADHD and ‘attention fatigue’ so closely mirror each other that the Attention Deficit Disorders Evaluation Scale has been used as a measure of attention fatigue,” wrote the authors of a 2004 study in the American Journal of Public Health.

Experts are still trying to figure out exactly what resource in your brain is drained by effortful directed-attention tasks. They haven’t nailed that down yet. But there’s evidence that directed attention involves frontal and parietal regions of the brain that are also involved in other “cognitive-control” processes. These are the activities that take you out of autopilot and steer you toward goal-directed thoughts and actions — the stuff that isn’t necessarily fun or engaging, but that supports your career, your relationships, and your health.

Distractions, multitasking behaviors, loud noises, bustling urban environments, poor sleep, and many other features of modern life seem to promote attention fatigue. On the other hand, certain activities seem to reinvigorate the brain in ways that support directed attention and self-regulation processes. And one of the most studied and effective of these — as you’ve probably heard — is spending time in nature.

“Getting out in nature seems to relax the brain’s frontal lobes and relieve this attention fatigue,” says Phil Stieg, MD, PhD, chairman of neurological surgery and neurosurgeon-in-chief at New York-Presbyterian/Weill Cornell Medical Center.

Exactly how nature does this is tricky. Stieg says that several overlapping mechanisms of benefit are likely at play.

But one that has garnered a lot of expert attention is termed “soft fascination.” The gist is that natural environments are just stimulating enough to gently engage the brain’s attention without unhelpfully concentrating it.

“[W]hat makes an environment restorative is the combination of attracting involuntary attention softly while at the same time limiting the need for directing attention,” wrote the authors of a 2010 study in Perspectives on Psychological Sciences. Nature, they added, seems to hit that sweet spot.

On the other hand, activities that grab and hold our attention too forcefully — books, social interactions, pretty much anything on a screen — entertaining through they may be, are unlikely to recharge our brain’s batteries. “Unlike soft fascination, hard fascination precludes thinking about anything else, thus making it less restorative,” the study authors added.

A lot of the work on soft fascination is folded into a psychological concept known as Attention Restoration Theory, or ART. While a lot of the ART research highlights time in nature as the optimal route to cognitive replenishment, it’s not the only route.

Mindfulness also promotes attention restoration.

In many ways, it’s a kind of soft-fascination training. Mindfulness attempts to loosen the mind’s preoccupation with self-focused thoughts and judgments while also broadening awareness of your surroundings. This seems a lot like what spending time in nature does automatically, and there’s evidence that moving mindfulness training into natural outdoor settings may augment the practice’s benefits.

Stieg, the New York-Presbyterian/Weill Cornell neurosurgeon, recently discussed the benefits of nature on his podcast This Is Your Brain. He agrees that mindfulness may be a helpful alternative for those who don’t have access to nature (or the time to get lost in it). He also says that avoiding things that fatigue attention — loud noises, multitasking, technology — could reduce your need to escape to the outdoors.

“If you’re on a cell phone for eight hours a day, your attention never gets a rest,” he says. “I don’t think spending time in nature provides all the answers, but there’s good evidence that it support a longer, healthier, emotionally stable life.”

The bigger takeaway may be that your brain needs idle time to rest and recharge. Deprived of that time and the soft-fascination experiences that support it, your psychological and cognitive health may pay a price. ~


~ Being a Christian wasn’t a religion for us; it was a culture. I was raised by a community of believers who loved me almost as much as they loved God. I spent Sundays at the church, and Wednesday nights memorizing songs that taught me the books of the Bible. We weren’t the kind of family that prayed at a Chili’s or had crosses hanging on our living room walls, but we were surrounded by that kind of faith, that outward exuberance for the Lord.

What’s hard to understand now is how much I believed it. When I first started trying to explore my experience as an adult, the only way I could find to really process was through fiction. I needed to create a separate but similar environment to the one I’d grown up in to really see it. I needed to create someone else to be able to see myself. That work of processing became my debut novel God Spare the Girls.

Like the sisters in my book, at one time I saw God’s hand in everything that happened to me. I went to church three days a week and read my Bible almost every day. I loved that it was God’s Word, but I also loved that the stories were so good. There were beheadings and betrayals, redemption and salvation, murders and sacrifices. There were things I didn’t agree with the church about, certainly. I was young and artsy and secretly queer, after all. None of those questions, though, could usurp the absolute conviction I felt at my core.

But in college I lost that certainty. Far from home and attending a church that upset and offended me more often that it comforted me, I let myself ask questions I’d buried. Why did a loving God need to punish people? Why did the Bible say so many things that felt so natural to me were sins? Why weren’t women pastors? The answers were the same answers that had always been there but I could no longer swallow them, felt my mouth becoming acidic and bitter.

There is a verse in the Bible about how we as mere mortals cannot see truth on our own. God has to lift a veil from our eyes (sometimes literally, like in the case of the Apostle Paul) so we can see. A few steps outside the doors of my church-based life and the curtain rose. I saw the ways my abstinence-only education had failed me; how deep the scars were left by homophobic rhetoric; how much money these churches raised and how little of it they gave away; how many of the men leading these congregations used the power they had secured for their own gain above all else.

But it wasn’t so simple. For every way in which the church hurt me, there is another memory: one where a dozen women hug me tight, proud of me for something. Like the protagonist of my book, I had been taught to believe that there was a Christ shaped hole in my body that would feel vacant without my faith. And that felt true after I left the church, but it wasn’t the faith I missed, it was the community: the feeling of belonging, the mutual support and care that makes a church feel like a home.

The problem with losing your religion is that you are left adrift in a world that used to have clear rules and guidelines for how to exist in it. Leaving evangelical Christianity is even more stark: there is no liturgy or culture without belief. I knew how to handle stress or disappointment or frustration as a Christian: to pray, to seek counsel, and to read my Bible. But if your only method of processing is prayer, what do you do when you aren’t sure if there is a God anymore to hear you?

For me, fiction became a space where it was safe to ask the questions, I was afraid to look at. Initially, I worked on this story just for me. I was writing for an audience of one, writing to understand where I came from and to appreciate the culture that had both built and harmed me. In this story, I found the solace I could no longer find in my faith. Through these fictional girls, I learned how much the church gave me, and not just how much it took. 

I remembered everything: the youth group and the women who supported me, the hurtful beliefs I held and that were held against me. It was a painful and terrible to dredge up the pain of both in my experience in the church. But it also gave me the space to recognize another feeling: that though leaving the church was the right decision for me, the decision itself was painful on its own. Writing gave me the ability to grieve that loss at the same time it gave me the tools to recognize just what I had lost.

That was the feeling I wanted to tackle when I started this book. I wanted to write a book for people like me: people who grew up with a faith wrapped around them until they were mummified inside of it, and who one day were cut out of it and left standing, free but insecure, unable to feel so snug and safe again. I wanted to write a book that acknowledges what a beautiful, cathartic, and communal thing a faith community could be while refusing to ignore the problems that can grow in those communities.

God Spare the Girls is a story about two sisters who know the evangelical church for all of its failures and triumphs and who are disproportionately affected by them. It is a story about sisterhood, and trying to figure out who you are, and sexuality. But at its core it is a story about the pain of questioning where you come from. It’s a book I wish I had been able to read, and that I hope can be a balm or at least a reassuring shoulder squeeze for others. ~


My first impulse is to say I never missed anything after leaving the church. Only after more reflection I feel I missed the beauty of Vespers, for instance. Christmas and Easter became empty of any special meaning. But since the demands of school and the attractions of my interests were so strong, there was never the sense of an absence, the alleged “god-sized hole.” The time filled itself automatically. Simply reading books would have been enough, and my intense drive to master English. But I also had my curiosity about the world and my delight in the beauty of nature — and the drama of attending school, which provided some sense of community the way the church definitely didn’t. 

But I think I can sense what the author is trying to convey: after a church-centered life, it must feel lonely and even frightening to have left it behind. So how did Kelsey ever summon the courage to leave? And I remember Freud’s words in The Future of an Illusion: “The voice of the intellect is soft one, but it does not rest until it has gained a hearing. Ultimately, after endless rebuffs, it succeeds. This is one of the few points in which one may be optimistic about the future of mankind.” 

At the same time, I did gain something, and oddly enough it pertains to literature. It would be difficult to understand much of the older literature, which is full of references to bible stories, if I didn’t have the right background. I did most of my reading before the advent of the Internet and thus the ability to quickly look up obscure items. Much of old art would also have remained incomprehensible to me: Jacob wrestling with the angel, the return of the prodigal son, parable of the sower, the endless Annunciations, Nativities and Resurrections, The Last Judgment, and a great deal more. Much of the culture of the past would be closed to me, or at least difficult to grasp. 


One type of religious community is the congregation or parish body of a church. A monastery or convent is a different religious community. In the article, the author misses the evangelical community of her childhood church. Her example of that community is one that promoted the beliefs that were the most painful to her, damaging her mental health.

Her example begs the question: What are the benefits of religious practice in a community? In 2009, the University of Pennsylvania completed a study designed to answer that question. The study suggested that there are benefits from meditation and prayer. To meditate is to engage in spiritual reflection. Prayer is a quiet time with a spiritual focus.

By engaging in these religious practices, a person improves their memory cognition, and compassion while simultaneously reducing anxiety, depression, irritability, and stress. However, the social scientists did not draw the conclusion that these religious practices always produced favorable results.

They learned that those who focus their beliefs on a loving, forgiving, and compassionate God developed a more positive view of themselves and reaped clear-cut benefits from their religious practices. However, those who focused on a dispassionate, vengeful, and unforgiving God became more anxious, depressed, irritable, and stressed.

Jonathan Edwards's sermon, Sinners in the Hands of an Angry God, is an example of a religious focus on a vengeful and punishing God. Furthermore, studies on emotional intelligence and mindfulness demonstrate that a person can acquire positive benefits by practicing non-religious meditation and quiet time.

Paraphrasing Barbara Ehrenreich, quiet time is an interval dedicated to cultivating an attitude that is positive and compassionate. To accomplish this, one reads, listens to music, or sits in silent thought. The physical and mental improvements are comparable to the outcomes of those engaging in religious meditation and prayer.

Other than the above religious practices, a person’s involvement with a religious community seems to have a minimum impact on their mental health. However, studies indicate that community membership produces benefits if the members respect the feelings of others, listen to alternative views, and are not contemptuous in their criticism.


Before Covid I attended the meetings of two Unitarian groups, and found them a welcome addition to my restricted life. At the same time, it didn't escape my attention that we never spoke in religious terms. Was the word "god" ever mentioned? I can't remember. Mainly, we exchanged stories from our lives that were loosely related to a particular theme.

I fondly remember that small community, and hope that it can reconstitute itself. And, to steal from a poem by Robert Cording, "Nobody misses the missing god."  

As Ginette Paris observed, "It's still early after the death of god." Studies have shown that those members of a congregation who seemed to reap the health benefits were not the pious, but those who socialized the most. We need to discover how best to build supportive communities in whatever form, and, for the introvert, keep on teaching meditation or, indeed, simply "quiet time."

My old lover was Catholic and lied to me about the smallest things. Now he's dying and I'm trying to forgive everyone standing in line ahead of me at the grocery store. ~ S. Brook Corfman


Love God? Sometimes I hate him. ~ Martin Luther

Calvinist minister R. C. Sproul, in his book “What Is Reformed Theology?” writes:
Love for God is not natural to us. Even in the redeemed state our souls grow cold and we experience feelings of indifference toward him. When we pray, our minds wander and we indulge in woolgathering . In the midsts of corporate worship, we are bored and find ourselves taking peeks at our watches . . . Our natural lack of love for God is confirmed by our natural lack of desire for him.”

Though the minister cites this as an example of our total depravity, how I wish I’d come across a passage like this one when I agonized over my inability to stay attentive during prayers! I didn’t know it was a normal phenomenon, and everyone’s mind tended to stray. I thought that happened just to me, the wicked one.

But what worried me most was my inability to love god. I saw god as evil so love was impossible. How could you love a being who, presumably to punish the parents, gives their child brain cancer? Jesus was supposedly sweet, but he too would come back to preside as judge at the Last Judgment, tossing the huge majority of people into hell for eternity.

Only Mary was non-punitive, so only Mary could be loved. Others must have felt that way too, since most churches were dedicated to her, most icons and most candles.

But here comes the real jewel. “Sproul cites Luther’s answer to the question ‘Do you love God?’

Luther replied [prior to his rejection of Catholicism], ‘Love God? Sometimes I hate him.’ This is a rare admission among men. Even Luther’s candid reply was less than totally honest. Had he spoken the full truth, he would have said that he hated God all the time.’ (Sproul, 127)

~ John Morreall, “Questions for Christians”

Luther throwing an ink pot at the devil


~ TIME Magazine has reported a discovery that will likely transform medical practice. The article describes age-reversal research performed at the Mayo Clinic and Scripps Research Institute.

The therapy targets and eliminates senescent cells, which are a factor in degenerative aging.

These findings represent a significant leap forward in reducing and even reversing aging at the cellular level.

The Los Angeles Times on July 10, 2018, reported further:

“Aging…is beginning to look more and more like a disease—and a treatable one at that.”

“...[this new study]...not only offers a clear look at the power of senescent cells to drive the aging process, but also a pharmaceutical cocktail that, in mice at least, can slow and even reverse it.”

Compared to mice who aged normally, those that started the dasatinib-quercetin [senolytic] cocktail at an age equivalent to 75 to 90 years in humans ended up living roughly 36% longer, and with better physical function.”

The last quote indicates that elderly people may be able to restore physical functionality and live longer by purging their bodies of senescent cells.

As senescent cells accumulate they inflict systemic damage by:

1) Emitting pro-inflammatory signals and

2) Secreting protein-degrading enzymes.

Mayo Clinic researchers tested the natural compound quercetin in combination with the drug dasatinib [a cancer drug]. The purpose was to target and eliminate these zombie-like senescent cells. This experimental “senolytic” treatment restored youthful function and improved survival.

Scientists have studied the mechanisms of dasatinib and identified intriguing natural compounds (theaflavins) that function in a similar senolytic fashion.

Quercetin and theaflavins have complementary activity, and can provide a dual-action approach to help clear senescent cells from the body to restore youthful function at the cellular level.


What this Mayo Clinic discovery further revealed is the degree of toxicity inflicted by senescent cells:

If only one in 7,000 to 15,000 cells are senescent, then age-related deterioration starts to occur in laboratory mice.

To reiterate, just one senescent cell out of 7,000-15,000 healthy cells can initiate degenerative aging.

To make matters worse, senescent cells seem to pass on their age-accelerating toxicity to normal cells, creating a spiral of pathologic disorders that result in chronic illnesses and premature death.

In our youth, cells naturally eliminate themselves if they become damaged or dysfunctional. 

This process, which damaged (senescent) cells “shut off,” is called apoptosis.

Apoptosis is often referred to as “programmed cell death.” This self-elimination is a normal and important part of maintaining and regenerating healthy tissues. As organs age, however, more cells become senescent.

These zombie-like cells emit harmful compounds that promote inflammation in the surrounding tissue. Chronic inflammation is a major contributor to degenerative disorders.

Published research demonstrates that buildup of senescent cells accelerates aging processes and increases risk of age-related diseases. Diabetes, obesity, stroke, vision loss, neurogenerative disorders, osteoarthritis, emphysema, and cancer can be connected to the presence of senescent cells.

The possibility of removing senescent cells from the body provides a novel approach to modulating and reducing the cellular factors of aging.


Senolytic compounds can be drugs, peptides, or plant extracts that act to cleanse the body of senescent cells.

In human cell and animal studies, consistent findings show that removing senescent cells from the body improves various markers of aging and prolongs lifespan in some models. In a 2016 study published in Science, researchers demonstrated that in a mouse model of atherosclerosis, removal of senescent cells led to significant inhibition of growth and even regression of arterial plaque. This ability to block or reverse plaque growth could be an important step in preventing heart and blood vessel disease.

In another study published in Nature, a mouse model of aging demonstrated that removing senescent cells benefited multiple tissues, while delaying the onset and slowing progression of age-related disorders.


Senescent cells fail to undergo programmed self-elimination. Like a contagion, senescent cells pass on their accelerated aging properties to healthy cells by releasing a number of factors that can cause tissues to deteriorate.

Senolytic compounds target anti-apoptotic pathways. This causes senescent cells to self-destruct and be eliminated from the body.

Senolytics are able to specifically target these cells and activate their suicide switch so that they proceed to die a normal death. In this way, toxic cells are removed from the body without harming highly functioning cells.

[the article goes on to discuss theaflavins and quercetin as natural senolytics; fisetin (naturally found in strawberries) is also a potent senolytic.] ~

From another source: SENESCENT IMMUNE CELLS 

~ In a study recently published in Nature, University of Minnesota Medical School researchers found that senescent immune cells are the most dangerous type of senescent cell.

Cells become senescent when they are damaged or stressed in the body, and they accumulate in our organs as we age. Senescent cells drive inflammation and aging as well as most age-related diseases.

The research team — led by Laura Niedernhofer, MD, PhD, a professor in the Department of Biochemistry, Molecular Biology and Biophysics — discovered that senescent immune cells drive tissue damage all over the body and shorten lifespan. Therefore, senescent immune cells are detrimental and should be targeted with senolytics. ~


While fisetin may be even more potent than quercetin, liposomal quercetin in capsules is available, while I don’t see the equivalent (i.e. in convenient capsules) for fisetin. Until then, you may want eat a lot of strawberries.  

And don’t worry — your liposomal quercetin will be helping your body in getting rid of senescent cells.

In my opinion, the most important supplement is berberine — unless you have access to metformin. Berberine is the closest we can come to metformin by buying an inexpensive supplement. After berberine, it may indeed be quercetin, theaflavin (in black tea) and other senolytics.

Olive oil and MCT oil are also supposed to help the body get rid of the harmful senescent cells. And don’t forget to drink lots of tea, definitely including black tea, for those precious theaflavins. If black tea is literally not your cup of tea, try doubling your consumption of quality green tea it also induces autophagy. (Coffee, dark chocolate, curcumin, exercise, fasting? Yes on all of these; fasting remains the most reliable means to induce autophagy.)

(By the way, have you noticed that if you're just beginning to feel hungry, tea (green or black) will kill that feeling?  Yes, tea, both green and black) is an appetite suppressant; hence its use in weight-loss regimens.)

Spermidine is another compound that stimulates autophagy. A good food source of spermidine is aged cheese such as sharp and extra-sharp cheddar. The more mature the cheese, the more spermidine it contains. The Mediterranean diet provides an abundance of spermidine.

Is there an over-the-counter drug that induces autophagy? "We found that the ancient drug aspirin and its active metabolite salicylate stimulate autophagic flux by virtue of their inhibitory action on acetyltransferase EP300. The inhibition of EP300 results from a direct competition between salicylate and acetyl coenzyme A for binding to the catalytic domain of the enzyme." Yes, aspirin turns out to be yet another calorie-restriction mimic.

food sources of quercetin (though the alleged cranberry looks like blueberry to me; and why are the cherries green?) Marijuana leaves (which can be eaten as any other leafy greens) also contain quercetin.


ending on beauty:

I had come to believe what’s beautiful
had more to do with daring
to take yourself seriously, to stay
the course, whatever the course might be.

~ Stephen Dunn, Always Something More Beautiful