*
SCHLAMPEREI FOR SIGMUND FREUD
I see you raging, aging Herr Professor,
fist beating on the slippery
headpiece of the sofa,
as you rail at that American woman poet:
Trouble is, I am an old man —
you don’t think it worth your while
to love me.
Admit it: it was Torschlusspanik,
the terror of the closing door,
life teasing, “Come on, Handsome,
time is running out.”
Not for your “immortality project,”
safe in history’s coffin.
For the last chance to be the Beloved.
But love is shameless in its Schlamperei —
from Schlampe, slut,
slouching by the lamp-post:
“Darling, let’s pretend we do not know
that for one real kiss you’d give up all” —
the soul on her knees in that dingy light.
And you for forty years at Berggasse 19,
the entrance next to the butcher shop,
the butcher’s name also Sigmund —
your rich neurotics forced to pass
bloody slabs and halved carcasses.
Above, the room where you practice
vivisection of dreams —
except for the dream where you sit
in a barber chair, staring at the sign,
You are requested to close your eyes.
But because you said, The voice of the intellect
is a soft one, but it does not rest
until it’s been heard, I forgive you.
And because you knew
that, deeper than lust, everyone wants to die.
What did you care for the Schlamperei
of those “instincts” you tried to track —
The word rhymes with eye.
~ Oriana
(The American woman poet mentioned in the first stanza is H.D.)
*
THE STORY BEHIND “ONE DAY IN THE LIFE OF IVAN DENISOVICH”
By 1973, Aleksandr Solzhenitsyn was already “Russia’s pre‐eminent living writer,” according to the New York Times. His novels, including One Day in the Life of Ivan Denisovich, were well regarded, especially in the West. He had even won the 1970 Nobel Prize for Literature for “the ethical force with which he has pursued the indispensable traditions of Russian literature.”
But The Gulag Archipelago, published on December 28, 1973, by YMCA-Press in Paris, stood alone. Solzhenitsyn called it his “main” work, threading personal experience, history and literature into a sprawling 300,000-word narrative of Soviet prison camps. His broaching of this taboo topic, however, came at a great personal cost.
Alexander Solzhenitsyn in West Germany following his deportation from the Soviet Union in February 1974
Solzhenitsyn’s troubles with Soviet authorities began in 1945. He was a member of the Soviet Army at the time, stationed in East Prussia as World War II drew to a close. Agents from Smersh, a Soviet spy agency, arrested the 26-year-old, who had been decorated for heroism in battle and was at the time a loyal Communist Party member, because he referred to Joseph Stalin disrespectfully in a letter to an old friend.
A court sentenced him to eight years in the constellation of brutal prison camps across the Soviet Union that he later called the “gulag archipelago.”
“This Archipelago crisscrossed and patterned that other country within which it was located, like a gigantic patchwork, cutting into its cities, hovering over its streets,” Solzhenitsyn wrote in the preface to The Gulag Archipelago, which he composed between 1958 and 1968. “Yet there were many who did not even guess at its presence and many, many others who had heard something vague. And only those who had been there knew the whole truth.”
As Solzhenitsyn suggested, writing about the gulag system was verboten in large part because Soviet authorities sought to deny or obfuscate its very existence. Forced labor camps in the Soviet Union began in 1919 under Vladimir Lenin, but their population ballooned into the millions when Stalin rose to power. Under Stalin, more and more intellectuals, dissidents, prisoners of war, rich farmers and innocent people were persecuted, arrested and purged.
In the three-volume book, Solzhenitsyn offered a sweeping story of the Soviet prison system—from arrest to torture, execution, starvation and long hours of toil in the camps. He drew from his personal experience and interviews with hundreds of survivors. The New York Times called the text “a heroic accomplishment.”
A court sentenced him to eight years in the constellation of brutal prison camps across the Soviet Union that he later called the “gulag archipelago.”
Its official reception in the Soviet Union, however, was less enthusiastic due to Solzhenitsyn’s argument that the gulag system was not just an aberration under Stalin, but embedded deep in the rotting core of Soviet ideology.
In the aftermath of The Gulag Archipelago’s publication, Solzhenitsyn was deported from the USSR and stripped of his citizenship. At first he went to West Germany and stayed with leading postwar German author Heinrich Böll. After brief stints in Zurich and Canada, he and his family settled in Cavendish, a village in rural Vermont. In exile—yet close to Dartmouth College’s extensive library—the author labored away on his epic multi-volume novel about the Russian Revolution, The Red Wheel.
Solzhenitsyn’s exploration and critique of Soviet history and authoritarian power never ceased. Neither did his desire to return home from a long exile that began with his arrest in 1945—the moment when, he wrote, “the gate behind us, the gate to our past life, is slammed shut once and for all.”
https://www.smithsonianmag.com/smart-news/discover-the-story-behind-a-legendary-expose-of-the-brutality-of-the-soviet-union-180985658/?utm_source=firefox-newtab-en-us
from City Journal:
Aleksandr Isaievech Solzhenitsyn was a writer of immense talent and spiritual depth, the century’s greatest critic of the totalitarian immolation of liberty and human dignity, a thinker and moral witness who illumined the fate of the human soul hemmed in by barbed wire in the East, and a materialist cornucopia in the West, the mature Solzhenitsyn remained remarkably faithful to the twin imperatives of courage and truth. A modern Saint George, he slew the dragon of ideological despotism with rare eloquence, determination, and grit. For that alone, he deserves to be forever remembered.
He had two great “missions,” as he called them: to witness to those who suffered and perished in the Soviet prison-camp system (and accompanying manifestations of Communist repression); and to trace the roots of the Soviet tragedy in the great unfolding “red wheel,” especially in the February Revolution of 1917 that preceded the October Revolution later that year and made it all but inevitable.
He is the author of two great “literary cathedrals,” as the Solzhenitsyn scholar Georges Nivat put it: The Gulag Archipelago and The Red Wheel, two “experiments in literary investigation” that will require decades to come to terms with in any adequate way.
Many silly and even pernicious things have been written about Solzhenitsyn by those who confuse love of truth with dogmatism, and the “active struggle with evil,” as Solzhenitsyn once described it, with moral fanaticism. And among these tendentious critics are those who mock patriotism, repentance, self-limitation, and liberty under God—that is, all of Solzhenitsyn’s enduring themes and commitments.
Solzhenitsyn’s was a long but ultimately rewarding journey. Since early boyhood, he wished to become a writer. One of the key chapters of August 1914 (the first volume of The Red Wheel), depicting the Battle of Tannenberg and the suicide of General Samsonov, was already written in the fall of 1936, before Solzhenitsyn was 18. He dreaded what kind of writer he might have become without the experience of the Gulag.
It was in the prison camp in 1945 and 1946, as he describes it in various interviews and in “The Ascent”—his account in the central section of The Gulag Archipelago of how the scales of ideology fell from his eyes—that he was “completely cleansed of any Marxist belief.” His cellmates helped him see the light of truth and the unparalleled mendacity of the ideological lie, the destructive illusion that evil is not inherent in the human soul, that human beings and societies can be transformed at a revolutionary stroke, and that free will is subordinate to historical necessity.
Solzhenitsyn’s life is marked by this great paradox: in the camps, cold and hungry, and subject to limitless repression by camp guards and camp authorities, he recovered an appreciation of the purpose of things.
By the age of 27, Solzhenitsyn had an outlook that he retained until his death on August 3, 2008, a worldview deepened only by his subsequent reaffirmation of faith in the living God near the end of his imprisonment at Ekibastuz in the Kazakh Steppes in the early 1950s. In the introduction (“Inception”) to his autobiographical poem The Road (7,000 lines memorized in the camps and composed without benefit of pen and paper), Solzhenitsyn had already spoken of his burning desire to convey the experience of the camps to an uncomprehending world. I quote from a translation being prepared by the author’s son, Ignat:
With lucid understanding, not with ire—
The time to write is now, precisely now!
Solzhenitsyn wrote with “lucid understanding,” and with no small dose of scorn, about the “Progressive Doctrine,” the inhuman ideology that justified terror and tyranny as no regime or ideological movement had ever justified the killing and repression of real or imagined “enemies of the People.” He showed that the heart of Bolshevism lay in a monstrous coming together of violence and lies that gave rise not to mere dictatorship but to a totalitarianism that transformed betrayal and lying into “forms of existence.” This totalitarianism demanded fierce resistance, both for the sake of liberty and for the right of the human soul to breathe freely, with the dignity afforded it by God.
Solzhenitsyn would become the most eloquent critic of ideological revolution, the “vain hope that revolution can improve human nature,” as he said in the Vendée in the fall of 1993. He saw many affinities between the French and Russian revolutions, not least the shared hope that revolution could transform human nature and regenerate the human race.
Instead, Solzhenitsyn stood for repentance and self-limitation, and for a conception of self-government (beginning with the arts of local liberty) that emphasized the importance of civic virtue. Here he was indebted to Tocqueville, to the zemstvos or nineteenth century Russian provincial and local councils, and to the experience of local liberty that he witnessed (and admired) during his western exile in Switzerland and New England between 1974 and 1994. He spoke with admiration for such local liberty in his farewell to the people of Cavendish, Vermont, on February 28, 1994. It was a tradition of liberty from the bottom up much needed in contemporary Russia, he observed.
Solzhenitsyn, denounced by some as a supporter of messianic nationalism (something he always repudiated, even when it manifested itself in a great writer like Dostoevsky), also provided an enduring model of constructive patriotism. He loved Russia profoundly but refused to identify his wounded nation with a Soviet despotism that stood for religious repression, collective farm slavery, and the elimination of political liberty and a tradition of literary reflection that spoke to the health of Russia and the permanent needs of the soul. He wanted Russia to abandon destructive dreams of empire and turn inward, but without forgetting the sorry fate of the 25 million Russians left in the “near abroad” after the break-up of the Soviet Union.
How one evaluates Solzhenitsyn tells us much about how one ultimately understands human liberty: Is it rooted in the gift of free will bestowed by a just, loving, and Providential God? Or is it rooted in an irreligious humanism, which all too often leads to human self-enslavement, as we saw with the totalitarian regimes of the twentieth century? Solzhenitsyn’s reasonable choice for “Liberty under God” has nothing to do with mysticism, authoritarianism, or some illiberal theocratic impulse. Those who attribute these positions to Solzhenitsyn cannot provide a single sentence to support such misrepresentations.
Solzhenitsyn spoke in the name of an older Western and Christian civilization, still connected to the “deep reserves of mercy and sacrifice” at the heart of ordered liberty. It is a mark of the erosion of that rich tradition that its voice is so hard to hear in our late modern world, more—and more single-mindedly—devoted to what Solzhenitsyn called “anthropocentricity,” an incoherent and self-destructive atheistic humanism. Solzhenitsyn asks no special privileges for biblical religion (and classical philosophy), just a place at the table and a serious consideration within our souls.
In 1998’s Russia in Collapse, he forcefully attacked “radical nationalism…the elevation of one’s nationality above our higher spiritual plank, above our humble stance before heaven.” And he never ceased castigating so-called Russian nationalists, who preferred “a small-minded alliance with [Russia’s] destroyers” (the Communists or Bolsheviks). He loved his country but loved truth and justice more.
But as Solzhenitsyn stated with great eloquence in the Nobel Lecture, “nations are the wealth of mankind, its generalized personalities.” He did not support the leveling of nations in the name of cosmopolitanism or of a pagan nationalism that forgot that all nations remain under the judgment of God and the moral law. In this regard, Solzhenitsyn combines patriotism with moderation or self-limitation. One does not learn from Solzhenitsyn to hate other peoples, or to deny each nation’s right to its special path, one that respects common morality and elementary human decency.
In a revealing conversation with the Italian Russianist Vittorio Strada on October 20, 2000, Solzhenitsyn was asked what his message was to the young generation of today. He told them to resist “the temptation of consumerism” and to develop a much-needed “interior shut-off mechanism, internal limits.” Once again we hear the voice of the great advocate of voluntary self-restriction. Solzhenitsyn also warned the young against historicist complacency, the view that we are necessarily “entering into a happy century,” after the wars and tyrannies of the twentieth century.
Solzhenitsyn always remained sensitive to the great harm caused by utopian illusions. Yet once again, the great Russian writer and moral witness ends on a note of hope. “Life will be hard,” he tells them, “but circumstances will never defeat the human will,” as long as that will “is concentrated and focused on what is true.”
We owe gratitude to this great man and writer for his courage, his fidelity to truth and to freedom (rightly understood), and his timeless reminder that, among the clamor of modern life, we should not lose focus on the underlying purpose of the human adventure, or the freedom granted to us. Beyond totalitarianism, there is the task of building regimes of self-government worthy of all the possibilities—and limits—of the human soul. In that great task, Solzhenitsyn remains as relevant as ever. And his art—and the profound philosophical reflections and insights embedded in his works—remain a gift to us all.
Solzhenitsyn embarked on a train journey across Russia in the summer of 1994, after nearly 20 years in exile.
https://www.city-journal.org/article/solzhenitsyn-a-centennial-tribute#:~:text=Aleksandr%20Solzhenitsyn:%20Reflections%20on%20totalitarianism%27s%20greatest%20critic%20%7C%20Daniel%20J.%20Mahoney%2C%20City%20Journal
*
WHAT PUTIN “GAINED” IN THE WAR WITH UKRAINE
Russia’s inflation is 35%, the national wealth fund is empty, no money to pay soldiers, Soviet weapon stockpiles finished, a million of maimed and killed, and after Ukraine began striking deep into the Russian territory, the people — even in Siberia — are getting worried that the war can come to their own homes.
In close to three years, nothing is going to Putin’s plan.
Russia failed to take Kyiv and was defeated at Kherson, Mykolaiv, Zaporizhia, Chernihiv, Sumy, and Kharkiv.
Russia lost at least 200,000 KIA, 20,000 combat vehicles.
At least 9 Russian generals killed.
Bottomless Soviet-era stockpiles of weapons exhausted, and now Russia has to beg Iran and North Korea for munitions and missiles.
Russia had to announce mobilization — 1st time after WW2.
Black Sea fleet decimated, including the flagship ‘Moskva’.
Russia expended 10,000 missiles and failed to destroy Ukraine’s power grid, which Putin himself admitted was the goal.
Russia’s oil industry is getting severe blow because of Ukrainian strikes.
Multiple oil depots and ammo depots obliterated.
After nearly 3 years of total war, Russia couldn’t occupy a single regional city (3 regional cities were occupied by Russia in 2014).
Ukraine still controls 80% of its territory.
NATO added 1,300 km on Russia’s border, after Finland and Sweden joined the alliance. Missiles can now reach Putin’s residence in Valdai (between St. Petersburg and Moscow) even faster..
Ukraine acquired advanced western weapons and air defense systems that exceed the Russian analogues.
Russia lost lucrative European gas market and is losing oil markets.
Nearly a million of young, educated Russians emigrated.
Ukraine invaded the Kursk region of Russia 5 months ago, and Russia is unable to squeeze them out — despite bringing North Korean troops for help.
Russia lost its bases in Syria and was unable to save the regime of Bashar al-Assad.
Russian troops got kicked out of Nagorno-Karabakh by Azerbaijan.
Russian economy is in deep trouble due to sanctions and overheating, because of militarization of productive industries.
Russia’s base interest rate by central bank is at 21%, home loans are at 30%, consumer loans at 40–50%.
Russia’s reliance on China has greatly increased.
Ukraine’s international weight skyrocketed.
Russia’s military potential will take decades to restore.
After nearly 3 years of total war, Russia is still stuck in Donbas.
Things are not going well for Vladimir Putin.
Not going well at all. ~ Elena Gold, Quora
One of Russia's overage recruits
*
PUTIN’S RUSSIA ROSE LIKE HITLER’S GERMANY
The striking similarities between Vladimir Putin’s Russia and Adolf Hitler’s Germany are not accidental. Both regimes had — the past tense is intentional — the same historical trajectory because both were the product of imperial collapse and its destabilizing aftermath on the one hand and the emergence of a strong leader promising to make the country great again on the other.
In contrast to most empires, which decay and progressively lose their colonial possessions over time, both Wilhelmine Germany and Czarist Russia collapsed — swiftly and completely — at the height of their power in 1917-1918.
Decay inures imperial elites to the loss of colonies, enables them to formulate different ideologies centered on the nation state, and reduces the number of institutional and economic ties between the imperial core and its colonies. The Ottoman Empire is an excellent example of the decay dynamic. Turkey’s founding father, Mustafa Kemal Atatürk, fought the Greeks but was perfectly satisfied with the Turkish state.
Empires that collapse — usually as a result of a war or some other severe crisis — experience a sudden severing of political ties between the imperial metropolis and the colonies, but the imperial mindset remains dominant in the metropolis and the economic and institutional connections between core and periphery remain strong.
Almost inevitably, the post-collapse economies, societies and cultures of the metropolis experienced enormous disarray — as in Germany in the 1920s and Russia in the 1990s. The blame for this sad state fell on the democratic elites who came to power after the authoritarian empire ended.
Once democracy was discredited, strong men appeared — Hitler and Putin — promising to return their countries to their rightful place in the sun and establishing cults of personality. The Nazis argued that Germany should have one people, one empire, and one Führer; the Putinists claimed that Putin embodied the state. Nazi propaganda emphasized Hitler’s genius and benevolence; Putinist propaganda focused on Putin’s virility and ability to outwit the world.
In such circumstances, the former metropolis had every incentive to rebuild the old empire. Imperial revival was popular, enhanced elite legitimacy, promised to revive the economy and extirpate humiliating memories of collapse, and seemed to guarantee great-power status.
Central to their attempts at re-imperialization was the false claim that their ethnic brethren in the newly independent colonies were being oppressed: the Germans in Austria, Czechoslovakia and Poland; the Russians in all the post-Soviet states, and especially Ukraine.
Tentative stabs at expansion followed. Hitler grabbed the Rhineland, Austria, and the Sudetenland. Putin grabbed Chechnya, parts of Georgia, and parts of Ukraine. Given their imperial mindsets, militaristic ambitions, personality cults and demonization of minorities (Jews and Ukrainians), it was almost inevitable that Hitler and Putin then embarked on major wars. In 1939, Hitler attacked Poland; in 1941, he attacked the USSR. Putin’s war with Ukraine began on Feb. 24, 2022.
As often happens with leaders who believe their own propaganda, both Hitler and Putin committed strategic mistakes that resulted in their downfall. The Bolsheviks were able to reestablish most of the czarist empire because their militaries and economies were stronger than those of the former colonies, while the powerful countries of the West were distracted by the war.
Hitler’s and Putin’s fatal error was not to have heeded the Bolshevik example and, instead, to have antagonized a whole array of states with more hard power than they had. Expansion was one thing: Europe and the United States ignored or downplayed it. A major land war threatened the stability and survival of Eurasia and could not go unheeded.
Hitler’s generals knew they had lost when they failed to win the Battle of Britain and the United States entered the war. It took millions of dead and the Holocaust before Germany was finally defeated and Hitler committed suicide in his bunker.
Putin’s generals also appear to have known they would not win after their attempt at a blitzkrieg failed to capture Kyiv. It has taken thousands of dead and Russia’s genocide of Ukrainians to align scores of countries — and, in particular, the United States and the United Kingdom — with Ukraine and to provide it with the heavy weaponry it needs to defeat Russia.
Fittingly, Putin reportedly also resides in a bunker. In all likelihood, that’s where he, too, will meet his end.
The death and destruction will have been as enormous as they will have been unnecessary. But, as after World War II, the West again will have the opportunity to create a security architecture that provides for Russia’s de-Putinization and a durable peace.
*
FOUR WORDS OF WISDOM FROM ARNOLD SCHWARZE
NEGGER
"Stay Busy. Be Useful.”
Although not as universally recognized as "I'll be back", these four words have been a guiding star for Arnold Schwarzenegger, who has achieved massive success from bodybuilding to the silver screen, and even politics. This mantra, passed down from his father and deeply ingrained during his formative years, spurred him on a lifelong journey of continuous growth, instilled a relentless drive, and propelled him to dizzying heights of success. But more than a motivational quote, it encapsulates an essential component of our human experience — a sense of purpose.
Psychological research has repeatedly emphasized the relationship between a sense of purpose and well-being. Having a purpose or meaningful goal to work towards can act as a counterforce against depressive symptoms and a catalyst for happiness. But how does staying occupied or feeling useful translate into psychological benefits? Let's unpack some research.
The Benefits of Purpose
McKnight and Kashdan (2009) define purpose and highlight its role as a central aspect of well-being. They state that purpose isn't a simple yes-or-no matter; it's multi-dimensional. This encompasses three main areas: scope, strength, and awareness. Scope reflects the role purpose plays across different areas of a person's life. Strength involves the impact of purpose on one's actions and emotions. Awareness, on the other hand, involves the person's consciousness and ability to articulate their purpose. Interestingly, even when a purpose is not at the forefront of our minds at all times, it can still unconsciously motivate us and influence our behavior, especially when it is tied to meaningful cues in our environment.
Importantly, having a strong sense of purpose has direct health benefits too. McKnight and Kashdan outline that those who had a solid sense of purpose experienced less psychological distress and more well-being. It’s as if having a purpose serves as psychological armor, shielding us from life's adversities and offering tranquility in tumultuous times.
So, having a cause greater than oneself can soften the blow of life's inevitable challenges. A longitudinal study by Kim et al. (2014) supported this, suggesting that a robust sense of purpose was associated with greater use of several preventive healthcare services and also fewer nights spent hospitalized. Similarly, Koizumi et al. (2008) found that male seniors with a well-defined purpose experienced a lower risk of cardiovascular diseases.
To be clear, purpose is not about mindless hustling. It's about dedicating oneself to meaningful tasks and goals. When we have something to strive for, something that gives us a sense of purpose, we are better equipped to face life's challenges. Though a strong sense of purpose doesn't necessarily spare us from hardships, it equips us with the tools to tackle them. It promises not an effortless existence, but a fulfilling one. Being productive and useful, not just for ourselves but also for those around us, strengthens our sense of purpose, which adds to our own happiness and contributes to the well-being of others.
Finding Your Purpose
In a world that often confuses busyness with purpose, remember that true purpose stems from an inner alignment with our deeply held values and aspirations. Think of purpose as an anchor that keeps us steady, even when life gets stormy. But discovering and crafting our purpose is easier said than done. It requires patience and introspection. It requires real work. It’s not going to be easy. But it is going to be worth it in the long run.
The clearer you are about who you are and what your purpose is in life, the greater your contribution to this world. So, living by the motto "Stay busy. Be useful." might just be a good starting point for a more meaningful life.
https://www.inc.com/jessica-stillman/arnold-schwarzenegger-says-a-happy-life-boils-down-to-these-4-words-psychology-agrees-with-him/91107058
*
PERSONALITY TRAITS ASSOCIATED WITH LIFELONG SINGLEHOOD
In 2023, 46.4% of American adults were single. The proportion of adults who have never been married has never been higher.
Researchers sampled more than 77,000 individuals over age 50 to compare the Big Five personality traits of lifelong singles with those of people in a committed relationship.
They found that lifelong singles reported lower levels of extraversion, openness, and conscientiousness compared to ever-partnered people and rated life satisfaction lower.
It’s trendy to be single. According to a 2023 data release from the U.S. Census Bureau, 46.4% of American adults were single. The proportion of adults who have never been married reached record highs: Roughly 32% of women and 37% of men have never tied the knot, per the report. Those rates were 22% and 30% in 1980.
The increasing share of single and never-married individuals can be attributed to several factors, including greater longevity and more women in the workforce. And the trend toward singlehood is here to stay if Gen Z is any indication.
They’ve embraced a new style of in-between relationship: the “situationship.” Writing for CNN, recent college graduate Sara Forastieri Vicente described it as “more than a friendship but less than a committed relationship” involving “both emotional and physical intimacy.”
“We’ve created our own small world in this vast universe of romance and love, one that normalizes fluidity and casualness in romantic partners,” she wrote.
But is eschewing attachment a recipe for emotional fulfillment? If a recent study published in Psychological Science is any indication, perhaps not.
Personality traits and relationship status
An international team of researchers sampled more than 77,000 people over fifty years old living in 27 European countries and Israel. They sought to compare the Big Five personality traits of lifelong singles with people who have been in committed relationships. The Big Five personality traits are:
Openness. Reflects how curious and receptive someone is to novel experiences.
Conscientiousness. Describes how organized, responsible, and detail-oriented a person is.
Extraversion. Indicates how outgoing and sociable someone is.
Agreeableness. Represents how cooperative and empathetic someone is.
Neuroticism. Measures how emotionally stable someone is.
The researchers found that lifelong singles reported lower levels of extraversion, openness, and conscientiousness. They also rated their life satisfaction as lower than ever-partnered people did.
Personality differences between single and partnered individuals were relatively small for conscientiousness and openness — about three points lower on a 100-point scale. But the divide was greater for extraversion and life satisfaction. Lifelong singles scored just under six points lower in extraversion and just over four points lower in life satisfaction.
In many respects, the findings conform to conventional stereotypes. Extraverted and open individuals are more likely to get out, meet people, and potentially find themselves in a relationship. Moreover, being partnered can also force individuals to try new things. Conscientiousness is often prized in a partner. Being organized and responsible facilitates dating and cohabitation. Relationships can also encourage individuals to develop these skills.
The authors specifically looked at older individuals because the data would be more likely to capture people who are single or partnered by choice. Greater age also allows more time for these relationship decisions to affect personality.
Should Gen Zers be concerned about these results? Will a life full of “situationships” ultimately be less satisfying and dull their personalities? It’s uncertain. Core aspects of an individual’s personality tend to be stable, but significant changes can occur over decades. Moreover, the present study can’t parse whether lifelong singles’ personalities and life satisfaction differ due to their relationship choices or if their personalities dictate their relationship style.
The researchers note that what it means to be single is changing, so the results don’t necessarily portend the future for today’s young people.
They write: “More recent cohorts likely differ from older cohorts in norms and acceptance of singlehood, given that the importance of marriage for well-being is declining, that more people choose to stay single, and that younger cohorts report lower importance of partnership for happiness.”
https://bigthink.com/neuropsych/lifelong-singles-differ-in-personality/?utm_source=firefox-newtab-en-us
*
THE BRUTALIST: IMPRESSIVE BUT FLAWED
One of the many things that makes Brady Corbet’s “The Brutalist” so essential is how it defies easy categorization. It is “about” so many things without specifically hammering, highlighting, or bullet-pointing them. Sure, it’s impossible to miss the commentary on capitalism embedded in the script by Corbet and Mona Fastvold. Still, it’s also a story of immigration, addiction, Zionism, architecture, inequity, class, violence, and even filmmaking. The word ambitious is overused in modern criticism, but the very existence of “The Brutalist” feels like a miracle: An original story shot on VistaVision cameras, released in 70mm, complete with overture and intermission. It’s a film that turns inward into itself, winding its themes around its characters like a great American novel.
Adrien Brody does the best work of his career as László Tóth, who is introduced in an essential, tone-setting sequence. At first, it’s hard to tell where he is, surrounded by people in an overcrowded space with the cacophony of conversations around him and the booming score from Daniel Blumberg starting to make itself known. As he moves through the crowd, he pushes himself through doors and into sunlight, his face bursting with happiness at the sight of the Statue of Liberty, but Corbet and cinematographer Lol Crawley warp the moment by presenting the iconic structure upside down, at the top of the frame. The statue shifts to the side, but it’s never upright, a warped symbol of the American dream, an overture of the film’s main theme to follow in the form of an unforgettable image. This prologue also includes a quote from Goethe that feels like the most pronounced Corbet & Fastvold get in how to read what follows: “None are more hopelessly enslaved than those who falsely believe themselves free.”
Tóth believes himself free, getting a job at his cousin Attila’s (Alessandro Nivola, always good) furniture shop, notably named Miller & Sons despite there being no Miller and no sons. Like that floating statue, Corbet & Fastvold are seeding themes that will grow later, playing with the artifice of capitalism, a structure that sells the comfort of a family business over actual artistry.
When Tóth designs a chair to be put in the front window, Attila’s wife Audrey (Emma Laird) tells him it looks like a tricycle. This is a film that experiments with form while also being narratively about how people exploit artistry and value function over expression. It will eventually become a story of a hollow monument, a building with a benefactor who wants to make something for everyone but has no creative passion of his own to put in this empty structure.
László’s life changes when Harry Lee Van Buren (Joe Alwyn) comes to Miller & Sons to hire them to remodel his father Harrison’s library while he’s away from home. The project falls apart when Harrison (Guy Pearce) returns home in a fury, angry that his house is being torn apart by people he’s never met, and refusing to pay. The drama leads to an emotional decision by Attila, who kicks László out of his home, sending him into an addiction spiral with his friend Gordon (Isaach de Bankolé, perfectly understated), until Harrison returns with an apology. He brings László into his world of upper-class snobs, people who display their wealth like it has any meaning, even to a Holocaust survivor, who they see as another object to own.
Harrison offers to help László bring his wife Erzsébet (Felicity Jones) and their niece Zsófia (Raffey Cassidy) over from Europe, but it’s a prelude to what he really wants: The design of a community center that will serve as a tribute to Harrison’s recently deceased mother. It’s a place to gather, but also a place that he controls, and one that he claims will look forward but is anchored in the past by being a monument to his mother.
There’s a key scene in the film just before Harrison makes this public proposal in which he asks László why he’s an architect, and the survivor speaks about how his structures have reportedly survived the war and how they will speak for generations after the conflict. “My buildings were designed to endure such erosion,” he says. Kind of like film. It’s not hard to read “The Brutalist,” a work with technical ambition like no other this year, as a commentary on its own existence, a monument to the art of filmmaking as much as anything.
Harrison seeks to control László from the very beginning. He uses rage in that first scene, he literally throws money at him in a key second half scene (and then asks him to give it back), and the climax of the first pre-intermission half sets their relationship perfectly. After offering legal assistance to make his dreams come true in a manner that will surely tie them together, he then basically forces him to move in by not giving him a ride back that night and forcing him to listen to his ideas the next day.
Harrison will eventually cross all lines of physical and moral righteousness, a clear parallel to how capitalism destroys art, taking from it what it wants and needs before disposing of it. Some have criticized the sharp turn that the film takes with Harrison and László, but repeat viewings make it clear how much that kind of brutal ownership is there from the very beginning.
Of course, an American epic like “The Brutalist” only works if the cast is on the same page as the creator, and the majority of the performers here deliver. Jones feels a bit miscast and stumbles in a late-movie dramatic confrontation, but Brody & Pearce make up for any flaws in the ensemble by carrying the second half of the film. Brody’s performance is one of broad expressiveness, the overflowing emotion of seeing the Statue of Liberty or the tears that roll down his face on hugging his cousin, and then watching that joy leave his countenance as the world around him erodes it away. It’s a strong contender for the best performance of the year in any film. Pearce balances him perfectly, playing Harrison as a force of selfish nature, giving the film a much-needed jolt from his very first moment on-screen, and perfectly capturing the kind of wealthy monster who discards anyone around him once he’s used them up.
“The Brutalist” is also a technical marvel, most notably in Crawley’s fluid cinematography, crafting compositions that look gorgeous in 70mm without ever feeling overly showy. His work is organic and beautiful, and it’s anchored by excellent editing from Dávid Jancsó and an effective score by Blumberg. The sound design as a whole is a load-bearing beam in this film’s construction, from the hum of that first scene to the many sequences of men at work, the background noise of the “American Dream” after World War II.
Some will look at the 215-minute runtime of “The Brutalist” and bring out that dreaded word when it comes to serious, long movies: pretentious. Of course it’s pretentious. You couldn’t make this movie effectively without pretension. But one person’s pretentious is another’s ambitious, and I wish we had more movies this pretentious, this unapologetic, this willing to do more with film than so many even consider.
“The Brutalist” is a work that incorporates well-known world history into two of the definitive forms of expression of the 20th century in architecture and filmmaking, becoming a commentary on both capitalism and art. Both are essential to the story of the human experience. Both can be beautiful. Both can be brutal.
https://www.rogerebert.com/reviews/the-brutalist-film-review-2024
from another source:
~ There is no swerving the irony that The Brutalist, a sweeping drama about the American immigrant experience, lands in cinemas in the very week that the 47th President of the United States starts the process of booting loads of them out.
Adrien Brody already has a Golden Globe for his lead performance as a Hungarian-Jewish architect who, released from the horrors of Buchenwald concentration camp, arrives in the United States and begins to rebuild his life and career.
There are many impressive things about this film, not least the acting, but for me it too often loses its narrative grip in the second act, veering off on tangents that feel unnecessary, distracting and self-indulgent.
Ostensibly, construction is what The Brutalist is largely about. Laszlo Toth (Brody) is a Bauhaus-trained architect whom we first meet amid the tumult of middle-Europe’s liberation from the Nazis. But before long he is arriving in New York Harbor where he gets a skew-whiff sighting of the Statue of Liberty, heavily symbolic of experiences to come as the American Dream turns out to be, if not illusory, decidedly compromised.
To begin with, though, it is full of heady promise. There is a very moving scene when Toth meets up with his already-assimilated cousin Attila (Alessandro Nivola) and the pair hug as if daring fate ever to pull them apart again.
Attila owns a furniture store in Philadelphia called Miller & Sons. He has anglicized his name and invented offspring because, he says, Americans love a family business. That’s the sort of thing that European Jews have to do to fit in, but the intense, passionate Toth never really does; he never entirely escapes the scourge of anti-Semitism which, in a way that only eventually becomes clear, even shapes the style and philosophy of his brutalist architecture.
Atilla gives him a job, but after a commission goes wrong the cousins fall out.
Toth, increasingly reliant on alcohol and heroin, is forced to work on building sites until he falls under the patronage of a volatile millionaire industrialist and socialite called Harrison Lee Van Buren (Guy Pearce).
Once Van Buren hires him to design a mighty community center named after his beloved late mother, Toth’s American Dream seems complete, despite the sly enmity of his patron’s entitled and supercilious son Harry (Joe Alwyn).
Along with dozens of tons of concrete, he pours his heart and soul into the project. Yet it is built, metaphorically, on shifting sands. Toth might be difficult, stubborn and obstreperous, but the Van Burens are not worthy of his heart and cannot buy his soul.
All this unfolds absorbingly, but the narrative takes an unwelcome lurch sideways following that blessedly welcome intermission when, thanks to the Displaced Persons Act and some string-pulling by Van Buren, Toth’s osteoporosis-stricken wife Erzsebet (Felicity Jones) and niece Zsofia (Raffey Cassidy), an elective mute, join him in America.
Erzsebet’s arrival introduces a jolting psychosexual dimension to the drama which it simply doesn’t need. And later, be warned, there is a rape scene all the more shocking for being entirely unexpected.
As an exercise in story-telling, The Brutalist has genuine grandeur and ambition. It is beautifully shot and scored, and in several ways stands comparison with the great coming-to-America movies such as The Godfather: Part Two (1974), except that it replaces guns with girders.
Brody is excellent, so too Pearce. But the film does not soar like I kept hoping it would.
It whisks us well beyond the mezzanine level of expectation, but never to the roof.
https://www.bostonglobe.com/2025/01/09/arts/the-brutalist-adrien-brody-brady-corbet/
Oriana:
I was struck by the concentration-camp aura of some of the interiors —it made me think of the gas chamber in Auschwitz, though it was the walled-in entrance to the chamber that evoked that for me even more (known to me from photographs only). In perusing reviews, I found this: "the interiors of the community center were made to resemble the concentration camps László was subjected to, meaning his trauma was developed into his work.”
But ultimately the message of the movie is that it’s end result is what matters, as Laszlo's niece states in her speech: “"No matter what the others try and sell you, it is the destination, not the journey.”
https://screenrant.com/the-brutalist-ending-explained/
*
SHOULD WOMEN STAY SINGLE?
More women pause before rushing into marriage. Historically, the institution represented women's only path for financial security, but social and economic pressures have subsided in advanced economies. Research studies along with cultural and demographic trends support decisions to delay marriage or not marry at all. “Women are increasingly educated, more economically independent, gaining more opportunities outside marriage and embracing freedom,” explains Joseph Chamie. “All these factors contribute to single women capable of maintaining a stable financial life.”
Marriage requires more work, caring for larger homes along with children. Although gender wage gaps persist, single women report being healthier and less stressed while enjoying more freedoms. Increasingly, government and workplace policies protect female workers from harassment and discrimination. One consequence of women delaying marriage to pursue education and work opportunities: Increasing numbers of men, especially the low skilled, struggle to find partners. Women expect gender equality and deliberate before making a lifelong commitment.
Women worldwide have high expectations for marriage, and more decide to stay single to pursue their own ambitions. One in seven Japanese women remain single.
Evidence concerning the state of marriage strongly suggests that women should give serious thought before making the momentous commitment to tie the knot — and demographic trends are making such a pause easier to do.
A number of studies report that single women tend to be healthier and less depressed, living longer than married women. Single women generally experience fewer stresses and compromises than married women. Furthermore, single women feel more empowered, enjoying greater personal autonomy and freedoms than married women largely because they don’t juggle challenging multiple roles at work and home.
Wives are generally less happy than single women, with many resentful of being married to the wrong man. Consequently, large numbers of marriages, estimated at no less than half in France, Russia and the United States end in divorce or separation. Women are far more likely to file for divorce and report feeling happier after ending their marriages.
Such unhappiness is not limited to women married to men – women married to other women are more likely to divorce than married male same-sex couples. Women have higher expectations of marriage than men and higher demands for meaningful communication and relationship quality, especially with regard to affection and intimacy.
One oft-noted concern is the matter of sexual relations. Wives often complain that husbands want sex with little attention to other needs. This concern is exacerbated by the traditional view still held by many that marriage implies automatic conjugal rights, with husbands entitled to intimacy any time and wives duty bound to oblige.
Women’s expectations of marriage can be hard to satisfy, increasing the risk of disappointment and questions about whether marriage was the right decision. In many instances, husbands are unaware of wives’ dissatisfaction. Consequently, husbands are more likely than wives to be surprised by requests for divorce or separation.
Increasingly, both husbands and wives work outside the home. However, wives also generally manage the house, organize children’s lives and provide care to aging parents. Despite limited progress in the sharing of domestic and familial responsibilities, the wife, even when employed, is still viewed as the homemaker, primary parent and principal caregiver.
In traditional societies, especially throughout most of Asia, marriage and family roles are a bundled package, especially for women. In general, marriage, rearing children and caring for elderly parents are linked. Once a woman marries, she is expected to put aside her personal goals to prioritize family responsibilities. Given those normative expectations, growing numbers of women reject the package and decide to stay single.
Consequently, more women marry later or not all. In Japan, for example, one in seven women were unmarried by age 50 in 2015, more than four times the level in 1970. In South Korea the proportion never married among women aged 30 to 34 years, 1.4 percent in 1970, reached nearly 30 percent by 2010. Similarly in China, 30 percent of urban women are single in their late 20s, as compared to less than 5 percent in 1970.
Many countries, especially those in the West, have banned discrimination against women and promote policies of gender equality. However, most married women continue to make compromises in their professional ambitions and personal lives. And on average, married women do more of the housework than women in cohabiting relationships with men.
Globally, women spend far more time than men on unpaid work, including housework, child and adult care, shopping and volunteering. Even among Scandinavian countries where the participation of men in unpaid work is comparatively high at about three hours per day, women still spend more time on those activities. The lowest levels for men’s participation in unpaid work, less than an hour per day, are observed in India, Japan and South Korea.
A recent study in the United States found that mothers with a husband or live-in male partner sleep less and do more housework than single mothers. Why mothers do more housework when there’s a man in the household was not determined. Another study reported that husbands create about seven hours of extra housework a week for their wives.
Most women are not prepared to return to the matrimonial inequalities of the past where husband were household heads, controlling the finances and property and expecting wives to love, honor and obey on his terms. Having gained legal and economic rights in marriage, most women expect to be equal partners within a marriage as well as in divorce or separation. Women are increasingly educated, more economically independent, gaining more opportunities outside marriage and embracing freedom. All these factors contribute to single women capable of maintaining a stable financial life.
In general, husbands have not demonstrated willingness to increase their relatively low contributions to housework and childcare. In addition, many men, particularly in traditional settings, do not want to marry women who are equals or earn more money than they do. Consequently, as women make gains, many no longer view marriage as an attractive option and stay single longer than in the past.
Among OECD members, female mean ages at first marriage are well above 27 years in virtually all countries and exceed 30 years in most. One of the largest increases between 1990 and 2017 took place in Hungary, where the mean age of first marriages for women jumped from 22 to 30 years.
Meanwhile, people try alternatives to marriage. While marriage poses negative effects on women’s health and longevity, dog ownership provides benefits for men and women, such as lower blood pressure and reduced cholesterol. Dogs readily express appreciation and provide nonjudgmental companionship. They are comparatively easy to train and don’t come with in-laws.
Women historically had little choice in the matter of marriage. Remaining single conflicted with pervasive sociocultural norms and resulted in considerable social pressures to marry. The social stigma of remaining single for women was reflected in terms such as old maid, spinster, vieille fille, alte Jungfer, solterona, zitelle, sheng nu, Christmas cake and leftover. Significantly, families and society have long regarded husbands, earning a paycheck and controlling distribution, as providing financial security for women.
Today, women face considerably less social and economic pressure to marry than in earlier periods. Article 16 of the Universal Declaration of Human Rights establishes the right of both women and men to make their own decisions, freely and with full consent, whether to marry. Nevertheless, forced marriage of women and even girls continues in many countries, especially in Africa and Asia.
Worldwide it is estimated that millions of girls and young women are married against their will before age 18 years. In 42 countries one third of girls are forced into child marriages. In addition to the traditional ideals of honor, shame and avoidance of sexual activity outside marriage, poverty is a major reason why parents force young daughters into marriage. By some estimates, as many of half of the world’s marriages are arranged, a term open to interpretation.
Perhaps an unintended consequence of women’s independence, delayed marriage and increased singlehood is the rise of “incels,” or the involuntarily celibate movement. Growing numbers of young men feel excluded from romance with little or no access to a suitable sexual partner, resulting in bitterness, misogyny and in some cases violence.
With greater gender equality, expanding opportunities for women and increasing individualism, women are more deliberate on making fateful decisions about marriage, with more deciding to remain single.
https://archive-yaleglobal.yale.edu/content/should-women-stay-single
*
CAN THERAPY CAUSE HARM?
The art of ‘being for another’ – following, listening to and making sense of another person’s world – has been practised for millennia. Humans have always discussed their lives, their values and their problems, trying to find meaning, solace and joy. Experts at this sort of discussion have been called wise women, shamans, priests – and now therapists. Then, starting with Sigmund Freud, came a series of attempts to create a science of psychotherapy out of it.
But there is very little science to it.
Being there for another person is uncomfortable. It is difficult. There is no peace, since the other continually changes, but that is the art. Doing it well takes experience, intelligence, wisdom and knowledge. Apart from a weekend’s worth of basic ground rules, it cannot be taught. From observing the people who do it well, I have concluded that it is an attitude perfected by seeing thousands of clients, reading hundreds of books on philosophy, art, science – and some trashy romance novels. It is enhanced when the practitioner has voted for a range of political parties, had a number of careers, and adhered to a religion or three – how could you possibly understand the anticipation, the fervor or the profound loss of purpose in seeking, finding and losing God unless you had undergone something like that yourself?
I became a psychotherapist and psychologist to maximize the good I could do in the world. It seemed obvious that helping people by engaging with the root of their suffering would be the most helpful thing to do. I also became a child psychotherapist to address the roots of suffering in childhood, where they seemed to stem. I experienced how deepening into a feeling could transform it, and learned about pre-natal trauma; I even wrote a doctorate on trauma.
Now, two decades into my career, I practice, lecture, supervise and write about all of these things, but increasingly I reject everything that I learned. Instead, I practice the art of ‘being for another’, an idea that arose in conversation with my colleague Sophie de Vieuxpont. I’m a mentor, a friend in an asymmetrical friendship, and a sounding board and critical ally assisting people as they go through the complexities, absurdities, devastations and joys of life.
Along the way, over years of practice, I lost faith that awareness was always curative, that resolving childhood trauma would liberate us all, that truly feeling the feelings would allow them to dissipate, in a complex feedback loop of theory and practice.
It started with returning to an old interest in evolutionary biology, with the release of Robert Plomin’s book Blueprint (2018). An account of twin studies, the book draws upon decades of twin statistics, from several countries, and the numbers were clear: childhood events and parenting rarely matter that much in terms of how we turn out.
That caused me to re-read Judith Rich Harris’s book No Two Alike (2006), which also examined twin studies along with wide-ranging studies of other species. Harris proposed that the brain was a toolbox honed by evolution to deliver sets of skills, leaving each of us utterly unique.
These books are perhaps summed up best in the second law of behavioral genetics: the influence of genes on human behavior is greater than the family environment. I noticed my defenses popping up, desperately trying to find holes in the science. But at the end of the day, without cherry-picking data conforming to what I learned in my training, the simple fact was this: twin sisters with identical genes raised in totally different families developed very similar personalities, while adopted sisters with no genetic links raised in the same family had very different personalities.
That finding, from the journal Developmental Psychology, undermined years of learning in psychodynamic theory. It means that the effect of your family environment – whether you are raised by caring or distant parents, whether in a low-income or high-income family – matters very little when it comes to your personality. If you’ve ever had any training in therapy, this goes against everything you have been taught.
The tenets of psychotherapy did not reflect my clients’ lived experience, or even my own. Instead, we see what we expect to see, and we make sense of our past based on how we feel now. If I am sad, I will recall deprivation and strife in my childhood, while my happier brother remembers a more positive situation; consider the memoirs Running with Scissors (2002), Be Different (2011) and The Long Journey Home (2011), each a radically different depiction of the same family.
In the few longitudinal studies that have been made, where we track children and their adverse childhood experiences (ACEs) from early years to adulthood, there is no link between ACEs and subsequent adult mental ill health. There is only a link between adult mental ill health and the ‘recollection’ of ACEs. This may seem wildly counterintuitive to a profession steeped in trauma theory.
ACEs have not been shown to cause mental ill health; it is rather that, when we suffer as adults, we interpret our childhoods as having been bad. I’m convinced that there are rare exceptions to this, of truly horrendous childhood experiences that do leave a mark, but even that certainty falters when I consider the fact that events that supposedly traumatize one person in a group fail to traumatize the others.
If you are denying what I’ve just written out of hand, you may be doing what religious fundamentalists have been doing for millennia. What I say may feel heartless, cold or politically toxic, but feelings aren’t epistemically valid grounds for rejecting information.
Instead, consider this: it is possible to care about suffering while reassessing your analysis of how it is caused and how it can be addressed. Perhaps a vast majority of therapy trainings are wrong about why people suffer. People in other cultures with radically different worldviews about how suffering develops and how best to deal with it also care deeply about helping people – they simply have a different way of doing it.
We need to reconsider why people suffer to help them in a better way. Freud and more recent trauma proponents like Gabor Maté tell us that our personalities and sufferings stem from how we were treated as children. This may resonate with us, but it could actually be wrong. If it is wrong, our treatments could be largely pointless and potentially harmful, and we need to critically examine these theories more carefully before we, as a profession, do more harm.
Historically, in many cultures around the world, from Nigeria to Malaysia, or the West more than 50 years ago, childhood has been seen as just one of the stages we move through, with no sacred status. We learn all the time, but suffering stems from how we now, at this time, relate to the world and what our current circumstances are.
Isn’t it a bit arrogant that so many in the West assume that this new, unevidenced theory – that suffering stems from childhood – should be universally true, or even true for us? How does the psychodynamic therapist, faced with their suffering client, feel resolute that they should dredge up the past, when philosophical traditions from across the world say the answer lies in the here and now? The Buddha, Lao Tzu, Aristotle and Jesus didn’t mention a word about childhood’s irreversible stain on the human condition – they saw us as individuals living through choices in the now. A millennium later, Al-Ghazali and Thomas Aquinas still worked on the here and now. Even two centuries ago, Hegel, Søren Kierkegaard and William James didn’t obsess about childhood.
If we were all doing brilliantly now, and if all the therapies that we pay so much money for worked as well as they claim to, maybe we could feel more confident in dismissing all of that. But they’re not, and surveys of happiness indicate that many Western women – therapists’ main customer demographic – aren’t doing brilliantly either.
*
At first, I couldn’t accept all this. Then Abigail Shrier’s bestseller, Bad Therapy (2024), described how therapeutic culture exerts a toxic and often harmful effect on culture at large. I was working with children, but was psychodynamic therapy, where we discuss the past, actually good for them? Shrier does not discount the occasional usefulness of children talking to adults, but also highlights the risks involved in turning it into routine treatment.
Children, even more than adults, become what they focus upon. If the focus is difficult feelings, these difficult feelings usually amplify rather than decrease. Yet this is what child psychotherapy focuses on – getting the kids to notice their difficult feelings and talk about them in the vain hope that all these difficult feelings magically disappear. My experience echoes the research Shrier cites: children are far more likely to identify with the feelings and fall down a rabbit hole of ever-increasing distress. Get a child to play out their anxiety in the sandbox, as I was taught to do, or to describe where they feel it in their body or what the scary monsters in their nightmares look like, and you might feel you’re helping them explore their emotions, but they end up stewing in them instead, invariably far more anxious than when they arrived.
There is a place for specific techniques in dealing with social anxiety, phobias and panic attacks. Such straightforward techniques, today classified under the umbrella of cognitive behavioral therapy (CBT), draw upon Buddhist, Stoic and traditional wisdom in order to hack our minds when we careen off into unhelpful rabbit holes. But why ask children to deepen into difficult feelings? This disrupts children’s natural process of resilience and of finding the good.
The issue is not merely with the therapeutic approaches offered in schools. We need to see it as a larger cultural idea that seeps out through parenting. Feelings are seen as central, when in fact they are vague and transient approximations of a situation. The whole point of parenting a child is to scaffold and develop their executive functions so that they develop adult emotional and intellectual capabilities.
This means teaching them that how they feel is not necessarily how things are, and that they may be held hostage by emotions if they don’t learn to move on from them. A child’s anger should not automatically be honored, and their resulting difficult behavior should most certainly not be rewarded with special accommodations.
As an example, a case study: Jim’s family contacted me in a state of anxiety about their increasing inability to cope with Jim’s behavior. They were highly caring parents who began to notice Jim struggling to self-regulate his impulses and lashing out at other kids. So, they sought professional advice from an experienced and fully accredited therapist. She recommended they sit with Jim and allow him to express his feelings safely, which often meant Jim lashing out at them.
When this didn’t help, and Jim began to get into trouble in school, their therapist recommended they take a trauma-informed approach. They had all lost a much-loved family member when Jim was small, now theorized as accounting for Jim’s disrupted emotional development. Jim’s anger was understandable in the context of what had happened, and he needed to be free to express it. Trauma-informed behavior management entailed making sure Jim felt safe when he was dysregulated (which is a polite way of saying when he was screaming, hitting and kicking people) and then asking him what he needed, showing him affection and finding activities he enjoyed such as gaming to reconnect.
After that, they could ask Jim about his feelings. Jim’s school adopted a similar strategy, with a student support worker ever-present to play with him when he disrupted the class. The idea was that forging strong attachment relationships with caregivers in school would help Jim feel emotionally safe, and his behavior would then improve.
‘It didn’t,’ his tearful mum explained to me. He recently got excluded and now refused to go to school.
These weren’t random crackpot theories. The psychological literature on attachment and trauma is extensive and mainstream. Many schools and mental health professionals have mandatory training in these approaches. But they also didn’t work. Jim was encouraged to identify with his anger, and to lean in to it rather than being expected and incentivized to move on. He was inadvertently taught that getting angry or violent in class would get him a free pass to play football instead of doing mathematics. The outcome was that, despite having thoughtful and motivated parents, Jim drifted further and further to the edges of the social world, because the therapeutic obsession with validating his emotion stopped him from learning the social rules that allow us to take part in it.
Because a central premise of child psychotherapy – helping children explore difficult emotions – is looking increasingly like it risks iatrogenic harm (ie, the treatment itself being harmful), I have started to decline work with children and am instead offering to work with parents or the broader system around the child.
Western morals draw upon ancient Greek and Christian ideas of virtue, humility and critical thinking. They form the core of psychotherapeutic thinking but have become increasingly imbalanced. The virtue and humility that was required to be for another has increasingly been distorted into victimhood for the client and heroic savior identities for the therapist, while critical thinking has effectively become a silencing of any critique of current therapeutic or ideological dogma.
We like to see ourselves as critical thinkers, but the critical thought never seems to mention that some 10 per cent of clients get worse after starting therapy – in these cases, therapy might be not merely unhelpful but actively harmful. If you’re told that you must listen to your momentary and subjective feelings of annoyance and hurt, and view them as your truth, minor interpersonal discomforts are much harder to let go of gracefully.
If you’re then told that your troubles with relationships stem from your parents’ failure to be fully present and meet your needs in childhood, the risk is that you will become more critical of your relationship with them at a time when perhaps you need that solid family bond the most. More than a quarter of Americans have cut off a family member; it is statistically improbable that most of these estrangements are for the sort of egregious abuse we might imagine merits it.
This is not talked about, taught or researched in our training institutions. Imagine if the benefits of capitalism or social media were extolled, and no one ever considered the shadow side? In a profession that gives a lot of airtime to Freud and Carl Jung, the Shadow should be central to our endeavors, yet we fail to consider this giant Shadow of our profession – that we regularly do harm.
This lack of self-critique coupled with a sense of infallible virtue creates a zeal for disseminating un-evidenced and potentially harmful memes from psychotherapeutic culture into mainstream culture. We see this when distress is increasingly explained as a reaction to ‘trauma’ or parents who in some way did us wrong. We see it in schools where children who would benefit more from clarity and boundaries instead get ‘trauma-informed’ carve-outs that stop them from being supported to develop the behavioral and life skills they will need to get on in the adult world.
Spurious trauma theories convince entire sections of society that they are deeply broken and require our services. How dare we believe that every other culture somehow misunderstood how suffering works, indeed our own culture until Freud? Sri Lankans don’t see their civil war or their tsunami as traumatic – in fact, when an army of trauma counselors descended upon the nation after the tsunami of 2004, the University of Colombo pleaded with them to cease seeing suffering as traumatization as it was undermining people’s resilience. Sri Lanka happens to top the charts for wellbeing in the ‘The Mental State of the World in 2023’ report despite such denial of the gospel of trauma.
No other culture that I know of believes that bad events create indelible stains on our minds, stains that forever taint our experience of the world. Bad things have happened throughout time, and they were (and are) bad enough without adding to them by insisting that some ‘trauma is held in the body’ in inescapable ways. Telling people that they have been harmed forever by others in their lives creates resentment and harms relationships.
If you peddle these kinds of stories, or simply believe them, consider reading up on the shaky science on which they are founded. And while therapy talks the talk about cultural competence and learning from other ways of thinking, it rarely walks the walk. We can learn from non-Western cultures’ takes on suffering, where we stay in the now, suffer in the now, and heal in the now.
Could it be dangerous to normalize therapy as the go-to way of dealing with difficulty? Therapy is a parasocial relationship, a form of unbalanced friendship. The relational component of being a therapist is like being a sex worker – there is a relationship, but it exists fully for the client. The payment and boundary rules bend the relationship towards the client – it is about them, because the reciprocity has been paid for. In a perfect world and life, there wouldn’t be any need for sex workers and therapists, because we would all have good sexual and non-sexual relationships, but in the real world such goodness is not always on offer. So, therapists and sex workers step in to fill the gap, offering experience of relating, in their respective domains, and hopefully allowing the client to learn enough about themselves and others to go into the real world and find real, better relationships.
The danger arises when the therapeutic relationship becomes a replacement for real-world relationships – when we are encouraged to ‘take it to therapy’ rather than attempting to engage with family or friends about painful and sensitive matters. Real-world relationships are strengthened by difficult conversations, and communities evolve by discussing matters that lurk at the edge of the respectable.
The therapeutic space shouldn’t insulate real external relationships from the dark messy internal relationships of ourselves – it should serve to occasionally incubate, prepare and clarify difficulties only with the express intention of sending the client back out to real relating. To return to the sex worker analogy, my job should not be to replace the unwilling spouse, but rather to be the open-minded sex worker who offers different ways of being that can transform the real marriage out there.
*
I believe that the true therapeutic work is to battle resentment. Resentment is the core of all my ills; the pain itself isn’t. Resentment arises when we are in pain but believe that we are entitled to not feel pain. This is complicated to engage in, especially since it borders on rights and politics. If I feel that I have the right to publish this article in The New York Times or have the right not to be offended by critical reviews of it, then the pain of being rejected by The NYT and reading vicious takedowns of my sage wisdom will be infinitely multiplied. My entitlement will make my basic pain so much worse.
I also believe that forgiveness and gratitude are the greatest allies that we have to battle entitlement and resentment. And they are easily developed.
Notice that I wrote that I believe the above to be true. I don’t know it. It works for me, resonates with me, and has been a theme for religious and nonreligious theorists from Siddhartha Gautama and Jesus of Nazareth to Nietzsche. But the simplicity of the theory allows my client to work out whether it applies to her or not – there are no hidden maternal attachments or strange counter-transferential stuff that only I, the expert, can decode. If my client insists that rights-based thinking can co-exist with gratitude, I may even learn from her (I did – thank you – if you read this, you know who you are).
Having let go of empty theories, and having informed the client that no magic resolutions will be forthcoming from my end – that their life is their responsibility – I find myself grounded in millennia of wisdom, in a lineage of people offering support in how to live. I can lean in to my clients, be curious about their stories, engage in their dilemmas, and flesh out their life-worlds as an intrepid explorer leaving no stone unturned.
On one hand, I feel less burdened by responsibility since there is nothing I ‘should’ do; on the other hand, the responsibility is greater since I take on all of the client without obfuscating theoretical filters. I have let go of therapeutic theory, and I think I am a better therapist for it.
https://aeon.co/essays/i-am-a-better-therapist-since-i-let-go-of-therapeutic-theory?utm_source=Aeon+Newsletter&utm_campaign=19d630b572-EMAIL_CAMPAIGN_2025_01_24&utm_medium=email&utm_term=0_-19d630b572-838110632
Oriana:
I could never afford therapy, but I was always curious about the experience. I like to delve into in-depth analysis of anything, though I also think that in terms of practical life philosophy nothing really surpasses Arnold Schwarzenegger’s “four words of wisdom”: "Stay Busy. Be Useful.”
Therapy is expensive not only in terms of money, but also, and more importantly, in terms of time. Perhaps a cooking class would ultimately be of more value? Or an acting workshop?
Developing a useful skill could ultimately prove to be a greater source of happiness than pondering ways in which your parents were too strict or not strict enough, or crying again over the mistakes of youth.
Ultimately, regardless of bad past experiences, “we have to cultivate our garden.” Which reminds me that I want to get more pansies for my flower border. Creating and/or taking in beauty is the best therapy I know.
For others, it might be having a dog. Talk about a being that’s totally “here and now.” There’s a lot we can learn from dogs and cats. And dogs have been shown to be better than therapists when it comes to comforting grieving persons.
I don’t mean to disparage therapists — at the same time I'm grateful we have loving dogs and beautiful cats. I’m glad that life is full of surprises and “learning experiences.” A Pilates class that will improve your posture is worth any number of hours spent analyzing your childhood or your both good-and-bad marriage.
However, I agree that overcoming entitlement and resentment is important. Life will take care of that, just as it will teach you not to waste the opportunities to be happy.
*
WHY FACTS DON’T CHANGE PEOPLE’S MINDS
The economist J.K. Galbraith once wrote, “Faced with a choice between changing one’s mind and proving there is no need to do so, almost everyone gets busy with the proof.”
Leo Tolstoy was even bolder: “The most difficult subjects can be explained to the most slow-witted man if he has not formed any idea of them already; but the simplest thing cannot be made clear to the most intelligent man if he is firmly persuaded that he knows already, without a shadow of doubt, what is laid before him.”
What’s going on here? Why don’t facts change our minds? And why would someone continue to believe a false or inaccurate idea anyway? How do such behaviors serve us?
The logic of false beliefs
Humans need a reasonably accurate view of the world in order to survive. If your model of reality is wildly different from the actual world, then you struggle to take effective actions each day.
However, truth and accuracy are not the only things that matter to the human mind. Humans also seem to have a deep desire to belong.
In Atomic Habits, I wrote, “Humans are herd animals. We want to fit in, to bond with others, and to earn the respect and approval of our peers. Such inclinations are essential to our survival. For most of our evolutionary history, our ancestors lived in tribes. Becoming separated from the tribe—or worse, being cast out—was a death sentence.”
Understanding the truth of a situation is important, but so is remaining part of a tribe. While these two desires often work well together, they occasionally come into conflict.
In many circumstances, social connection is actually more helpful to your daily life than understanding the truth of a particular fact or idea. The Harvard psychologist Steven Pinker put it this way, “People are embraced or condemned according to their beliefs, so one function of the mind may be to hold beliefs that bring the belief-holder the greatest number of allies, protectors, or disciples, rather than beliefs that are most likely to be true.”
We don’t always believe things because they are correct. Sometimes we believe things because they make us look good to the people we care about.
I thought Kevin Simler put it well when he wrote, “If a brain anticipates that it will be rewarded for adopting a particular belief, it’s perfectly happy to do so, and doesn’t much care where the reward comes from — whether it’s pragmatic (better outcomes resulting from better decisions), social (better treatment from one’s peers), or some mix of the two.
False beliefs can be useful in a social sense even if they are not useful in a factual sense. For lack of a better phrase, we might call this approach “factually false, but socially accurate.” When we have to choose between the two, people often select friends and family over facts.
This insight not only explains why we might hold our tongue at a dinner party or look the other way when our parents say something offensive, but also reveals a better way to change the minds of others.
Facts don’t change our minds. Friendship does.
Convincing someone to change their mind is really the process of convincing them to change their tribe. If they abandon their beliefs, they run the risk of losing social ties. You can’t expect someone to change their mind if you take away their community too. You have to give them somewhere to go. Nobody wants their worldview torn apart if loneliness is the outcome.
The way to change people’s minds is to become friends with them, to integrate them into your tribe, to bring them into your circle. Now, they can change their beliefs without the risk of being abandoned socially.
The British philosopher Alain de Botton suggests that we simply share meals with those who disagree with us:
“Sitting down at a table with a group of strangers has the incomparable and odd benefit of making it a little more difficult to hate them with impunity. Prejudice and ethnic strife feed off abstraction. However, the proximity required by a meal – something about handing dishes around, unfurling napkins at the same moment, even asking a stranger to pass the salt – disrupts our ability to cling to the belief that the outsiders who wear unusual clothes and speak in distinctive accents deserve to be sent home or assaulted. For all the large-scale political solutions which have been proposed to salve ethnic conflict, there are few more effective ways to promote tolerance between suspicious neighbors than to force them to eat supper together.”
Perhaps it is not difference, but distance that breeds tribalism and hostility. As proximity increases, so does understanding. I am reminded of Abraham Lincoln’s quote, “I don’t like that man. I must get to know him better.”
Facts don’t change our minds. Friendship does.
The Spectrum of Beliefs
Years ago, Ben Casnocha mentioned an idea to me that I haven’t been able to shake: The people who are most likely to change our minds are the ones we agree with on 98 percent of topics.
If someone you know, like, and trust believes a radical idea, you are more likely to give it merit, weight, or consideration. You already agree with them in most areas of life. Maybe you should change your mind on this one too. But if someone wildly different than you proposes the same radical idea, well, it’s easy to dismiss them as a crackpot.
One way to visualize this distinction is by mapping beliefs on a spectrum. If you divide this spectrum into 10 units and you find yourself at Position 7, then there is little sense in trying to convince someone at Position 1. The gap is too wide. When you’re at Position 7, your time is better spent connecting with people who are at Positions 6 and 8, gradually pulling them in your direction.
The most heated arguments often occur between people on opposite ends of the spectrum, but the most frequent learning occurs from people who are nearby. The closer you are to someone, the more likely it becomes that the one or two beliefs you don’t share will bleed over into your own mind and shape your thinking. The further away an idea is from your current position, the more likely you are to reject it outright.
When it comes to changing people’s minds, it is very difficult to jump from one side to another. You can’t jump down the spectrum. You have to slide down it.
Any idea that is sufficiently different from your current worldview will feel threatening. And the best place to ponder a threatening idea is in a non-threatening environment. As a result, books are often a better vehicle for transforming beliefs than conversations or debates.
In conversation, people have to carefully consider their status and appearance. They want to save face and avoid looking stupid. When confronted with an uncomfortable set of facts, the tendency is often to double down on their current position rather than publicly admit to being wrong.
Books resolve this tension. With a book, the conversation takes place inside someone’s head and without the risk of being judged by others. It’s easier to be open-minded when you aren’t feeling defensive.
Arguments are like a full frontal attack on a person’s identity. Reading a book is like slipping the seed of an idea into a person’s brain and letting it grow on their own terms. There’s enough wrestling going on in someone’s head when they are overcoming a pre-existing belief. They don’t need to wrestle with you too.
Why False Ideas Persist
There is another reason bad ideas continue to live on, which is that people continue to talk about them.
Silence is death for any idea. An idea that is never spoken or written down dies with the person who conceived it. Ideas can only be remembered when they are repeated. They can only be believed when they are repeated.
I have already pointed out that people repeat ideas to signal they are part of the same social group. But here’s a crucial point most people miss:
People also repeat bad ideas when they complain about them. Before you can criticize an idea, you have to reference that idea. You end up repeating the ideas you’re hoping people will forget—but, of course, people can’t forget them because you keep talking about them. The more you repeat a bad idea, the more likely people are to believe it.
Let’s call this phenomenon Clear’s Law of Recurrence: The number of people who believe an idea is directly proportional to the number of times it has been repeated during the last year—even if the idea is false.
Each time you attack a bad idea, you are feeding the very monster you are trying to destroy. As one Twitter employee wrote, “Every time you retweet or quote tweet someone you’re angry with, it helps them. It disseminates their BS. Hell for the ideas you deplore is silence. Have the discipline to give it to them.”
Your time is better spent championing good ideas than tearing down bad ones. Don’t waste time explaining why bad ideas are bad. You are simply fanning the flame of ignorance and stupidity.
The best thing that can happen to a bad idea is that it is forgotten. The best thing that can happen to a good idea is that it is shared. It makes me think of Tyler Cowen’s quote, “Spend as little time as possible talking about how other people are wrong.”
Feed the good ideas and let bad ideas die of starvation.
The Intellectual Soldier
I know what you might be thinking. “James, are you serious right now? I’m just supposed to let these idiots get away with this?”
Let me be clear. I’m not saying it’s never useful to point out an error or criticize a bad idea. But you have to ask yourself, “What is the goal?”
Why do you want to criticize bad ideas in the first place? Presumably, you want to criticize bad ideas because you think the world would be better off if fewer people believed them. In other words, you think the world would improve if people changed their minds on a few important topics.
If the goal is to actually change minds, then I don’t believe criticizing the other side is the best approach.
Most people argue to win, not to learn. As Julia Galef so aptly puts it: people often act like soldiers rather than scouts. Soldiers are on the intellectual attack, looking to defeat the people who differ from them. Victory is the operative emotion. Scouts, meanwhile, are like intellectual explorers, slowly trying to map the terrain with others. Curiosity is the driving force.
If you want people to adopt your beliefs, you need to act more like a scout and less like a soldier. At the center of this approach is a question Tiago Forte poses beautifully, “Are you willing to not win in order to keep the conversation going?”
Be Kind First, Be Right Later
The brilliant Japanese writer Haruki Murakami once wrote, “Always remember that to argue, and win, is to break down the reality of the person you are arguing against. It is painful to lose your reality, so be kind, even if you are right.
When we are in the moment, we can easily forget that the goal is to connect with the other side, collaborate with them, befriend them, and integrate them into our tribe. We are so caught up in winning that we forget about connecting. It’s easy to spend your energy labeling people rather than working with them.
The word “kind” originated from the word “kin.” When you are kind to someone it means you are treating them like family. This, I think, is a good method for actually changing someone’s mind. Develop a friendship. Share a meal. Gift a book.
Be kind first, be right later.
https://jamesclear.com/why-facts-dont-change-minds?utm_source=pocket_collection_story
*
MORE ON WHY FACTS DON’T CHANGE MINDS: COGNITIVE BIASES AND BRAIN BIOLOGY
“Facts First” is the tagline of a CNN branding campaign which contends that “once facts are established, opinions can be formed.” The problem is that while it sounds logical, this appealing assertion is a fallacy not supported by research.
Cognitive psychology and neuroscience studies have found that the exact opposite is often true when it comes to politics: People form opinions based on emotions, such as fear, contempt and anger, rather than relying on facts. New facts often do not change people’s minds.
I study human development, public health and behavior change. In my work, I see firsthand how hard it is to change someone’s mind and behaviors when they encounter new information that runs counter to their beliefs.
Your worldview, including beliefs and opinions, starts to form during childhood as you’re socialized within a particular cultural context. It gets reinforced over time by the social groups you keep, the media you consume, even how your brain functions. It influences how you think of yourself and how you interact with the world.
For many people, a challenge to their worldview feels like an attack on their personal identity and can cause them to harden their position. Here’s some of the research that explains why it’s natural to resist changing your mind – and how you can get better at making these shifts.
Rejecting what contradicts your beliefs
In an ideal world, rational people who encounter new evidence that contradicts their beliefs would evaluate the facts and change their views accordingly. But that’s generally not how things go in the real world.
Partly to blame is a cognitive bias that can kick in when people encounter evidence that runs counter to their beliefs. Instead of reevaluating what they’ve believed up until now, people tend to reject the incompatible evidence. Psychologists call this phenomenon belief perseverance. Everyone can fall prey to this ingrained way of thinking.
Being presented with facts – whether via the news, social media or one-on-one conversations – that suggest their current beliefs are wrong, causes people to feel threatened. This reaction is particularly strong when the beliefs in question are aligned with your political and personal identities. It can feel like an attack on you if one of your strongly held beliefs is challenged.
Confronting facts that don’t line up with your worldview may trigger a “backfire effect,” which can end up strengthening your original position and beliefs, particularly with politically charged issues. Researchers have identified this phenomenon in a number of studies, including ones about opinions toward climate change mitigation policies and attitudes toward childhood vaccinations.
Focusing on what confirms your beliefs
There’s another cognitive bias that can get in the way of changing your mind, called confirmation bias. It’s the natural tendency to seek out information or interpret things in a way that supports your existing beliefs. Interacting with like-minded people and media reinforces confirmation bias. The problem with confirmation bias is that it can lead to errors in judgment because it keeps you from looking at a situation objectively from multiple angles.
A 2016 Gallup poll provides a great example of this bias. In just one two-week period spanning the 2016 election, both Republicans and Democrats drastically changed their opinions about the state of the economy – in opposite directions.
But nothing was new with the economy. What had changed was that a new political leader from a different party had been elected. The election outcome changed survey respondents’ interpretation of how the economy was doing – a confirmation bias led Republicans to rate it much higher now that their guy would be in charge; Democrats the opposite.
Brain’s hard-wiring doesn’t help
Cognitive biases are predictable patterns in the way people think that can keep you from objectively weighing evidence and changing your mind. Some of the basic ways your brain works can also work against you on this front.
Your brain is hard-wired to protect you – which can lead to reinforcing your opinions and beliefs, even when they’re misguided. Winning a debate or an argument triggers a flood of hormones, including dopamine and adrenaline. In your brain, they contribute to the feeling of pleasure you get during sex, eating, roller-coaster rides – and yes, winning an argument. That rush makes you feel good, maybe even invulnerable. It’s a feeling many people want to have more often.
Moreover, in situations of high stress or distrust, your body releases another hormone, cortisol. It can hijack your advanced thought processes, reason and logic – what psychologists call the executive functions of your brain. Your brain’s amygdala becomes more active, which controls your innate fight-or-flight reaction when you feel under threat.
In the context of communication, people tend to raise their voice, push back and stop listening when these chemicals are coursing through their bodies. Once you’re in that mindset, it’s hard to hear another viewpoint. The desire to be right combined with the brain’s protective mechanisms make it that much harder to change opinions and beliefs, even in the presence of new information.
You can train yourself to keep an open mind
In spite of the cognitive biases and brain biology that make it hard to change minds, there are ways to short-circuit these natural habits.
Work to keep an open mind. Allow yourself to learn new things. Search out perspectives from multiple sides of an issue. Try to form, and modify, your opinions based on evidence that is accurate, objective and verified.
Don’t let yourself be swayed by outliers. For example, give more weight to the numerous doctors and public health officials who describe the preponderance of evidence that vaccines are safe and effective than what you give to one fringe doctor on a podcast who suggests the opposite.
Be wary of repetition, as repeated statements are often perceived as more truthful than new information, no matter how false the claim may be. Social media manipulators and politicians know this all too well.
Presenting things in a non-confrontational way allows people to evaluate new information without feeling attacked. Insulting others and suggesting someone is ignorant or misinformed, no matter how misguided their beliefs may be, will cause the people you are trying to influence to reject your argument. Instead, try asking questions that lead the person to question what they believe. While opinions may not ultimately change, the chance of success is greater.
Recognize we all have these tendencies and respectfully listen to other opinions. Take a deep breath and pause when you feel your body ramping up for a fight. Remember, it’s OK to be wrong at times. Life can be a process of growth.
https://theconversation.com/cognitive-biases-and-brain-biology-help-explain-why-facts-dont-change-minds-186530
*
LITTLE ICE AGE
Winter landscape with ice skaters, by Hendrick Avercamp, c.1608.
Environmental historians and climate scientists now recognize the 17th century as a period of intense climate change, the peak of the Little Ice Age – a period of severe cooling between the 16th and late 18th centuries – in which average yearly temperatures in the northern hemisphere plunged by as much as two degrees Celsius. While such a number might seem small, it had massive local effects. The major goal of the 2015 Paris Climate Accords was to ‘hold global temperature increase to well below 2°C’, an acknowledgement that anything beyond this number represents an irretrievable disaster.
Historical sources from the coldest period of the Little Ice Age give some insight into a time when a similar climate disaster came close. Historians such as Geoffrey Parker have begun to map out the cultural and historical consequences of the Little Ice Age across the hemisphere, from the Americas to Europe and Asia, most notably crop failure, which led to food shortages and widespread social and military conflict. The global tumult of the 17th century was clearly the result of the climax of a period of catastrophic climate change.
For many, these weather phenomena were fundamentally religious events that called for a godly interpretation. The popular religious writings of 17th-century Europe reveal ordinary people’s experiences of their environment and their attempts to make sense of it. Of these, perhaps no author was more popular (at least among Protestants) than Johann Arndt, whose writings went through hundreds of printings during the century and who was rumored to have outsold the Bible in some parts of Germany. Arndt’s writings attended directly to the environmental circumstances of the Little Ice Age, offering a religious explanation for the extreme environmental phenomena that orthodox Lutheranism simply did not mention or account for.
Arndt put forward the obvious interpretation: ‘When one now looks at the darkness of the sun and the moon, one should think that … it is contrary to their nature, and proclaims to us a great wickedness performed on earth.’ The dimming of the skies and the celestial bodies that reside there, he argued, must have been the result of some human moral failure. This was a conclusion that could not have been reached through orthodox Lutheran doctrine, which held that divine knowledge can only be found in the scriptures and not through environmental phenomena.
Similar interpretations of climate change during the period led to tragic instances of scapegoating. In southern Germany in 1626, a spring hailstorm followed by sudden Arctic temperatures prompted the swift and horrific torture and execution of 900 men and women, accused of creating the storm by witchcraft.
Arndt, for his part, did not attempt to blame vulnerable groups. Instead, he presented an ecological vision in which humans and the cosmos were in intimate interrelation, suffering together even as they did so as a result of human moral failure:
'The suffering of the macrocosm, that is, the great world, is subsequently fulfilled in the microcosm, that is, in humanity. What happens to man, nature and the great world suffer first, for the suffering of all creatures, both good and evil, is directed towards man as a center where all lines of the circle converge. For what man owes, nature must suffer first.'
These radical religious writings, and their intense popularity, seem to reveal an early modern reading public intent on interpreting and understanding their changing environment. Arndt’s book permanently transformed Protestant Christianity and its relationship with the physical world by shuttling Hermetic perspectives on the divinity of the cosmos into a Europe that was desperate for a religious understanding of their changing climate.
https://www.historytoday.com/archive/history-matters/who-blame-early-modern-climate-change?utm_source=Newsletter&utm_campaign=3a81ab296b-EMAIL_CAMPAIGN_2017_09_20_COPY_01&utm_medium=email&utm_term=0_fceec0de95-3a81ab296b-1214148&mc_cid=3a81ab296b
*
RUSSIAN DREAMS OF AN AMERICAN COLONY
Orthodox Church of Holy Trinity, Fort Ross
Fort Ross on California's rocky coast contained an oasis of Russian refinement.
North of San Francisco, I am traveling along the isolated Sonoma coast from Bodega Bay to a place the Indians for thousands of years called Metini. Towering stands of redwoods rise up from the insteps of switchbacks on Highway One. The trees go against the grain of steep brown hills, pine-topped ridges, and rugged seaside cliffs. Cows can be heard lowing in the fog-shrouded meadows, while a raucous crowd of barking sea lions cavorts among the boulders cropping out of the foamy surf. A hummingbird halts its flight and holds on to a mid-air perch, while a single-engine plane drones its way up the coast. Delicate ice plants carpet rocky terraces, their yellow and purple blossoms promising more scent than they deliver, while stalky fronds wave menacingly from the hillsides and rills. It’s all, to quote Jack Kerouac, “just too crazy.”
What also piques the curiosity is a fenced-in quadrangle, three hundred feet to a side, set several hundred yards back from the seaside cliffs. Two blockhouses occupy opposing corners and in another is a chapel sporting a dome of sorts. Numerous other wood buildings stand in the compound, which is called Fort Ross. It was built in 1812 by a couple dozen Russians and about eighty Aleuts for the purpose of supplying wheat and furs for the Russian-American Company’s colony in Sitka, Alaska. Only one of the buildings was here at the time of the Russian settlement—the recently restored Rotchev House, once home to Alexander Rotchev, the last of the managers sent by the Russian-American Company.
The entrance to what is now called Fort Ross State Historic Park lies away from the endless expanse of ocean, and you approach Rotchev House on foot by descending through groves of gigantic eucalyptus trees. The Rotchev family lived here from 1838 to 1841. The single-floor dwelling comprises seven rooms and is equipped with a trap door leading to a garret beneath a dramatically hipped roof. The home is sparsely furnished with sturdy, elegant pieces in the Biedermeier style, which was fashionable among Russian aristocrats, especially in the Siberian dwellings of exiled Decembrists. The structure demonstrates several notable Russian building techniques, including the half-dovetail notching of the redwood logs that were used in all the buildings at Fort Ross.
The gale-force winds that can hit in winter here just bounce off such solid construction. Additionally, the house is literally a window onto an ingenious Russian method of ventilation: A single windowpane, known as a fortochka, was furnished with hinges and a latch and could swing open by itself, serving as a kind of window within a window.
Inside, one sees further evidence of an unexpectedly civilized life here in this outpost in Alta California, a vast region that was officially claimed by the Spanish but difficult for them to control. In the parlor, the Russians’ refined sensibilities are on full view, with a gleaming samovar and a delicate pianoforte. Rotchev and his wife were book lovers. Both were multilingual; he was also a poet, and she would later translate a children’s book. Researchers are still trying to recreate the selection of books on the shelves, but it is a near certainty that it was once the finest collection in Alta California. One French visitor to the house in Rotchev’s day remarked on the house’s “choice library” and scores by Mozart. He was equally impressed by Rotchev’s wife, Elena, who spoke lively French.
What we know today of Fort Ross in the early and mid nineteenth century comes from research. Archaeologists have examined an Orthodox cemetery on a promontory across a ravine from the fort. The ornamental beads that turned up attested to a diversity of residents and a thriving family life at the fort. The material record also includes crosses, shards of pipes, ceramic cups from China and England, and musket balls, as well as tools and, found in the fort’s boatyard, evidence of work sheds, and banya, or bathhouses.
John Sutter, founder of New Helvetia and owner of the mill where gold was discovered in 1848, purchased the fort and all its assets in 1841, and the Russians took their personal belongings with them when they decamped. The picture we have today, then, of this Russian outpost—to say nothing yet of the Creole and Kashaya Indian families living in and around the fort—has come into focus only after decades of work to bring the scattered puzzle pieces back together.
The remote village has proven to have had global connections. Researchers have discovered links between the decorative arts in evidence in Rotchev House and the work of craftsmen as far away as Istanbul. Ceramic pipes that were popular among the Russian aristocracy were created in that era in the Turkish capital. The rugs that the Rotchevs were likely to have had in their home, and reproductions of which lie now in the study and the parlor, would have come from Baluchistan, in present-day Pakistan and Afghanistan. Ancient trade routes and the diplomatic necessities that come with empire-building allowed for such exchange and added yet another cosmopolitan dimension to life at the fort.
Fur trade with China lent a further international aspect to the activities of the Russian-American Company. In the Mongolian town of Kyakhta, Russian traders sold sea otter skins as luxury items. A single skin could fetch the equivalent of a hundred dollars, which, by way of comparison, was what a Pennsylvania farmer might hope to earn in a year in the 1790s. Sea otters were overharvested by 1825, however, which was among the reasons the Russians pulled out of California.
Alexander Rotchev himself was a cosmopolitan. He had been a dashing figure in the literary circles of Moscow, where he worked as a journalist and translator. He met and fell in love with the highly cultivated Elena Gagarina, and, against her family’s wishes, they wed. Rotchev was considered by Elena’s family to be beneath her social standing. Once the couple eloped, Elena was disinherited. The newlyweds moved to St. Petersburg, where Rotchev supported Elena and their new son while working for the Imperial St. Petersburg Theaters. He also translated Molière, Schiller, and Shakespeare for the Russian stage.
Rotchev needed greater income to support his wife and growing family. The Russian-American Company, headquartered in St. Petersburg, opened its doors to him and he strode in. After a year of working at the company’s offices in St. Petersburg, Rotchev was appointed commissioner-at-large and traveled along the Pacific coast of North America as well as to India and China.
Following the explorations of the North Pacific by Vitus Bering, at the encouragement of Peter the Great, the Russian-American Company was founded in 1799 to supply skins for the fur trade. Their primary goal was to increase commerce, but the scientific mission of advancing knowledge was inseparable from early Russian activity in the North Pacific. The company’s main base in Rotchev’s day was in Sitka, Alaska. Owned in large part by the aristocracy, the company had ties to the tsar’s family, and the tsar himself held stock in the company.
While the Russians at Fort Ross were engaged in toolmaking and shipbuilding, as well as defense of the compound, the People from the Top of the Land, the Kashaya, more than lent a hand to the agricultural work there. As the Aleuts were in Alaska, the Kashaya were coerced and cajoled to do the Russians’ bidding. One governor of Russian-America, however, Baron Ferdinand Petrovich von Wrangell, saw the value of treating the native people fairly and judging them on their own terms. The Kashaya benefited from the Russians’ relatively progressive colonial attitudes and were relieved that the “Undersea People” didn’t force their Orthodox religious views on them. They were also grateful for the Russians’ muskets and cannons, which protected them from the Spanish and Mexicans and other, more aggressive, native peoples.
The ethnographic accounts of Georg von Langsdorff constitute a major resource for understanding the native coastal peoples. As a translator for the Russian-American Company who sailed to San Francisco in 1806 to explore trade possibilities with the Spanish commandant, he provided the earliest portraits of early native life. Ilya Voznesensky later recorded the flora and fauna of coastal Alta California for the Imperial Academy of Sciences. This burst of scientific and cultural inquiry was sparked originally by Peter the Great in the early eighteenth century and continued by Catherine the Great, in her effort to bring “to perfection” knowledge of the North Pacific.
Why the Russians ever left such a splendid ecosystem along California’s coast can be explained to a large extent by why they came in the first place. In coming to Alaska, the Russians abandoned many comforts and seeming necessities, but one they couldn’t break with definitively was bread. So, when exploratory voyages happened upon a pair of coves well to the north of Bodega Bay—a major port north of San Francisco—company officials envisioned a secondary settlement, one that could supply furs to the company but also grow wheat and provide flour for Sitka and other Russian settlements in Alaska.
California’s first windmill was thus constructed at what became Fort Ross, and several ranches were established well away from the protective shadow of the fort, including at least one along the meandering Russian River, which lazily enters the Pacific near today’s picturesque and tiny community of Jenner (“pop. 107,” according to a signpost). “It was an error in judgment,” says Susanna Barlow with the Fort Ross Interpretive Association. “The marine climate wasn’t good . . . for growing a lot of wheat.”
By the time Alexander Rotchev arrived in 1838, the sea otter population had long been seriously diminished, in spite of the company’s moratorium on hunting any sea mammals at all. Additionally, the Kashaya people were more accustomed to a seasonal form of food-gathering, planting, and cultivation that was at odds with the industrial-strength harvest required by the Russians. And while Fort Ross itself was impregnable (it had more than forty cannons aimed at any approach from the sea, and an inland attack in such remote and unforgivable terrain was unthinkable), any attempt by the Russians to colonize inland would have been resisted by the Spanish and Mexicans.
Alta California was administered at the time of Rotchev’s arrival by Mexico. After the Mexican government won independence from Spain, it sought diplomatic recognition and made that a condition for granting Russians the permission to stay. The tsar declined and ordered the company to depart from California.
Rotchev looked upon the Russian settlement in California very fondly and was opposed to the withdrawal, but he did his duty and continued to try to find a buyer for Fort Ross, first among the French, then among the American settlers. Enter John Sutter, who, it is no exaggeration to say, bought the place lock, stock, and barrel in 1841.
Russian contributions to California history are few but significant. In addition to building and using the first windmills, they built the first ships and manufactured tools and equipment for settlers. They also planted orchards from saplings they had brought with them from Russia, possibly even introducing a new apple to North America—although it is in doubt which one. You can walk through the orchards just to the north of Fort Ross and stand next to trees planted by the Russian settlers. Generations of American ranchers have cared for the few remnants of Russian civilization left behind at Fort Ross.
Those groves of eucalyptus trees near the fort, though, were not planted by the Russians. They came later, the result of another wave of California dreamers who thought the species of eucalyptus they planted would provide excellent lumber. Just like the Russian dream of cultivating great quantities of wheat along the northern coast, the decision to plant those eucalyptus trees proceeded from an error in judgment—a rather poetic error, but an error nonetheless.
On leaving the fort, I walk along an old battered former section of Coast Highway One that used to cut straight across Fort Ross’s front yard in Jack Kerouac’s day. Its yellow centerline is still faintly visible. Standing there I revel in the beatific experience of just being here, on the road, in such an exquisite place that works on the imagination and expands one’s sense of the possible.
https://www.neh.gov/humanities/2012/marchapril/feature/russian-dreams-american-colony
*
THE PUZZLING VIRUS THAT INFECTS ALMOST EVERYONE
Statistically speaking, the virus known as Epstein-Barr is inside you right now. It is inside 95 percent of us. It spreads through saliva, so perhaps you first caught the virus as a baby from your mother, who caught it as a baby from her mother. Or you picked it up at day care. Or perhaps from a friend with whom you shared a Coke. Or the pretty girl you kissed at the party that cold New Year’s Eve.
If you caught the virus in this last scenario—as a teen or young adult—then Epstein-Barr may have triggered mono, or the “kissing disease,” in which a massive immune response against the pathogen causes weeks of sore throat, fever, and debilitating fatigue.
For reasons poorly understood but not unique among viruses, Epstein-Barr virus, or EBV, hits harder the later you get it in life. If you first caught the virus as a baby or young child, as most people do, the initial infection was likely mild, if not asymptomatic.
Unremarkable. And so this virus has managed to fly under the radar, despite infecting almost the entire globe. EBV is sometimes jokingly said to stand for “everybody’s virus.” Once inside the body, the virus hides inside your cells for the rest of your life, but it seems mostly benign.
Except, except. In the decades since its discovery by the virologists Anthony Epstein and Yvonne Barr in 1964, the virus has been linked not only to mono but also quite definitively to cancers in the head and neck, blood, and stomach.
It’s also been linked, more controversially, to several autoimmune disorders. Recently, the link to one autoimmune disorder got a lot stronger: Two separate studies published in 2022 make the case—convincingly, experts say—that Epstein-Barr virus is a cause of multiple sclerosis, in which the body mistakenly attacks the nervous system.
“When you mentioned the virus and MS 20 years ago, people were like, Get lost … It was a very negative attitude,” says Alberto Ascherio, an epidemiologist at Harvard and a lead author of one of those studies, which used 20 years of blood samples to show that getting infected with EBV massively increases the risk of developing multiple sclerosis. The connection between virus and disease is hard to dismiss now. But how is it that EBV causes such a huge range of outcomes, from a barely noticeable infection to chronic, life-altering illness?
In 2021, my colleague Ed Yong noted that a bigger pandemic is a weirder pandemic: The sheer number of cases means that even one-in-a-million events become not uncommon. EBV is far from novel; it belongs to a family of viruses that were infecting our ancestors before they were really human. But it does infect nearly all of humanity and in rare occasions causes highly unusual outcomes. Its ubiquity manifests its weirdness. Decades after its discovery and probably millennia after those first ancient infections, we are still trying to understand how weird this old and familiar virus can be. We do little to curb the spread of Epstein-Barr right now. As the full scope of its consequences becomes clearer, will we eventually decide it’s worth stopping after all?
From its very discovery, Epstein-Barr confounded our ideas of what a virus can or cannot do. The first person to suspect EBV’s existence was Denis Burkitt, a British surgeon in Uganda, who had the unorthodox idea that the unusual jaw tumors he kept seeing in young children were caused by a then-undiscovered pathogen. The tumors grew fast—doubling in size in 24 to 48 hours—and were full of white blood cells or lymphocytes turned cancerous. This disease became known as Burkitt’s lymphoma. Burkitt suspected a pathogen because the jaw tumors seemed to spread from area to neighboring area and followed seasonal patterns. In other words, this lymphoma looked like an epidemic.
In 1963, a biopsy of cells from a girl with Burkitt’s lymphoma made its way to the lab of Anthony Epstein, in London. One of his students, Yvonne Barr, helped prepare the samples. Under the electron microscope, they saw the distinctive shape of a herpesvirus, a family that also includes the viruses behind genital herpes, cold sores, and chicken pox. And the tumor cells specifically were full of this virus. Case closed? Not yet. At the time, the idea that a virus could cause cancer was “rather remote,” says Alan Rickinson, a cancer researcher who worked in Epstein’s lab in the 1970s. “There was a great deal of skepticism.”
What’s more, the virus’s ubiquity made things confusing. Critics pointed out that sure, children with Burkitt’s lymphoma had antibodies to EBV, but so did healthy children in Africa. So did American children for that matter, as well as isolated Icelandic farmers and people belonging to a remote tribe in the Brazilian rainforest. The virus was everywhere scientists looked, yet Burkitt’s lymphoma was largely confined to equatorial Africa. What if EBV was just an innocent bystander? Why wasn’t the virus causing disease anywhere else?
It was. Scientists just didn’t know where to look until a stroke of luck clued them in. In 1967, a technician in a Philadelphia lab studying EBV and cancer fell ill with symptoms of mono. Because she was one of the few people who had tested negative for EBV antibodies, she had regularly donated blood for lab experiments that needed a known negative sample. When she came back after the illness, she started testing positive, highly positive. The timing suggested what we now know: EBV is the most common cause of mono.
Scientists eventually found more links between the virus and other cancers: nasopharyngeal cancer, stomach cancer, Hodgkin’s lymphoma, and other forms of lymphoma. In all, it plays a role in 1.5 percent of cancers globally. Those first two are cancers in the cells lining the throat and stomach, which EBV can infect. The others are in white blood cells or lymphocytes, which the virus actually specializes in infecting. In particular, EBV infects a type of lymphocyte called a B cell, each of which is born to recognize a different hypothetical enemy. If a certain B cell never finds its matching enemy, it dies as part of the body’s ruthless culling of useless immune cells. If it does find a match, however, the B cell divides and transforms into memory B cells, which will remain to guard against infection for the rest of a person’s life.
EBV’s genius is that it co-opts this normal process. It manipulates infected B cells into thinking they have been activated, so that they turn into long-lasting memory B cells where the virus can hide for decades. (All herpesviruses in the family have this unusual ability to become latent, though they hide out in different types of cells. The chicken-pox virus, for example, uses nerve cells, sometimes coming out to cause shingles.)
Occasionally, EBV emerges from its hiding place, replicating just enough to get by. If it replicates too little, it won’t find another host before getting shut down by the immune system. If it replicates too much, it risks harming its current host. The virus and immune system are in constant balance, each holding the other in check. There’s an “elegance with which this virus has established a long-term relationship with the host,” says Sumita Bhaduri-McIntosh, an Epstein-Barr virologist and infectious-disease doctor at the University of Florida.
When this balance is interrupted, one possible result is cancer. As part of its manipulation of infected cells, EBV seems to suppress their normal dying process. And if the cell that refuses to die has other aberrant properties, then you can get cancers like Burkitt’s lymphoma. “In most cases, when the virus appears in this cancer, and subsequently in other cancers, it is one part of a chain,” Rickinson says. “It’s obviously not the sole driver of growth.” This explains why EBV doesn’t cause cancer in everyone it infects, only in those unlucky enough to have also acquired the wrong set of other mutations. In the case of Burkitt’s lymphoma, the cancerous cells also have a strange rearrangement of chromosomes, which scientists learned is linked to malaria infection. This accounted for the unique geographic pattern that Burkitt had observed. EBV is everywhere, but Burkitt’s lymphoma was common only in places where malaria is also endemic.
Epstein-Barr became known as the first human virus linked to not just an immediate disease but also cancers that can appear years after initial infection. It challenged the traditional paradigm of viruses causing short-term illnesses that resolve and confer immunity. After all, the virus stays inside our bodies and continues to interact with our immune systems for the remainder of our lives.
Over the years, more hints of EBV’s unusual abilities started appearing. The virus or the antibodies to it seemed to be disproportionately found in people suffering from autoimmune disorders such as rheumatoid arthritis, lupus, and multiple sclerosis as well as those suffering from chronic fatigue syndrome, also known as myalgic encephalomyelitis. These chronic conditions, whose biological mechanisms are even more elusive than cancer’s, are particularly hard to study. While the correlations between EBV and these disorders were suggestive, they were in no way definitive. People who have these conditions might almost all have EBV, but then almost all healthy people have EBV too. “That’s not a very good place to start doing epidemiology, when you have 95 percent in the control group,” says Paul Farrell, an EBV researcher at Imperial College London.
The recent study from Harvard’s Ascherio got around this by looking at a massive archive of serum samples taken from people over 20 years. The collection came from the Department of Defense, which stores serum from routine tests for HIV. Among the 10 million adults with samples in the repository, researchers were able to find enough people who were initially negative for EBV but then contracted it during the 20-year period. And those who did get the virus were 32 times as likely to develop multiple sclerosis as those who did not.
A second study from Stanford adds a possible causation to this correlation: Some multiple-sclerosis patients have antibodies that bind both an EBV protein and a protein in the brain, which is erroneously targeted by the immune system in multiple sclerosis. This kind of cross-reaction has long been suspected in MS but only now identified. “It’s just like a great volcano of information,” says Rickinson about the recent studies. As with EBV-associated cancers, though, only a tiny sliver of people infected with the virus end up developing multiple sclerosis, so some other trigger or triggers must also be in play. We’re only at the beginning of understanding this process.
COVID, too, revived interest in Epstein-Barr’s long-term consequences. A long-COVID study found EBV infection to be one of four major risk factors, suggesting that some long-COVID symptoms might be caused by reactivation of EBV when the body is weakened from fighting the coronavirus.
This association is perhaps not surprising. The debilitating fatigue associated with long COVID and other post-viral syndromes does look, in some ways, like the fatigue caused by mono. And in the 1980s, doctors noticing the similarity had begun diagnosing chronic Epstein-Barr virus syndrome in patients whose mono-like symptoms of fatigue and sore throat did not go away for months. Eventually, however, experts took Epstein-Barr out of the name and gave it the more general term of chronic fatigue syndrome, because EBV does not seem to be the sole cause of such symptoms.
Chronic fatigue syndrome may have several different explanations, but the virus may still play a role in some cases even after mild infections, says Hank Balfour, a pathologist at the University of Minnesota. He has also described cases of “chronic mono,” in which a severe initial EBV infection triggers mono symptoms that either linger or recur for months or even years. Mono’s acute phase typically lasts for weeks, which is already unusually long for a virus but is well documented.
There isn’t much research on chronic mono though, and the diagnosis is not widely accepted among doctors. “It needs, I think, more attention,” Balfour says. Long COVID remains a baffling consequence of the novel coronavirus, but even the long-term consequences of very common viruses like EBV are poorly understood.
As the long-term picture of EBV comes into focus, how do we think about the danger of a virus that is ubiquitous, that rarely causes serious disease but has devastating consequences when it does? We currently have no way of preventing EBV infection, short of avoiding all human interactions that might share saliva: a mother kissing her baby, a toddler doing almost anything.
Vaccines have been in the works for decades; Epstein himself worked on one. The link to multiple sclerosis, many long-time researchers now hope, will revive interest in an EBV vaccine.
More than a decade ago, a pharmaceutical company abandoned a vaccine candidate that successfully prevented mono but not EBV infection altogether. The result was “discouraging from a pharmacoeconomic point of view,” Balfour says, because there wasn’t a clear demand for a vaccine that prevented only mono. Preventing multiple sclerosis, however, might add an extra incentive.
Two vaccine candidates, from the National Institutes of Health and Moderna, entered clinical trials in 2022. A key question is whether they can do better than the old vaccine. “We would of course like to prevent infection. That’s the ultimate goal, but we think even if we don’t prevent infection, we can still reduce EBV-associated disease,” says Jeffrey Cohen, a virologist at the NIH who works on one of the vaccines. That’s because symptomatic EBV infections—such as mono—are associated with a higher likelihood of developing EBV-associated diseases, adds Balfour, who has also worked on a vaccine.
However, studying how the vaccine might stop diseases that develop years later, such as cancers or multiple sclerosis, will be very hard in a typical vaccine trial. The incidences are so low, and the diseases take so long to appear, that a vaccine trial in hundreds or thousands of people over a few years is unlikely to offer much definitive evidence. Most likely, Cohen says, if the vaccines work against mono, they can be approved to prevent the disease in people who have not yet been infected by EBV. Once it’s on the market and hundreds of thousands of people get it and are followed over years, then the effect on cancer or multiple sclerosis may finally become clear.
All of these advances make it a “fascinating time” for EBV research, says Rickinson, the biologist who once worked with the eponymous Epstein. “Unfortunately,” he says, “I’m unable to pursue it myself.” He recently retired from the University of Birmingham after devoting nearly 50 years to studying this enigmatic virus. It’s up to the next generation now—to figure out EBV’s remaining secrets and perhaps a better way of coexisting with it.
https://getpocket.com/explore/item/the-puzzling-virus-that-infects-almost-everyone?utm_source=firefox-newtab-en-us
*
HOW THE IMMUNE SYSTEM REGULATES BLOOD SUGAR
“For decades, immunology has been dominated by a focus on immunity and infection”, says Henrique Veiga-Fernandes, head of the Immunophysiology Lab at the Champalimaud Foundation. “But we’re starting to realise the immune system does a lot more than that”.
Glucose, a simple sugar, is the primary fuel for our brains and muscles. Maintaining stable blood sugar levels is crucial for our survival, especially during fasting or prolonged physical activity when energy demands are high and food intake is low.
Traditionally, blood sugar regulation has been attributed to the hormones insulin and glucagon, both produced by the pancreas. Insulin lowers blood glucose by promoting its uptake into cells, while glucagon raises it by signaling the liver to release glucose from stored sources.
Veiga-Fernandes and his team suspected there was more to the story. “For example”, he notes, “some immune cells regulate how the body absorbs fat from food, and we’ve recently shown that brain-immune interactions help control fat metabolism and obesity. This got us thinking—could the nervous and immune systems collaborate to regulate other key processes, like blood sugar levels?”.
A New Circuit Uncovered
To explore this idea, the researchers conducted experiments in mice. They used genetically engineered mice lacking specific immune cells to observe their effects on blood sugar levels.
They discovered that mice missing a type of immune cell called ILC2 couldn’t produce enough glucagon—the hormone that raises blood sugar—and their glucose levels dropped too low. “When we transplanted ILC2s into these deficient mice, their blood sugar returned to normal, confirming the role of these immune cells in stabilizing glucose when energy is scarce”, explains Veiga-Fernandes.
Realizing that the immune system could affect a hormone as vital as glucagon, the team knew they were onto something of major impact. But it left them asking: how exactly does this process work? The answer took them in a very unexpected direction.
“We thought this was all being regulated in the liver because that’s where glucagon exerts its function”, recalls Veiga-Fernandes. “But our data kept telling us that everything of importance was happening between the intestine and the pancreas”.
Using advanced cell-tagging methods, the team labelled ILC2 cells in the gut, giving them a glow-in-the-dark marker. After fasting, they found these cells had traveled to the pancreas. “One of the biggest surprises was finding that the immune system stimulates the production of the hormone glucagon by sending immune cells on a journey across different organs”.
Once in the pancreas, those immune cells release cytokines—tiny chemical messengers—that instruct pancreatic cells to produce the hormone glucagon. The increase in glucagon then signals the liver to release glucose. “When we blocked these cytokines, glucagon levels dropped, proving they are essential for maintaining blood sugar levels”.
“What’s remarkable here is that we’re seeing mass migration of immune cells between the intestine and pancreas, even in the absence of infection,” he adds. “This shows that immune cells aren’t just battle-hardened soldiers fighting off threats—they also act like emergency responders, stepping in to deliver critical energy supplies and maintain stability in times of need.”
It turns out this migration is orchestrated by the nervous system. During fasting, neurons in the gut connected to the brain release chemical signals that bind to immune cells, telling them to leave the intestine and go to a new “postcode” in the pancreas, within a few hours. The study showed that these nerve signals change the activity of immune cells, suppressing genes that anchor them in the intestine and enabling them to move to where they’re needed.
Implications for Fasting and Exercise
“This is the first evidence of a complex neuroimmune-hormonal circuit”, Veiga-Fernandes observes. “It shows how the nervous, immune, and hormonal systems work together to enable one of the body’s most essential processes—producing glucose when energy is scarce”.
“Mice share many fundamental biological systems with humans, suggesting this inter-organ dialogue also occurs in humans when fasting or exercising. By understanding the role of ILC2s and their regulation by the nervous system, we can better appreciate how these daily life activities support metabolic health. We’re eavesdropping on conversations between organs that we’ve never heard before”.
He adds that the immune system likely evolved as a safeguard during adversity, pointing out that our ancestors didn’t have the luxury of three meals a day and, if they were lucky, might have managed just one. This evolutionary pressure would have pressured our bodies to find ways to ensure that every cell gets the energy it needs.
“We’ve long known that the brain can directly signal the pancreas to release hormones quickly, but our work shows it can also indirectly boost glucagon production via immune cells, making the body better equipped to handle fasting and intense physical activity efficiently”.
Cancer, Diabetes and Beyond
The findings could open new doors for managing a range of conditions, notably for cancer research. Pancreatic neuroendocrine tumors and liver cancer can hijack the body’s metabolic processes, using glucagon to increase glucose production and fuel their growth. In advanced liver cancer, this process can lead to cancer-related cachexia, a condition marked by severe weight and muscle loss. Understanding these mechanisms could help develop better treatments.
“Balancing blood sugar is also critical, not only for preventing obesity, but also for addressing the global diabetes epidemic, which affects hundreds of millions of people”, remarks Veiga-Fernandes. “Targeting these neuro-immune pathways could offer a new approach to prevention and treatment”.
“This study reveals a level of communication between body systems that we’re only beginning to grasp”, he concludes. “We want to understand how this inter-organ communication works—or doesn’t—in people with cancer, chronic inflammation, high stress, or obesity. Ultimately, we aim to harness these results to improve therapies for hormonal and metabolic disorders”.
https://www.eurekalert.org/news-releases/1070432
*
THE FACTS ABOUT EGGS
If there was such a thing as a perfect food, eggs would be a contender. They're readily available, easy to cook, affordable and packed with protein.
"The egg is meant to be something that has all the right ingredients to grow an organism, so obviously it's very nutrient dense," says Christopher Blesso, associate professor of nutritional science at the University of Connecticut in the US.
Eating eggs alongside other food can help our bodies absorb more vitamins, too. For example, one study found that adding an egg to salad can increase the amount of vitamin E we get from the meal.
But for decades, eating eggs has also been controversial due to their high cholesterol content – which some studies have linked to an increased risk of heart disease. One egg yolk contains around 185 milligrams of cholesterol, which is more than half of the 300mg daily amount of cholesterol that the US dietary guidelines recommended until recently.
Does that mean eggs, rather than being an ideal food, might actually be doing us harm?
Are eggs bad for cholesterol?
Cholesterol, a yellowish fat produced in our liver and intestines, can be found in every one of our body's cells. Cholesterol is a crucial building block in our cell membranes. It also is needed for the body to make vitamin D, and the hormones testosterone and estrogen.
We produce all the cholesterol we need on our own, but it’s also found in animal produce we consume, including beef, shrimp and eggs, as well as cheese and butter.
Cholesterol is transported around our body by lipoprotein molecules in the blood. Every person has a different combination of various types of lipoproteins, and our individual make-up plays a role in determining our risk of developing heart disease.
Low-density lipoprotein (LDL) cholesterol – referred to as "bad" cholesterol – is transported from the liver to arteries and body tissues. Researchers say that this can result in a build-up of cholesterol in the blood vessels and increase the risk of cardiovascular disease.
But researchers haven’t definitively linked consumption of cholesterol to an increased risk of cardiovascular disease. As a result, US dietary guidelines no longer have a cholesterol restriction; nor does the UK. Instead, emphasis is placed on limiting how much saturated fat we consume, which can increase the risk of developing cardiovascular disease.
Foods containing trans fats, in particular, increase our LDL levels. Although some trans fats occur naturally in animal products, most are made artificially and are found in highest levels in margarines, snacks, and some deep-fried and baked foods, such as pastry, doughnuts and cake.
Meanwhile, along with shrimp, eggs are the only food high in cholesterol that are low in saturated fat.
While the American Heart Association still recommended in 2020 that we limit ourselves to one egg per day, a 2020 population study found an association between eating more than one egg per day and a lower risk of coronary artery disease.
"While the cholesterol in eggs is much higher than in meat and other animal products, saturated fat increases blood cholesterol. This has been demonstrated by lots of studies for many years," says Maria Luz Fernandez, professor of nutritional sciences at the University of Connecticut in the US, whose 2019 research found no relationship between eating eggs and an increased risk of cardiovascular disease.
The discussion on the health effects of eggs has shifted partly because our bodies can compensate for the cholesterol we consume.
"There are systems in place so that, for most people, dietary cholesterol isn't a problem," says Elizabeth Johnson, research associate professor of nutritional sciences at Tufts University in Boston.
In a 2015 review of 40 studies, Johnson and a team of researchers couldn't find any conclusive evidence on the relationship between dietary cholesterol and heart disease.
"Humans have good regulation when consuming dietary cholesterol, and will make less cholesterol themselves," she says.
And when it comes to eggs, cholesterol may pose even less of a health risk. Cholesterol is more harmful when oxidized in our arteries, but oxidation doesn't happen to the cholesterol in eggs, says Blesso.
"When cholesterol is oxidized, it may be more inflammatory, and there are all kinds of antioxidants in eggs that protect it from being oxidized," he says.
Also, some cholesterol may actually be good for us. High-density lipoprotein (HDL) cholesterol travels to the liver, where it's broken down and removed from the body. HDL is thought to have a protective effect against cardiovascular disease by preventing cholesterol from building up in the blood.
"People should be concerned about cholesterol that circulates in their blood, which is the one that leads to heart disease," says Fernandez.
What matters is the ratio of HDL to LDL in our bodies, as elevated HDL counteracts the effects of LDL.
However, while most of us are able to buffer the cholesterol we consume with the cholesterol we synthesize in our livers, Blesso says around a third of us will experience an increase in blood cholesterol by 10% to 15% after consuming it.
Trials have found that lean and healthy people are more likely to see an increase in LDL after eating eggs. Those who are overweight, obese or diabetic will see a smaller increase in LDL and more HDL molecules, Blesso says. So, if you're healthier to begin with, eggs potentially could have a more negative effect than if you're overweight – but if you’re healthier, you're also more likely to have good HDL levels, so an increase in LDL probably isn’t very harmful.
Eggs may also improve cardiovascular health through cholesterol through another mechanism. One Chinese study published in 2022 found that people who reported eating a moderate amount of eggs had more apolipoprotein in their blood, which is a building block of HDL.
Specifically, they had more large HDL molecules, which helps protect against heart attacks and strokes by helping to clear cholesterol from blood vessels.
There is some research that challenges these findings – but these studies tend to be population studies that can't unravel cause from effect. Research published in 2019 earlier this year, for example, looked at data from 30,000 adults followed for an average of 17 years and found that each additional half an egg per day was significantly linked to a higher risk of heart disease and death. (They controlled for the subjects' diet patterns, overall health and physical activity to try to isolate the effects of eggs.)
"We found that, for every additional 300mg cholesterol person consumed, regardless of the food it came from, they had a 17% increased risk of cardiovascular disease, and 18% increased risk of all-cause mortality," says Norrina Allen, one of the study’s authors and associate professor of preventive medicine at Northwestern University in Illinois, US.
"We also found that each half egg per day led to a 6% increased risk of heart disease and 8% increased risk of mortality.”
But despite the study being one of the largest of its kind at the time to address this specific relationship between eggs and heart disease, it was observational, giving no indication of cause and effect. It also relied upon a single set of self-reported data – participants were asked what they ate over the previous month or year, then followed up their health outcomes for up to 31 years. This means the researchers only got a single snapshot of what the participants were eating, even though our diets can change over time.
A more recent population study of more than half a million Americans over the age of 50 found that egg consumption was linked to a higher risk of death, including from cardiovascular disease, as well as cancer. But it also found that those who reported eating only egg whites, as opposed to the whole egg, had lower rates of death. The study has the same population-study limitations as the 2019 one, but the findings led the researchers to recommend replacing whole eggs with egg whites.
But there are numerous other observational studies that suggest eggs are good for heart health. One analysis of half a million adults in China, published in 2018, even found the exact opposite: egg consumption was associated with lower risk of heart disease. Those who ate eggs every day had an 18% lower risk of death from heart disease and 28% lower risk of stroke death compared to those who didn't eat eggs. Do healthier adults in China simply eat more eggs, or do the eggs make them healthier? This, of course, may be a big part of the confusion.
What about choline in eggs?
While these studies have reignited the debate on the impact of cholesterol in eggs on our health, we do know some ways in which eggs could affect our risk of disease.
One way is through a compound in eggs called choline, which may help protect us against Alzheimer's disease. It also protects the liver and cognitive function.
But it may have negative effects, too. Choline is metabolized by gut microbiota into a molecule called TMO [trimethylamine], which is then absorbed into people's livers and converted to TMAO [trimethylamine oxide], a molecule associated with an increased risk of cardiovascular disease. Blasso has wondered if eating a lot of choline from eggs could lead to elevations of TMAO: he found studies where people were observed to have elevated TMAO levels up to 12 hours after eating eggs.
Research measuring egg consumption and TMAO has so far only found transient increases in TMAO. However, TMAO is measured as a marker for heart disease only at a baseline level, which can be detected when people are fasting. Blasso likens this to how our blood sugar levels increase temporarily after eating carbohydrates, but elevated blood sugar levels are only associated with diabetes when these levels are continuous.
This may be because when we eat eggs, we might only get choline’s beneficial effects, he says.
"The problem is when, instead of being absorbed into the blood, choline continues to the large intestine, where it can become TMA and then TMAO," says Fernandez.
"But in eggs, choline is absorbed and doesn't go to the large intestine, so it doesn’t increase the risk of heart disease.”
A trial in 2021 found that, despite high choline content in egg yolks, healthy adults who ate four eggs a day showed no significant increase in TMAO.
So, are eggs healthy?
Meanwhile, scientists are beginning to understand other health benefits of eggs. Egg yolks are one of the best sources of lutein, a pigment that has been linked to better eyesight and lower risk of eye disease, for example.
"There are two types of lutein found in the retina of the eye, where it can protect the retina from light damage by working as a blue light filter, as exposure to light makes the eye deteriorate," says Johnson.
But it’s not just different parts of the egg that researchers have looked into — there is also some research to suggest differences in the nutrition profiles of different types of eggs –- although data is limited.
One 2021 study, for example, found that free-range eggs from a small family farm in rural Nova Scotia contained less cholesterol than conventionally-farmed eggs from the local supermarket. Also, a 2022 review of three studies looking at the link between organic egg consumption and health found that eating organic eggs was associated with lower levels of some markers of inflammation in the body.
While researchers are a long way from understanding why eggs affect us differently, the vast majority of recent research suggests they pose no risk to our health, and are much more likely to provide health benefits.
https://www.bbc.com/future/article/20190916-are-eggs-good-for-you
Oriana:
If you are concerned about your cholesterol profile, take 1500 mg of berberine a day. Be prepared to be astonished.
*
ending on beauty:
AFTERWORDS
The ruins of the weekend lie
in the mists of early Monday,
the sound of the river echoing
the darkness of your mouth.
Bloodless, misanthropic,
the shadow of nothing,
I watch the gray bird of evening
settle again like a poem
written before machines.
~ Sutton Breiding
No comments:
Post a Comment