The Callanish Stones, Outer Hebrides
*
ULTIMATE THULE
With favoring winds, o'er sunlit seas,
We sailed for the Hesperides,
The land where golden apples grow;
But that, ah! that was long ago.
How far, since then, the ocean streams
Have swept us from that land of dreams,
That land of fiction and of truth,
The lost Atlantis of our youth!
Whither, ah, whither? Are not these
The tempest-haunted Orcades,
Where sea-gulls scream, and breakers roar,
And wreck and sea-weed line the shore?
Ultima Thule! Utmost Isle!
Here in thy harbors for a while
We lower our sails; a while we rest
From the unending, endless quest.
~ H.W. Longfellow
Oriana:
Ultima Thule — in antiquity, the northernmost region of the earth, the limit of travel and discovery.
I admit to harboring (ahem) secret affection for some of Longfellow’s classics. Ultima Thule is a contrast between the sunlit dream and the stormy reality where “wreck and sea-weed line the shore.” It could also be the contrast between youth (the "heroic ego ideal") and older age, notorious for its "diminished expectations."
There’s the heroic ego quest of the first half of life — to put it more simply, the dreams of youth, which can be outrageously ambitious (I’ve discovered that it’s more the rule than the exception) — and the “diminished expectations” of the second half of life. One way or another, we try to make the best of of what we come to know as reality rather than the fantasy we started out with. And yes, after the school of hard knocks we need some rest and recovery. Then, with luck, new meanings unfold, and the quest, though transformed, does indeed seem endless.
In my personal perspective, there’s something to be said for reaching Ultima Thule: it’s a place of rest. It’s not a “stepping stone” to something else. Once you've reached your Ultima Thule, nothing is a stepping stone to anything else (and there's much to be said in favor of not seeing your life in terms of "stepping stones." You no longer have to prove yourself. Now you can have a much more mellow relationship with reality. And actually it’s a lot more interesting to have a relationship with reality rather than with a dream.
The first stanza is especially meaningful to me, in a personal sense: the dream of America.
*
WHY ENGLISH IS A STRANGE LANGUAGE
Edward the Confessor, Bayeux Tapestry
English speakers know that their language is odd. So do people saddled with learning it non-natively. The oddity that we all perceive most readily is its spelling, which is indeed a nightmare. In countries where English isn’t spoken, there is no such thing as a ‘spelling bee’ competition. For a normal language, spelling at least pretends a basic correspondence to the way people pronounce the words. But English is not normal.
Spelling is a matter of writing, of course, whereas language is fundamentally about speaking. Speaking came long before writing; we speak much more, and all but a couple of hundred of the world’s thousands of languages are rarely or never written. Yet even in its spoken form, English is weird. It’s weird in ways that are easy to miss, especially since Anglophones in the United States and Britain are not exactly rabid to learn other languages. But our monolingual tendency leaves us like the proverbial fish not knowing that it is wet. Our language feels ‘normal’ only until you get a sense of what normal really is.
There is no other language, for example, that is close enough to English that we can get about half of what people are saying without training and the rest with only modest effort. German and Dutch are like that, as are Spanish and Portuguese, or Thai and Lao. The closest an Anglophone can get is with the obscure Northern European language called Frisian: if you know that tsiis is cheese and Frysk is Frisian, then it isn’t hard to figure out what this means: Brea, bûter, en griene tsiis is goed Ingelsk en goed Frysk. But that sentence is a cooked one, and overall, we tend to find that Frisian seems more like German, which it is.
We think it’s a nuisance that so many European languages assign gender to nouns for no reason, with French having female moons and male boats and such. But actually, it’s us who are odd: almost all European languages belong to one family – Indo-European – and of all of them, English is the only one that doesn’t assign genders that way.
More weirdness? OK. There is exactly one language on Earth whose present tense requires a special ending only in the third‑person singular. I’m writing in it. I talk, you talk, he/she talk-s – why just that? The present‑tense verbs of a normal language have either no endings or a bunch of different ones (Spanish: hablo, hablas, habla). And try naming another language where you have to slip do into sentences to negate or question something. Do you find that difficult? Unless you happen to be from Wales, Ireland or the north of France, probably.
Why is our language so eccentric? Just what is this thing we’re speaking, and what happened to make it this way?
*
English started out as, essentially, a kind of German. Old English is so unlike the modern version that it feels like a stretch to think of them as the same language at all. Hwæt, we gardena in geardagum þeodcyninga þrym gefrunon – does that really mean ‘So, we Spear-Danes have heard of the tribe-kings’ glory in days of yore’? Icelanders can still read similar stories written in the Old Norse ancestor of their language 1,000 years ago, and yet, to the untrained eye, Beowulf might as well be in Turkish.
The first thing that got us from there to here was the fact that, when the Angles, Saxons and Jutes (and also Frisians) brought their language to England, the island was already inhabited by people who spoke very different tongues. Their languages were Celtic ones, today represented by Welsh, Irish and Breton across the Channel in France. The Celts were subjugated but survived, and since there were only about 250,000 Germanic invaders – roughly the population of a modest burg such as Jersey City – very quickly most of the people speaking Old English were Celts.
Crucially, their languages were quite unlike English. For one thing, the verb came first (came first the verb). But also, they had an odd construction with the verb do: they used it to form a question, to make a sentence negative, and even just as a kind of seasoning before any verb. Do you walk? I do not walk. I do walk.
That looks familiar now because the Celts started doing it in their rendition of English. But before that, such sentences would have seemed bizarre to an English speaker – as they would today in just about any language other than our own and the surviving Celtic ones. Notice how even to dwell upon this queer usage of do is to realize something odd in oneself, like being made aware that there is always a tongue in your mouth.
At this date there is no documented language on earth beyond Celtic and English that uses do in just this way. Thus English’s weirdness began with its transformation in the mouths of people more at home with vastly different tongues. We’re still talking like them, and in ways we’d never think of. When saying ‘eeny, meeny, miny, moe’, have you ever felt like you were kind of counting? Well, you are – in Celtic numbers, chewed up over time but recognizably descended from the ones rural Britishers used when counting animals and playing games. ‘Hickory, dickory, dock’ – what in the world do those words mean? Well, here’s a clue: hovera, dovera, dick were eight, nine and ten in that same Celtic counting list.
The second thing that happened was that yet more Germanic-speakers came across the sea meaning business. This wave began in the ninth century, and this time the invaders were speaking another Germanic offshoot, Old Norse. But they didn’t impose their language.
Instead, they married local women and switched to English. However, they were adults and, as a rule, adults don’t pick up new languages easily, especially not in oral societies. There was no such thing as school, and no media. Learning a new language meant listening hard and trying your best. We can only imagine what kind of German most of us would speak if this was how we had to learn it, never seeing it written down, and with a great deal more on our plates (butchering animals, people and so on) than just working on our accents.
As long as the invaders got their meaning across, that was fine. But you can do that with a highly approximate rendition of a language – the legibility of the Frisian sentence you just read proves as much. So the Scandinavians did pretty much what we would expect: they spoke bad Old English. Their kids heard as much of that as they did real Old English. Life went on, and pretty soon their bad Old English was real English, and here we are today: the Scandies made English easier.
I should make a qualification here. In linguistics circles it’s risky to call one language ‘easier’ than another one, for there is no single metric by which we can determine objective rankings. But even if there is no bright line between day and night, we’d never pretend there’s no difference between life at 10am and life at 10pm.
Likewise, some languages plainly jangle with more bells and whistles than others. If someone were told he had a year to get as good at either Russian or Hebrew as possible, and would lose a fingernail for every mistake he made during a three-minute test of his competence, only the masochist would choose Russian – unless he already happened to speak a language related to it. In that sense, English is ‘easier’ than other Germanic languages, and it’s because of those Vikings.
Old English had the crazy genders we would expect of a good European language – but the Scandies didn’t bother with those, and so now we have none. Chalk up one of English’s weirdnesses. What’s more, the Vikings mastered only that one shred of a once-lovely conjugation system: hence the lonely third‑person singular –s, hanging on like a dead bug on a windshield. Here and in other ways, they smoothed out the hard stuff.
They also followed the lead of the Celts, rendering the language in whatever way seemed most natural to them. It is amply documented that they left English with thousands of new words, including ones that seem very intimately ‘us’: sing the old song ‘Get Happy’ and the words in that title are from Norse. Sometimes they seemed to want to stake the language with ‘We’re here, too’ signs, matching our native words with the equivalent ones from Norse, leaving doublets such as dike (them) and ditch (us), scatter (them) and shatter (us), and ship (us) vs skipper (Norse for ship was skip, and so skipper is ‘shipper’).
But the words were just the beginning. They also left their mark on English grammar. Blissfully, it is becoming rare to be taught that it is wrong to say Which town do you come from?, ending with the preposition instead of laboriously squeezing it before the wh-word to make From which town do you come? In English, sentences with ‘dangling prepositions’ are perfectly natural and clear and harm no one.
Yet there is a wet-fish issue with them, too: normal languages don’t dangle prepositions in this way. Spanish speakers: note that El hombre quien yo llegué con (‘The man whom I came with’) feels about as natural as wearing your pants inside out. Every now and then a language turns out to allow this: one indigenous one in Mexico, another one in Liberia. But that’s it. Overall, it’s an oddity. Yet, wouldn’t you know, it’s one that Old Norse also happened to permit (and which Danish retains).
We can display all these bizarre Norse influences in a single sentence. Say That’s the man you walk in with, and it’s odd because 1) the has no specifically masculine form to match man, 2) there’s no ending on walk, and 3) you don’t say ‘in with whom you walk’. All that strangeness is because of what Scandinavian Vikings did to good old English back in the day.
Finally, as if all this wasn’t enough, English got hit by a firehose spray of words from yet more languages. After the Norse came the French. The Normans – descended from the same Vikings, as it happens – conquered England, ruled for several centuries and, before long, English had picked up 10,000 new words.
Then, starting in the 16th century, educated Anglophones developed a sense of English as a vehicle of sophisticated writing, and so it became fashionable to cherry-pick words from Latin to lend the language a more elevated tone.
It was thanks to this influx from French and Latin (it’s often hard to tell which was the original source of a given word) that English acquired the likes of crucified, fundamental, definition and conclusion. These words feel sufficiently English to us today, but when they were new, many persons of letters in the 1500s (and beyond) considered them irritatingly pretentious and intrusive, as indeed they would have found the phrase ‘irritatingly pretentious and intrusive’. (Think of how French pedants today turn up their noses at the flood of English words into their language.)
There were even writerly sorts who proposed native English replacements for those lofty Latinates, and it’s hard not to yearn for some of these: in place of crucified, fundamental, definition and conclusion, how about crossed, groundwrought, saywhat, and endsay?
But language tends not to do what we want it to. The die was cast: English had thousands of new words competing with native English words for the same things. One result was triplets allowing us to express ideas with varying degrees of formality. Help is English, aid is French, assist is Latin. Or, kingly is English, royal is French, regal is Latin – note how one imagines posture improving with each level: kingly sounds almost mocking, regal is straight-backed like a throne, royal is somewhere in the middle, a worthy but fallible monarch.
Then there are doublets, less dramatic than triplets but fun nevertheless, such as the English/French pairs begin and commence, or want and desire. Especially noteworthy here are the culinary transformations: we kill a cow or a pig (English) to yield beef or pork (French). Why? Well, generally in Norman England, English-speaking laborers did the slaughtering for moneyed French speakers at table. The different ways of referring to meat depended on one’s place in the scheme of things, and those class distinctions have carried down to us in discreet form today.
Caveat lector, though: traditional accounts of English tend to oversell what these imported levels of formality in our vocabulary really mean. It is sometimes said that they alone make the vocabulary of English uniquely rich, which is what Robert McCrum, William Cran and Robert MacNeil claim in the classic The Story of English (1986): that the first load of Latin words actually lent Old English speakers the ability to express abstract thought. But no one has ever quantified richness or abstractness in that sense (who are the people of any level of development who evidence no abstract thought, or even no ability to express it?), and there is no documented language that has only one word for each concept.
Languages, like human cognition, are too nuanced, even messy, to be so elementary. Even unwritten languages have formal registers. What’s more, one way to connote formality is with substitute expressions: English has life as an ordinary word and existence as the fancy one, but in the Native American language Zuni, the fancy way to say life is ‘a breathing into’.
Even in English, native roots do more than we always recognize. We will only ever know so much about the richness of even Old English’s vocabulary because the amount of writing that has survived is very limited. It’s easy to say that comprehend in French gave us a new formal way to say understand – but then, in Old English itself, there were words that, when rendered in Modern English, would look something like ‘forstand’, ‘underget’, and ‘undergrasp’. They all appear to mean ‘understand’, but surely they had different connotations, and it is likely that those distinctions involved different degrees of formality.
*
Nevertheless, the Latinate invasion did leave genuine peculiarities in our language. For instance, it was here that the idea that ‘big words’ are more sophisticated got started. In most languages of the world, there is less of a sense that longer words are ‘higher’ or more specific. In Swahili, Tumtazame mbwa atakavyofanya simply means ‘Let’s see what the dog will do.’ If formal concepts required even longer words, then speaking Swahili would require superhuman feats of breath control. The English notion that big words are fancier is due to the fact that French and especially Latin words tend to be longer than Old English ones – end versus conclusion, walk versus ambulate.
The multiple influxes of foreign vocabulary also partly explain the striking fact that English words can trace to so many different sources – often several within the same sentence. The very idea of etymology being a polyglot smorgasbord, each word a fascinating story of migration and exchange, seems everyday to us. But the roots of a great many languages are much duller. The typical word comes from, well, an earlier version of that same word and there it is. The study of etymology holds little interest for, say, Arabic speakers.
To be fair, mongrel vocabularies are hardly uncommon worldwide, but English’s hybridity is high on the scale compared with most European languages. The previous sentence, for example, is a riot of words from Old English, Old Norse, French and Latin.
Greek is another element: in an alternate universe, we would call photographs ‘lightwriting’. According to a fashion that reached its zenith in the 19th century, scientific things had to be given Greek names. Hence our undecipherable words for chemicals: why can’t we call monosodium glutamate ‘one-salt gluten acid’? It’s too late to ask. But this muttly vocabulary is one of the things that puts such a distance between English and its nearest linguistic neighbors.
And finally, because of this firehose spray, we English speakers also have to contend with two different ways of accenting words. Clip on a suffix to the word wonder, and you get wonderful. But – clip on an ending to the word modern and the ending pulls the accent ahead with it: MO-dern, but mo-DERN-ity, not MO-dern-ity. That doesn’t happen with WON-der and WON-der-ful, or CHEER-y and CHEER-i-ly.
But it does happen with PER-sonal, person-AL-ity.
What’s the difference? It’s that -ful and -ly are Germanic endings, while -ity came in with French. French and Latin endings pull the accent closer – TEM-pest, tem-PEST-uous – while Germanic ones leave the accent alone. One never notices such a thing, but it’s one way this ‘simple’ language is actually not so.
Thus the story of English, from when it hit British shores 1,600 years ago to today, is that of a language becoming delightfully odd. Much more has happened to it in that time than to any of its relatives, or to most languages on Earth. Here is Old Norse from the 900s CE, the first lines of a tale in the Poetic Edda called The Lay of Thrym. The lines mean ‘Angry was Ving-Thor/he woke up,’ as in: he was mad when he woke up. In Old Norse it was:
Vreiðr vas Ving-Þórr / es vaknaði.
The same two lines in Old Norse as spoken in modern Icelandic today are:
Reiðr var Ving-Þórr / es vaknaði.
You don’t need to know Icelandic to see that the language hasn’t changed much. ‘Angry’ was once vreiðr; today’s reiður is the same word with the initial v worn off and a slightly different way of spelling the end. In Old Norse you said vas for was; today you say var – small potatoes.
In Old English, however, ‘Ving-Thor was mad when he woke up’ would have been Wraþmod wæs Ving-Þórr/he áwæcnede. We can just about wrap our heads around this as ‘English’, but we’re clearly a lot further from Beowulf than today’s Reykjavikers are from Ving-Thor.
Thus English is indeed an odd language, and its spelling is only the beginning of it. In the widely read Globish (2010), McCrum celebrates English as uniquely ‘vigorous’, ‘too sturdy to be obliterated’ by the Norman Conquest. He also treats English as laudably ‘flexible’ and ‘adaptable’, impressed by its mongrel vocabulary. McCrum is merely following in a long tradition of sunny, muscular boasts, which resemble the Russians’ idea that their language is ‘great and mighty’, as the 19th-century novelist Ivan Turgenev called it, or the French idea that their language is uniquely ‘clear’ (Ce que n’est pas clair n’est pas français).
However, we might be reluctant to identify just which languages are not ‘mighty’, especially since obscure languages spoken by small numbers of people are typically majestically complex. The common idea that English dominates the world because it is ‘flexible’ implies that there have been languages that failed to catch on beyond their tribe because they were mysteriously rigid. I am not aware of any such languages.
What English does have on other tongues is that it is deeply peculiar in the structural sense. And it became peculiar because of the slings and arrows – as well as caprices – of outrageous history.
https://getpocket.com/explore/item/english-is-not-normal?utm_source=firefox-newtab-en-us
*
WHAT HAS PUTIN DONE FOR RUSSIANS?
Over the years I’ve worked with a number of Russians, some in quite senior positions. The vast majority of them saw the 1990’s not as liberation and the blessed birth of freedom, but economic and social chaos that allowed a few well-connected folks to make unimaginable fortunes while everyone else struggled to survive.
They see, as does Putin, the fall of the USSR as an embarrassment, an end to their prominence on the world stage and the days when they were one of just two superpowers.
Some friends have told me that while the shops are full and the restaurants are better today, they miss the socialization that collective poverty forced upon them.
To these people, Putin is how they got their mojo back.
Today you’ve got a US president who clearly has his strings being pulled in Moscow, someone who tells anyone in earshot that he and Vladimir get along really well.
You’ve got a US president going against his allies, trying to get Russia back into the G-7/8. (Kicking Putin out for invading Crimea was a brilliant move, as it rubbed salt into those 1990’s fears of no longer being powerful or even relevant.) The economy is holding up surprisingly well. India and China are buying Russian oil, albeit at a discount. Western brands that left in protest of the invasion of Ukraine have been replaced by local knock-offs, strengthening the image of self-reliance that is so important to the Russian psyche.
Sure, there are those folks who inexplicably walk out of open 4th floor windows. And the guys who order poisoned tea instead of the usual variety. And yes the elections are a farce and absolutely no one trusts what they read in the media.
But they never did. Older Russians are familiar with the days when officials who’d fallen out of favor were airbrushed out of official photos from years ago (great book on this called The Commissar Vanishes). They NEVER believed what the media reported, and taught their kids the same cynicism.
Conversely, part of our shock and horror today was that the Washington Post used to be so honorable they made a movie about it. We had a long way to fall; the Russians were already waiting at the bottom.
I’ve lost touch with a lot of folks since the invasion of Ukraine. Certainly prior to that, pretty much everyone thought that even if the elections were clean and fair, Putin would still win. I think that’s likely to be the case. Yes, he’s suffered 1 million casualties trying to achieve his dream of bringing the USSR back together, but he carefully drafted the victims from the poorer regions outside of Moscow and St. Petersburg, and paid substantial death benefits.
The war hasn’t touched most ordinary Russians, as the US has blocked Ukraine from using the weapons we provide to target sites inside Russia. The Ukrainians use drones to attack oil refineries and fighter jets, but they have chosen NOT to hit civilian targets.
Last point: there’s no obvious replacement. Opposition politicians have a habit of getting sick or getting arrested. Even some in his inner circle suddenly find themselves in a plane crash. Can you name a single Russian politician who’s likely to step into Putin’s shoes?
It’s always hard to tell how much Trump understands about anything. But it’s clear that he wants what Putin has got: pretty much unlimited power, with elections being more of a show than something to keep you up at night. He wants to surround himself with oligarchs who are loyal because that’s how they get — corruptly, unfairly, illegally — rich. He sees himself as bringing back the glory not of the Soviet Union, but whenever it was that America was previously Great.
If that doesn’t worry you, it should. ~ David W. Rudlin, Quora
*
ELENA GOLD ON RUSSIA
The historical misconception about Russia that still impacts current events the most is not understanding the difference between the Soviet Union and the Russian Federation.
Somehow, Putin managed to implant the idea that Moscow should be allowed to have a say in affairs of Russia’s neighbors, into the western informational space — based on the fact that the international relations during the times of the USSR were dictated by the Kremlin.
But Russia isn’t the Soviet Union. The Soviet Union had twice the population of Russia — so, more former Soviet citizens live outside Russia than inside it.
Many leaders still see Russia as a “mighty power” based on what was happening in the Soviet times.
But today’s Russia is way behind what the USSR was. Today’s Russia can’t even build its own planes without foreign parts. It can’t feed itself. It can’t clothe itself. It can’t even make enough weapons to fight its war. It needs things made abroad.
Putin is a sad old man, dreaming of the glory of Stalin. A former communist, who betrayed the communist ideals and built a corrupt mafia state instead.
Russia is rotten to the core, compared to the Soviet Union. It’s a state that has no idea of the future. Putin’s ideal is the USSR past.
The past that the world have moved on from, decades ago. The past that cannot be recreated.
Putin’s Russia cannot survive, because it clings to the past. Russia’s collapse is inevitable. ~ Elena Gold, Quora
Martin Lux:
Russia takes it for granted that they should keep expanding. 550 years ago Russia was a rather small kingdom around Moscow.
*
HAMAS IS WINNING THE CULTURE WAR

The furious torrent of lies and manipulations emerging from Gaza should surprise no one. This is Hamas’ Battle of Berlin. And the terrorists are winning.
Just look at the image of Evyatar David, who was kidnapped from the Nova festival, crouching in a tunnel, emaciated and forced to dig his own grave.
It is no coincidence that Hamas brazenly chose to publish photographs of the skeletal Jewish prisoners it is deliberately starving in its war tunnels after a week of successfully marketing its blood libel about how the IDF is starving Gazans. If you’ve ever seen a Mafia movie, you already know the trick. Having groomed a new recruit for months, the mobsters have one final test: whack a guy, rob a joint, shoot up an innocent girl with heroin, do something so blatantly immoral and evil that it will tie you down to a life of crime forever. Once you’ve crossed the bright line into blatant immorality, in front of witnesses, on what basis will you object to the boss’s next order?
Hamas is now doing the same thing, with Western liberals in the role of the new recruits. For weeks, the terrorist group passed off photographs of people suffering from genetic disorders as evidence of Israeli cruelty; at the same time, it prevented ample aid supplies from reaching those very people by hijacking more than 90 percent of U.N. aid trucks, selling family aid packages—which enter Gaza for free—at insane markups, and shooting into crowds of people trying to access aid provided by a U.S.-backed humanitarian consortium.
None of these tactics is new or in the least bit original to the current Gaza war. What’s new is how brazenly Hamas is doing it, and how avidly European presidents, editors, intellectuals, activists, and even some Jews are closing their eyes and ears and going along for the ride, parroting the Hamas propaganda line like toddlers.
Which is precisely what Hamas was counting on. The point of the starvation photographs wasn’t to prove anything; to the contrary, the evidence that the photographs did not show children suffering from a lack of food was right there in the picture frame, in the form of the well-fed-looking mothers and other relatives. It doesn’t take any special feat of research or in-depth knowledge of conditions on the ground in Gaza or acquaintance with the medical literature on starvation to know that mothers feed their children first. That’s why in pictures from actual famines—of which there are unfortunately many thousands of examples in the photo departments of every big-city newspaper and major magazine on the planet—you see skeletal mothers along with skeletal babies. Not in Gaza, though.
That’s because the point of the photographs was to provide onlookers with a fig leaf of an excuse to embrace the narrative of a terrorist organization whose aims are, in fact, openly genocidal—and which had done its best to live up to its promises by shooting, gang-raping, and stabbing thousands of Israelis on Oct. 7, 2023, setting entire families on fire in their homes, and live-streaming the proceedings, before dragging off more than 250 captives to their internationally sponsored dungeons in Gaza, to be beaten, tortured, and starved. It’s no contest. Hamas makes the Mafia look like choirboys.
Openly siding with monsters like that is a heavy lift. Except, what else is a good liberal-progressive person to do? All you need to do is look at the pictures and feel empathy with a hungry child. Only a monster could possibly confront such undeniable evidence of human suffering and respond with calorie counts and numbers of aid trucks and arguments about who did what to whom first. Do you really want to forfeit your membership in polite society to defend an army that starves children, regardless of preexisting medical conditions? Nice law office/magazine/newspaper/university/career in scholarship or the arts/personal reputation you’ve got here. It would be a shame if anything bad were to happen to it, because you supported a genocide.
When top brass at The New York Times objected to the fact that the emaciated child that their photo editors had placed on the cover of their newspaper was not in fact suffering from a lack of food, but rather a constitutional inability to digest food, they directed the staff to come up with a different photograph, showing the effects of starvation rather than an underlying disease. Instead, they were presented with more photographs of children with genetic disorders. And yet, the inability of Times editors and stringers to come up with even a single photograph of a starving child—images that, according to the Times’ own reporting about famine and mass starvation in Gaza should have been plentiful—didn’t cause the paper to change its coverage one iota.
There was still mass starvation in Gaza, you see—because of course there is. Because everyone online says there is. Because Israel lies. Because the Hamasniks inside the paper would quit. As it was, the Times had to hunker down as masked “protesters” festooned the paper’s shiny corporate headquarters with ugly, insulting graffiti. Who wants that?
Then came the final initiation: Look at one more photograph, this time of an actual victim of actual forced starvation, which resulted from deliberate and well-documented Hamas policy—and ignore it. Why? Because you don’t really care about starvation or about facts or about justice and peace. You care about us, and you’ll think whatever we tell you to think and do as we command. The world, by and large, has obeyed. Whatever happens in Gaza this week or next month or next year, Hamas’ victory in the war for public opinion is complete.
How is a terror group that fought one of the world’s mightiest armies for nearly two years after livestreaming hundreds of gruesome snuff films and posting them on YouTube possibly winning the war? Why is it ascendant, supported by the leaders of Europe, the Pope, and the editors and writers of nearly every newspaper and magazine and broadcast network on the planet, even when the smallish territory it holds has been decimated, and so many of its masterminds and foot soldiers alike have been eliminated in battle? And what to make of the baffling sympathy of so many who ought to know much better?
The answer is as simple as it is stark: Hamas is winning because Hamas understands that the story polite, educated, kindhearted people in Tel Aviv and Berlin and New York and Paris tell themselves about the world is a flimsy fairy tale to which they’re hopelessly attached, to the exclusion of anything and everything else, obvious atrocities included—and that the only way to maintain that fairy tale is denial.
The fairy tale goes something like this: All people are basically good, and all cultures are basically the same. Sure, they differ in curious and charming ways—this one has a festival involving song and dance, that one flies kites and lights lanterns during the winter solstice—but they share a profound human truth: namely, that human beings value nothing more than their own well-being and that of their children. All of which makes the species a cheerfully rational lot: scratch a human, across all nations and cultures, and you’ll find the same deep desire for a nice home, a decent income, and opportunities for upward mobility.
But every fairy tale needs its ogres, and so we have those baddies we call “extremists.” They, too, come in many shades. There’s the menacing Itamar Ben-Gvir, who wants to chew up Arab babies, or the villainous Yahya Sinwar and his band of Oct. 7 marauders. But the differences don’t really matter much. The only thing to know about the baddies is that they stand in the way of the common dream of peace and prosperity that unites us. Which is why choosing sides, in fairy-tale land, is an intolerable act of prejudice. The global village can have no sides because we are all the same—and all equal.
Look at Israel through this warped prism, and a blissfully simple image emerges: Some Palestinian ogres did something bad, in response to which the Israeli ogres did something even worse. To end this vicious cycle of misunderstanding and hatred, we need to stop the war (war is bad!), engage in diplomacy (diplomacy is good!), and act to the best of our ability to punish the ogres on both sides and check their future excesses.
If only we provide enough foreign aid dollars, the Palestinian people (good and noble and peaceful!) will finally shake off the yoke of Hamas, and the Israeli people (well groomed and technologically competent!) will kick the brutish Bibi to the curb and learn to live in peace with their pleasant Palestinian neighbors.
It’s an inspiring story, swirling with the spirit of the Enlightenment. It’s also patently false.
Israel is not fighting against Hamas, a small and malicious terror group tyrannically oppressing its own population. Israel is at war with the Palestinian people, who have adopted the 20th-century political program of replacing the Jewish nation-state of Israel through whatever combination of violent and “peaceful” means and assuming whatever form of sovereignty there.
We know this from the accounts of hostages who were held by civilians as slaves, forced to cook and clean and serve entire families while small children and elderly women alike mocked them and added to their suffering. We know this from the accounts of Oct. 7, 2023, showing that the majority of the ghouls who entered Israel to pillage, murder, and rape weren’t members of highly trained terror units but ordinary Gazans who relished the opportunity to afflict pain and suffering on the Jews next door.
And we know this from the stories of IDF soldiers returning from Gaza and reporting that every home in the strip, more or less, prominently features a photograph of Jerusalem emblazoned with the promise to erase the Zionist presence there.
These sentiments aren’t, as some of our self-professed intellectual and moral betters tried to assert, the result of generational trauma or the occupation or the psychological complications of living in a so-called open-air prison, as The Atlantic Monthly once called Gaza prior to Israel’s departure in 2005. They are, quite simply, the tenets of an entirely coherent belief system, one whose adherents repeat and elucidate, clearly and honestly, every chance they get—one that began as an offshoot of postcolonial Pan-Arab nationalism and has since become subsumed into the older and broader currents of the Muslim faith.
Islam favors the forceful conversion and/or subjugation of nonbelievers, as even the quickest glimpse at its history will tell. As such, the idea of a Jewish enclave in Muslim lands—especially one that has successfully resisted its onslaughts for the better part of a century and that now controls one of its holiest cities, while excelling at science and technology and warfare—is unacceptable. The infidels must be killed at any cost; nothing else matters.
This worldview, of course, strikes compassionate Western contemporaries as impossibly grotesque. That like beliefs have guided much of the species throughout much of the course of human events in both Islamic countries and in the West matters to them not a whiff. They’ve no use for history’s mightiest engine, the one fueled by the clash of incompatible ideas, cultures, beliefs, and aspirations. After all, we’re no longer apes. We’re human beings, proudly godless and infinitely perfectible—and, above all, peaceful by nature.
So what if Rome destroyed Carthage and the Barbarians sacked Rome, or if Raynald of Chatillon prevailed in the Battle of Montgisard before being beheaded by Saladin, nearly two decades later, in Hattin? Such unremitting clashes between mutually exclusive forces belong in the yellowing pages of antiquity, not on the front pages of our newspapers. And yet, as we lull ourselves with soothing stories of our shared humanity and settlements on Mars, and possible immortality, thanks to the wonders of gene editing and whatever technological innovations are coming down the pike, our foes march on. They march not only on the Jews—the canaries in the coal mine—but also on London and Paris and Sydney and Frankfurt, demanding acquiescence to ways of living and beliefs that we once rightfully condemned as barbaric. Or else.
Go visit a public park in Birmingham or London or attempt to buy lunch in downtown Athens or Malmö, and it’s obvious that Europe is dying—its native populations, folkways, religions, and languages being replaced by people whose relationship with their host countries is marked most loudly by resentment, mixed with contempt. Terrified European elites, presiding over shrinking populations and dwindling resources, know no other way but to submit, while justifying their submission through ever-more elaborate rituals of pretense and denial.
Israel has no such privilege. To survive, it has just one path forward. First, it must realize that as land and humiliation are the only two viable currencies in the Middle East, it must reoccupy Gaza, reviving President Trump’s proposal to relocate the strip’s inhabitants to Egypt, the Gulf states, Ireland, France, and wherever else desires to take them. Relocation of populations as the result of war is not a barbaric offense practiced only by Nazis, as opponents shout; it is the common outcome of nearly every war in history.
If shipping Gazans out of Gaza as a consequence of their defeat is somehow Nazi-like, then the list of Nazi states on the planet is long indeed: China, India, Pakistan, Vietnam, Poland, Germany, the Czech Republic, Ukraine, Russia, Austria, and France, for starters. The United States sent hundreds of thousands of Loyalists fleeing to Canada in the aftermath of the Revolutionary War, and not a single one has yet received compensation for their losses. That’s war.
Second, Israel must reject any notion of a future settlement that is absent a complete and total Palestinian surrender, not just in Gaza but in the West Bank as well. A Palestinian state is not the answer to the problems of either Jews or Arabs. It is a way for the world to guarantee a violent and bloody future for everyone in the region, by snatching victory from the jaws of defeat for Hamas.
History has been very clear in its verdict that there is room at best for one state between the river and the sea, as the Palestinians and their Western partisans like to put it. Any sane person deciding between the existence of the State of Israel, a technologically advanced liberal democracy as well as the region’s leading military power, and the various Palestinian principalities that owe their existence to outside charity, should have an easy time deciding which state that should be. If your answer is Palestine, then you are either an Islamist or a nihilist. Either way, your values are not mine—especially given the scale of the murdering that your answer supposes, and the human desert that you propose to build on the resulting pile of bones.
Third, Israel must resist the enormous pressure that will result from European capitals, because the pressure is precisely the point. Britain is already facing a bubbling revolt of citizens enraged by decades-long concealment of Pakistani grooming gangs raping hundreds of defenseless young women, as well as by an even more insidious effort to arrest and silence people who point out the obvious on social media. The more vocal and violent the anti-Israel revolt in Europe grows, the more likely it is to force the continent’s feckless leadership into a reckoning that their policy of welcoming migrants is about to backfire in a very painful way.
It is the hope of European elites that by throwing Israel over the side of the ship, they might buy themselves perhaps another decade or two of relative social peace, during which they can believe whatever they want about human nature while eating gobs of Nutella. I believe these comforting assumptions about the efficacy of sacrificing the Jews will be a mistake for them.
Either way, Israel can’t be part of it. Dying for Europe’s delusions of how it might buy peace with its own barbarians was the unavoidable fate of European Jews during World War II, an experience that made the necessity of a Jewish state clear to every sentient Jew and sympathetic or guilt-driven Western person on the planet. I am sad to say that our own century’s barbarians show no signs of being any friendlier to Jews than their European predecessors were.
Thankfully, having a state means that Jews are no longer compelled to sacrifice ourselves for the convenience of Europeans or the global left or deluded right-wing American podcasters or The New York Times or anyone else. Every other consequence of our national existence, however brutal or bloody, is painfully small by comparison.
https://www.tabletmag.com/sections/israel-middle-east/articles/winning-the-culture-war
*
MASS SHOOTINGS AND MENTAL HEALTH PROBLEMS
After a random shooting in Austin, Texas, left three people dead, Austin Police Chief Lisa Davis said that the suspect had past criminal offenses and “serious issues.” The 32-year-old suspect was arrested after police found him naked, holding a Bible and claiming he was Jesus.
“There were some serious failures here,” Davis said.
Days earlier, a 30-year-old man had fired hundreds of shots at the US Centers for Disease Control and Prevention in Atlanta, killing a police officer. The gunman, who took his own life, had spoken about suicide and had reportedly reached out for mental health assistance ahead of the attack.
And just a few days before that, another gunman with a history of mental health problems shot and killed four people in a Manhattan skyscraper before turning the gun on himself.
Suspects in several recent high-profile attacks were described as having mental health problems, but experts say that doesn’t mean their mental health issues are to blame for the killings.
No mental health system is built to catch such rare and explosive crimes, experts said. But the potential solution is one that many politicians won’t have the stomach to address: limiting access to guns.
“Often we tell the mental illness story because it’s the most obvious or fits into our stereotypes and if we focus only on that, then we’re missing all of these other factors which are much more predictive of mass shootings,” said Dr. Jonathan Metzl, the director of the Department of Medicine, Health and Society at Vanderbilt University and author of “What We’ve Become: Living and Dying in a Country of Arms.”
“Having a mental health problem is not predictive of mass shootings,” Metzl said. “Many have symptoms of mental illness, that’s definitely true, but that’s a different argument than saying that mental illness caused the mass shooting.”
Violence is not a listed symptom of mental health issues, including major depression or schizophrenia, Metzl noted. “In fact, there’s no mental illness whose symptoms are violence toward others or shooting other people,” he said.
Many people in the United States have a mental illness, according to the US Centers for Disease Control and Prevention — 1 in 5 adults experience a mental illness in a given year — and only a “microscopic number of them go on to hurt anyone else,” Metzl said.
People with mental illness are much more likely to be the victim, rather than the perpetrators, of violence, studies show. And if a person with mental health issues hurts anyone with a gun, it’s most likely themselves, said Dr. Jeffrey W. Swanson, a professor in psychiatry and behavioral sciences at Duke University School of Medicine who has also written extensively about gun violence and mental health.
Mental illness is a strong causal factor in suicides, studies show, but only about 3% to 4% of violent acts are attributable to serious mental illness alone, Swanson’s research showed. Even among gun violence, mass shootings are unusual: Of the 150,000 people shot in the United States every year, only about one to 2% of those were victims of mass shootings, research shows.
“If you think about it, we certainly have a problem with gun violence in the US. We have a problem with mental illness. Those are two really big public health problems that intersect on their edges, but mental illness, it’s not exactly the place you would start if you just wanted to try to stop so many people from dying in mass shootings,” Swanson said.
Mass shootings generally don’t stem from one problem but several factors might increase the risk: A history of violence, access to guns, violent social networks, misogyny and substance abuse all make the list.
“Most perpetrators of mass shootings had domestic violence histories or targeted family or intimate partners,” said Lisa Geller, senior adviser for implementation at the Center for Gun Violence Solutions at Johns Hopkins Bloomberg School of Public Health.
“Domestic violence, more than any other issue really played a critical role in mass shootings,” said Geller, who wrote a 2021 study about the role of domestic violence in mass shootings.
And while some may argue mass shootings would be prevented if attackers had better access to the mental health system, J. Thomas Sullivan, professor emeritus at the University of Arkansas at Little Rock William H. Bowen School of Law, said he doesn’t think that’s right, either.
Sullivan and other experts CNN spoke with said the more effective solution would be to focus on the tools used to harm others.
People with mental illness are much more likely to be the victim, rather than the perpetrators, of violence, studies show. And if a person with mental health issues hurts anyone with a gun, it’s most likely themselves, said Dr. Jeffrey W. Swanson, a professor in psychiatry and behavioral sciences at Duke University School of Medicine who has also written extensively about gun violence and mental health.
Mental illness is a strong causal factor in suicides, studies show, but only about 3% to 4% of violent acts are attributable to serious mental illness alone, Swanson’s research showed. Even among gun violence, mass shootings are unusual: Of the 150,000 people shot in the United States every year, only about one to 2% of those were victims of mass shootings, research shows.
“If you think about it, we certainly have a problem with gun violence in the US. We have a problem with mental illness. Those are two really big public health problems that intersect on their edges, but mental illness, it’s not exactly the place you would start if you just wanted to try to stop so many people from dying in mass shootings,” Swanson said.
Mass shootings generally don’t stem from one problem but several factors might increase the risk: A history of violence, access to guns, violent social networks, misogyny and substance abuse all make the list.
“Most perpetrators of mass shootings had domestic violence histories or targeted family or intimate partners,” said Lisa Geller, senior adviser for implementation at the Center for Gun Violence Solutions at Johns Hopkins Bloomberg School of Public Health.
“Domestic violence, more than any other issue really played a critical role in mass shootings,” said Geller, who wrote a 2021 study about the role of domestic violence in mass shootings.
And while some may argue mass shootings would be prevented if attackers had better access to the mental health system, J. Thomas Sullivan, professor emeritus at the University of Arkansas at Little Rock William H. Bowen School of Law, said he doesn’t think that’s right, either.
“Putting the burden on the mental health system to provide the help that would be needed to stop these shootings is an inappropriate way to shift the blame,” said Sullivan, who has written extensively about the topic.
Sullivan said several of the mental health experts he has taught said their patients issue threats all the time. “But not everybody can accurately predict when somebody who is making a threat it actually going to follow through,” Sullivan said.
Sullivan and other experts CNN spoke with said the more effective solution would be to focus on the tools used to harm others.
“I think a lot of people aren’t going to like to hear this, but the real problem is access to guns,” Sullivan said.
There are many responsible gun owners, himself included, Sullivan said, but “it’s very difficult to stop someone from firing a gun if they’ve got one.”
Gun restrictions are what countries such as Australia and New Zealand turned to after a mass shooting. The strongest risk factors for violent behavior in general, Swanson said, is being young and being male. “But you know you can’t round them up, right?” Swanson said. Some countries have decided “that the idea that everyone should have easy access to a firearm is just too dangerous. So they broadly limit legal access to guns,” he said.
Police block the entrance to a Target after a shooting in Austin, Texas
“Until neuroscientists come up with the magic molecule to eliminate injurious behavior, in the meantime it’s important to focus on the lethal means issue,” Swanson said.
EDITOR’S NOTE: Help is available if you or someone you know is struggling with suicidal thoughts or mental health matters.
In the US: Call or text 988, the Suicide & Crisis Lifeline.
Globally: The International Association for Suicide Prevention and Befrienders Worldwide have contact information for crisis centers around the world.
After Connecticut increased its enforcement of its red flag law, research found it was associated with a 14% reduction in the state’s firearm suicide rate. In California, gun violence restraining orders have been credited with deterring at least 58 potential mass shootings and other types of gun violence in that state, including suicide, research shows.
Polls have shown the majority of Americans favor these kind of restrictions, but the political reality does not always reflect that.
In Texas, where three people — a store employee, a 4-year-old and her grandfather — were killed in the Austin Target parking lot on Monday, the legislature recently passed law that make such gun law restrictions illegal.
https://www.cnn.com/2025/08/13/health/mental-health-shootings
*
“GODFATHER” OF AI ON THE ONLY WAY TO INSURE AI IS SAFE
Geoffrey Hinton, known as the “godfather of AI,” fears the technology he helped build could wipe out humanity — and “tech bros” are taking the wrong approach to stop it.
Hinton, a Nobel Prize-winning computer scientist and a former Google executive, has warned in the past that there is a 10% to 20% chance that AI wipes out humans. On Tuesday, he expressed doubts about how tech companies are trying to ensure humans remain “dominant” over “submissive” AI systems.
“That’s not going to work. They’re going to be much smarter than us. They’re going to have all sorts of ways to get around that,” Hinton said at Ai4, an industry conference in Las Vegas.
In the future, Hinton warned, AI systems might be able to control humans just as easily as an adult can bribe 3-year-old with candy. This year has already seen examples of AI systems willing to deceive, cheat and steal to achieve their goals. For example, to avoid being replaced, one AI model tried to blackmail an engineer about an affair it learned about in an email.
Instead of forcing AI to submit to humans, Hinton presented an intriguing solution: building “maternal instincts” into AI models, so “they really care about people” even once the technology becomes more powerful and smarter than humans.
AI systems “will very quickly develop two subgoals, if they’re smart: One is to stay alive… (and) the other subgoal is to get more control,” Hinton said. “There is good reason to believe that any kind of agentic AI will try to stay alive.”
That’s why it is important to foster a sense of compassion for people, Hinton argued. At the conference, he noted that mothers have instincts and social pressure to care for their babies.
“The right model is the only model we have of a more intelligent thing being controlled by a less intelligent thing, which is a mother being controlled by her baby,” Hinton said.
Hinton said it’s not clear to him exactly how that can be done technically but stressed it’s critical researchers work on it.
“That’s the only good outcome. If it’s not going to parent me, it’s going to replace me,” he said. “These super-intelligent caring AI mothers, most of them won’t want to get rid of the maternal instinct because they don’t want us to die.”
Hinton is known for his pioneering work on neural networks, which helped pave the way to today’s AI boom. In 2023, he stepped down from Google and started speaking out about the dangers of AI.
Not everyone is on board with Hinton’s mother AI approach.
Fei-Fei Li, known as the “godmother of AI” for her pioneering work in the field, told CNN on Wednesday that she respectfully disagrees with Hinton, her longtime friend.
“I think that’s the wrong way to frame it,” Li, the co-founder and CEO of spatial intelligence startup World Labs, said during a fireside chat at Ai4.
Instead, Li is calling for “human-centered AI that preserves human dignity and human agency.”
“It’s our responsibility at every single level to create and use technology in the most responsible way. And at no moment, not a single human should be asked or should choose to let go of our dignity,” Li said. “Just because a tool is powerful, as a mother, as an educator and as an inventor, I really believe this is the core of how AI should be centered.”
Hinton said it’s not clear to him exactly how that can be done technically but stressed it’s critical researchers work on it.
AI is accelerating faster than expected
Many experts believe AIs will achieve superintelligence, also known as artificial general intelligence, or AGI, in the coming years.
Hinton said he used to think it could take 30 years to 50 years to achieve AGI but now sees this moment coming sooner.
“A reasonable bet is sometime between five and 20 years,” he said.
While Hinton remains concerned about what could go wrong with AI, he is hopeful the technology will pave the way to medical breakthroughs.
“We’re going to see radical new drugs. We are going to get much better cancer treatment than the present,” he said. For instance, he said AI will help doctors comb through and correlate the vast amounts of data produced by MRI and CT scans.
However, Hinton does not believe AI will help humans achieve immortality.
“I don’t believe we’ll live forever,” Hinton said. “I think living forever would be a big mistake. Do you want the world run by 200-year-old white men?”
Asked if there’s anything he would have done differently in his career if he knew how fast AI would accelerate, Hinton said he regrets solely focusing on getting AI to work.
“I wish I’d thought about safety issues, too,” he said.
https://www.cnn.com/2025/08/13/tech/ai-geoffrey-hinton?iid=cnn_buildContentRecirc_end_recirc&recs_exp=most-read-article-end&tenant_id=popular.en
*
AFTER THE SPIKE: WHAT THE SLOW AND STEADY DEPOPULATION MEANS FOR THE WORLD
In 2012, 146 million children were born. That was more than in any prior year. It was also more than in any year since.
Millions fewer will be born this year. The year 2012 may well turn out to be the year in which the most humans were ever born—ever as in ever for as long as humanity exists.
No demographic forecast expects anything else. Decades of research studying Africa, Asia, Europe, and the Americas tell a clear story of declining birth rates.
The fall in global birth rates has lasted centuries. It began before modern contraception and endured through temporary blips like the post-World War II baby boom. For as far back as there are data to document it, the global birth rate has fallen downward—unsteadily, unevenly, but ever downward.
So far, falling birth rates have merely slowed the growth in humanity’s numbers. So far.
The view from the top of a Spike
There are quite a lot of people in the world. But that hasn’t been true for long. Ten thousand years ago, there were only about 5 million of us. That’s as many people as today live in the Atlanta metro area, and only a fraction of the number who live in Bangkok, Beijing, or Bogotá. A thousand years ago, our numbers had grown to a quarter billion.
Two centuries ago, we passed 1 billion for the first time. One of every five people who have ever lived was born in the 225 years since 1800. A populous world, on the scale of humanity’s hundred-thousand-year history, is new.
Getting big happened fast. And as soon as it has happened, it’s about to be over. In the shorter run—soon enough to be seen by people alive today—humanity’s global count will peak. There’s a gap between the year of peak births and the year of peak population—a gap that we now live within—because the annual number of births, though falling, has not yet fallen far enough to reach the annual number of deaths. That will happen within decades.
Different experts predict slightly different timetables for when. The demographers at the UN believe it is most likely to happen in the 2080s. The experts at the International Institute for Applied Systems Analysis in Austria place the peak a little sooner in the same decade. The Institute for Health Metrics and Evaluation at the University of Washington projects a peak even sooner, in the 2060s.
These dates aren’t exactly the same. But on the timeline of humanity, a difference of twenty years is not really a difference. Each group projects that birth rates will keep falling, so each group projects that we peak this century.
What happens after?
Figure 1.1 plots humanity’s path. We call this picture—of humanity’s past, present, and possible future—the Spike.
We first presented the Spike in a pair of publications in 2023: an opinion article in the New York Times and a matching research paper that filled in the scientific details. We asked: What if birth rates stay on their current course? The answer is that if they do, then humanity will depopulate. We do not mean that humanity would stop growing, reach some plateau, and stabilize near our present numbers. Every decade after turning the corner, there would be fewer of us. Within three hundred years, a peak population of 10 billion could fall below 2 billion.
The Spike is not a product of outlandish imagination. The possibility it charts does not assume some shift or reversal in the way people live and behave. The Spike is what would happen if the whole world one day had the sort of birth rates that are already common in many places. In that future, like now, some people would have a few children. Some would have none. And many would have one or two.
We generated the Spike by projecting a future in which, globally, there were 1.6 children per pair of adults, a statistic that matches the current U.S. average. But, as we’ll show soon, something like the Spike will happen as long as the worldwide average stays below two children per pair of adults. Below two children is what matters, because it means that one generation isn’t replacing itself in the next generation. Is that kind of future likely?
The United States’ average of 1.6 kids is not exceptional. The birth rate is below two in Mexico, Canada, Brazil, Russia, Thailand, and many other countries. The European Union as a whole is at 1.5. The two most populous countries, India and China, are both below two. A birth rate below two is found within each U.S. state; when looking only among U.S. Blacks, whites, or Hispanics; and in every Canadian province.
You stand now at the top of the Spike with 8 billion others. The story of the future starts with understanding the fact that most of those 8 billion others don’t (or didn’t, or won’t, once they grow up) aspire to parent very many children.
One of those people is Preeti. In 2022, Preeti had a baby in a crowded government hospital in India. Her baby was born very small. So after a nurse rolled up a cart to weigh and assess her baby girl, Preeti was brought to the hospital’s new program for underweight newborns, called “Kangaroo Mother Care.” Preeti and her baby were assigned one of the program’s ten beds in the next room.
Preeti lives in Uttar Pradesh, a populous, poor state in the north of India. She traveled to the hospital from a half-mud, half-brick home in a small village. The nurses down the hall don’t have neonatal incubators, which are the standard treatment for underweight babies born in the rich places of the world. But they do have proven, low-cost procedures to keep tiny babies warm, fed, and alive.
The baby was Preeti’s first. She expects to have one more. She already loves this girl. But it would be good, Preeti says, if the next one were a boy so she can “get the operation”—meaning sterilization surgery, having done her duty to have a boy.
Preeti’s hope for two children is normal now, even in a poor, disadvantaged state in India. This book tells her story and her nurses’ stories. Their choices, their lives, are also part of a wider story. A story in which women in rural Uttar Pradesh (where many women are poor, haven’t had much schooling, and marry young) choose two children is a story in which many women, everywhere, choose even fewer. Preeti is one eight-billionth of the story that this book tells: Choosing fewer children is becoming normal, everywhere.
Rural India might seem like the middle of nowhere to someone who has never been to Uttar Pradesh. But to an economist or demographer, India is in the middle of the world’s statistics: middle in income, middle in life expectancy, and middle in birth rates. And what happens in India is important for the planet as a whole. At some point between when Preeti’s baby was born and now, India became the world’s most populous country. If there’s one thing that many non-Indians know about India, it’s that there are a lot of people there: in 2025, 1.4 billion.
What fewer people realize is that India is on a path to a shrinking population, which is a corner that China has recently turned and Japan did in 2010. That’s because many women like Preeti plan to have one or two children. In the most recent national data from India, women were having children at an average rate of two per two adults. Because that data point was from 2020, the average has almost certainly fallen to a little bit less than two by 2025. But even back in 2020, those who had been to secondary school (a growing fraction of girls and women in India) averaged 1.8, which matches the average for all U.S. women in 2016. The hospital where Preeti gave birth is in an especially disadvantaged state of India. But young women there said that they want about 1.9 children, on average. Small families are the new normal.
What’s so normal about normal?
For many people, a society where women average 1.8 or 1.9 children would feel familiar. But so much familiarity is deceiving.
Normalcy will create something unprecedented. Birth rates that are normal in most countries today will lead to an unfamiliar future of global depopulation.
If today’s normal stays normal, then big changes are coming.
And yet, looking around, you might not notice the difference between a society on the track toward depopulation and one headed for a stable future. Figure 1.2 diagrams two (of many) possible futures, with different fractions of people choosing zero, one, two, three, or four children. The taller a bar is, the larger the fraction of adults who have that many children. On the right is a distribution of family sizes that would make for a stabilized population, neither growing nor shrinking. On the left is a depopulating future, with 1.6 births per two adults, on average.
How different are the left and the right? It depends on what we’re asking. The bars look only a little different, but their consequences are very different. Their implications are as different as a steady, stable global population, on the right, and a decline toward zero, on the left. They don’t look that different. Both include some families with a few children, plenty with none or one, and a bunch with two. Both look pretty ordinary if you live in a place like Austin, Texas, where we do. Professional statisticians could tell the difference, if they had all the data. But could you tell the difference on a visit to the park, the grocery store, the pool? Could you see the difference at school drop-off, at the coffee shop, or jogging around the lake? Probably not. And that means the patterns of family life leading to a profoundly different future can slip past our notice.
We may not feel it. We may not see it. But we teeter at the tip of the Spike. Our times, when many people are alive, may prove to be unlike the entire rest of human history, past and future—if what is normal today persists.
Is this story four-fifths over?
Birth rates around the world vary in interesting ways: across countries and provinces, by race and religion, by education and income. In the United States, teen births are most likely to happen in January, but births to married moms are most likely in May. In India, Dalits—the disadvantaged caste group formerly called “untouchable”—tend to have slightly more children than people born into more privileged castes. The varied history is fascinating, too: France’s fertility fell fast in the 1700s, long before its neighbors’ did and long before hormonal birth control or latex condoms were invented. Experts have written thousands of articles about the details in scholarly journals.
But those detailed differences don’t help us understand what is likely to happen. We learn what is likely to happen by seeing what people around the world have in common. Every region on Earth today either has low birth rates, like China, India, or the United States (the three most populous countries), or has falling birth rates, like most African countries. If humanity stays the course it is now on, then humanity’s story would be mostly written. About four-fifths written, in fact. Why four-fifths? Today, 120 billion births have already happened, counting back to the beginning of humanity as a species, and including the births of the 8 billion people alive today. If we follow the path of the Spike, then fewer than 150 billion births would ever happen. That is because each future generation would become smaller than the last until our numbers get very small.
Right about now it would be understandable to think, “But come on! This is all too much confidence about an unsustainable trend! Surely people won’t keep having fewer and fewer children forever.”
Some trends are indeed unsustainable, and it would be a mistake to extrapolate them indefinitely. We’re not making that kind of mistake here. People around the world could continue to have small families. Not smaller than today. Small like today. They could continue, for a long time, to make individual decisions that add up to 1.4 or 1.6 or 1.8 children on average. A depopulating future would arise from steady birth rates at these levels.
How long depopulation could continue depends on what people choose. Our numbers will fall decade by decade, as long as people look around and decide that small families work best for them. That’s all it would take. There would never be more than 150 billion humans, if families continue to have a bit less than two children each, on average. So if—if—humanity stays this course, then there would be only 30 billion more of us for the rest of human history. How exactly might we fizzle out in that future? Should anyone literally expect that humanity will depopulate down to the last two people?
No. In a world that sheds 8 billion people, something big would eventually break and knock us off this path, for good or for bad. We would not ride the precise math of the Spike down to the last few million of us.
The off-ramp from the Spike could be sharply down. The end could be some catastrophe that a larger population might have survived but a smaller population couldn’t. We have a chapter about this possibility. Or the off-ramp could be up. Maybe birth rates would rebound, after a disaster or disintegration that staggers us.
How? If progress halts or reverses, if life becomes worse, then it would be like we moved toward humanity’s poorer past. People had more babies in the poorer past than they do today and tend to have more babies in poorer countries than in richer countries. So perhaps the off-ramp is some disaster that regresses on social, technological, or political progress, knocking backward humanity’s millennia-long history of struggle and growth.
That might mean higher birth rates, and it might even stabilize the population, but it wouldn’t be good.
Might matters reverse automatically, without big changes? The short answer is that that’s unlikely. A reversal would break a centuries-old trend of declining birth rates. That trend is founded on social and economic changes that most of us view as progress and that none of us should expect to disappear.
We can learn about the odds of an automatic rebound from the histories of countries where birth rates have fallen low. Since 1950, there have been twenty-six countries, among those with good enough statistics to know, where the number of births has ever fallen below 1.9 births in the average woman’s full childbearing lifetime.
Never, in any one of these twenty-six countries, has the lifetime birth rate again risen to a level high enough to stabilize the population. Not in Canada, not in Japan, not in Scotland, not in Taiwan. Not for people born in any year. In some of these countries, governments believe they have policies to promote and support parenting. But all of them continue to have birth rates below two. A 0-for-26 record does not mean that things couldn’t change, but it would be reckless to ignore the data. If a reversal happens, it will be because people decided they wanted to reverse it and then worked to make it happen, not because automatic stabilizers kicked in.
It takes two (to ever have a stable global population of any size)
Perhaps even at the end of this book you will not agree that a world of 5 billion flourishing people could be better than a world of 500 million equally well-off people. But do you think the size of the population should ever stabilize at any level—even a level much smaller than today’s—rather than dwindling toward zero?
Some inescapable math. For stabilization to ever happen at any level—even to maintain a tiny, stable global population—the same math applies: For every two adults, there must be about two children, generation after generation.
Wait, two? Exactly 2.0? Two for everybody? No, the next chapter explains. For now, it is enough to see that any population, large or small or tiny, continues to shrink if there aren’t at least two children for each two adults. Dwindling toward zero is neither balance nor sustainability.
Notice what this inescapable math implies: Once the global average falls below two, which is a marker that we are likely to pass in a few decades, stabilizing the world population would require the global birth rate to increase and then to stay higher permanently. That has never happened before in recorded demography.
Maybe you feel confident that someday, somebody good and powerful will figure it out. Maybe you are more optimistic than the projections in the Spike that, after some decades or centuries of depopulation, humanity will manage to pull its birth rates back up to two. Even if you think so, read on.
For one, you might be wrong. This book will show that some popular beliefs about the history of how governments and movements have shaped birth rates are wrong.
For another, even if the global population will eventually recover, it makes a big difference when the recovery begins. Here are the stakes, even in the optimistic case of an unprecedented recovery: Each decade of delay in starting the rebound causes the final, stabilized population size to be 8 percent smaller, ever after.
It is time to pay attention
Do you remember when you first understood that climate change is a seriously big deal? Most of us born before 1990 went through school without much awareness. Your authors grew up in a time when school-children learned about the problems of an ozone hole, acid rain, and depleted tungsten supplies, not carbon emissions. The first book about climate change for a general audience, Bill McKibben’s The End of Nature, was not published until 1989. But the basic facts have been known for a lot longer than the social movement has been around. Congress heard scientific testimony in the 1950s.
In 1965, President Johnson included in a speech to Congress that: “This generation has altered the composition of the atmosphere on a global scale through radioactive materials and a steady increase in carbon dioxide from the burning of fossil fuels.” That year, the White House released a report calling carbon dioxide a pollutant. Progress, such as it is, has only accelerated in recent years. But somebody got started in the 1950s.
Good thing they did, or the climate policy of today would not have the tools, the technologies, and the political awareness to make the progress it is finally making. Scientists in the 1950s and ’60s had recognized the threat of climate change. They did not have a complete map to every solution. But they did not believe it was too early to get started, six decades ago.
The tip of the Spike may be six decades from today. (Or a few decades sooner than that.) Like the climate pioneers of the 1950s, all of us alive and working today are decades away from anyone having all the answers we need. But that does not exempt us from facing up to the facts. It’s time to start learning. The first step is understanding the population today, where it came from, and where it is heading.
https://lithub.com/after-the-spike-what-slow-and-steady-depopulation-means-for-the-world/
Michael:
France has tried the 'pay a mother for each child' model. Paris shells out more than a thousand Euros per child per year. As a result, France has the highest reproduction rate in Europe — but it is still just 1.62 babies per woman. Well below replacement level. Historically communities have only been successful in boosting birth rates in two ways —
polygamy (the Mormons being an excellent example) and universal child care/creches. (Israeli kibbutzes.)
whole grain cat loaf:
Quick question. If you see a mother with six kids, do you treat her like a queen repopulating the world for you or do you treat her like a dirty drain on society and a slut ruining your good time out in public with her brats? Because based on your behavior, you seem to WANT women to have tons of kids, but are also the type of man who looks down on young mothers and mothers of 4+ kids as scum and makes that everybody's problem.
Michael:
Falling population is already a major driver of conflicts. Putin's Russia has a 1.4 reproduction rate. It is no longer has sufficient population to assert great power status. So Putin says Ukraine 40+ millions are actually Russian! Helps him keep the whites dominant within the multi-cultural Russian Federation and provides him enough population to claim significant power status.
But, unfortunately for his plans the Ukrainians aren't playing ball.
On a slightly less dramatic level, Beijing has to rely on exports to drive continued economic growth because the aging Chinese population doesn't consume enough.
*
WHO FIRST REALIZED THE EARTH WAS VERY, VERY OLD?
How the father of geology — and biology — James Hutton challenged 18th century beliefs
~ Back in the 18th century, most scholars believed that the Earth had been created about 6,000 years previously and that it hadn’t changed much in that time, with mountains, rocks and lakes still sitting pretty much where God had placed them.
However, Scottish geologist James Hutton disagreed. And in challenging the prevailing orthodoxy, he set the scene for Darwin’s theory of evolution.
Hutton thought that the Earth was more dynamic than theologians would have it, and spent decades traveling Britain in search of evidence. “Lord pity the arse that’s clagged to a head that will hunt stones,” he wrote during one particularly arduous leg on horseback.
He discovered sites where strata meet at right angles, where older rocks overlaid younger ones, and, in Holyrood Park in his home city of Edinburgh, a small cliff that became known as ‘Hutton’s Section’. Here he observed that lava had penetrated the existing sedimentary rock layer.
Hutton did not actually put a figure on its age. Instead he talked of “deep time”, in which “we find no vestige of a beginning – no prospect of an end.”
These insights earned Hutton the sobriquet “the father of modern geology”, but he was also an influential uncle of modern biology. He provided the time frame necessary for evolution to work its magic; it was then left to Darwin to work out the mechanism.
https://www.discoverwildlife.com/people/meet-the-scientist/james-hutton-theory
*
ANOTHER LOOK AT MATERIALISTS (THE MOVIE)
Like all love triangles, Celine Song’s new film Materialists places you at a fateful fork in the road, peering at two points in the distance and evaluating the different futures that lie in wait. In Materialists, the first destination looks like this: a glossy Manhattan penthouse; regular dinner dates at five-star restaurants; few if no apparent friends; a lot of money, and being the object of envy of New York’s society women. What you lack in warmth you make up for in status. The second, meanwhile, is much less glamorous: a dingy shared apartment in south Brooklyn with two slob flatmates; arguments about money; takeaway meals from food trucks. But perhaps you’d have a lot more fun. [Oriana: few women would "buy" this argument. One thing that money can buy is fun experience, e.g. travel]
It’s the question driving many of our romantic stories, the choice animating everything from Jane Austen’s novels to the climax of reality television show The Bachelor: love or money? Song’s films seem to be more interested in love. Her first feature, the double Oscar nominated Past Lives, was a wistful story about star-crossed love that brought audiences to tears. There is a lot less wist in this follow-up, a satire-tinged drama about the indignities of modern dating in our renewed gilded age. Dakota Johnson plays Lucy, an unapologetic materialist and high-end matchmaker who is instantly charmed by Harry (Pedro Pascal), a banker who is what those in her business call a “unicorn”: rich, tall, handsome, smart. At the same time, she reconnects with ex-boyfriend John (Chris Evans), who still looks at her with a puppy-eyed devotion and nurses his inability to provide her the life she wants like a sore wound.
Except for John, the film’s characters tend to talk to each other with the performative coldness of businesspeople. Potential partners are evaluated for their ability to make one feel “valuable”. Harry declares an interest in Lucy’s “immaterial assets”. Lucy’s clients demand their dates have a minimum salary (the women) or a maximum age (the men). Everybody speaks as if they are angling themselves as contestants on The Apprentice, without any of the messily fun theatrics of reality TV.
The marketing of Materialists has placed the film firmly in the elevated world of Harry’s penthouse over John’s grungy flat. There is the cast, drawn from the most in-demand stars in Hollywood; there is its cult US distributor, A24; there is Song’s “syllabus” for the film, replete with the works of Mike Leigh and Merchant Ivory and Martin Scorsese; the understated, quiet luxury wardrobe; the soundtrack featuring the Velvet Underground and Cat Power. Though when I watched it, I thought not so much of Leigh, but rather the less cool big-budget 2000s romcoms that also set out the same fundamental premise of Materialists: an ambitious young woman tries to make it in the big city, makes mistakes in love and in work, and learns hard lessons about life in the process.
Two decades have passed since these films – How to Lose a Guy in 10 Days, Bridget Jones’s Diary, Maid in Manhattan – commanded the box office, and a lot has changed since, not least the collapse of the blockbuster romcom film and the genre’s move to low-budget fare on the streamers. It’s interesting, still, to see how Materialists has reengaged with the genre’s tropes.
It struck me, for example, that John – an avatar of unconditional devotion, unfailingly loyal if a void of any edge – is a Duckie from Pretty in Pink kind of figure, the prospective love interest the protagonist considers before choosing someone more alpha and more interesting (in effect, a Harry). Perhaps after the post-#MeToo reckoning and the ongoing crisis in masculinity, our view of the ideal man has softened – though it helps, I imagine, if the man in question looks like Chris Evans. [Oriana: I found Chris Evans of zero interest, and actually less handsome than Pedro Pascal.]
Meanwhile, the film’s affect of extremely mannered and self-aware cynicism seems firmly out of our current age rather than the cheery turn-of-the-millennium sugariness of, say, Love Actually. The world has hardened since, our lives are angrier and more isolated. The internet has sharpened individualist hustle culture, and the most powerful man in the world is a status-obsessed dealmaker incapable of seeing anything beyond the lens of his own ego. And so the characters of Materialists scramble to ascend the marketplace, keeping an eye on where they stand in the pecking order. In today’s US, the bottom can be a terrifying place.
Watching these largely rich, largely lonely people talk about love through the language of the market, I thought: what a sad way to see other people, and what a sad way to be. The film thinks this, too, judging from (spoiler warning!) its sudden about-turn ending, in which love wins over money and the heart triumphs over cold, calculating reason. It is a conventional fairytale romcom ending, but perhaps with everything that’s passed since, this retro callback is the point: a bid for a new sincerity after decades of status-conscious cynical individualism. Duckie has finally won over the rich alpha male; the biggest prize today is someone who will love you unconditionally.
I didn’t find this final triumph in Materialists particularly convincing: its characters were too cold, too unspecific and lacking in vitality to really make me root for their final reconciliation. (I did appreciate, though, the film’s contemporary twist on the romcom fantasy: Lucy’s realization that her dream job is ethically murky and of indeterminate value to the world.)
But it did make me want to see more romcoms on the big screen, ones with intellectual curiosity and seriousness that command the space that Materialists – against prevailing movie industry trends – has been given. The beauty of love, after all, is that it can break through our solipsism and radically reshape ourselves. It is a hopeful, radical practice that finds in other people not cause for anger, defense, or hatred, but possibility for mutual wisdom and growth.
“The whole movie is about fighting the way that capitalism is trying to colonize our hearts and colonize love,” Song said recently. Maybe finding space for life outside capitalism’s relentless onward march is increasingly a fantasy – but what a beautiful, frothy fantasy that can be.
https://www.theguardian.com/film/2025/aug/14/materialists-anti-capitalist-romcom-celine-song
Oriana:
I can only repeat what I wrote before: the Chris Evans character isn't attractive enough (not just physically, but also in terms of mentality and personality, not to mention his pathetic career prospects) to be realistically a bigger draw than the handsome rich man with a luxury penthouse apartment. Just the view from those windows!
Another point that hasn't been made about this choice is that marrying the richer man would enable the heroine to quit her dubious job and have children -- and to pursue her interests, take classes, travel, etc. Material abundance is a key that opens many doors. Poverty closes the same doors.
*
A COMET COULD HAVE CAUSED A RAPID CLIMATE SHIFT AROUND 12,800 YEARS AGO
The Younger Dryas event, an ancient climate change catastrophe is typically attributed to glacial meltwater.
Around 12,800 years ago, the Northern Hemisphere got cold — really cold — in an abrupt climate change crisis called the Younger Dryas event. Now, researchers have found evidence that suggests that the sudden catastrophe may have been caused by a comet.
Reporting their results in a study in PLOS One, the researchers identified geochemical signatures in deep-sea sediments from Baffin Bay, off the coast of Greenland, which indicate that the cooling of the Northern Hemisphere’s air and ocean may have come from a collision with a comet as it disintegrated.
“Collisions of the Earth with comets led to catastrophes, leading to climate change, to the death of civilizations. One of these events was a catastrophe that occurred about 12,800 years ago,” said Vladimir Tselmovich, a study author from the Russian Academy of Sciences, according to a press release. “Having studied in detail the microscopic traces of this disaster in Baffin Bay, we were able to find multiple traces of cometary matter.”
Dropping around 10 degrees Celsius in the span of a single year, then stabilizing at a lower level for around 1,200 years, the temperatures of the Northern Hemisphere were abnormally cold throughout the course of the Younger Dryas, potentially prompting consequences for the plants, animals, and human civilizations that lived there. But whatever the consequences, their cause is typically attributed to an increase in glacial meltwater, which would have weakened the ocean currents that transport warm water throughout the Northern Hemisphere.
While some scientists suspect that this glacial meltwater arose without a comet, others say that one of these celestial ice clumps created the melt. Indeed, the proponents of the “Younger Dryas Impact Hypothesis” posit that the debris from a disintegrating comet destabilized Earth’s ice sheet, causing the increase in glacial meltwater that disrupted the ocean’s circulation.
The hypothesis has found some support in terrestrial sediment and ice cores, though it has lacked evidence from deep-sea ones. Hoping to gather this evidence, the authors of the new study investigated the geochemistry of four deep-sea sediment cores from the time of the Younger Dryas event, taken from the floor of Baffin Bay.
Using an assortment of microscopy and spectroscopy techniques, which offered a closer look at the cores and their chemical compositions, the researchers concluded that the sediments contained material from a comet, based on “the morphology and composition of the microparticles found,” Tselmovich said in the release.
“The amount of comet dust in the atmosphere was enough to cause a short-term ‘impact winter,’ followed by a 1,400-year cooling period,” he added in the release. “The results obtained confirm the hypothesis that the Earth collided with a large comet about 12,800 years ago.”
Metal Particles and Microspherules
Among the evidence that the team found were microscopic particles that were consistent with the metals in comet dust, and microscopic particles that were consistent with extraterrestrial material, containing a large amount of platinum, iridium, nickel, and cobalt.
The team also discovered spherical microscopic particles, or microspherules, that seemed both terrestrial and extraterrestrial in origin, which likely emerged when the debris from the comet exploded above or at the ground, mixing and melting into the material it encountered there.
Together, the findings indicate that a cometary collision occurred at around the same time as the Younger Dryas event, though additional research — perhaps involving deep-sea sediments — is required to establish a causal connection between the two.
“Our identification of a Younger Dryas impact layer in deep marine sediments underscores the potential of oceanic records to broaden our understanding of this event and its climatological impacts,” added Christopher Moore, another study author from the University of South Carolina, in the release.
https://www.discovermagazine.com/a-comet-could-ve-caused-rapid-climate-change-around-12-800-years-ago-47900?utm_source=firefox-newtab-en-us
from Wikipedia:
The Younger Dryas (YD, Greenland Stadial GS-1) was a period in Earth's geologic history that occurred circa 12,900 to 11,700 years Before Present (BP). It is primarily known for the sudden or "abrupt" cooling in the Northern Hemisphere, when the North Atlantic Ocean cooled and annual air temperatures decreased by ~3 °C (5 °F) over North America, 2–6 °C (4–11 °F) in Europe and up to 10 °C (18 °F) in Greenland, in a few decades. Cooling in Greenland was particularly rapid, taking place over just 3 years or less. At the same time, the Southern Hemisphere experienced warming. This period ended as rapidly as it began, with dramatic warming over ~50 years, the transition from the glacial Pleistocene epoch into the current Holocene.
*
MARIA GRAHAM AND THE VALPARAÍSO EARTHQUAKE
An earthquake in Chile and the observations of eye-witness Maria Graham caused open hostility among 19th-century geologists.
‘Small Earthquake in Chile, Not many dead.’ The journalist Claud Cockburn supposedly won a prize for dreaming up this notoriously dull newspaper headline, although it seems never to have been used. In contrast, the very real Valparaiso earthquake of 19 November 1822 was immensely newsworthy: it killed or injured around 500 people, razed entire villages to the ground and prompted a tsunami.
One on-the-spot eye-witness, Maria Graham, described how a 100-mile stretch of coast had been lifted several feet above its former level so that ‘the ancient bed of the sea laid bare and dry, with beds of oysters, muscles, and other shells adhering to the rocks on which they grew, the fish being dead and exhaling most offensive effluvia.’
https://www.historytoday.com/archive/great-debates/maria-graham-and-valparaiso-earthquake
(shifting to AI:)
Maria Graham, who was present in Chile at the time, meticulously documented the earthquake's effects, including its duration, oscillation, and impact on the terrain. She also noted the raising of the coastline in the area of Quintero, which she measured at 4 feet.
Graham's report, "An Account of Some Effects of the Late Earthquakes in Chili," was published in 1824 in the Transactions of the Geological Society. It was the first female-authored paper in that journal. Her observation about the raised coastline became a point of contention, particularly with George Greenough, the President of the Geological Society.
Greenough questioned the accuracy of Graham's observations, leading to a public dispute between the two. While some historians initially characterized Graham as a layperson or amateur, recent research highlights her considerable interest and expertise in geology, demonstrating that she was a competent fieldworker and knowledgeable about the relevant theories.
Graham's report, despite the controversy, was influential and cited by prominent geologists like Charles Lyell and Charles Darwin. Darwin even visited the site Graham described in 1836, corroborating her observations. Graham's work contributed to the growing understanding of earthquakes and their impact on the Earth's surface during a period of significant development in geological science.
Key observations and insights
Coastal Uplift: Graham recorded that the earthquake had raised a 100-mile stretch of the Chilean coastline by several feet, exposing the former seabed. She noted a specific instance at Quintero Bay where an old shipwreck, previously inaccessible, became reachable from land after the quake.
Evidence of Elevation: She observed beds of dead shellfish, including oysters and mussels, attached to newly exposed rocks, further supporting her claim of land uplift.
Long-term Geological Processes: Graham deduced that the coast had likely been elevated by past earthquakes, linking the immediate event to longer-term geological processes of land formation and mountain building, including the Andes. This perspective was crucial in shifting geological thinking towards concepts of deep time and incremental change.
Challenge by Greenough: Graham's findings became controversial when George Bellas Greenough, President of the Geological Society of London and a proponent of Neptunism (a theory that landforms were primarily shaped by aqueous processes), attacked her observations, suggesting she lacked the necessary calmness and objectivity to accurately record the event. His critique hinted at gender bias, questioning whether a woman could be a reliable scientific observer under such conditions.
Lyell's Support: Charles Lyell, a leading geologist of the time, used Graham's observations to support his theory of vulcanism (which attributed landforms to igneous processes and included elevation caused by earthquakes) in his influential "Principles of Geology”.
Validation by Darwin: Further evidence supporting Graham's claims emerged with the 1835 Concepción earthquake, also in Chile, observed by Charles Darwin during the Beagle expedition. Darwin's observations of raised shell beds in Quintero, inspired by Graham's report, implicitly endorsed her findings.
Empress Maria Leopoldina: Graham forged a strong friendship with Empress Maria Leopoldina, an enthusiastic botanist and natural history enthusiast herself. This connection opened doors to the imperial household and provided access to resources and opportunities otherwise unavailable to women at the time. The Empress shared Graham's passion and reportedly arranged scientific expeditions for her into the surrounding rainforests near Rio de Janeiro.
Ultimately, Maria Graham's observations, despite the initial controversy, were recognized as valuable and helped to shape understanding of the role of earthquakes in geological processes like land elevation and mountain formation.
Besides geology, Maria Graham also made contributions to botany, zoology, and mineralogy.
A BIT MORE ON EMPRESS MARIA LEOPOLDINA
Maria Leopoldina of Austria (22 January 1797 – 11 December 1826) was the first Empress of Brazil as the wife of Emperor Dom Pedro I from 12 October 1822 until her death. She was also Queen of Portugal during her husband's brief reign as King Dom Pedro IV from 10 March to 2 May 1826.
She was born in Vienna, Austria, the daughter of Holy Roman Emperor Francis II, and his second wife, Maria Theresa of Naples and Sicily. Among her many siblings were Emperor Ferdinand I of Austria and Marie Louise, Duchess of Parma, the wife of Napoleon Bonaparte.
In the 21st century, it has been proposed by some historians that she was one of the main articulators of the process of Independence of Brazil that took place in 1822. Her biographer, historian Paulo Rezzutti, maintains that it was largely thanks to her that Brazil became a nation. According to him, the wife of Dom Pedro "embraced Brazil as her country, Brazilians as her people and Independence as her cause."
She was also adviser to Dom Pedro on important political decisions that reflected the future of the nation, such as the Dia do Fico and the subsequent opposition and disobedience to the Portuguese courts regarding the couple's return to Portugal. Consequently, for governing the country on Dom Pedro's trips through the Brazilian provinces, she is considered the first woman to become head of state in an independent American country.
*
SMALL LONG-NOSED DOGS LIVE LONGEST
Ask any pet parent about their pup’s longevity and they’ll likely tell you they wish dogs could live forever. And while an anti-aging drug for canines is in the works, dog owners still have to grapple with the eventual pain and heartbreak of losing their beloved four-legged friends.
Now, research published in Scientific Reports offers a more nuanced picture of how long dogs typically live, depending on their breed, sex, size and face shape. More specifically, small and long-nosed dogs tend to live the longest, while larger and flat-nosed dogs tend to have shorter lives, the United Kingdom-based study found.
These findings could help pet parents, animal shelters, breeders and policymakers arrive at more informed decisions about the health and welfare of dogs.
“This provides an opportunity for us to improve the lives of our canine companions,” says study lead author Kirsten McMillan, a data scientist at the London-based animal welfare organization Dogs Trust, to the Guardian’s Nicola Davis. “We are identifying groups that desperately need attention, so we can zone in on these populations and work out what the problem is.”
On average, dogs live to be about 10 to 13.7 years old, the study found. But, just as with humans, canine lifespans vary greatly depending on a variety of factors, from genetics to lifestyle to size. Researchers wanted to explore those characteristics on a broad scale and see if any patterns emerged.
To do so, they gathered data from breed registries, veterinarians, pet insurance companies, animal welfare charities and universities. In the end, their dataset included information about 584,734 dogs located within the U.K. Of those, about half—284,734 individuals—were deceased. Their sample included purebred and crossbred dogs.
The median lifespan for all dogs in the sample was 12.5 years, the team found. Female dogs tended to live slightly longer than males, with a median lifespan of 12.7 years for females compared to males’ 12.4 years.
When the team homed in on size and face shape, they found that smaller and long-nosed dogs tended to live longer than larger and flat-nosed dogs. Miniature dachshunds, for instance, which are both small and long-nosed, had a median lifespan of 14 years, compared to just 9.8 years for French bulldogs, which are medium-sized dogs with flat noses.
Flat-nosed, or brachycephalic dogs, have many known health issues, including breathing problems and heat intolerance, but it remains unclear whether or how those factors contribute to their risk of early death.
Among the 155 purebred breeds included in the dataset, Lancashire heelers tended to live the longest, with a median life expectancy of 15.4 years. Behind them were Tibetan spaniels (15.2 years), Bolognese (14.9 years), shiba inus (14.6 years) and papillons (14.5 years), to name a few.
The breeds with the shortest lifespans were Caucasian shepherds (5.4 years), presa canarios (7.7 years) and cane corsos (8.1 years).
One surprising finding was that purebred dogs tended to live longer than crossbreds: The median lifespan for purebreds was 12.7 years, compared to 12 years for crossbreds.
This contrasts with the long-held belief that crossbred dogs are longer-lived than purebred dogs because they have more variation in their genes, says Audrey Ruple, a veterinarian at Virginia Tech who was not involved in the new research, to New Scientist’s Chen Ly. Scientists may want to look more deeply at this difference in the future, she adds.
More broadly, the study did not explore the reasons behind these variations in dogs’ lifespans, which might also be fodder for future research, reports the New York Times’ Emily Anthes. In addition, since the research only included U.K. dogs, the findings may not be representative of all pups worldwide.
To come up with global life expectancy estimates for dogs, the team hopes scientists in other countries will conduct similar studies.
“Once we have those estimates from country to country... that can be hugely helpful in us working toward improving the longevity of some of these [breeds],” McMillan says to Science News’ Erin Garcia de Jesús.
https://getpocket.com/explore/item/which-dogs-live-the-longest-scientists-say-small-and-long-nosed-canines-outlive-others?utm_source=firefox-newtab-en-us
*
SEX DIFFERENCES IN THE IMMUNE SYSTEM
Many diseases affect men and women differently. Asthma tends to strike men earlier in life, yet more women develop asthma as they get older. Parkinson's is more common in men, but Alzheimer's is more common in women.
The differences are even more stark when it comes to autoimmune disease. Women are around two and a half times more likely than men to develop multiple sclerosis and nine times more likely to develop lupus.
Why would some diseases strike one sex more than another? And why do some tissues, such as the lungs and brain, seem especially vulnerable to these sex-based differences?
To answer these questions, scientists at La Jolla Institute for Immunology (LJI) are leading new research into how our immune cells defend specific parts of the body.
In a new Science review, LJI Professor, President & CEO Erica Ollmann Saphire, Ph.D., MBA, and LJI Associate Professor Sonia Sharma, Ph.D., examine how genetics, sex hormones, and environmental factors come together to shape the immune system.
"In just the last two years, LJI scientists have uncovered a whole new body of information about how the immune systems of men and women are very different," says Saphire. "We're looking at what is genetically encoded in our XX or XY chromosomes, and how hormones like estrogen and testosterone affect what is genetically programmed into our immune cells."
In the paper, the researchers define biological sex (in an immunology context) as the presence of XX chromosomes in females and XY chromosomes in males. "Every cell in your body is either XX or XY," says Saphire. That X chromosome has many, many immune-related genes.
Women have two copies of each. That gives them, in a sense, twice the palette of colors to paint from in formulating an immune response. It can also give them a stronger immune response for those genes that are doubly active active in both copies simultaneously.
Sex hormones are important for much more than reproductive function. Immune cells can also sense hormones such as estrogen and testosterone and use them to determine which genes to turn on or off and which ones to turn on more brightly or dim. This means similar immune cells can do different things, depending on whether that cell is from a male or a female.
Further, female cells vary in which of their two copies of X is "turned on." As a result, women have organs with a collage, or mosaic, of immune cells that work differently in different tissues. This innate "variety" of immune cells appears to be an effective way to ward off infectious disease (women are better than men at fighting off pathogens such as SARS-CoV-2).
But scientists have also found that having more genes from X chromosomes may predispose women to autoimmune disease. This increased X chromosome "dosage" is closely linked to a higher risk for autoimmune diseases such as Sjögren’s syndrome and scleroderma.
New research into sex-based immune system differences is also critical for developing new cancer immunotherapies, Sharma explains.
"We're increasingly understanding how sex-based differences affect disease outcomes. When it comes to medicine, one size doesn't fit everybody," says Sharma, who directs LJI's Center for Sex-Based Differences in the Immune System. "This is leading to new research, particularly in the cancer field, toward precision medicine. We're asking how a person's individual immune system is contributing to controlling that cancer through immunotherapy."
Saphire and Sharma also highlight environmental factors, such as nutrition and chemical exposures, that may add to the complex interplay of chromosomes and sex hormones. Men and women also appear to have some signature differences in their skin and gut microbiomes.
The researchers hope these foundational discoveries can lead to medical advances for all, and they're working with collaborators across the country to move this research forward. "It takes a team to translate these findings," says Sharma.
https://www.eurekalert.org/news-releases/1094031
*
TINY CREATURES THAT GET FAT AND HELP FIGHT GLOBAL WARMING
A tiny, obscure animal often sold as aquarium food has been quietly protecting our planet from global warming by undertaking an epic migration, according to new research. These "unsung heroes" called zooplankton gorge themselves and grow fat in spring before sinking hundreds of meters into the deep ocean in Antarctica where they burn the fat.
This locks away as much planet-warming carbon as the annual emissions of roughly 55 million petrol cars, stopping it from further warming our atmosphere, according to researchers.
This is much more than scientists expected. But just as researchers uncover this service to our planet, threats to the zooplankton are growing.
Female copepods (4mm) with cigar-shaped fat stores in their bodies
Scientists have spent years probing the animal's annual migration in Antarctic waters, or the Southern Ocean, and what it means for climate change.
The findings are "remarkable", says lead author Dr Guang Yang from the Chinese Academy of Sciences, adding that it forces a re-think about how much carbon the Southern Ocean stores. "The animals are an unsung hero because they have such a cool way of life," says co-author Dr Jennifer Freer from British Antarctic Survey.
But compared to the most popular Antarctic animals like the whale or penguin, the small but mighty zooplankton are overlooked and under-appreciated.
This copepod has hair-like arms for feeding
If anyone has heard of them, it's probably as a type of fish food available to buy online. But their life cycle is odd and fascinating. Take the copepod, a type of zooplankton that is a distant relative of crabs and lobsters.
Just 1-10mm in size, they spend most of their lives asleep between 500m to 2km deep in the ocean. In pictures taken under a microscope, you can see long sausages of fat inside their bodies, and fat bubbles in their heads, explains Prof Daniel Mayor who photographed them in Antarctica.
Without them, our planet's atmosphere would be significantly warmer. Globally the oceans have absorbed 90% of the excess heat humans have created by burning fossil fuels. Of that figure, the Southern Ocean is responsible for about 40%, and a lot of that is down to zooplankton.
Millions of pounds is being spent globally to understand how exactly they store carbon.
Scientists were already aware that the zooplankton contributed to carbon storage in a daily process when the animals carbon-rich waste sinks to the deep ocean.
But what happened when the animals migrate in the Southern Ocean had not been quantified. The latest research focused on copepods, as well as other types of zooplankton called krill, and salps.
The creatures eat phytoplankton on the ocean surface which grow by transforming carbon dioxide into living matter through photosynthesis. This turns into fat in the zooplankton. "Their fat is like a battery pack. When they spend the winter deep in the ocean, they just sit and slowly burn off this fat or carbon," explains Prof Daniel Mayor at University of Exeter, who was not part of the study.
"This releases carbon dioxide. Because of the way the oceans work, if you put carbon really deep down, it takes decades or even centuries for that CO2 to come out and contribute to atmospheric warming," he says.
The research team calculated that this process — called the seasonal vertical migration pump — transports 65 million tons of carbon annually to at least 500m below the ocean surface.
Of that, it found that copepods contribute the most, followed by krill and salps.
That is roughly equivalent to the emissions from driving 55 million diesel cars for a year, according to a greenhouse gas emissions calculator by the US EPA. The latest research looked at data stretching back to the 1920s to quantify this carbon storage, also called carbon sequestration.
But the scientific discovery is ongoing as researchers seek to understand more details about the migration cycle. Earlier this year, Dr Freer and Prof Mayor spent two months on the Sir David Attenborough polar research ship near the South Orkney island and South Georgia.
Using large nets the scientists caught zooplankton and brought the animals onboard. "We worked in complete darkness under red light so we didn't disturb them," says Dr Freer. "
Others worked in rooms kept at 3-4C. "You wear a lot of protection to stay there for hours at a time looking down the microscope," she adds.
Antarctic krill (50-60mm) with green guts showing they've recently eaten algae
But warming waters as well as commercial harvesting of krill could threaten the future of zooplankton. "Climate change, disturbance to ocean layers and extreme weather are all threats," explains co-author Prof Angus Atkinson from Plymouth Marine Laboratory.
This could reduce the amount of zooplankton in Antarctica and limit the carbon stored in the deep ocean.
Krill fishing companies harvested almost half a million tons of krill in 2020, according to the UN.
It is permitted under international law, but has been criticized by environmental campaigners including in the recent David Attenborough Ocean documentary.
The scientists say their new findings should be incorporated into climate models that forecast how much our planet will warm.
"If this biological pump didn't exist, atmospheric CO2 levels would be roughly twice those as they are at the moment. So the oceans are doing a pretty good job of mopping up CO2 and getting rid of it," explains Prof Atkinson.
The research is published in the journal Limnology and Oceanography.
https://www.bbc.com/news/articles/c628nnz3rp9o?utm_source=firefox-newtab-en-us
*
TRINITY BEFORE JESUS?
There was no “trinity” before the alleged birth of Jesus and there was no “trinity” in early Christianity.
The “trinity” is a bizarre and incomprehensible christian dogma that evolved over four centuries punctuated by acrimonious councils, cries of “heresy!” on all sides, and various excommunications.
Why the acrimony, the accusations of heresy and the excommunications?
No one could explain how there was only one god, and yet Jesus was (allegedly) the “son of god.”
Was Jesus fully human, adopted as a human son by god? Paul said Jesus was adopted at his resurrection, while the author of Mark said it was at his baptism by John the Baptist.
Was Jesus half god and half man, a demigod like other alleged “saviors” such as Dionysus and Romulus?
Was Jesus fully god, meaning there were two gods, not one?
Was Jesus a projection, emanation or “mode” of god?
Was Jesus an angel? Some christians believed (and some continue to believe) that Jesus was Michael the Archangel.
Was Jesus created, or did he preexist creation?
It took four centuries for Jesus to become a member of the “trinity” — which was finally made official when the holy ghost completed the “trinity” in 371 AD.
What a long, strange trip it’s been! ~ Michael Burch, Quora
Allen Parmen:
How does the Trinity explain one member dying and thus punishing infinite generations of Jews for something they didn’t do and couldn’t happen (god dying but not dying that is)?
Michael Burch:
I agree.
Also, there was no “hell” in the Old Testament, so why would god create hell, never announce it, then send billions of people there who never knew it existed?
Makes no sense unless god is infinitely evil.
Guy Fiver:
In Mark’s no-frills gospel, there is no miraculous virgin birth, no star of Bethlehem, no wise men, no empire-wide taxations, no angelic announcements, nor tales of precocious young Jesus astounding the rabbis with his knowledge.
In the early church Jesus was believed to have been adopted. The adoptionist meme says that Jesus was born human, not divine and later adopted by Yahweh as His son.
Adoptionists believed that Joseph was Jesus' father and Mary was his mother (not a virgin), and that through Jesus' sinless devotion, God was very pleased and took Jesus as His own son, making him divine at Jesus' death. (Copying what the Roman emperors believed happened to them) The anti-adoptionist meme is the now-familiar (i.e. orthodox) view, that Jesus was born of a virgin and was divine from birth, and that Jesus and God are part of a single Holy Trinity.
When the Council of Nicaea VOTED on this, in 325 CE, the anti-adoptionist meme won the day.
I wonder how many believers know about this? That their savior, who they believe suffered and died on the cross to save their souls from damnation, was actually voted into the position that he holds today.
Dennis Regelspergen:
Before his crucifixion when Jesus was praying in the garden, asking God if the crucifixion could be avoided, and then saying “but not my will but your will be done." Who was he talking to? Himself?
This seems to fly in the face of the trinitarian idea that all three are one.
And it seems especially odd that Jesus asks God if he can get out of the crucifixion when both them are one all-knowing, all-powerful God.
He knows that according to the story he has to die and then be resurrected to save us from the sin that our great-great-great-great-grandparents made of eating the wrong fruit.
This plot line has a few holes in it.
*
THE TURNING POINT WHEN THE BODY STARTS AGING RAPIDLY
Past studies show that human aging doesn’t necessarily happen at the same pace throughout our life.
There is still much to discover about the aging process, especially when it comes to how it impacts the body’s organs.
A new study found that by focusing on aging-related protein changes in the body, there is an acceleration in aging of organs and tissues around the age of 50.
And of these proteins, scientists found that expressions of 48 of them linked to diseases increased with age, such as cardiovascular and liver disease.
While we can try to slow it down, human aging is something we currently can’t stop from happening. However, past studies show that aging doesn’t necessarily happen at the same pace throughout our life.
Instead, there are certain ages when a person’s body may experience a burst of aging. Previous studies show that the body may undergo rapid aging around the ages of 44 and 60.
And there is still much to discover about the aging process, especially when it comes to how it impacts the body’s organs.
“Aging, as a systemic, degenerative process that spans multiple organs and biological strata, remains one of the most profound unresolved questions in the life sciences,” Guang-Hui Liu, PhD, regenerative medicine researcher at the Chinese Academy of Sciences, explained to Medical News Today.
“Throughout the extended human lifespan, two fundamental issues persist: Do all organ systems adhere to a unified aging rhythm? Does a molecular spatiotemporal hub exist that orchestrates organism-wide senescence? Despite their centrality to understanding the essence of aging, these questions have long lacked systematic, empirical resolution.”
Liu is the corresponding author of a new study recently published in the journal Cell, that has found that by focusing on aging-related protein changes in the body, they can get a clearer picture of how the body’s organs and tissues age over time, including an aging acceleration around the age of 50.
And of these proteins, scientists found that expressions of 48 of them related to diseases — such as cardiovascular disease and fatty liver disease — increased with age.
Creating an aging ‘atlas’
For this study, researchers analyzed 516 samples of 13 types of human tissues collected from 76 organ donors between the ages of 14 and 68 who had passed away from traumatic brain injury.
The tissue samples included cardiovascular, digestive, respiratory, endocrine, and musculoskeletal samples, as well as immune system, skin, and blood samples.
Next, researchers documented the types of proteins found in the organ and tissue samples, allowing them to create what Liu called “a proteomic aging atlas” that spans 50 years of human life.
“Covering seven physiological systems and thirteen pivotal tissues, the atlas presents a panoramic, dynamic portrait of organismal aging from a protein-centric perspective,” Liu explained. “The more than 20,000 proteins encoded by the genome serve as the structural bedrock of cells; their dynamic networks exquisitely orchestrate physiological homeostasis and act as the principal executors of virtually every biological process.”
“Consequently, systematically charting a panoramic, lifespan-wide atlas of proteomic dynamics and dissecting the reprogramming rules of protein networks at organ- and system-level scales are pivotal for accurately identifying the core drivers of aging and for establishing precise intervention targets,” he added.
BODY AGING ACCELERATES RAPIDLY AROUND AGE 50
At the study’s conclusion, researchers found that the biggest aging changes in the body’s organs and tissues seems to occur around age 50.
“Ages 45–55 are identified as a landmark inflection point: most organ proteomes undergo a ‘molecular cascade storm,’ with differentially expressed proteins surging explosively, marking this interval as the critical biological transition window for systemic, multi-organ aging,” according to Guang-Hui Liu, PhD.
“Notably, the aortic proteome is reshaped most dramatically; its secretome and the circulating plasma proteome evolve in tight concordance, indicating that senescence-associated secreted factors (senokines) may serve as the hub mechanism broadcasting aging signals throughout the body,” Liu explained.
Additionally, Liu and his team found that expressions of 48 of the proteins linked to diseases, including cardiovascular disease, fatty liver disease, tissue fibrosis, and liver-related tumors, increased with age.
“Organ aging is the essence of human chronic disease; each geriatric illness is merely a specific manifestation of this underlying organ aging,” Liu added.
Aging causes biochemical changes in the body
MNT had the opportunity to speak with Cheng-Han Chen, MD, a board certified interventional cardiologist and medical director of the Structural Heart Program at Memorial Care Saddleback Medical Center in Laguna Hills, CA, about this study.
“This study found that protein changes in the body associated with aging seem to accelerate roughly around age 50, depending on the type of body tissue. This is an interesting finding that helps us better understand the types of biochemical changes that underlie aging and potentially provide targets for therapy at different stages of someone’s life,” stated Cheng-Han Chen, MD.
“Science is only beginning to understand the biological mechanisms involved in aging,” Chen said. “Studies like this help us to identify the basis of normal aging, and in turn provides insight into how deviations in normal biology lead to diseases such as cardiovascular disease and fatty liver disease. Ultimately, this will help us understand how to keep our patients healthy and aging well. It may also help us to develop new therapies for diseases that result from accelerated aging.”
“Future research should attempt to expand on these findings in more diverse demographic groups and as well as in other important organs such as the brain and kidneys,” he added.
Transforming medicine from reactive to proactive
MNT also talked to Manisha Parulekar, MD, chief of the Division of Geriatrics at Hackensack University Medical Center in New Jersey, about this research.
“The idea that our cells lose the ability to maintain a healthy and functional proteome (the collection of proteins) is a cornerstone of modern aging theory. The accumulation of misfolded proteins is the classic example, best known in neurodegenerative diseases like Alzheimer’s disease. This study’s finding of widespread amyloid accumulation across many tissues confirms that this isn’t just a brain-specific problem but a systemic feature of aging,” said Manisha Parulekar, MD.
“This research is about transforming medicine from a reactive, disease-focused model to a proactive, health-focused one,” she continued. “By understanding the what and the when of aging, we can develop the tools to compress morbidity — allowing people to live not just longer, but healthier and more vibrant lives.”
“A longitudinal study, following the same individuals over decades will be helpful,” Parulekar added when asked what she would like to see as next steps for this research. “This would track their personal proteomic changes over time, allowing us to study genetic and lifestyle differences between people and providing additional confirmation for the ‘age 50 inflection point’.”
While changes will occur every year, past research shows that, at the protein level, the most notable changes take place around 34, 60, and 78.
Additionally, the scientists found that the most noteworthy age-related molecule and microbe changes were linked to potential health concerns.
For example, with people in their 40s, Snyder and his team discovered significant changes in the number of molecules related to alcohol, caffeine, and lipid metabolism, as well as cardiovascular disease and skin and muscle.
At the age of 60, the biggest molecule changes were related to cardiovascular disease, immune regulation, kidney function, carbohydrate and caffeine metabolism, and skin and muscle.
Snyder said it is important for researchers to continue to examine what happens to the body during biological aging because we can then take action to reduce many of the problems associated with aging.
“It is unclear why there are such large changes specifically around the ages of 40 and 60. Further research will be necessary to identify the mechanisms and potential biological rationale for the changes around those time periods,” stated Cheng-Han Chen, MD, “Science is only beginning to understand the biological mechanisms involved in aging.”
https://www.medicalnewstoday.com/articles/aging-human-body-dramatic-molecular-changes-40s-60s
*
BENEFITS OF KALE
Due to its high levels of antioxidants, fiber, and anti-inflammatory compounds, kale supports immune function, gut health, and disease prevention.
Kale provides vitamin C, vitamin K, calcium, and magnesium—all of which help strengthen bones, reduce inflammation, and protect against age-related diseases.
If you want to eat for longevity, look no further than vegetables. They’re packed with nutrients that support overall health, helping you live a long and thriving life. The good news? All vegetables can benefit your lifespan, so there are plenty of options to choose from. But if one vegetable came out on top, what would it be? To find out, we asked registered dietitian Nisha Melvani, MS, RD, to share the best vegetable for longevity and living well.
The Best Vegetable for Longevity
When asked to name the top vegetable for longevity, Melvani called out kale, a popular and versatile leafy green. Thanks to its rich content of essential nutrients—particularly antioxidants and fiber—kale tops other vegetables in the category of longevity.
“Kale contains several important antioxidants,” Melvani says. These include carotenoids (like lutein and beta-carotene) and vitamin C, which help neutralize harmful free radicals and reduce oxidative stress in the body, she says. It’s a noteworthy effect for longevity, as chronic oxidative stress can contribute to chronic diseases, which can shorten your lifespan.
Additionally, vitamin C “supports immune function by promoting the production and function of white blood cells, helping the body fight infections more effectively,” Melvani adds. “Vitamin C also aids in wound healing and helps maintain healthy skin by supporting collagen production.” This prevents illness-causing germs from entering the body, keeping you healthy and well. Finally, kale provides glucosinolates, or compounds that are linked to a lower risk of chronic disease due to their potent anti-inflammatory and antioxidant properties.
Plus, kale is a stellar source of fiber. According to Melvani, this includes insoluble fiber, which adds bulk to stool and encourages regular bowel movements, and soluble fiber, which feeds beneficial gut bacteria. The latter also supports the production of short-chain fatty acids in the gut, which reduce inflammation and strengthen the immune system, Melvani says. Soluble fiber also manages blood cholesterol and blood sugar, which is key for reducing the risk of heart disease and type 2 diabetes, respectively. “By supporting gut health, lowering cholesterol, controlling blood sugar, and reducing chronic inflammation, the fiber in kale may help prevent age-related diseases and promote a longer, healthier life,” Melvani says.
The longevity-boosting benefits of kale don’t stop there. “Kale is rich in nutrients that are important for strong, healthy bones,” Melvani says. “It provides vitamin K, which helps the body make proteins needed for bone mineralization and strength.” Kale also offers calcium, which builds and maintains bone density, and magnesium, which supports bone structure and helps the body use calcium effectively. Together, these nutrients can strengthen bones, potentially reducing the risk of falls that can lead to serious fractures and ultimately, a poorer quality of life.
https://www.realsimple.com/best-vegetable-for-longevity-11776794
*
ending on beauty:
LATE AUTUMN IN VENICE
Already the city no longer drifts
like a bait, catching the days as they surface.
The glassy palaces ring more brittle
against your gaze. And from the gardens
the summer hangs like a heap of marionettes,
headfirst, exhausted, done.
But from the ground, out of old forest skeletons,
volition rises: as if overnight
the commander of the sea had to double
the galleys in the sleepless arsenal,
in order to tar the next morning breeze
with a fleet, which pushes out rowing
and then suddenly, all its flags dawning,
seizes the high wind, radiant and fatal.
~ Rainer Maria Rilke
Ezra Pound in Venice, 1971
No comments:
Post a Comment