Saturday, February 18, 2023

WHY THE NAZIS ACCELERATED THE HOLOCAUST WHEN DEFEAT WAS NEAR; ERNAUX ON THE WESTERN REACTION TO THE END OF COMMUNISM; HOW MUSHROOMS FIGHT AGING; CHILDREN IN THE PALEOLITHIC; INSIDE THE RUSSIAN ARMY; AN OLIGARCH’S LIFE; FRIEDRICH BOLLNOW’S PHILOSOPHY OF HOPE

 Yosemite’s Horsetail Falls become “Firefalls” in winter when the light is at just the right angle

*
MUSHROOMS

They lured us from the path’s
slender curtain of light.
Guidebooks called them
the “fruiting bodies” —

those trumpets and umbrellas
glistening with raindrops and sap;
gilled, primeval creatures
webbing the underworld.

My father, a forest gnome,
would call us to point out
the sinister decoys, betrayed
by the loose, slovenly

pleating of the underskirt:
the stigmata of poison.
Newspapers would report
yet another household

found silent in the morning.
But no one would give up
those raids on huddled
mushroom beds,

under centuries of oaks
feeding on rich death.
Oh, they were smarter than we
who were pale, unrooted,

we who needed the light.
But we’d return and return,
the air soft after rain, trees in fog
and the first scent of autumn —

Birds called feathery warnings,
the woodpecker drummed,
the cuckoo counted off
the years of our lives —

Veined with dusk, we’d
kneel and pick, our hands
growing fibrous, musky
with the smell of home.

~ Oriana 


*
ANNIE ERNAUX ON THE FALL OF COMMUNISM:

“What followed was a short-lived epoch when tyrants were executed after an hour’s trial, and soil-covered corpses were exposed in mass graves. What was happening defied the imagination — so we really had believed that Communism was immortal — and our emotions were at odds with reality. We felt left out, and envied the people in the East for experiencing such moments. Then we saw them crowding into the shops of West Berlin, and they moved us to pity with their awful clothes and bags of bananas. Their inexperience as consumers was touching. Now the spectacle of their collective hunger for material goods, which showed no restraint or discrimination, antagonized us. These people weren’t worthy of the pure and abstract freedom we had devised for them. The sense of affliction we’d been accustomed to feeling about those who lived ‘under Communist yoke’ gradually turned into a disapproving observation of the use they made of their freedom. We liked them better when they were lining up for sausage and books, deprived of everything, so we could savor the luck and superiority of belonging to the ‘free world’.”

~ Annie Ernaux, The Years  


*
BORGES: I’M NOT SURE I EVEN EXIST

~ “I’m not sure I even exist, actually. I am all the authors I've read, all the people I've met, all the women I've loved, all the cities I've visited, all my ancestors... “

—So, Borges, do you exist or... "Nothing, nothing, my friend; what I told him: I'm not sure about anything, I know nothing. Imagine, I don't even know the date of my death.”

~ Jorge Luis Borges (interview in the print edition of El País, September 26, 1981).

*
INSIDE THE RUSSIAN ARMY

~ The brutal harassment and bullying of recruits is called dedovshchina (дедовщина) and it infested in the Soviet Army in the late 1960s.

Two major changes happened back then which caused this phenomenon — originally happening in prisons and concentration camps — to infest the Red Army.

Abolition of the starshina (professional NCO) class

Diminishing cohorts and diminishing conscript intake numbers, and introduction of previously disqualified materiél, such as gang members, criminals and drug abusers, into the conscript intake cohorts.

The 1) meant that there was nobody any more to shepherd young men between 20 to 22 full of testosterone and little to speak of social maturity and 2) meant that the convicts brought the prison culture with them into the military.

All kinds of hazing and bullying always exists in all militaries. Young human males are vicious b4stards, and not least to each other
but dedovshchina has been turning the dial up to eleven. It is the result of brutal external discipline and no internal discipline.

The name “dedovschina” comes from “ded” [I prefer the "dyed" transliteration], grandfather. The Russian military has no boot camps, but the recruits are assigned directly to the units. As the tour of duty is 12 months, the intake happens twice per year, and the discharges are twice per year. The older cohort in the turn of the discharge call themselves “deds”, “granfathers”, and they consider their right to terrorize the young and inexperienced recruits.

Since the Soviet army is known as a brutal institution which stresses numbers instead of quality, and since the Russian male culture is ultra-macho, brutal and ruthless and bullying and terror are considered as the normal way of resolving the pecking order, any sensible males do anything to avoid the military service. The result is that the military gets only the bottom dregs — the dullest, stupidest, the most immature and least talented boys — which often are former schoolyard bullies, antisocial cases or have drug or alcohol problems.

The result is the dedovshchina — bullying and terror beyond any measures.

We have all watched the news and seen the abysmally bad performance of the Russian army in Ukraine. When Russia attacked, I was XOing a yacht at Guadeloupe. Hunny called me and said Russia has attacked and Ukraine is resisting. I said: “Has Kyiv fallen?” No. “If it doesn’t fall in 48 hours, Ukraine will win the war”.

Since Sir Basil Lidell Hart insisted warfare is never about material vs material, weapons vs weapons nor numbers vs numbers, but human will vs human will, this pathetic and pitiable performance of the Russian military did not come to me as a surprise. When the military is the place where you least want to be at the moment, the result is abysmally bad morale and will to fight.

Sunzi insists: When the officers are strong and the enlisted men are weak, the result is collapse. This is exactly what has happened. Russians, despite their numerical and material superiority, have demonstrated terribly bad morale. Their external discipline is brutal and their internal discipline is non-existent — as demonstrated by the rapes, murders and abductions. Terrorizing the civilians is a sign of bad internal discipline.

The Ukrainians are fighting for their physical existence, and it happened that the Ukrainian men escorted their loved ones to safety abroad, and then took the next plane or train back to Ukraine to the military. This is a sign of excellent morale. It is this contest of wills which has tipped the scales.

Perhaps the nastiest effect of the dedovshchina in the Russian military has been the abandonment of comrades and fraggings. We have seen on the videos how the Russians abandon their wounded and injured comrades and do nothing to rescue them — they care just only about their own lives. Why should they rescue their former tormentors? And likewise, what is a better way to get rid of a bully than murder him and masquerade it as a battle casualty?

When a negative legacy such as the dedovshchina infests the military, it is almost impossible to extirpate. Militaries are goddamn conservative and even stagnant institutions, and they really do not care too much what happens at the bottom of the totem pole. The dedovshchina is tolerated because it is a way to determine who is NCO stuff. This is why the Soviet legacies have been inherited in the Russian military as well. ~ Susanna Viljanen, Quora

Thomas Jorgensen:
And its only worsened by the Russian Officers regarding their soldiers as less-than-human and totally expendable. This is a remnant, not of Communist USSR as some believe, but of Czarist Russia social pecking order, in which officers were mostly of noble background, while the common soldiery were recruited from the serfs, that said nobility exercised absolute power over. This mentality survived the abolition of monarchy and the privileges of nobility, into the Soviet era and further beyond. So modern Russian officers, though no longer of noble background, treat their soldiers like the nobility of the Czarist period treated their serfs. Russian soldiers are frequently used as a source of cheap labor, using some of their service time to patch up officers private property rather than receiving military training. Military service in Russia is truly a sort of corvée, or forced unpaid labor owed by a serf to his feudal lord, except in Russia the “feudal lord” is the almighty Russian State.

Angel Lopez Simon:
That's why a professional military is always better than a bunch of conscripts. You can do 10 times more with a ¼ of people… and all of them can be much better equipped and trained.

Ken H:
And this was known as far back as the time of Caesar.

a soldier’s funeral

Mary:

When you build your army with gangsters and convicts, when many of your soldiers are career criminals, then the culture of the army will be a criminal culture, organized on the principles of violence, maintained by abuse and degradation.

A good or even serviceable morale is impossible under such brutal and brutalizing conditions, where no one is valued, no one's individual survival matters to any one else. This is not a "band of brothers" but "dog eat dog." It is no wonder such soldiers abandon the wounded and do not attempt rescues...no one cares who is left behind, no one depends on, or expects any help, any mercy, or anything but more brutality.

This made me think of our own problem with a culture of violence in the US. So many of our stories involve a man with a gun...with no gun, no story. We grew up on Davy Crockett and Cowboys, sheriffs and outlaws, all conflicts defined by and resolved with..guns. This has become a core mythology, a culture that led us to this present situation, where there are more mass shootings than days in the year, and where there is no segment of the population unthreatened, and no place at all you can assume safe.

The Wild West takes up a place in our collective psyche bigger and longer than the historical conditions it mythologizes, and has enormous  influence. Both the lawman and the lawless cast huge shadows as heroes, their weapons fetishized as instruments of individual power...as arguments both deadly and irrefutable.

The conscripts are disposable, easily sacrificed, abandoned, in the criminal culture of the Russian army. We seem as much like them in our own culture of violence, in that we protest the slaughter occurring here daily, but do nothing to stop it. The victims, young, old, innocent, random, are left abandoned before the gunman who uses them as brutal sacrifice for his own exit in a "blaze of glory.”

Oriana:

True, we have mass shootings and other tragedies caused by the gun culture. The difference is the prevalence of the rule of law in the society at large, including the military — which is answerable to the civil authority. And all of this is based on the respect for human life, for human dignity.

The strength of the U.S. lies in the power of its institutions. We’ve had bad presidents, but not one tried to poison or defenestrate his political opponents. Or imprison them in a hard labor camp. We take that basic decency for granted — it’s only the example of Russia (or, say, North Korea) that makes us realize that no, this is not automatically “normal,” but rather a cultural achievement to be cherished and preserved, no matter the effort and expense it takes.

At the same time, alas, we have the gun culture — this is a country of paradoxes.
 
*
A RUSSIAN OLIGARCH’S LIFE (Misha Firer)

~ From today’s news on Telegram channels, Russians learned that Vladimir Plotnikov, a pudgy deputy of the ruling party United Russia from Perm, has three families. The names of the deputy’s young wives are Svetlana, Anastasia, and Elena.

Vladimir has two sons from Svetlana, two children from Anastasia, and one daughter from Elena. Wives know about each other and allegedly have good relationships.

Perm 36’6 that conducted investigation alleges that Vladimir has fathered more children.

Vladimir is best known for requesting to authorize censorship in order for Russians “to stop looking at the West, turn to traditional Christian family values, and have more love for the Motherland.”

The deputy spends his vacations in Italy, France, Maldives, and the United Emirates. He takes the first family with him abroad by seniority. Then that family is flown back home, and the next one is brought in on a private jet. And then the third one.

The three wives like to wear lookalike fur coats and Luis Vuitton perfume and do plastic surgeries at the same specialist.

This ensures that none of them stands out and there’s no competition. If Vladimir gets tired of one them, the other two get thrown out, because they have identical looks and wear identical clothes. One for all and all for one.

Now it should be noted that Perm where Vladimir and his families are from has been in every rating of the worst regions in Russia for years if not decades. In 2023, Perm took the 5th place of the poorest cities where over 50% of the population live below poverty line.

What Russians don’t realize, or rather don’t want to think about, is that between them and the ruling class lies an abyss as great as in pre-revolutionary Russia between the Tsar’s nobility and peasants.

The two classes literally have nothing in common except the language although children of the new nobility often don’t speak Russian [because they get their education in private schools abroad, especially the UK].

Vladimir Plotnikov

Mobilized soldiers do not fight against Nazis for the Motherland in Ukraine as they’re told on TV. The contrivance is ridiculous. Like a century ago, they are poor and ignorant peasants killed and maimed for the sole purpose of keeping the tsar and ruling class in power indefinitely.

A few years ago, my wife and I visited her father’s university friend, a retired FSB agent at his birthday party.

He used to provide protection racket to mobile phone providers and other tech companies in the late 1990s - early 2000s that would become dominant in the Russian market and got hold of stakes in their businesses. He had more money than he knew what to do with it.

At the time of our visit he was building a house the size of Versailles Palace only constructed with logs in the Medieval Russian style. Servants quarters alone was a massive three-story house.

There was a lawn the size of a football pitch in front of the mansion under construction, and when we came out and headed to the tent with a barbecue and a pile of presents from guests, we saw two dozen children and young adults playing on the playground and running around.

Turned out these were all the FSB dude’s children from different women. He was a down to earth guy though taciturn. He was doing barbecue by himself American style - he’d lived in the US for a few years after the breakup of the Soviet Union. I helped him out and complimented on a big family and asked how many wives he got.

“Marriage is like a mortgage for the poor. When you’re rich, you pay the whole sum upfront. It’s the same with women. When you’re rich and powerful they stand in line for a privilege to have sex and children with you, and you pick and choose.”

And what struck me, like with the deputy’s wives, ruling class men are surrounded with women, and men, and children who feed off of them like those birds red-billed oxpeckers that pick parasites off rhino’s backs. It’s still not a great number of people, even with the servants and extended families.

And below them, there are the great masses of the unwashed, the untouchables, the peasants with their mortgages, and their marriage certificates and one point two children, and the one thing that keeps them going is this firm belief that the little KGB man is absolutely right when he tells them the manly truth from the TV screen that Russians are the greatest people on earth.

And you believe him whole-heartedly one hundred per cent, because that’s all you got in terms of self-confidence, hopes and dreams.

A compliment oft-repeated can go a long way, albeit sometimes straight to hell. ~ Quora

Curro B:
The guy's place should be called SPerm

Vasily Ivanov:
Love for the Motherland means love for the Masters. Putin is a big Master and he IS the whole Motherland. Plotnikov is a smaller Master and he IS a part of the Motherland (Perm, in particular). The idea was provided by V. Volodin: “Putin is Russia”. By the way, ‘Volodin’ might be translated as ‘Vladimir’s’.

Mary:

So many things in the description of Russian life have a bizarre, surrealistic feel. The description of the oligarch with his several wives, who dressed similarly, and had Plastic Surgery done that made them look more alike (so no one would have more advantage) is just creepy. So you have a country where all the messages you hear are lies, and that's considered normal, where corruption is a way of life, and expected, even depended on, and where people in "high places" routinely fall out of windows or "commit suicide" by shooting themselves repeatedly in the head. It sounds like some kind of macabre circus you would laugh at if it weren't so dangerous.

Oriana:

I wonder: has the lawlessness gotten worse, or is it that we are learning more about it? The Soviet Union was a master of lies and secrecy. Now there is a much greater access to information. Still, we are dealing with a vicious bully. I can't pretend to know the solution, except for the one certainty: appeasement doesn't work. You couldn't negotiate with Hitler, and now you can't negotiate with Putin. This is tragic, but at least the situation is familiar, and the West isn't as clueless as it was in the 1930s.

*
WEALTH INEQUALITY IN RUSSIA

~ Russia is a perfect textbook example how a country goes from an industrialized country into a developing country by corruption, demoralization and deindustrialization — and proves Marx is wrong by insisting material features of the society being essential.

The main reason why Russia has become a Third World country (with nukes) is moral rottenness. The behavior of the Russian troops in Ukraine have demonstrated perfectly how corrupt, rotten and amoral the Russian society is. It is as the majority of the Russians have barely reached the first stage of the Lawrence Kohlberg’s six stage evolution of ethics.

The whole conundrum begins to unravel from the fact that Russia is an Authoritarian Patrimonialism, where all the power and all ownership is concentrated in one single man. But nobody rules it all alone. Since there is no rule of law, the inner circle of the ruler (Czar, Premier, President, you name it) consists of sycophants and yes-men. There is no protection of ownership in Russia, but an unlimited right to expropriate and possess. The only way to guarantee the favor of the sycophants and cronies is gift-giving and terror. The result is corruption is well tolerated. Russia is basically a one-man kleptocracy, where the biggest thief owns everything.

An ordinary Russian is no better than a peasant. Not a serf any more, but not a citizen either. He is dirt poor, has no other rights and freedoms than what he can guarantee with his fists, and on the mercy of the state and the criminal gangs. But he lives with the hope that one day he will be at the receiving end of the corruption. Since corruption is the custom of the land, nobody questions it. And the state treats the Russians as peasants
it deliberately keeps them poor so the status of the leader and his cronies is not questioned. Nothing is more dangerous to the ruler than a strong middle class.

The century of Communism and force-fed Marxism and Atheism basically destroyed the Russian morals and all ethics. This was well observed and discussed by Aleksandr Solzhenitsyn. The result has been there is very low degree of trust in the society.

Since there is no protection of ownership, there is no real Capitalism in Russia either
the Russian economic system is strikingly similar to the Third World economies. The moral collapse, the low degree of trust and no protection of ownership means there is no accumulation of capital in Russia — all fortunes are either put immediately into conspicuous consumption or offshored to safety elsewhere. There is no point on investment and even less point on entrepreneurship. The criminals and/or the state can seize your property, your company, your wealth at any moment and you are lucky if you escape with your life. The result has been low willingness for entrepreneurship and low degree of both domestic and foreign investment.

What kind of fool would invest to Russia when you can get a far better return of investment in India or Indonesia?

The result has been deindustrialization, collapse of production and investment, and reverting into extractive economy — that is, the increase of extraction (agriculture, food production, mining, energy production and scrap metal production) on the cost of other industries. The Russian economy is based on extraction — the same way as Third World countries usually.

Russia has always been technologically backwards, and the current war in Ukraine certainly is not going to help the situation. This technological backwardness is due to the rottenness in the society, and it is not going to change in the near future.

So the answer to the question is: the horrible gap between the filthy rich and the dirt poor is perfectly normal for all Third World economies, and Russia has managed to revert back into a Third World economy by reverting back into Authoritarian Patrimonialism and because of the moral rottenness and ubiquitous corruption in the Russian society. ~ Susanna Viljanen, Quora

Bill Smith:
Another country that de-developed is Argentina. Prior to WW1, it was one of the world’s richest countries.

Chris Otter:
Russia ran out of German prisoners of war some time ago, and its ability to create technology plummeted. Its highly educated population moved to Israel or other parts of the developed world en masse. In its late stages, the Soviet Union relied on economic espionage to advance its technology, but it wasn't as capable as today's China in that regard.

Andrew Travkin:
The part about deindustrialization is not really true. The reason for it happening is the crash of the Soviet planned economy and the loss of skilled labor and manufacturing means, not some grand moral degradation.

Also it is not like it is totally impossible to have industry in a corrupt country. Knowing what official to bribe and what connections to have simply becomes another variable to consider when running a business. I’ve heard that corruption is also heavily ingrained in Chinese society, but that does not stop them from being the world’s workshop.

Oriana:
I think Susanna has a point about bad economy, moral corruption, and low work ethic. With a morally corrupt government, even manners deteriorate across the social spectrum. On my first trip back to Poland, I was amazed to see not only an improvement in work ethic, but also the more respectful way people addressed one another, including strangers — without the shouting and rudeness that I remember back from the “communist” days (I put “communist” in quotation marks because not even the Soviet Union was a communist [or a socialist] country — as Misha Iossel observed, it was always a fascist country, and the same held for its Eastern European satellites]). In the bad old times, the illegitimacy of the government and the omnipresent lying in the media seemed to corrupt all human relations. 

Surprising as it may seem, the elimination of gross corruption can lead to greater politeness. How we treat others in part reflects the moral state of the whole society.

Democracy is based on the radical idea that an individual human being has value. 

Another radical idea is that hard work and honesty should be rewarded, and corruption punished.

Joe:

The article presents the myth that the Russian poor accept the State’s corruption because they hope that one day, they will be on the receiving end of greed instead of the victim’s side.

Journalists in the United States are fond of writing that the poor support corporate greed because they hope to be the head of a corporation.

The poor don’t believe that lie in Russia or the United States. They might repeat it, but in their hearts, they know that the rich rigged the system against them. By middle school, students become aware that it is nearly impossible to receive or afford an education that would make them competitive with the Richie Rich child.

Their knowledge that the powerful have designed educational costs to be prohibitive for the average citizen fuels their hopelessness. In the US, wealthy CEOs went to jail for paying universities to award their non-athletic children a sports scholarship. If education is too expensive for the upper class, how costly is it for the middle and low-income classes?

Adding the absence of social justice to the lack of educational opportunities cripples the prospects for achieving economic prosperity. When these two conditions are present, the result leads to a feeling of hopelessness in the average person. By maintaining the lack of prospects, the powerful promote the idea that the poor accept their poverty.

In both countries, the powerful view a campaign against the greed of the oligarchical and religious leaders as an assault on them and a war against the truth. Leaders must admit to the carnage they inflicted upon their nations to change the direction of their counties. Therefore, the possibility of justice in Russia and the United States is hard to imagine.

In this country, the change occurred when Ronald Reagan became president. His presidency started a decline in the funding of public education and an increase in social injustice. By the 90s, it became common for American journalists to write that the poor accepted their condition because they thought that one day they would be rich.

The poor feel powerless against the wealthy because the rich own exclusive possession of the justice and education systems. The reporters become their accomplices when they write the myth that the Russian/American poor accept the corruption of the state/corporation because they think that eventually they will be rich.

Oriana:

You hit on something fundamental here. There are endless advantages that stem from being born to rich parents. I don’t mean the super-rich; by “rich” I mean those parents who can afford to send their children to good college without saddling their kids with large debt and/or having to work at least part time — which drastically reduces the time the students can devote to study.

And the difference starts before college. A public school in Beverly Hills or La Jolla can attract better teachers and organize more attractive after-school activities. I knew of a high school in La Jolla which set up a field trip not to, say, a local museum or nearby nature preserve, but to . . . Paris. And I’ll never forget hearing a fellow student bewail not having gone to a prep school like his roommate. “He studied Latin, French, and calculus. For field trips, they went to New York and abroad.” I forget the rest, but there was more: some fancy electives and distinctly “elite” experiences. Yes, money can buy a better education, conferring lifelong benefits.

The apologists for the current system also quote the examples of some individuals who didn’t have any of these privileges and either never went to college or dropped out — but still managed to open a business or succeed by some other means. Yes, there are such exceptional individuals, but they are exceptions. A great deal depends on luck: a bright child of poor parents may attract the attention of a teacher who’s willing to become a mentor. And thank goodness the public libraries are free! Still, the probability is greater that the child of poor parents will end up becoming a gang member. A faculty colleague once said to me, “Most of the kids I went to school with are either in prison or working at car wash.” He himself had luck with mentors, and developed good social skills. Even so, he never got to go to the fancy schools; consequently, his job chances were limited.


*
RUSSIAN GENERAL FOUND SHOT DEAD IN HIS HOUSE

~ Major General Vladimir Makarov, 67, who was fired by Vladimir Putin a month ago has been found shot dead at his home near Moscow on Monday.

He was in charge of ‘combatting extremism’ in Russia, reportedly leading the witchhunt to oust the President’s rivals, as well as journalists, per The Sun. ~ Emmanuel Ikechukwu, Quora

Mario Correa:
Yes, of course, he committed suicide by shooting himself three times in the head.

Otto Matsch:
Why not go full Stalin and bump off a few hundred at a time? C’mon Vlady, you can do it.

Major general Vladimir Makarov

*
TOP RUSSIAN MILITARY OFFICIAL FALLS OUT OF A HIGH-RISE WINDOW

~ Marina Yankina, head of the department of financial provisions for the Western Military District, was found dead on a sidewalk on Wednesday morning, according to multiple local reports. She is just the latest in a growing list of Russian military officials, defense industry figures, war critics, and gas and oil execs to die suddenly and mysteriously since the start of the full-scale invasion last year.

The 58-year-old’s belongings and documents were found on a balcony on the 16th floor of the building, Mash reports.

Russia’s Investigative Committee is looking into the circumstances of the deadly plunge, with their preliminary conclusion being suicide, according to Fontanka.

Prior to joining the Western Military District, Yankina worked in the Federal Tax Service.
The Western Military District has incurred some of the heaviest losses in Russia’s war against Ukraine—and been blamed for a string of humiliating battlefield losses.

Colonel-General Alexander Zhuravlyov was sent packing as commander of the district in October following huge losses in Kharkiv. His successor, Lieutenant-General Roman Berdnikov, was sacked a few months later after lasting less than three months.

Colonel-General Sergey Kuzovlev then took the helm, only to be replaced a few weeks later by Lieutenant General Yevgeny Nikiforov.

https://www.thedailybeast.com/top-russian-military-official-marina-yankina-dead-after-fall-from-16th-floor

John Bastan:
You would think by now some of these people would move into a 1st floor apartment.

*
In November 2022, Col. Vadim Boiko, 44, deputy head of the Makarov Pacific Higher Naval School in Vladivostok, was found dead from multiple gunshot wounds in what has been described as a suicide.

https://www.emmanuelsblog.com.ng/2023/02/another-top-putin-official-plunges-to-her-death-from-high-rise-building.html


*
WHY THE NAZIS CONTINUED TO ACCELERATE THE HOLOCAUST AS DEFEAT WAS CLEARLY INEVITABLE

~ To you, WWII was about the US, the UK, the USSR, and France, and Germany, Italy and Japan. And the Nazis were losing big time by 1943.

But to the Nazis, WWII was an epic showdown, the final battle in a war that had been going on for generations: the war between Aryans and Jews.

And the thing was: the Nazis genuinely feared that the Jews were winning.

And that's how you understand why they still kept the genocide up for as long as they could. They were losing the war against the Allies. But they still had a chance to destroy the Jews, to exterminate them completely. To win what was, to them, the more important war.

This was more akin to a religious article of faith. They had convinced themselves of this so utterly, that even thinking about objecting to it was sacrilegious to them. Murder as a sacred duty, the worship of killing.

You are trying to apply rational thought to this. It won't work. ~ Matts Anderson, Quora

Jonathan Chin:
Anderson’s explanation is the one put forth by historians of the Holocaust in the past couple of decades. Historian Geoffrey Megargee, who headed the Holocaust Museum, actually noted that Nazis tended to react to battlefield reverses by intensifying the genocidal campaign against their racial enemies.

*
WHY THE PUNJABI SIKHS ARE SO BELOVED BY THE BRITISH

Three things I know about Sikhs:

1) They wear turbans. Irrelevant. A lot of people wear a lot of things.

2) They run kitchens to feed literally everyone who comes to them. No means testing. No ID. No waiting. No compulsory prayer meeting. No pamphlets. If you’re hungry, the Sikh community will feed you.

3) They carry knives. Hold your horses, there. A lot of them carry tiny, ceremonial knives tucked into their turbans, but they all carry knives, not because their religion says they have a right to defend themselves but because their religion says they have a sacred duty to defend those unable to defend themselves.

Sikhism says all Sikhs have a duty to feed and defend all people who need it, and Sikhs actually follow those tenets of their religion.

We could be here a very long time listing examples of worse people. ~ Sableagle


Peter Knobel:
I found on a 1973 visit to India, bus companies owned and operated by industrious Sikhs. I asked one driver what the big sword behind his seat was for. He said, ‘That is to deal with bandits in the country.’ Any bandit should have thought twice about taking that driver on!

Danny Peck:
can’t remember who said it but someone once said “God created Sikhs to teach Christians how to be Christlike” and that's stuck with me even as an atheist.

*
LET IT BE MORNING (movie) — ABSURDIST BLEAKNESS

~ “Let It Be Morning” begins with a vision of prison bars, which turn out to be the metal on a cage holding wedding doves. Although the first scene is indeed set during nuptial celebrations, it’s an undeniably ominous omen when the door is opened and the birds refuse to fly.

There are, in fact, bars everywhere in Eran Kolirin’s Palestinian drama, though few others are as visible (or unsubtle). His protagonist, Sami (Alex Bakri), is confined by his marriage, his family, his town. Some of these imprisonments, like his unhappy relationship with his sharply intelligent wife (an excellent Juna Suleiman), are at least partially of his own making. Others, like a stubbornly closed checkport to Jerusalem, are not.

Sami’s instinct to escape immediately after his brother’s village wedding is, he insists, purely practical: he’s got to get back to work in the city before he gets fired. But his emotional itchiness is obvious to everyone around him, even as he tries—if halfheartedly—to hide it. He left this parochial life behind, with its underachieving neighbors, needy relatives, and dusty roads leading to nowhere. He’s urban and urbane now, a sophisticated success with a good job, a modern sensibility, and a mistress who can’t wait to see him.

Unfortunately, the Israeli army couldn’t care less about any of this. In an effort to capture unregistered Palestinians, soldiers have fenced off the town so that no one can go in or out. As much as Sami wants to get away, he’s got no escape. And if he can’t run from his hometown, he’ll have no choice but to revisit the self he long ago left behind.

Kolirin has made several features but is best known for the 2007 dramedy “The Band’s Visit.” This is a far more melancholy affair, relying less on absurdist laughs and more on frank despair. Bakri and his supporting cast tap into their characters’ emotions with palpable depth. Cinematographer Shai Goldman reflects their increasingly jittery energy with a sharply observant lens that jumps from one perspective to another. And music supervisor Habib Shadah adds layers with a fast-switching soundtrack that does the same.

Even so, Kolirin unearths dark humor in unexpected spots, from those recalcitrant birds to the single, utterly clueless Israeli soldier tasked with keeping an entire, outraged town at bay. (The filmmaker’s mordant sensibility bled into reality when his mostly-Palestinian cast protested the movie’s Cannes premiere, because it was submitted as an Israeli release.)

It’s an impossible situation for everyone, and Kolirin—a Jewish Israeli who adapted his script from a book by Palestinian author Sayed Kashua—never pretends otherwise. This does lead to a climax that we can see coming as soon as we meet Abed (Ehab Salami), a former friend who once again follows Sami around like a puppy dog. And Sami himself is on a journey that feels increasingly foreseeable as circumstances force his hand.

But Kolirin has a sense for the bleakly surreal, and an ability to balance even the darkest experiences with empathetic shades of gray. Everyone here is bound by bars of some sort, and everyone has the freedom to make certain choices within them. Ultimately, his interest is less about the roadblock in front of Sami than about the paths he wants to forge—or doesn’t. ~


https://www.thewrap.com/let-it-be-morning-review-palestinian/

from another source:
Sami (Alex Bakri) is a sad man, so sad that he seldom smiles or warmly engages with others. Sami’s mood colors nearly everything in Let It Be Morning, the story of a Arab/Israeli citizen who returns to his small village for his brother’s wedding.


Don't be misled, Let It Be Morning is no nostalgic chronicle of a homecoming. After the Israeli army seals off the village, Sami becomes stranded in a political and personal limbo: He's uncomfortable being away from his fast-paced life in Tel Aviv and the mistress who seems to exemplify a typical midlife crisis.

Sami and his wife Mira, played by Juna Suleiman, have a much loved young son but Suleiman makes it clear that Mira understands the reality of a life that has stagnated.

Directed by Eran Kolirin (The Band's Visit), Let It Be Morning relies fine performances from Bakri and Suleiman to enhance its low-key, character-driven approach.

The Arab community depicted in Let It Be Morning, adapted from a novel by Palestinian writer Sayed Kashua, isn't unified. Some villagers want to cooperate with the Israelis, who have blockaded the village as part of a campaign to identify illegal West Bank Palestinians who are seeking work. Others want to protest.

Ehab Salami portrays Abed, a newly minted cabbie who has accumulated crippling debt to buy his vehicle. Once a friend, Abed has become a source of embarrassment for Sami. His insistent presence pushes Sami to face a background he thought he had shed.

Simple on its surface, Let It Be Morning leaves viewers with much to digest; the movie stands as a quietly realized counterpoint to current newspaper accounts about Israeli/Arab conflict.

That's not to say that the conflict is ignored but that Kolirin takes a  humane approach to tension as his characters struggle to find their footing.

Let It Be Morning makes it impossible to overlook the humanity of people whom we might otherwise meet only in news reports. Kolirin tells a story about complex characters living in a complex situation over which they don't always have control. In other words, his movie  mirrors life. ~

http://denersteinunleashed.blogspot.com/2023/02/a-deeply-human-drama-in-arab-village.html


and one more:

~ “We gotta get out of this place if it’s the last thing we ever do.” – The Animals, 1965
The said lyric is exactly how Sami (Alex Bakri) feels.  Sami, his wife Mira (Juna Suleiman), and his young son are trapped in purgatory.  No, not in a religious netherworld due to past sins, but in a tangible place:  his hometown. 

While visiting this tiny, remote village to attend his brother Aziz’s (Samer Bisharat) wedding, a small band of military types blocks the one road in and out of town. 

The one road!

Unfortunately, Sami, his wife, and his son face this manned obstruction after sundown when attempting to drive home to Jerusalem.  They’re forced to turn around and head back to his parents’ house, spend the night, and hopefully, this unexpected inconvenience will subside in the morning.

Well, come morning, it…does not. 


For an unknown swathe of time, Sami and his family are stuck in this isolated community, and no one can – definitively — circle a date on the calendar when the roadway will open again.
Writer/director Eran Kolirin’s “Let It Be Morning” was released in 2021, and the film became Israel’s Best International Feature submission for the 2022 Academy Awards.  It didn’t land on the Oscar shortlist, but Kolirin’s movie does offer an anthropological study over its 101-minute runtime.

This unnamed parish is primarily an Arab-populated community, one located in Israel.  So, tensions are built into the narrative.  However, the script – based on Sayed Kashua’s 2006 novel – does not delve into massive combative tactics between Jewish and Arab populaces.  Some mentions of the ever-present geo-political, religious, and cultural differences occur, and yes, the road is blocked, an obvious point of contention.  Still, the film’s messages pertain to a couple of universal aspects of human nature, whether the movie is set in Phoenix, Shanghai, Sydney, Nairobi, Buenos Aires, or Jerusalem. 

Granted, the given municipality isn’t one of the planet’s largest commerce centers.  Here, “everyone” knows Sami’s name, and the movie touches upon family conflicts, but not in a cliché-driven sense. Disagreements are presented and explored as a matter of fact. 
Rather than showcase screaming matches and verbal outbursts – that we might expect from forced, paint-by-numbers American dramedies where extended families are cooped up in a home over the holidays (see also, “The Family Stone” (2005)) – general apathy is the “winning” emotion of the day in “Let It Be Morning”.

Living with regret is a common theme with the inhabitants and visitors in this anonymous settlement.  Will anyone break free?  Break free of their invisible chains, as songwriter/drummer Neil Peart famously called the emotional reasons for remaining in unwanted life circumstances.

In most (but certainly, not all) cases, listless tones and the characters’ general indifferences carry the production, and the motion picture’s deliberate lingering pace can create struggles for moviegoers. Subdued and isolated discourse repeatedly transpires within quiet rooms or on empty street corners as Sami trudges through his given, forced circumstances.  

Occasionally, words of wisdom resonate with him and us, especially from Sami’s mom and his childhood friend Abed (Ehab Salami), but we wade through lengthy, dreary stretches to get to these Promised Land exchanges. 

Meanwhile, sad-sack Sami carries an everyman suburban white-collar worker’s fate. The man has it all but doesn’t appreciate his blessings, including Mira, whom he sadly neglects.

Don’t cry over “Let It Be Morning”. It’s not an invaluable time at the movies.  Still, the film is deliberately downtrodden as broken dreams aren’t forgotten, and disappointments forge an ever-present malaise. Some moments of levity temporarily raise spirits, but not often enough to consider this cinematic adaptation a comedy. 

In addition to the last scene, a moment that stuck most with this critic is when Sami’s household-triad briefly steps away from the urban center, as Kolirin captures lovely rocky hills and buttes, complete with olive trees, fresh air, and the chance at new beginnings.  The scene makes one appreciate the natural beauty of the region. Perhaps “getting out of this place” shouldn’t be a ubiquitous proclamation. ~ 


https://www.phoenixfilmfestival.com/blog/2023/2/let-it-be-morning-movie-review

Oriana:

I remember with pleasure a wonderful Palestinian comedy, Tel Aviv on Fire. Alas, “Let It Be Morning” isn’t even ten percent as good. First I thought that it’s my ignorance of the subtleties of the Arab-Israeli relations that made it difficult for me to comprehend the movie, much less enjoy it. But once I remembered how much I enjoyed “Tel Aviv on Fire,” that reason evaporated.

“Morning” simply doesn’t have the brilliant comic scenes, the energy, the zest that made “Tel Aviv on Fire” a near-masterpiece — no special knowledge of the nuances of Arab-Israeli politics required.

The situation of waiting for the closure to be lifted reminded me of “Waiting for Godot” — but without Beckett’s all-out brilliance of super-bleakness. Sometimes less really is less. The wedding could have been a fiesta of music, dance, and wonderful food — but the movie misses this obvious opportunity for presenting us with some colorful aspects of Palestinian culture.

The only scene that worked for me both as comedy and heavy symbolism was the attempt to release the doves. After a grand announcement, the cage door is open — and the beautiful white doves, a symbol of peace, refuse to leave the cage.

The finale is also quite good, from the point of view of absurdist comedy. The people of the village finally, finally unite and march toward the road block — only to find that gate open. The one Israeli soldier who used to sit in a folding chair guarding the road — sometimes strumming his guitar, but mostly napping — is gone, along with the chair. If this happened to be Beckett, the empty chair would have probably remained, teasing the audience with all sorts of possible symbolism — but we don’t get even that.

Still, I must admit that the ending works as an absurdist comedy. The problem is that it cannot compensate for the rest of this non-brilliant movie. Nevertheless, I would still recommend it: it puts human faces on the interminable Jews-and-Arabs question. It is a political movie, but it's even more a family drama, a community drama, and an altogether "we are all human" kind of drama -- a movie that unites us rather than divides us.



*
HAS VISITING ISRAEL CHANGED YOUR MIND ABOUT ARAB-ISRAELI RELATIONS?

~ To be quite frank, yes it has.

I’m an Arab, born into a Muslim family. However, I was born and raised in London, UK. The anti-Zionist sentiment reigns quite strongly here.

I visited Israel in June 2015, and later again in 2018, and spent a month overall there. It’s not much time in the grand scheme of things, though sometimes it felt like reality pretty much slapped me in the face to “grow up”. It took me a full 23 years of life to come to the realization that the concept of reality itself was a mosaic of everything we know and do not know. There is no black or white truth, no single group or individual has monopoly over justice.

I was convinced by some acquaintances I had met in London — messianic Christians — to travel to Israel. I found the idea of setting foot in Israel abhorrent, but I don’t know what to tell you.

Curiosity got the better of me, and I am glad it did.

I stayed in Tel Aviv/Jaffa for a couple of days, visited the Negev desert, Ramallah in the West Bank, Haifa in the North, Beer Sheva, and of course Jerusalem.

It’s a majestic place indeed. Something felt surreal about being in a place I thought I’d never see for as long as I lived.

I got to see first hand the ‘Startup Nation’ at work. I toured Tel Aviv’s co-working spaces, accelerators, VCs, and met some amazing people, learned a lot and went back home inspired.
I saw people from all walks of life, people from Ghana, Ethiopia, Poland, Brazil, Japan, America, Egypt, Morocco, Iraq, you name it. This detail was especially important for me because I genuinely believed it was so “white” for the most part. People treated me with kindness and hospitality, and I really appreciated that. Not to mention what amazing cooks they are! I found it terribly hard not to like them once I began acquainting myself with people on a personal level, sharing bits of their lives with me, their traumas, their fears of the future. It was really hard to find that hate and anger I had carried with me all the way from London. It is so easy to hate what you do not know, or what you do not understand. I think that was my lesson number one.

My lesson number two was this: there is no unified Arab entity. It doesn’t exist, forget about it.
Looking back retrospectively, further back than my travels to Israel, it seems that in my mind, I had been living in the Middle East during the 1950s, when Arab Nationalism was most rampant. Except, it was the 21st century, and I was living in London. My level of disconnect to reality astounds me until today, but this is the cold truth. I stayed in Jerusalem for a little while, and spent some time in the Arab quarter. I had some interesting discussions about Iraq with some Palestinians. As an Iraqi, I have no admiration for Saddam — or any dictator for that matter — however, I understood that Palestinians in general disagree with me on that, and that’s fine. But I spent a few days thinking about this.

Iraqis around the world demonstrate for Palestine, support their resistance, sometimes doing so even if they disagree with certain principles and/or are aware of the hostility towards Shiites a lot of Palestinians share. Yet they seem to have a hard time supporting Iraq’s right to self-determination, only because the process doesn’t involve their “second leader, Saddam” to quote a young Palestinian man I met.

Whether the process has been successful or not is irrelevant here — democracy does not happen over night. It is a dire process that involves trial and error. The hostility I got for expressing these very thoughts was disappointing to say the least, and in many ways, hypocritical. Considering Arab nationalism rests primarily on Palestinian resistance, I felt this came as a blow. There is no Arab nationalism, there is politics, and politics is as brutal as a predator preying on its meat. That is all.

I suppose being a Zionist doesn’t actually mean excusing every policy decision the state of Israel has made or will make in the future. Israelis have been some of the harshest critics of their own government, it turns out.

In the same way that we all criticize the policies of other countries, I have my reservations about some of Israel’s. For example, I think that I will consistently remain firm on my opposition to illegal settlements.

I made friends I will keep for life, learnt so much about a people I grew up resenting and despising, and most importantly, I came back a little wiser. :) ~ Roqaia, Quora


Tel Aviv apartment building

Alan Sargeant:
I am glad that you traveled there with an open mind. I have also been twice to Israel and on both occasions I was with a young Israeli doing their military service. It certainly changed my preconceptions about Israel.

I discovered that there are places reserved in the Knesset for Arabs (notionally Palestinians — although the real Palestinians have long since gone and were supplanted by Jews centuries ago.) However, I am not going to argue over ethnicity of non Jews who are mostly Muslim with a significant Christian minority on the West Bank and in Jerusalem.

The second thing that surprised me was that Arabic was actually an official language.

I also discovered that while military service was compulsory for Jews, it was optional for other minorities and some did it.

Douglas Gray:
In a nutshell, while the Israeli treatment of those in Palestine and the West Bank is not perfect, they are not nearly as bad as many of the corrupt people who are the Palestinian leaders.

Brian Milliner:
The problem as I see it is that there are two religions in the Middle East that have too much power and too much sway over their believers. The purveyors of extremism. They should have little say in the way the government works. They should leave it to the taxpayers, the majority, the ones that keep their bellies full, to have the say. It is taxes and donations that support religious institutions. The problem is that politicians use the religious as pawns in their fight to keep themselves in power. The politicians also forgets who pays their keep. Not the Rabbis nor the Imams. It is, unfortunately, an ever spiraling vicious circle.


*
FRIEDRICH BOLLNOW’S PHILOSOPHY OF HOPE

~ ‘We have a new image of society … and then out of this we have a new image of religion … I feel more grandiose than I did then because now I think I’d call it the basis for a [new] universalism …’ This bold prediction of unity and renewal comes neither from a bearded prophet nor a New Age guru. The idea that society and religion are heading toward a new universalism comes instead from the psychologist Abraham Maslow speaking in 1972. Maslow is famous for his idea of self-actualization and his hierarchy of needs. This hierarchy leads the individual from lowly physiological and safety needs (eg, food, shelter) through love, belonging and esteem, all the way to self-actualization. Today, Maslow’s ideas are back in fashion, covered in innumerable self-help books such as Transcend: The New Science of Self-Actualization (2020) by Scott Barry Kaufman, or The Brother’s Handbook: Abraham Maslow’s Hierarchy of Needs Revised for the Black Man (2020) by Byron Cowan.

Maslow’s theories are based on the idea that, in our ‘very essence’, humans are a ‘perpetually wanting animal’. Our wants are interconnected and interdependent: in a gradual progression, fulfilling one want or need allows for the pursuit of another, higher want or need. As the highest, the pursuit of self-actualization is an end in itself. ‘Self-actualizers,’ Maslow says, can realize the desire ‘to become more and more’ what they are, ‘to become everything’ that they’re ‘capable of becoming’. In focusing in this way on human potentiality rather than on abnormality, Maslow established what he called a ‘positive psychology’: the study and support of optimal mental health. In emphasizing self-actualization and a new human ‘universalism’, Maslow is not alone. Other thinkers – from Francis Bacon to Francis Fukuyama – have celebrated similar utopian visions of fulfillment and universality.

The idea that life’s ultimate goal is the achievement of self-fulfillment is not limited to philosophy and pop-psychology. It is given expression in everyday slogans such as ‘Be the best version of yourself you can be,’ or warnings that ‘Too many of us are not living our dreams because we are living our fears.’ However, in 1954 Maslow discovered that true self-actualizers are not that easy to find. In screening 3,000 college students for his research, he could identify ‘only one immediately usable subject and a dozen or two possible future ones (“growing well”).’ Among the general public, Maslow found he had ‘to stop excluding a possible subject on the basis of single foibles, mistakes, or foolishness … no subject was perfect.’ In fact, Maslow ended up with only two historical figures (Thomas Jefferson, and Abraham Lincoln in his last years) for whom self-actualization appeared ‘fairly sure’.

This striking absence of subjects ‘who have developed … to the full stature of which they are capable’ is not just a question of researcher selectivity and human imperfection. It stems from the discontinuities, errors, failures, setbacks and crises that are a part of all our lives. The COVID-19 health crisis likely represents the latest ‘universal’ example. During the first waves of the pandemic, we all experienced discontinuities and disruptions ranging from minor inconveniences (mask-wearing, vaccination, self-isolation) to full-blown crises (job loss, severe illness, death). How do these fit into our life story? What do they contribute to our growing, to our becoming ‘everything’ we ‘are capable of’? In Maslow’s positive psychology, these disruptions can represent only impediments and deprivations. Like an existential game of snakes and ladders, they imply a forced descent down the hierarchy of needs.

But there is a very different way of understanding growth. It is articulated not by a psychologist, but by a student of the German existentialist Martin Heidegger. This is Otto Friedrich Bollnow, who started off in 1925 with a PhD in physics, and whose pedagogical and philosophical writing continued well into the 1980s. Bollnow struggled with Heidegger’s claim that we discover our authentic selves through the experience of angst – a crisis of anxiety and dread – which arises only when we face up to the reality of our own death. Bollnow came to believe that it is not always death and angst that play this crucial developmental role. Other kinds of crises are just as important, and so are other emotions and experiences. Bollnow observed that

~ It seems to belong to the nature of the human life that it does not proceed as a unitary and continuous process of progress and development. Rather, [one] must run through successive and distinct phases which are separated from one another by breaks, and according to which life from time to time commences again with a new beginning. ~

A brush with death in a car accident, a serious medical diagnosis, or an upheaval in one’s immediate family can all constitute a crisis. Such events do not represent a brief break after which we return unchanged to our earlier pursuits. These crises also cannot be created ‘artificially’ or planned in advance, and one cannot undergo someone else’s crises on their behalf.

The crises brought on by the COVID pandemic have meant new beginnings for many of us. In early 2020, as COVID spread across the globe, none of us could envision what our lives would be like afterwards. And very few of us simply returned to exactly who or what we were before. ‘A crisis represents a liquidation … the demise of an old order [after which] a new order begins,’ as Bollnow put it in 1959.

Instead of facilitating a gradual climb up a hierarchy, crises often lead to a kind of starting over, a striking out in a new direction. ‘The new beginning now no longer means the taking over of a new task in the continuing line of progressive movement, but rather the going back in time, in order to take up once again what was earlier begun.’ Crises change us profoundly; they contribute to our maturation, and in so doing, shape who we ultimately become: ‘Only by passing through crises, does life assume its genuine being,’ Bollnow concludes in 1987. Crises have a way of (sometimes ruthlessly) reminding us of our own limits and vulnerability. Years after Bollnow, the US educator and psychologist Jack Mezirow saw the ‘crisis event’ as a key moment in his understanding of the ‘transformative learning’ of adults.

Our development occurs through a kind of dialectic – a movement between periodic crises and moments of renewal and rejuvenation. ‘This forward and backward’ movement, Bollnow emphasizes, is not about ‘accidental and avoidable mishaps, but rather belong[s] to the nature of life.’ Such events must also be seen to redirect not only how we might develop and progress but also what direction and purpose such growth might actually take. In this sense, whatever self-actualization we might realize in our lives is dependent on the setbacks and losses we have suffered. Unfortunately, these sorts of insights do not attract the kind of interest that, say, a new science of self-actualization does. Maslow’s grandiose optimism is still celebrated today, while Bollnow’s more realistic insights are quietly forgotten.

Maslow’s idea of sustained self-actualization atop a hierarchy of fully satisfied human needs is a myth. Being gainfully employed, being involved in healthy social, familial and intimate relations, enjoying physical health, self-esteem, strength and freedom: all of these are worthy pursuits precisely in the sense that they demand ongoing attention and effort. Moreover, we are living in a time of multiple, simultaneous crises, in what Britons have called an ‘permacrisis’. There is an energy crisis; China still faces COVID-19 and maybe also a financial crisis; in the United States, it is inflation and perhaps even the future of democracy itself that is in crisis.

Undertaking much of his mature work in the final years of the Second World War and during Germany’s rebuilding, Bollnow was no stranger to crises and new beginnings. Instead of Heidegger’s angst, Bollnow believed that a rather different inner orientation and frame of mind was important. This is hope, which Bollnow saw as the touchstone of human emotion and existence: ‘Hope thus points to the deeper ground in which the feelings of patience and security are rooted, and without which [we] would never be able to relax [our] attention or go to sleep tranquilly.’

Hope ‘comes to us without any effort on [our] part, as a sort of gift or grace,’ Bollnow adds. It is a frame of mind that connects us to the future, not as the inevitability of our own death, but as ‘an infinite source of new possibilities’. Hope can provide firm ground as crises batter, challenge and change us. What is needed is neither a new universalism nor perpetual satisfaction or self-actualization, but the openness that only hope can give. ~



https://psyche.co/ideas/our-age-of-crises-needs-bollnows-philosophy-of-hope?utm_source=Aeon+Newsletter&utm_campaign=0362ef0c88-EMAIL_CAMPAIGN_2023_02_15_11_56&utm_medium=email&utm_term=0_-0362ef0c88-%5BLIST_EMAIL_ID%5D

*
CHILDHOOD DURING THE ICE AGE

~ The sun rises on the Palaeolithic, 14,000 years ago, and the glacial ice that once blanketed Europe continues its slow retreat. In the daylight, a family begins making its way toward a cave at the foot of a mountain near the Ligurian Sea, in northern Italy. They’re wandering across a steppe covered in short, dry grasses and pine trees. Ahead, the cave’s entrance is surrounded by a kaleidoscope of wildflowers: prickly pink thistles, red-brown mugworts, and purple cornflowers.

But before entering, this hunter-gatherer family stops to collect the small, thin branches of a pine tree. Bundled together, covered with resin and set alight, these branches will become simple torches to illuminate the cave’s darkened galleries. The group is barefoot and the path into the cave is marked by footprints in the soft earth and mud. There are traces of two adults, a male and female, with three children: a three-year-old toddler, a six-year-old child, and an adolescent no older than 11. Canine paw prints nearby suggest they may be accompanied by pets.

Carrying pine torches, they enter the base of the mountain. At around 150 meters inside, the family reaches a long, low corridor. Walking in single file, with only flickering firelight to guide them, they hug the walls as they traverse the uneven ground. The youngest, the toddler, is at the rear. The corridor soon turns to a tunnel as the ground slopes upward, leaving less than 80 cm of space to crawl through. Their knees make imprints on the clay floor. After a few meters, the ceiling reaches its lowest point and the male adult stops. He then pauses, likely evaluating whether the next section is too difficult for the littlest in the group. But he decides to press on, and the family follows, with each member pausing in the same spot before continuing. Further into the cave, they dodge stalagmites and large blocks, navigate a steep slope, and cross a small underground pond, leaving deep footprints in the mud. Finally, they arrive at an opening, a section of the cave that archaeologists from a future geological epoch will call ‘Sala dei Misteri’ (the ‘Chamber of Mysteries’).

While the adults make charcoal handprints on the ceiling, the youngsters dig clay from the floor and smear it on a stalagmite, tracing their fingers in the soft sediment. Each tracing corresponds to the age and height of the child who made it: the tiniest markings, made with a toddler’s fingers, are found closest to the ground.

Eventually, the family accomplished what it had set out to do, or perhaps simply grew bored. Either way, after a short while in the chamber, they made their way out of the cave, and into the light of the last Ice Age.

This family excursion in 12,000 BCE may sound idyllic or even mundane. But, in the context of anthropology and archaeology, small moments like these represent a new and radical way of understanding the past. It wasn’t until 1950, when the cave was rediscovered and named ‘Bàsura’, that the story of this family’s excursion began to be uncovered. Decades later, scientists such as the Italian palaeontologists Marco Avanzini and his team would use laser scanning, photogrammetry, geometric morphometrics (techniques for analyzing shape) and a forensic approach to study the cave’s footprints, finger tracings and handprints. These little traces paint a very different prehistorical picture to the one normally associated with life 40,000 to 10,000 years ago, toward the end of the last Ice Age, during a prehistoric period known as the Upper Palaeolithic.

Asked to imagine what life looked like for humans from this era, a 20th-century archaeologist or anthropologist would likely picture the hunting and gathering being done almost exclusively by adults, prompting researchers to write journal articles with titles such as ‘Why Don’t Anthropologists Like Children?’ (2002) and ‘Where Have All the Children Gone?’ (2001). We forget that the adults of the Palaeolithic were also mothers, fathers, aunts, uncles and grandparents who had to make space for the little ones around them. In fact, children in the deep past may have taken up significantly more space than they do today: in prehistoric societies, children under 15 accounted for around half of the world’s population. Today, they’re around a quarter. Why have children been so silent in the archaeological record? Where are their stories?

As anyone who excavates fossils will tell you, finding evidence of Ice Age children is difficult. It’s not just that their small, fragile bones are hard to locate. To understand why we forget about them in our reconstructions of prehistory, we also need to consider our modern assumptions about children. Why do we imagine them as ‘naive’ figures ‘free of responsibility’? Why do we assume that children couldn’t contribute meaningfully to society? Researchers who make these assumptions about children in the present are less likely to seek evidence that things were different in the past.

But using new techniques, and with different assumptions, the children of the Ice Age are being given a voice. And what they’re saying is surprising: they’re telling us different stories, not only about the roles they played in the past, but also about the evolution of human culture itself.

Human bones are fragile things, but some are more fragile than others. The larger, denser bones of adults tend to be better preserved in the archaeological record than those of children, whose bones are more like a bird’s than an elephant’s: they are smaller, more porous and less mineralized, lack tensile and compressive strength, and may not be fully fused to their shafts (in the case of long bones). These skeletons are more vulnerable to both sedimentary pressure (when buried underground) and erosion from acidic soil and biodegrading organic matter. This is one of the main reasons why telling the stories of prehistoric children has been so difficult.

But they’re not only poorly preserved. The small size of some remains means they can be easily missed. I experienced this when I worked at an archaeological site called Drimolen in South Africa, 40 km north of Johannesburg. Archaeologists working here have dated the site to between 1.5 and 2 million years old, and have uncovered the remains of more than 80 individuals who are early members of the Homo genus and the Paranthropus robustus species, another ancient human lineage.

Close to 50 per cent of the uncovered remains were identified as children under 10 years old – far more than has been recovered from similar sites in the region (but a figure that is more in line with estimates of Ice Age demographics). One reason for this difference is the rigorous screening protocol followed by the project’s team. Every cubic centimeter of earth that is excavated at Drimolen is dry-sieved to pick out larger pieces, then placed on medium- and fine-mesh screens and cleaned with running water. The remaining sediment is spread out on a table, ready to be sorted. During my time at Drimolen, I spent countless hours searching for even the tiniest human tooth, hidden among stones and the skeletal remains of rodents and other small animals.

The act of burial also accounts for the disappearance of children from the archaeological record. Children throughout time have often been buried in remote locations or shallow graves without a coffin or grave marker. The reason for these practices has to do with the different ways societies treat the dead based on age, sex, social status and other factors. This does not mean these children were not loved or that there was not a sense of grief at their passing, but it can make locating their final resting places difficult for archaeologists.

However, not all children were buried in unmarked graves. Some were given spectacular burials, and these exceptions help us further understand the lives of children in the past. A case in point comes from an estimated 10,000-year-old overhanging rock shelter in France called Abri de la Madeleine. Here, a young child of three to seven years of age was laid to rest surrounded by three limestone slabs, which formed a protective barrier around their head. Hundreds of white shell beads produced by carving (or snapping) tusk-shaped Dentalium shells to around 6-7 mm in length, were found at the child’s head, elbows, wrists, knees and ankles and around their neck. According to the archaeologists Francesco d’Errico and Marian Vanhaeren, an unbroken Dentalium shell could produce around two tiny, tube-shaped beads, which means it would likely have taken 15-20 hours to produce the nine meters of beads found embroidered onto the child’s clothing (long since rotted away). D’Errico and Vanhaeren believe that, depending on the skill of the person making it, this garment would have required 30-50 hours to complete. Burying the child in clothing that took so long to make speaks to the grief the community must have felt at their passing.

CHILDREN AND WORK

A growing body of ethnographic and archaeological research is revealing the ways children have always contributed to the welfare of their communities and themselves. Herding, fetching water, harvesting vegetables, running market stalls, collecting firewood, tending animals, cleaning and sweeping, serving as musicians, working as soldiers in times of war, and caring for younger siblings are all common examples of tasks taken on by children around the world and across time. These tasks leave their mark in the archaeological record.

Palaeolithic children learning to make stone tools produced hundreds of thousands of stone flakes as they transitioned from novice to expert. These flakes overwhelm the contributions of expert tool-makers in archaeological sites around the world. Archaeologists can recognize the work of a novice because people learning to produce stone tools make similar kinds of mistakes. To make, or ‘knap’, a stone tool, you need a piece of material such as flint or obsidian, known as a ‘core’, and a tool to hit it with, known as a ‘hammerstone’. The goal is to remove flakes from the stone core and produce a shape blade or some other kind of tool. This involves striking the edge of a core with a hammerstone with a glancing blow. But novices, who were often children or adolescents, would sometimes hit too far towards the middle of a core, and each unskilled hit would leave material traces of their futile and increasingly frustrated attempts at flake removal. At other times, evidence shows that they got the angle right but hit too hard (or not hard enough) resulting in a flake that terminates too soon or doesn’t detach from the core.

At a roughly 15,000-year-old site called Solvieux in France, the archaeologist Linda Grimm uncovered evidence of a novice stone-knapper, likely a child or adolescent, working on a tool. Sitting to the side of the site, the novice began hitting a core with the hammerstone. After encountering some difficulty, they brought the core they were working on to a more experienced knapper sitting in the centre of the site near a hearth. We know this because the flint flakes produced by the novice and the expert were found mixed together. After receiving help, the novice continued knapping in this central area until the core was eventually exhausted and discarded. While the tools made by the expert knapper were taken away for some task, those made by the novice were left behind. At other sites, novices sat closer to expert knappers as they practised, presumably so they could ask questions and observe the experts while they worked, or just share stories and songs.

Of course, not all novices were children. But in the Palaeolithic, when your very survival depended on being able to hunt and butcher an animal, process plants, make cordage, and dig up roots and tubers, making a stone tool was essential. Everyone would have had to learn to knap from a young age and, by the time novices were eight or nine years old, they would have developed most of the cognitive and physical abilities necessary to undertake more complex knapping, increasing in proficiency as they entered adolescence. By making stone tools, children provided not only for themselves but for their younger siblings too, contributing to the success of their entire community.

CHILDREN AND PLAY

All work and no play? Not quite. Other studies of footprints, this time from 13,000-year-old sites in Italy and France, document children and teens running around playing tag, making ‘perfect’ footprints the way kids do today at the beach, and throwing clay balls at each other and at stalagmites – some of the pellets missed their targets and remain on the cave floor. Skills were honed through play in other ways: at Palaeolithic sites in Russia, researchers found 29 clay objects that, by analyzing traces of fingerprints, were determined to be made by children between the ages of six and 10, and adolescents between 10 and 15.

Ethnographically, we know that children often begin to learn ceramics by first playing with clay, making toy animals and serving bowls. Another way to see children at play in the Palaeolithic is to look for them in secret or small spaces too tiny for an adult body. Near Étiolles in northern France, archaeologists uncovered a Palaeolithic settlement. However, down from this occupied area, out of view of the settlement, they also found stone tools made by novices, as well as animal bones. This may have been a Palaeolithic clubhouse, with everything an Ice Age child would need: privacy, things to do, and snacks. And at Las Chimeneas cave in Spain, the archaeologist Leslie Van Gelder, seeking evidence of children, documented lines drawn by tiny fingers in the soft sediment on the underside of a low, narrow overhang.

Ironically, though this playful behavior has given us a window into the lives of Palaeolithic children, it has been another reason why children have been understudied by archaeologists. For some archaeologists, this behavior appears so random and unpredictable that it renders Ice Age children not only unknown but unknowable. There is a joke among archaeologists that we label an artefact as ‘ceremonial’ if its purpose is not readily discernible. Similarly, an artefact that is found in an unusual location is often explained away as the remains of a child’s play. By playing, children of the past are argued to ‘distort’ the archaeological record. As a result, ethnographic data and personal anecdotes are often used as cautionary tales.

However, a growing number of archaeologists have argued that children distort the archaeological record only if we think that our task as scientists is to reconstruct the behavior of adults. If we think our goal is to reconstruct human behavior more broadly, then children’s use and modification of objects simply adds to the rich history of an artefact’s ‘life’ or its ‘biography’.

For more than 200 years, children have been neglected by archaeologists. It was part of a disciplinary bias towards adult men in archaeological interpretations. This began to change in the 1970s and ’80s with the rise of feminist archaeology and the archaeology of gender, led by archaeologists from the University of California at Berkeley such as Margaret Conkey, Ruth Tringham and Rosemary Joyce. The approaches advocated by these female scholars critically examined the roles of women in the past and, by extension, children started to become ‘visible’ too. But it is only in recent years that youngsters have truly emerged from the shadows. This emergence is part of a growing movement within archaeology to diversify voices in the past by exploring cultural constructs of age, gender, sexuality, and identity (although it should be noted that the elderly remain understudied).

*
The cultural anthropologist Sheina Lew-Levy and her colleagues studied tool innovation among children and teens in modern foraging societies. ‘Tool innovation’ means using new tools, or old tools in new ways, to solve problems. The team observed that adolescents seek out adults they identify as innovators to learn tasks such as basketry, hide working, and hunting. Furthermore, these adolescents are the main recipients and transmitters of innovations. Those of us of a certain age can remember helping our parents program their first VCRs in the same way that teens now introduce their parents to the latest apps.

As our species spread across the globe in the Palaeolithic, the way that children and adolescents adopted innovations would have been a key factor in how well humans solved problems as we adapted to new environments. Child-sized tools have been found at many Palaeolithic sites, and even full-sized spears were included in the burials of children in Russia.

Tools are important, but so are relationships. The social context of how children and adolescents learn is at the heart of what archaeologists call ‘cumulative culture’ and one of the main reasons why it is so important to focus on the lives of youngsters from the Palaeolithic. Knowing how to survive in this challenging environment – learning what plants are poisonous, how to avoid dangerous animals, where to find food in times of drought, and how to maintain alliances with your neighbors – would have been beyond the capacity of any one mind. Instead, it took the collective knowledge of many minds working together and augmented over time for human society to flourish. The children of prehistory played a central role in this flourishing.

In speaking about cumulative culture, researchers often use the metaphor of a ratchet. A ratchet is a tool with angled teeth that allows movement only in one direction. This is an appropriate metaphor for cumulative culture because each generation builds on the knowledge of the generation that came before.

The evolutionary psychologist Michelle Scalise Sugiyama has argued that one of the most powerful vehicles of cumulative culture is oral storytelling. The human ability to live vicariously through the experiences of others is particularly important in situations that are dangerous or occur only rarely. For example, when the Indian Ocean tsunami hit the Indonesian island of Simeulue in 2004, only seven of its 75,000 inhabitants died because the vast majority fled to higher ground having remembered stories their grandparents told them as children, about a tsunami in 1907.

lions in Chauvet Cave

Scenes in Palaeolithic art were likely visual components of oral stories that would have passed from one generation to the next, and Palaeolithic children would have learned from the images around them. Across southwest France and northern Spain, more than 300 caves have been found containing paintings and engravings dating between 40,000 and 10,000 years ago. Think of the 35,000-year-old painting of a pack of lions hunting a mammoth on the walls of Chauvet cave in France, or an unusual 17,000-year-old painting of a bull with its entrails hanging out, or a man with a bird’s face, or a defecating rhinoceros, or the spear-thrower at Lascaux.

In Rouffignac cave, archaeologists observed two tectiforms, one of many symbols found in cave art whose shape roughly resembles an arrow pointing upward with a horizontal line along its base. While some researchers have suggested that it may symbolize a shelter of some kind, the truth is that its specific meaning is likely lost to us now. Nonetheless, what is special about the two found in Rouffignac is that one tectiform is large, drawn with adult-sized fingers at the height of an adult, while the other is smaller, drawn with a child’s fingers and located much closer to the ground. We know the smaller tectiform was drawn second because its lines cross those of the larger symbol. Many other parts of the cave are filled with lines drawn by tiny hands. This is not just ‘child’s play’. It’s an example of a novice engaging in the symbolic world of adults in their community.

Shared knowledge and cumulative culture, expressed through complex technologies, cultural institutions and symbol systems, are what made our way of life possible. For example, under glacial conditions, plants and animals can, over generations, migrate southward. Humans working together and pooling their knowledge can also migrate south but, in addition, they can alter their hunting strategies, create fire, sew warm clothing and build shelters. These adaptations, which can occur within a lifetime, can be passed on to future generations and then adapted for further learning opportunities. Cumulative culture, often driven by children, has allowed humans to transform landscapes and inhabit virtually every continent on the planet.

Children had a critical and defining (but often underappreciated) influence on how human culture developed and continues to develop. As the ones carrying cumulative culture forward into the next generation, they are the primary drivers of human cultural evolution. Not only are they early adopters of innovations, they are also responsible for the selective loss or winnowing of cumulative culture – not all technologies, strategies or even stories remain relevant or compelling.

The future of studying the past may well lie in our continued efforts to reconstruct the lives of these Ice Age influencers. They were children who loved and were loved; who experienced hunger and pain, but also joy; who played games, made art and occupied ‘secret spaces’; who listened to stories and made music; who learned to hunt, gather and fish; and who produced ceramics, stone tools and, sometimes, little footprints in soft mud.


https://aeon.co/essays/what-was-it-like-to-grow-up-in-the-last-ice-age?utm_source=Aeon+Newsletter&utm_campaign=0362ef0c88-EMAIL_CAMPAIGN_2023_02_15_11_56&utm_medium=email&utm_term=0_-0362ef0c88-%5BLIST_EMAIL_ID%5D

Chris Banal:
The insight that children are the curators of our society through time — that they get to choose what new innovations are carried on and which old ideas are left behind — made me realize that this is why older generations often have frustration or disdain for the younger ones.
They are powerless as the new generation assesses, chooses, and culls all the ideas and innovations that the older generation helped to curate.

Ron Gambrel:
Children are often ignored in their societal roles. Even today. I would say that in the past, children were more like other animal species in that their childhood would have been relatively short. Reproduction would likely have occurred almost as soon as the body was capable, otherwise we as a species would not have survived. That said, learning to survive and contribute came early.

*
WE HAVE ALWAYS BEEN  DISTRACTED

~ If you suspect that 21st-century technology has broken your brain, it will be reassuring to know that attention spans have never been what they used to be. Even the ancient Roman philosopher Seneca the Younger was worried about new technologies degrading his ability to focus. Sometime during the 1st century CE, he complained that ‘The multitude of books is a distraction’. This concern reappeared again and again over the next millennia. By the 12th century, the Chinese philosopher Zhu Xi saw himself living in a new age of distraction thanks to the technology of print: ‘The reason people today read sloppily is that there are a great many printed texts.’ And in 14th-century Italy, the scholar and poet Petrarch made even stronger claims about the effects of accumulating books:

Believe me, this is not nourishing the mind with literature, but killing and burying it with the weight of things or, perhaps, tormenting it until, frenzied by so many matters, this mind can no longer taste anything, but stares longingly at everything, like Tantalus thirsting in the midst of water.

Technological advances would make things only worse. A torrent of printed texts inspired the Renaissance scholar Erasmus to complain of feeling mobbed by ‘swarms of new books’, while the French theologian Jean Calvin wrote of readers wandering into a ‘confused forest’ of print. That easy and constant redirection from one book to another was feared to be fundamentally changing how the mind worked. Apparently, the modern mind – whether metaphorically undernourished, harassed or disoriented –­ has been in no position to do any serious thinking for a long time.

In the 21st century, digital technologies are inflaming the same old anxieties about attention and memory – and inspiring some new metaphors. We can now worry that the cognitive circuitry of the brain has been ‘rewired’ through interactions with Google Search, smartphones and social media. The rewired mind now delegates tasks previously handled by its in-built memory to external devices. Thoughts dart from idea to idea; hands drift unwittingly toward pockets and phones. It may seem that constant access to the internet has degraded our capacity for sustained attention.

This apparent rewiring has been noticed with general uneasiness, sometimes with alarm, and very often with advice about how to return to a better, more supposedly ‘natural’ way of thinking. Consider these alarming headlines: ‘Is Google Making Us Stupid?’ (Nicholas Carr, The Atlantic, 2007); ‘Have Smartphones Destroyed a Generation?’ (Jean M Twenge, The Atlantic, 2017); or ‘Your Attention Didn’t Collapse. It Was Stolen’ (Johann Hari, The Observer, 2022). This longing to return to a past age of properly managed attention and memory is hardly new. Our age of distraction and forgetting joins the many others on historical record: the Roman empire of Seneca, the Song Dynasty of Zhu, the Reformation of Calvin.

Plato would have us believe that this double feeling of anxiety and nostalgia is as old as literacy itself, an inescapable problem that is inherent in the technology of writing. In one of his dialogues, the Phaedrus, he recounts how the ancient inventor of writing, an Egyptian god named Theuth, presents his work to the king of the gods. ‘This invention, O king,’ says Theuth, ‘will make the Egyptians wiser and will improve their memories; for it is an elixir of memory and wisdom.’ The Egyptian king of the gods, Thamus, predicts the opposite:

For this invention will produce forgetfulness in the minds of those who learn to use it, because they will not practise their memory. Their trust in writing, produced by external characters which are no part of themselves, will discourage the use of their own memory within them. You have invented an elixir not of memory, but of reminding; and you offer your pupils the appearance of wisdom, not true wisdom, for they will read many things without instruction and will therefore seem to know many things, when they are for the most part ignorant and hard to get along with, since they are not wise, but only appear wise.

The gods’ predictions contradict one another, but they share an underlying theory of cognition. Each assumes that human inventions like writing can alter thought, and even create new methods of thinking. In 1998, the philosophers Andy Clark and David J Chalmers called this interactive system, composed of the inner mind cooperating with the outer world of objects, ‘the extended mind’. Our ability to think, they claimed, could be altered and extended through technologies like writing. This modern idea expresses a much older notion about the entanglement of interior thought and exterior things. Though Clark and Chalmers wrote about this entanglement with a note of wonder, other scholars have been less sanguine about the ways that cognition extends itself. For Seneca, Zhu and Calvin, this ‘extension’ was just as readily understood as cognitive ‘degradation’, forerunning the alarm about smartphones and Google ‘making us stupid’ or ‘breaking’ our brains.

For as long as technologies of writing and reading have been extending the mind, writers have offered strategies for managing that interaction and given advice for thinking properly in media environments that appeared hostile to ‘proper’ thought. It’s not hard to find past theories of the ways that technologies, such as printed books or writing, shaped thought in past millennia. However, these theories don’t give us a sense of exactly how minds were being shaped, or a sense of what was gained by thinking differently. To understand the entanglement of books and minds as it was being shaped, we might turn to readers and writers in Europe during the Middle Ages, when bookshelves swelled with manuscripts but memory and attention seemed to shrivel.

Writing during the 13th century, the grammarian Geoffrey of Vinsauf had plenty of advice for writers overwhelmed with information. A good writer must not hurry; they must use the ‘measuring line of the mind’ to compose a mental model before rushing into the work of writing: ‘Let not your hand be too swift to grasp the pen … Let the inner compasses of the mind lay out the entire range of the material.’ Geoffrey expresses an ideal here, but his handbook gives us little access to thinking as it really happened while seated at a medieval desk before a blank page with quill in hand. In navigating this problem, the intellectual historian Ayelet Even-Ezra pursues one route toward an answer in Lines of Thought (2021). For her, ‘lines of thought’ are the lines of connection structuring the many branching diagrams that fill the pages of medieval manuscripts. One such horizontal tree can be seen crawling across the book’s cover:’

Follow these branches on the book’s cover to the root, and you will see that the diagram grows from a neuron. This union of nervous system and diagram-tree suggests the book’s argument rather directly: for Even-Ezra, these horizontal trees written by medieval scribes did not simply record information – they recorded pathways for thinking that were enabled by the branching form of the tree itself. Branching diagrams reveal the medieval extended mind at work in its interactions with pen, ink and the blank space of the page.

Pay close attention to these diagrams, and sometimes they will reveal a medieval cognitive process as it played out. Here is one 13th-century diagram examined by Even-Ezra:

This diagram, mapping out the branches of medicine, does not seem to go as the scribe had planned. The first branch sprawls evenly and comfortably. However, the second branch is awkwardly diverted. An offshoot seems to have occurred to the scribe only later, which has been grafted on. The lowest branch is a thicket of revision and deviating thought-lines. Even-Ezra notes the obvious: this scribe did not gauge available space properly at the start. That was part of the problem. But it is also evident that the exact structure of this information had ‘emerged during the process of drawing’; written diagrams like this one facilitated complex, abstract thinking. These new abstract thoughts could surprise the thinker, who accommodates them wherever possible by sketching the diagram. Even-Ezra suggests that the horizontal-tree format made concepts ‘easier to manipulate’, abstracting them from the linearity of language. Filling out the many branches of these diagrams ‘paved the way for new questions’.

*
Remember the story of Thamus, that skeptical Egyptian god who predicted that young minds would be ruined by writing? The branching diagram, in Even-Ezra’s account, represents one good outcome of the invention of writing. These diagrams could facilitate deeper reflection, especially of an abstract kind, during sessions of intensive reading. They could also be aids to memory, rather than its substitutes, because they repackaged information in formal patterns that could stick in the mind. Medieval note-takers filled the margins of medieval books with these diagrams, and many are evidence of careful attention and a desire to crystalize new knowledge. Even-Ezra describes how the rise of these diagrams – a new kind of writing technology – reshaped cognition.

We can see the effects drawn out on the page. Geoffrey of Vinsauf might have looked on in horror as medieval diagrammers, against his best advice, took up the pen to draw out abstract ideas not yet fully composed. But, like Even-Ezra, we can watch these developments with no anxiety or alarm. From a safe historical distance, Lines of Thought proposes that the medieval vogue for branching trees subtly rewired the medieval mind. But today, neither we nor Even-Ezra worry about the old ways of thinking that may have been lost in process.

We might follow a similar line through the long history of technology-induced media anxiety. There have been thousands of years of analogous fears of broken, distracted, stupefied brains – whatever metaphors are invented to express them. Our present worries are a novel iteration of an old problem. We’ve always been rewired (even before the new media technologies went electric and metaphors of ‘wiring’ became ubiquitous; the metaphor itself is older, and really caught on in the telegraph era.)

Consider another example: have indexes in printed books made us more distracted readers? In Index, A History of the (2022), the English historian Dennis Duncan makes Plato’s anecdote about the Egyptian gods Theuth and Thamus the ancient point of origin for a long historical arc of tech anxiety bending towards Google. At points between Plato and search engines, Duncan plots the rise of the index as a necessary piece of search equipment for readers. Compilers and users of early indexes in the 16th century, such as the Swiss physician Conrad Gessner, saw great potential in them, but also had reservations. Gessner used this technology in many of his books, creating impressive indexes of animals, plants, languages, books, writers and other people, creatures and things. He thought that well-compiled indexes were the ‘greatest convenience’ and ‘absolutely necessary’ to scholars. Yet he also knew that careless scholars sometimes read only indexes, instead of the whole work.

The index invited a kind of misuse that was an affront to the honest scholarship Gessner believed it was supposed to serve. Erasmus, that intellectual giant of the Renaissance, was another critic of the misuse of the index, yet he was less concerned about lazy, index-first readers than the writers who exploited this tendency. Since so many people ‘read only titles and indexes’, writers began to put their most controversial (even salacious) material there in search of a wider audience and better sales.

The index, in other words, had become the perfect place for early modern clickbait. It was up to the good reader to ‘click through’ – to read the whole book and not just the punchy index entries – before rushing to judgment. Erasmus did not expect many readers to put in the legwork. But he makes no argument against printing books with indexes in themselves, any more than he argues against giving books title pages (for title pages, too, were newfangled, time-saving additions to printed books). For Erasmus, the index was a tool that was only as good as its readers. Duncan gives us a history of anxious controversy around the index and how people have used it, taking this unremarkably familiar feature of every book’s back-matter and revealing its early career as the latest technological threat to proper thought.

Should we look back on these changing interactions between books and minds, and worry that some ‘Great Rewiring’ was taking place centuries ago? Obviously not. Even if we believe that a commonplace way of writing down ideas on the page really was changing the way medieval minds worked, as Even-Ezra argues, we don’t look back with regret. Even if the new multitudes of books, and the indexes mapping them, caused some alarm among those who witnessed their proliferation and the demise of careful and attentive reading, we raise no alarms in retrospect. New regimes of memory and attention replace the old ones. Eventually they become the old regimes and are replaced, then longed for.

That longing now takes shape as a nostalgia for the good old days when people were ‘voracious readers’ of books, especially novels. Johann Hari, in his book Stolen Focus (2022), introduces us to a young bookseller who cannot finish any of the books by Vladimir Nabokov, Joseph Conrad or Shirley Jackson that she picks up: ‘[S]he could only get through the first chapter or two, and then her attention puttered out, like a failing engine.’ The would-be reader’s mind just runs out of steam. Hari himself retreats to a seaside town to escape the ‘pings and paranoias of social media’ and thus recover the lost experience of attention and memory. Reading Dickens was part of his self-prescribed cure: ‘I was becoming much more deeply immersed in the books I had chosen. I got lost in them for really long stretches; sometimes for whole days – and I felt like I was understanding and remembering more and more of what I read.’ To Hari, and many others, re-focusing on reading fiction is one obvious method to return the mind to some previous and better state of attention and remembering. This novel cure is a method so obvious, occurring to so many, that it often goes unexplained.

Getting lost in books, in novels, has been recast as a virtuous practice in modern life: the habit and the proof of a healthy mind. The same practice, however, has looked to others like a pathology. The ‘voracious reader’ presents as the mind of that intellectually malnourished, overstimulated junkie diagnosed by Petrarch, strung out on a diet of flimsy texts: ‘frenzied by so many matters, this mind can no longer taste anything’. Don Quixote characterized the pathological reader, so enthralled by his fictitious books of romance that his mind forgets reality. In Jane Austen’s England, around the turn of the 18th century, as more women and a growing middle class began to read novels, warnings were issued against their unhealthy effects. Concerned observers in the early 1800s wrote that a ‘passion for novel reading’ was ‘one of the great causes of nervous disorders’ and a threat to the ‘female mind’. Watch out, one wrote in 1806, for ‘the excess of stimulus on the mind from the interesting and melting tales, that are peculiar to novels’.

Later, in the 20th century, Walter Benjamin theorized that the urbanite’s solitary reading of mass-produced novels had made it almost impossible for them to achieve the state of mind required for storytelling. For him, novels – in tandem with newspapers and their drip-drip-drip of useful information – made the true mental relaxation that comes from boredom much harder to find. (He saw boredom as the natural incubator for storytelling.) It is remarkable how two different eras could both say something like: ‘We live in a distracted world, almost certainly the most distracted world in human history,’ and then come to exactly opposite conclusions about what that means, and what one should do.

Hari’s seaside idyll of properly managed attention (ie, getting lost in books) would have been taken as a tell-tale sign of a pathologically overstimulated mind in another age. That irony of history might be instructive to us. New technologies will certainly come along to vie for our attention, or to unburden our memory with ever-easier access to information. And our minds will adapt as we learn to think with them. In Stolen Focus, Hari quotes the biologist Barbara Demeneix, who says that ‘there is no way we can have a normal brain today’. There’s a yearning here, after some lost yesterday, when the mind worked how it was meant to. When was that, exactly? Seneca, Petrarch, and Zhu would all like to know.

Tech nostalgia tends to look wrong-headed eventually, whether it longs for the days before Gutenberg, or before daily newspapers, or before Twitter. Hari makes a good case that we need to work against the ways that our minds have been systematically ‘rewired’ to align with the interests of tech giants, polluters and even a culture of overparenting. He doesn’t believe we can truly opt out of the age of distraction by, say, ditching the smartphone. Indeed, we will still worry, as we should, about how our minds interact with external things. But together we should imagine a future of more conscientious thinking, not a past. ~

https://aeon.co/essays/weve-always-been-distracted-or-at-least-worried-that-we-are

Frank Brown:
yes — my first thought was like the observation that young people today have become lazy and disrespectful to their elders (society is doomed) — written 2000 years ago …
second may be a more modern observation about the computer age — something like — what was invented before we are born is just an assumed part of the environment, what is invented before we are 16 is seen as incredibly exciting and rewarding, and what is invented after we are 35 is seen as just wrong and a threat to society!


*

HOW MUSHROOMS PROTECT AGAINST AGING

~ Human studies have found an association between mushroom consumption and lower risk of chronic diseases and premature death.

One study of more than 15,000 Americans found that those who consumed mushrooms had a 16% lower risk of mortality than those who did not eat mushrooms.

Replacing just one serving a day of red or processed meat with mushrooms was associated with a 35% lower risk of all-cause mortality.

Research has identified an amino acid in mushrooms, L-ergothioneine, that may be responsible for these health-promoting effects.

One of the world’s preeminent nutritional biochemists, Dr. Bruce Ames, published a seminal review proposing that L-ergothioneine should be classified as a “ longevity vitamin.”

L-ergothioneine is not produced in the body. It must be obtained through diet.

Typical American diets are low in mushrooms. L-ergothioneine levels in the body also tend to decline with age.

For those who don’t ingest lots of mushrooms, direct oral intake of L-ergothioneine is an easy way to obtain their benefits.

WHAT IS ERGOTHIONEINE?

L-ergothioneine is an amino acid found in high concentrations in mushrooms and other fungi. High levels are found in edible mushrooms such as porcini, oyster, shiitake, and maitake.

The amount of L-ergothioneine in mushrooms varies with the species and is impacted by conventional agricultural practices. It would take about 2-5 cups of common white button mushrooms to get 5 mg of dietary L-ergothionine. That’s why supplements are a better choice to maintain daily intake.

A growing body of evidence has found that mushrooms may help prevent chronic diseases and premature death.

A major scientific discovery in 2005, found that humans produce a transporter protein responsible for taking up L-ergothioneine from the diet and delivering it to cells throughout the body.

Due to the widespread tissue distribution of this transporter, it can transport L-ergothioneine 100 times more efficiently than other compounds.

Clinical studies suggest most tissues of the body contain L-ergothioneine. This finding helped drive the scientific investigation into how this amino acid works in the body, and the suggestion that it be classified as a longevity vitamin.

REDUCED TELOMERE SHORTENING

Several studies have pointed out how L-ergothioneine may promote longevity.

One contributor to the aging process is the loss, or shortening, of telomeres, the protective caps on the ends of chromosomes. Telomere shortening is a marker of advanced cellular aging, loss of function, and eventual cell death.

A 2022 study found that L-ergothioneine significantly reduced the rate of telomere shortening and the number of short telomeres in cells exposed to oxidative stress.

Wood ear salad. I usually add them to stir-fry.

Another area being studied is L-ergothioneine’s ability to protect cellular DNA.

For example, ultraviolet-induced DNA damage in the skin accelerates skin aging and risk of skin cancer. L-ergothioneine protects against this DNA damage in the skin, which is one reason it is an ingredient in many anti-aging creams.

Oxidative stress is a driver of disease and accelerated aging. L-ergothioneine is closely related to glutathione, one of the most powerful antioxidants produced in the body. 

L-ergothioneine concentrates in the mitochondria, which are vulnerable to oxidative damage.

Preclinical evidence shows that L-ergothioneine can help neutralize damaging oxidizing compounds before they damage mitochondria. It can also protect against free radicals that damage DNA and proteins.

Experimental evidence has also shown that L-ergothioneine can inhibit the synthesis of pro-inflammatory cytokines, which are abundant in many chronic inflammatory diseases associated with aging.

Together, these effects may help ward off chronic disease and promote longer life.

PROTECTING THE BRAIN

The concentration of L-ergothioneine is particularly high in several major regions of the brain, including those responsible for cognitive function, learning, and memory.

In mice, L-ergothioneine promotes nerve cell maturation, resulting in enhanced memory. Cell studies show it helps promote the formation of new neurons, which is vital to learning and also to memory formation.

In animal models, it is protective against oxidative-stress-induced deficits in learning and memory37 and learning deficits induced by beta-amyloid accumulation. Beta-amyloid buildup is seen in the brains of patients with Alzheimer’s disease, making L-ergothioneine an intriguing candidate for clinical studies looking at neuroprotective agents.

In humans, lower blood levels of L-ergothioneine have been noted in patients with both mild cognitive impairment and dementia, compared to healthy subjects, suggesting that low L-ergothioneine could be a risk factor for these conditions. Low levels of L-ergothioneine are also seen in patients with Parkinson’s disease and brain matter atrophy.

In a clinical trial of adults with mild cognitive impairment, taking a mushroom extract containing 5 mg of L-ergothioneine daily for 12 weeks led to significant improvements in verbal memory, working memory, sustained attention, and other measures of cognitive function compared to those taking a placebo.

CARDIOVASCULAR HEALTH

Diseases of the heart and blood vessels remain the leading causes of death and disability.

Dysfunction of the vascular endothelium is central to a wide range of cardiovascular disorders, including hypertension, atherosclerosis, chronic heart failure, coronary artery disease, and diabetes.

L-ergothioneine has been found to be protective against different types of oxidative and inflammatory damage in endothelial cells, which form the inner lining of blood vessels.
It also protects against cell stressors that impair vascular relaxation,49 and prevents the binding of monocytes (a type of white blood cell) to endothelial cells, an early event in cardiovascular disease.

A large population study published in 2020 showed that higher levels of L-ergothioneine in the body are associated with reduction of cardiometabolic diseases by 15%, cardiovascular mortality by 21%, and overall mortality by 14%.

Other studies revealed that L-ergothionine protects the endothelium from cell death.
L-ergothioneine is an amino acid found predominantly in mushrooms.

Its potent antioxidant and anti-inflammatory effects may help slow the cellular aging process and protect the body against age-related disorders, including neurodegenerative and cardiovascular diseases.

This may explain why, in population studies, people who eat mushrooms have a reduced risk of mortality. ~


https://www.lifeextension.com/magazine/2023/3/mushrooms-protect-against-aging?sourcecode=CVB301E&utm_content=article4&utm_medium=email&utm_source=newsletter-zmag&utm_campaign=CVB301E&CID=807670de-ff8a-433a-8eb3-2c588f48fd86

Oriana:

Ergothioneine is only one of the beneficial compounds found in mushrooms. Typically they are known for beta-glucans, such as beta-D-glucose. Beta glucans enhance immune function, helping prevent and defend against bacterial, viral, fungal, and parasitic infection.
I am lucky to have an Asian market nearby. I love wood-ears (gelatinous algae-like little brown cups, quite tasty too), shiitake, and oysters mushrooms. I grew up in a mushroom-loving culture (see the opening poem), and miss the mushrooms of my childhood; fortunately, Asian mushrooms are easily available to me.

But even the ordinary white button mushrooms have health benefits. If you are interested in health and longevity, mushrooms can be a tasty part of your daily diet.

Personally I believe in eating mushrooms in order to obtain a whole spectrum of benefits. Sure, if you have a specific conditions, such as neuropathy, you need to take the lion’s mane mushroom supplement to get enough of the medicinal compounds. But eat the real thing — why deprive yourself of the unique taste and texture, not to mention other benefits, some of which still remain unknown to us.


Lion's mane mushroom is supposed to improve brain function and help with neuropathy

*
SUPER-SHROOM IN CALIFORNIA (LIKE THE DESERT SUPER-BLOOM)

~ Stu Pickell, mushroom hunter, is elbow-deep in dirt.

Lying flat on the ground in a narrow canyon under a towering bigcone Douglas fir, he paws up handfuls of gently fragrant humus. Occasionally he sniffs it, searching for the scent of his quarry: a unique type of black truffle that hasn’t ever been observed before in drought-parched Southern California. This tiny pocket of trees in the Santa Ana Mountains could be its ideal habitat, he thinks—and now could be the best time ever to actually find it.

Normally at this time of year, Pickell might be hunting for chanterelles in Mendocino or searching for slime molds in Alaska. But today, the fungi aficionado and photographer is searching just a few minutes from home, near Los Angeles’s suburbs—barely out of sight of the city, but in a whole different world.

That’s because over the past few months, a remarkable series of deluges, unexpected for drought-stricken California, have sparked an epic, once-in-a-blue-moon mushroom season, giving mushroom hunters across the state a chance to pursue their passion in their own backyards.

“It’s a super-shroom!” says Justen Whittall, a botanist at Santa Clara University—the mushroom equivalent of a wildflower “superbloom.” Years this good are rare in Southern California; some hunters say the last was in 1997, more than 20 years ago.

The abundance is astounding, he says. Citizen scientists are plucking dozens of never-before-described mushroom species from the soil. Mushroom hunters have been selling truckloads of dinner plate-size chanterelles, often worth thousands of dollars, to fancy restaurants. About 150 people attended a recent morning walk at the Los Angeles Mycological Society’s annual fair to see mushrooms popping up in city parks and sidewalk cracks.

Even for those new to the fungi world, this is a perfect time to get in on the action, says Bat Vardeh, a mushroom enthusiast and founder of the naturalist group Women Forage Socal.
In dry years, “mushrooms can be everywhere around you, and invisible at the same time,” she says—hiding in soil and waiting for rain. But this year, they’re right out in the open, ready to be seen; admired; and in many cases, eaten.

The magic of mushrooms

Fungi may represent one of the greatest outstanding scientific mysteries of the modern world.
There are an estimated three or more million species—but only about 150,000 have been described scientifically. We’ve also only scratched at the surface of their potential as food, medicine, environmental cleanup agents, and more.

“In your own backyard you can find something that has never been seen before,” says Else Vellinga, a mycologist at the University of California, Berkeley.

Many species lurk undetected as mycelium, a root-like network of often underground fungi. Genetic analyses of DNA fragments found in single handfuls of soil often uncover dozens of fungal species.

That richness was on display in the Santa Anas. In just one small patch of forest floor, Pickell and his friend Christian Schwarz (who literally wrote the book on California mushrooms) found species after species: tiny chartreuse cup fungi, delicate red pinwheels, lavender blewits, metallic-capped pluteus mushrooms, and many more (but no black truffles, Pickell’s quarry of the day).


Pluteus, or deer mushroom

Such a mushroom explosion is a boon for the fungi-curious, from experts like Pickell and Schwarz to anyone with a smartphone with a camera and the iNaturalist app. But it also underscores a real problem: Because we don’t know what species exist and where, it’s challenging to see how climate and ecological change affects fungi.

“We know we’re losing species,” says Gordon Walker, scientist and host of the popular Fascinated by Fungi Instagram, TikTok, and podcast. “We just don’t know how many or how fast.”

But fungi are also consummate survivors. Give a mushroom an inch, conditions-wise, and it’ll take a mile.

Bob Cummings, a mycologist at Santa Barbara City College, has been mushroom hunting for 60 years. His first mind-blowing season was 1982 to 1983, when one of the most powerfully rainy seasons on record hit California. Chanterelles were practically popping out of the woodwork. Another epic season came along in 1997. But he’d almost given up hope for another exceptional year, since Southern California, along with much of the U.S. Southwest, has been stuck in a 20-year drought.

Those rainy years tend to come during an El Niño event, a climatic phenomenon that usually brings wet weather to the Pacific coast, from California to Chile. But recent years have seen the opposite, a La Niña pattern characterized by dry winters. “The last few years have been terrible,” says Jess Starwood, a forager from the Los Angeles area. “I didn’t see a single chanterelle.”

Scientists forecast the dry pattern would prevail once again this winter. But instead, a series of remarkable atmospheric rivers streamed across the skies, dumping as much as 600 percent of the normal rainfall in some parts of California, overtopping dams and destroying homes.

And mushrooms thrive on moisture. Dampness plus the right environmental conditions, like chilly night temperatures, triggers mycelium to fruit, sending up the forms we more readily recognize as mushrooms.

In February, several weeks after the most recent rain, Starwood crunches through thick oak leaf duff in the low hills of mountains near L.A. We’re barely out of sight of homes when she darts toward a barely visible hump in the leaves—a “shrump,” in mushroom parlance. She gently dusts away the leaves and just like that, the prize appears. She’s found a dense, fragrant, valuable chanterelle (later, she’ll sell it to n/naka, an upscale restaurant in L.A.).


“Oooh, yay, yay, yay,” she says gleefully as she carves it from the soil with a flip-out blade. “It’s just so thrilling to find them, every time.”

How to develop your 'mushroom eyes’

Even if you’re not in the middle of the supershroom, it’s always a good time to learn the thrills of mushroom hunting—and be part of remarkable citizen science efforts to understand and protect them.

The first step is to go out into the world and develop your “mushroom eyes,” says Cummings. During the damp season, find a natural spot, preferably one with mushroom-hosting trees such as oaks or pines, damp swales, or mulch and decomposing wood. Then, just start looking closely. You might be surprised how easy it is to tune your vision to fungi. “They appear like from nowhere,” he says.

Starwood compares learning to read a landscape for mushrooms with learning a new language.
“First you’re looking out at a wall of green; how do you even start breaking it apart?” she says. “Then you learn a word, or two words—seasons, timing, shade conditions, where water goes, tree species—and soon it comes together, and you can speak fluent nature.” That’s when you stumble across remarkable finds.

Mycological associations often host walks and events to help people learn to identify finds and develop your eyes. Local naturalist groups are also invaluable, says Pickell—as is a good guidebook.

Most important? Never eat anything that you haven’t identified with full certainty.
The first step is to go out into the world and develop your “mushroom eyes,” says Cummings. During the damp season, find a natural spot, preferably one with mushroom-hosting trees such as oaks or pines, damp swales, or mulch and decomposing wood. Then, just start looking closely. You might be surprised how easy it is to tune your vision to fungi. “They appear like from nowhere,” he says.

Starwood compares learning to read a landscape for mushrooms with learning a new language.
“First you’re looking out at a wall of green; how do you even start breaking it apart?” she says. “Then you learn a word, or two words—seasons, timing, shade conditions, where water goes, tree species—and soon it comes together, and you can speak fluent nature.” That’s when you stumble across remarkable finds.

Mycological associations often host walks and events to help people learn to identify finds and develop your eyes. Local naturalist groups are also invaluable, says Pickell—as is a good guidebook.

Most important? Never eat anything that you haven’t identified with full certainty.
Mushroom hunting will change the way you look at the world, says Vardeh. Suddenly, you’ll see all the places their host trees—like majestic old oaks—are threatened by wildfire or development, or where mycelium-rich soil is disrupted by plowing.

“It feels like going from walking into a room of strangers to a room full of friends,” she says.

https://www.nationalgeographic.com/environment/article/how-to-hunt-mushrooms-during-californias-epic-supershroom?rid=&cmpid=org=ngp::mc=crm-email::src=ngp::cmp=editorial::add=Daily_NL_Thursday_Science_20230216

*
SOME 40 PERCENT OF AMERICANS BELIEVE THAT JESUS WILL “DEFINITELY” OR “PROBABLY” RETURN BEFORE 2050

(Millions of Christians see humanity headed not toward peaceful progress, but toward annihilation. That’s why I bother posting: we need to start whispering [a whisper is sometimes more effective than a shout]: Jesus is never coming back. Never, never, never, never.)

“George W. Bush apparently believed that Iraq and Afghanistan were singled out in end times prophecy. “Gog and Magog are at work in the Middle East,” he told French President Jacques Chirac in a 2003 phone call, appealing to their common Christian faith as a basis for the invasion. “This confrontation is willed by God, who wants to use this conflict to erase His people’s enemies before a new age begins.” Chirac, a Roman Catholic, promptly asked his staff to call the French Federation of Protestants and find out what Bush was talking about.”

“We now live in a world shaped by evangelicals’ apocalyptic hopes, dreams, and nightmares,” Matthew Avery Sutton writes in his new book American Apocalypse: A History of Modern Evangelicalism. As the title suggests, Sutton is interested in Christian apocalypticism not as a fringe movement but as a political and cultural force that transformed America, a thesis that will likely provoke skepticism. It is one thing to marvel at the prevalence of biblical literalism—some 40 percent of Americans believe that Jesus will “definitely” or “probably” return before 2050—but it is quite another to suggest that biblical prophecy has been a central force in our nation’s history. Yet Sutton, who has written two previous books about evangelicalism, possesses a quality shared by the best historians: the ability to make his subject integral, a sun around which everything else orbits.

That Bush and Reagan managed to become leaders of the free world speaks to decades of fundamentalist political ambition. This is one of the most baffling aspects of the movement. One might expect the anticipation of apocalypse would go hand-in-hand with apathy or social withdrawal: If you believe the world is on the brink of destruction, why bother trying to transform it? But fundamentalists became more politically engaged than their liberal protestant counterparts. Sutton explains this paradox via Christ’s parable of the talents. A wealthy man goes on a journey, entrusting each of his servants with a number of talents, a unit of money. When he returns, he assesses what each man has done with their portion—whether they hid it in the ground or invested it—and praises them accordingly. The parable, which is today the lodestar of the Christian financial planning industry, has long been interpreted in terms of a more tenuous kind of stewardship. Believers see themselves as guardians of earthly virtue, charged to “occupy” the earth until Christ’s return.

One fascinating story line documents the role fundamentalists played in the early Zionist movement. During the 1890s, William E. Blackstone, a Chicago real estate developer who wrote the bestseller Jesus is Coming, became one of the first advocates for the reestablishment of Israel. Convinced that Christ would not return until this prophecy had been fulfilled, in 1891 he created the Blackstone Memorial, a petition for the instatement of Israel signed by a number of powerful premillennialist Americans, including John D. Rockefeller and J. P. Morgan, and a few Jewish leaders. When World War I broke out, he wrote to Secretary of State William Jennings Bryan to express his belief that the United States was “the instrument which God had prepared” to establish the State of Israel. Later Blackstone shared similar sentiments with President Woodrow Wilson. Blackstone continued pressing the cause until his death in 1935. Because of this early work, which predated that of Theodore Herzl, Louis Brandeis recognized Blackstone as the “father of Zionism.”

In a hilarious anecdote, missionaries Ralph and Edith Norton meet with Mussolini in the early 1930s to interview him for the Sunday School Times. Like a lot of fundamentalists of that era, the missionary couple believed Mussolini was a strong candidate for the Antichrist—the dictatorial leader who would resurrect the Roman Empire. As the Nortons quizzed Mussolini about his political intentions and explained the basics of biblical prophecy, Il Duce became fascinated. “Is that really described in the Bible?” he asked. “By the time the Nortons were through with him,” Sutton writes, “Mussolini apparently believed—and maybe even hoped—that he was the long-awaited world dictator prophesied in the book of Daniel.”

Confident assertions notwithstanding, Sutton seems ambivalent about the future of apocalypticism. He mentions the emerging church—a new generation of believers who have adopted a postmodern approach to scripture and reject premillennial ideas—and alludes to the bland therapeutic messages of megachurch pastors who no longer “spend time exploring doomsday scenarios.” At one point, he suggests that premillennial theology has largely been exported to the developing world. Perhaps Sutton’s predictions are informed, more subtly, by the prevailing wisdom about declining religiosity in wealthy countries; polls indicate that even the United States is home to a rising number of “nones,” who, when surveyed, do not pick a conventional religion.

John Martin: The Great Day of His Wrath

Progress and panic have always been two sides of the same coin, and if we dismiss the rants of televangelists, or snicker at the megaphone insanity of street preachers, it is at least in part because they embody an unflattering reflection of our own obsession with apocalypse, because their worldview is the most obvious distillation of the modern death wish. Sutton’s book demonstrates that the history of evangelicalism, cynical and fatalistic as it may be, is very much our own.

http://bostonreview.net/book-ideas/meghan-ogieblyn-american-apocalypse-evangelicalism?utm_source=hs_email&utm_medium=email&utm_content=16702582&_hsenc=p2ANqtz-86QZjauLtAdI_7mQvfuW-MrzXpFlmDJHC2fqUcJt2pktL2cm0xIK7-laXxY3C7S-kPtu0TPtFFi_AVGtKoz1GS4rfSEw&_hsmi=16702582


*
ARE RELIGIOUS JEWS STILL WAITING FOR THE MESSIAH?

~ Messiah (Mashiach) literally means the annointed one. Every Jewish king was a Messiah because the ritual of becoming king involved being “annointed”. To desire there to be a Messiah is to desire the rebirth of the Jewish national identity. The assumption throughout history has been that for the Jews’ national identity to be restored we required a leader who would return us to our homeland.

Waiting implies inaction. It also implies hope or yearning. The Zionists among us came to the realization over a century ago that in the modern age restoring our national identity does not require a single powerful individual leader to lead us home, but rather for us to take the initiative and return home on our own accord. We’ve had many prolific people who contributed to the successful reconstitution of our nation. All of us together are participants in what some may view as the beginning of the Messianic era. Perhaps one day we will look back and choose retroactively the person who was the most vital in making our redemption a reality, and “annoint” that person as the Messiah. But we’ve been done “waiting” for over a century.

The non-Zionists among us — the Assimilationists — most of whom are Reform or unaffiliated Jews living in the diaspora, no longer ascribe to the idea of redemption and the rebirth of the Jewish nation. They wish only to integrate with the societies within which they reside. They too are done waiting.

Thus the only Jews “waiting” for the Messiah to appear from the heavens riding a donkey are the Ultra-Orthodox sects who refuse to adapt to contemporary reality. Mainstream “modern” Orthodox, like myself, believe in the Messiah as Maimonides would, as a metaphor for a new reality, a rebirth. Not that we discount the hope that some day such a “Mashiach” may appear, but we acknowledge the miraculous events of the recent past. The fact that the Land of Israel is not only once again sovereign Jewish territory but that fully half of the Jews of the world have been gathered and returned to our land. Only fools can ignore the obvious realization of the prophecies of redemption that have taken place, against all odds.

There is still a long way to go on the road to redemption. But we aren’t waiting anymore. We are proudly marching down the road to redemption, undaunted and motivated to forge ahead, facing every obstacle with faith and determination. ~ Chaim Handler

*
PEOPLE WITH COLLEGE DEGREE LIVE UP TO THREE YEARS LONGER

~ A new study published in the Proceedings of the National Academy of Sciences journal suggests that individuals who have earned a bachelor’s degree or higher have seen increasingly longer life expectancies than those who do not have at least a bachelor’s.

The most recent data sourced from 2018 suggests that individuals who hold at least a bachelor’s degree will outlive their counterparts by approximately three years.

Specifically, those with advanced degrees were likely to live 48.2 years out of 50, whereas those without a bachelor’s degree education lived closer to 45.1 years.

Researchers analyzed mortality data from 48.9 million people between the ages of 25-84 within the years of 1990 and 2018. From there, they controlled for deaths outside of old age, such as suicides or drug overdoses.

Results showed a widening mortality gap between those with a degree and those without as time goes on.

“The widening educational differences have meant that education is now a sharper differentiator of expected years of life between 25 and 75 than is race, a reversal of the situation in 1990,” researchers write.

This trend continues when adjusting the data to incorporate race into the analysis. While the mortality for Black Americans, for instance, is still higher than their white counterparts, education has become a larger factor in mortality discrepancies.

“Throughout all of recorded history in the U.S., Black people have more frequently died earlier and younger than whites — that remains true,” Sir Angus Deaton, an economic professor at Princeton University told CNBC. “But what has happened is the gaps by race have narrowed and the gaps by education have widened within both groups. Differences by race, which have been there forever, are still there, but they’re becoming smaller than differences in education.”
As for the explanation behind this trend, researchers believe that a combination of a decline in wages and increasingly automated work, have reduced the supply of jobs with strong benefits for workers.

While the data used in this analysis is pre-pandemic, the authors anticipate that the educational divide in mortality could widen further, as those without level of higher education are less likely to have jobs that permit remote work and are therefore relegated to layoffs or increased COVID-19 exposure. ~

https://thehill.com/changing-america/well-being/longevity/544120-earning-a-bachelors-degree-could-add-an-average-of/

Oriana:

Education correlates with a higher IQ, and IQ is one of the best predictors of longevity.

*
DRUG SEEMS TO REVERSE MENTAL DECLINE IN MICE

~ An experimental drug reversed age-related declines in memory and mental flexibility in old mice after just a few doses, according to a study by researchers from the University of California, San Francisco (UCSF).

The drug, ISRIB, has previously been shown in other studies to restore normal cognitive function in mice after traumatic brain injury, enhance memory in healthy mice and mice with Down syndrome, as well as prevent noise-related hearing loss.

The experimental drug works by interfering with the body’s Integrated Stress Response (ISR). The ISR slows down protein production in cells if something is wrong, such as a viral infection or injury, to give the cells time to heal. But as both memory and learning depend on active protein production, the ISR can lead to problems if it remains in the “on position” in the brain.

The ISRIB, which stands for ISR InhiBitor, reboots the cell’s protein-synthesis machinery after it’s stopped by the stress response.

For the recent study published this month in the journal eLife, researchers trained older mice to escape from a watery maze by finding a hidden platform, a task researchers said is typically more difficult for older mice to learn. But the study found mice that received small doses of ISRIB over a three-day training process were able to complete the task as well as younger mice and much better than mice of the same age who weren’t given the drug.

“ISRIB’s extremely rapid effects show for the first time that a significant component of age-related cognitive losses may be caused by a kind of reversible physiological ‘blockage’ rather than more permanent degradation,” Susanna Rosi, UCSF professor in the departments of Neurological Surgery and of Physical Therapy and Rehabilitation Science, said.

The mice were then tested several weeks after the initial treatment to determine how long the cognitive rejuvenation lasted. Researchers trained the same mice to find their way out of a maze with an exit that changed daily, a test of mental flexibility. The mice who had received the treatment still performed higher levels than untreated mice. 

Researchers analyzed the anatomy of cells in the animal’s hippocampus just one day after giving the mice a single dose of ISRIB and found electrical activity between neurons was more lively and cells were better able to form stable connections with one another.

Scientists behind the study believe a drug like ISRIB could at some point treat a number of diseases such as Alzheimer’s, dementia, Parkinson’s and Amyotrophic Lateral Sclerosis.

“The data suggest that the aged brain has not permanently lost essential cognitive capacities, as was commonly assumed, but rather that these cognitive resources are still there but have been somehow blocked, trapped by a vicious cycle of cellular stress,” Peter Walter, a professor from the UCSF Department of Biochemistry and Biophysics and a Howard Hughes Medical Institute investigator, said. ~

https://thehill.com/changing-america/well-being/medical-advances/531885-drug-reportedly-reverses-age-related-mental/


*
OUR INFLAMMATION-DRIVEN AGING CLOCK

~ According to biologist David Furman of Stanford University, “Every year, the calendar tells us we’re a year older. But not all humans age biologically at the same rate. You see this in the clinic — some older people are extremely disease-prone, while others are the picture of health.”

Now, Stanford scientists have discovered a different way of ascertaining our future expiration dates: the “inflammatory-aging clock” (iAge). Determining someone’s iAge is not only far simpler than performing epigenetic tests. It can also help individuals and their physicians anticipate and confront health issues before they happen.

One of the things that distinguishes people who remain healthy longer from others is the strength of their immune system.

One of the immune system’s primary tools is acute inflammation. This is a “good” process because it is the body’s localized, protective response to things like tissue damage, invasive microbes, or metabolic stress. Importantly, it is a short-term response that lasts only as long as needed for the immune system to finish its job.

On the other hand, long-term, system-wide inflammation is “bad.” This form of inflammation causes organ damage and is associated with aging. It makes a person vulnerable to a whole range of conditions, including heart attacks, cancer, strokes, arthritis, cognitive decline, depression, and Alzheimer’s.

INFLAMMATION AS A MEASURE OF AGING

Furman is the director of the 1000 Immunomes Project, “the world’s largest longitudinal population-based study of immunology and aging.” As such, he had access to blood samples taken from 2009 to 2016 from 1,001 healthy people aged 8-96.

Artificial intelligence (AI) analysis of the samples allowed the researchers to identify protein markers in the blood that most reliably indicated a person’s inflammation age. They identified a specific cytokine, CXCL9, as being especially useful. When processed by an algorithm the team devised, it produced a simple inflammation-age value. Comparing this to the patients’ histories, it turned out to align with the health of their immune systems and subsequent encounters with age-related disease. CXCL9, produced by the inner lining of blood vessels, is associated with heart disease.

The researchers verified the validity of their system by measuring the iAge of people 65 and older who had had their blood drawn in 2010. When they followed up with these people in 2017, the scientists found that their 2010 iAge turned out to be a more accurate predictor of their health than their chronological age.

Finally, the researchers tested their iAge algorithm with 29 long-lived people from Bologna, Italy — only one of whom had not yet turned 100 — comparing them to 18 average 50-79-year-old individuals. The inflammatory age of all the Bolognese participants averaged about 40 years younger than their chronological age. Furman reports that one 105-year-old man had an inflammatory age of 25.

CXCL9 is easy to measure and may have significant clinical applications. Specifically, it highlights the value of addressing chronic inflammation as a way to increase longevity.
Most promisingly, a person’s iAge could serve as an important early-warning system. Furman notes, “Our inflammatory aging clock’s ability to detect subclinical accelerated cardiovascular aging hints at its potential clinical impact.” He adds, “All disorders are treated best when they’re treated early.” ~

https://bigthink.com/health/iage-stanford/#Echobox=1676301523

Oriana:

Meanwhile, we need to use our knowledge of natural anti-inflammatories. Among the most potent ones is fish oil.

Coffee with milk (make it goat milk) is twice as anti-inflammatory as plain coffee.

Then there is curcumin, the leafy greens, berries, beets, sweet potatoes, and many more items we would be wise to include in our diet. More on this in the next blog.

*
ending on beauty:

FOR PENELOPE

Do not wait for me Penelope

I am not the Navigator
I am the great pink boar
wallowing at Circe’s trough

O Penelope
these are
my last human thoughts

~ Sutton Breiding


No comments:

Post a Comment