Saturday, April 29, 2023

GEOGRAPHY OF VIOLENCE; ULTIMATE FATE OF THE UNIVERSE; TECHNO-OPTIMISM: THE CASE FOR FLOURISHING; THE DESTRUCTIVENESS OF CONCRETE; KKK DIDN’T ALWAYS DRESS IN POINTED WHITE HOODS; HOW EXERCISE DURING PREGNANCY AFFECTS THE OFFSPRING; WARNINGS AGAINST LASIC

One of the most beautiful Baroque churches in the world: The church of Jesus, also known as Casa Professa, is located in the historic center of Palermo and dates back to after the mid 1500s.

*
IN A CLOCK SHOP

Temporal, temporary,
I listen for the clocks to say,
but what they say is round
They say return, return

to blank zodiac, the zero noon.
They say Again, again
on the pulsing inner shield,
an ant-like run of seconds.

They say the same, the same
those hands that point but hold
nothing, cuckoos that repeat
their own echoes.

In my uncle’s house,
the grandfather clock
woke me up at night,
its brass gong spidered

in a calibrated cage —
and my life
ticked in the narrow dark.
The heart that beats

I am, I am,
is a clock, not a valentine.
Everything that exists
is a clock; here it is only refined,

fitted with weights and chains,
faces and numbers that shine
like the eyes of something
closing in —

watching us to the end,
without end as the stars.

~ Oriana

*
“Lord, how the day passes! It’s like a life—so quickly when we don’t watch it and so slowly when we do.” ~ John Steinbeck

*
THE MOST CONTROVERSIAL RUSSIAN WRITER, ACCORDING TO DIMA VOROBIEV

Must be Alexander Solzhenitsyn, the author of the anti-Soviet magnum opus “The Gulag Archipelago”.

Consider the following: The man got the highest seal of approval from the liberal West (the Nobel Prize in Literature, 1970), and from the great admirer of Russia’s imperial legacy, President Putin.


His work is a searing indictment of Soviet rule as a destroyer of the glorious tradition of  Imperial Russia. However, in post-Soviet Russia, the USSR is considered an heir to the millennial tradition of the Russian civilization. Moreover, it was a pinnacle of it.

Solzhenitsyn viewed the secret police NKVD/KGB as core elements of Communist rule,  the Devil incarnated. Meanwhile, the ongoing nation-building in Russia is headed by a proud alumnus of the KGB, with thousands of his colleagues forming the core of our ruling class.

Together, they carry on the torch of Stalin’s police state, forever vigilant, never repentant.

The man found refuge from the KGB in America, the flag bearer of liberal democracy — the system he despised almost as much as Communism.

In his fight against the USSR, Solzhenitsyn found himself side-by-side with Soviet dissidents, a tiny minority where ethnic Jews were over-represented. However, he came later to author a treatise, “Two Hundred Years Together,” a text that would become a major reference work for overt and covert anti-Semites among our ethnic nationalists.

He wrote “The Gulag Archipelago” in the midst of a thick blanket of secrecy over everything concerning Stalin’s purges and the dirty work of NKVD. His research was based on many oral testimonies of Gulag survivors, not hard statistics or archived documents. This made it possible for his many detractors to dismiss his work as “campfire tales” in their efforts to whitewash Stalinism. And yet, under Putin, it made it into the school curriculum.

Solzhenitsyn was a prominent champion of Russian ethnic nationalism. Meanwhile, hardcore nationalists have always been the primary source of trouble for our multi-ethnic Derzháva (“the mighty State”).

Below, a piece by Elliott Banfield, “After The Eclipse”. According to the artist, Soviet rule was a deviation, a devastating hiccup in the history of Russian civilization. This was exactly Solzhenitsyn’s point of view.

However, modern Russia treats Soviet rule as a dramatic but magnificent continuation of our imperial tradition. The USSR’s victory over Nazi Germany is now celebrated as a preeminent event of our history. According to the new canon, the two disks you see here are in effect two sides of the same shiny medal.

*
“In the period of dictatorship, surrounded on all sides by enemies, we sometimes manifested unnecessary leniency and unnecessary soft-heartedness.” Krylenko, speech at the Promparty trial ~ Solzhenitsyn, opening of The Gulag Archipelago

*
Bill Smith:
Solzhenitsyn never admired the West and never saw it as an alternative to Communism. He had a low opinion of Gorbachev and Yeltsin, but praised Putin.

Shrinivas S:
To say he hated the decadent and degenerate West would be an understatement. He did return back to Russia before his death.

Robert Hansen:
Solzhenitsyn was a devout adherent to the idea of seeking a “Third Way” that rejected liberalism and communism from the start. In slightly different circumstances he’d be an out-and-out fascist.

Roxolan Tonix:
Does it really matter? Russians have the innate ability to salivate over both the empire and the Soviet Union and all the tyrants that came along with them all in the same line. If you had a mummy of tzar Nicholas and put it right next to Lenin’s, nobody would bat an eye.

Michal Šturz:
Russia has no separation of church and state — thus the state was able to take on the mantle of infalibility. The government in Moscow is always right, because what’s right is defined by the government.

William Lebotzschy:
Solzhenitsyn was certainly able to convey the Soviet system to the west, as most were ignorant, until then. His ““Cancer Ward” was an excellent clarification of the inhumanity of the system, and the disposable nature of Soviet citizens. We all knew about the NKVD, as they had been reported on by many writers since the 1930’s.

Dima Vorobiev:
For Solzhenitsyn, the ideal Russia was something in line with Iberian Fascism. Benevolent autocracy with Orthodoxy as the State ideology.

Alexandr Dugin professes a kind of psychedelic Fascism. Solzhenitsyn is a plain vanilla 19th century fundamentalist.

Vaselin Romanov:
Having lived abroad all my life, Solzhenitsyn was never considered controversial at all among emigres. I agree that Gulag is very poorly written and extremely tough slogging but anecdotal oral stories have formed the basis of many a book about WW2.


Bolshevism committed the greatest human slaughter of all time.” One may argue that Hitler is still #1 in terms of body count, but accurate body count is not the point. The point is that the world is well aware of Nazi crimes, but is still confused about Communism (not that pure Communism was ever achieved, except in small religious communities). There are still neo-Nazis, but my guess is that they are vastly outnumbered by neo-Communists. Communism as an ideology continues to attract misguided idealists, with their poor grasp of history, especially the history of crimes against humanity.

*
“To President Putin,

As we write and you are well aware, Alexei Navalny is being held in IK-6, one of the harshest penal colonies in your country. He has been consistently returned to solitary confinement, squeezed into a concrete cell the size of a dog kennel, with no ventilation. Visits from relatives and phone calls are forbidden, his attorney-client privileges have been canceled. Despite running a fever, he is required to stand all day.

We add our voices to those of the 600 Russian doctors requesting urgent and immediate independent medical help. A further 100 Russian lawyers and 100 regional deputies are demanding that the torture of Navalny cease and again that medical assistance be provided.
Navalny is serving prison sentences based on charges which would never have been upheld under any independent legal system. We support the call of the German government, the U.S. authorities, and the European Union demanding his immediate release. It is in your power.”

Michael Abramowitz (president of Freedom House)

Svetlana Alexievitch (journalist, writer, Nobel Prize winner)

Anne Applebaum (journalist, writer)

Costa Gavras (film maker)

Richard Dawkins (biologist, writer)

and many others, including Margaret Atwood and Misha Baryshnikov

*
MUD AND THE UKRAINIAN COUNTEROFFENSIVE

The Ukrainian military is watching and waiting for the conditions to improve before it launches its large-scale counteroffensive.

Conditions on the ground are still not ideal for large-scale mechanized warfare. The ground continues to be soft after the thawing of the winter snow, and mud makes off-road driving precarious.

“However, Russian online outlets are likely exaggerating the overall impact of mud on Ukrainian forces as part of an information operation aimed at raising Russian morale, and undermining Ukraine’s supporters, in light of an anticipated Ukrainian counter offensive,” the British Military Intelligence assessed in its latest estimate of the war.

The Russian military and intelligence services have continued to rely on information operations to bolster their narrative and disrupt Ukrainian operations. The Kremlin has used disinformation and misinformation has been used to both cover up humiliating Russian defeats on the ground and to also portray the Ukrainians in a bad light. The information operations are directed at several different audiences, including the Russian people, the Ukrainian public, the West, and pro-Russian countries.

“Surface conditions can be expected to improve in the coming weeks. The threat from mines probably continues to be a more important factor in limiting the combatants’ off-road maneuvers,” the British Military Intelligence added.

https://www.19fortyfive.com/2023/04/putins-nightmare-comes-true-russias-war-in-ukraine-is-falling-apart/

Oriana:

Russian capacity to wage war depends on its ability to produce (or procure) weapons. If the most important factor in war is the troops’ morale, then the second is “ammunition” — I chose to put that term in quotation marks because I mean all types of weaponry.

*
TOP-DOWN CHAIN OF COMMAND HAMPERS THE RUSSIAN MILITARY

~ They’re riding speedboats like fucking Cuban drug lords and we’re watching them helplessly unable to do anything!

How the hell is this happening?! I'll tell you how!

When I detect a target, I’m transmitting it to my superiors. I’m asking for artillery because we don’t have direct access to it. Everything must be agreed with the headquarters.

And by the time it is agreed — IF agreed — the target has safely fucked off into the fog, because the fucking bloated coordinates have become useless long time ago.

I beg you at the command center — let the fighters make those decisions!

Because of idiocy and fear of being punished for making wrong moves nothing gets decided on many levels of the command.

Or reduce the communication steps to a minimum so that there the time is not wasted on coordinating an artillery strike.

Fathers-commanders, here are the coordinates of the place where the enemy hangs out.
46.66XXXX, 32.701XXX

Take action!”

Now you know why Kremlin is obsessed with “decision making centers.” They believe that just like in the Russian military, every tiny decision on the battlefield in Ukraine has to be vetted in Pentagon. Soldiers and officers have zero say.

And if they strike Pentagon and White House with a prototype missile, America will immediately cease to exist because 350 million US citizens wouldn’t know what to do and Ukrainian Army and NATO will surrender. And Russia will win the war.

~ Misha Firer, Quora


*
“I know enough of the world now to have almost lost the capacity of being much surprised by anything” ~ David Copperfield,  Charles Dickens, 1850

Poor children in London, 1850.

*
MORE ON DEMOGRAPHIC DECLINE IN RUSSIA AND IN GENERAL

~ When industrialized countries reach a certain income level, the people in them seem to stop having babies. In the Soviet Union, ethnic Russians weren’t the ones having the babies.

The trend at the time was for the “minorities” to be having children, in huge numbers, from the 1960’s on. Russia’s population never quite recovered from WWII, with the death of so many men (an entire generation, even more). Worse, what little health care that was effective before 1991 evaporated afterwards, but in truth it had been in sharp decline for more than two decades.

Combined with heavy alcohol use, smoking and extremely poor diet — as one visitor in the 1980’s noted, Russians were extremely fat (before the West was fat, another story there), mostly because all they had to eat was lots of bread. They ate bread with everything. It was cheap. The diet was renowned for being incredibly crappy, even among Russian health officials.

The Soviet economy was in a slow state of collapse throughout the 1970’s and 80’s, and in truth from the 1960’s.

Whole towns looked like they did in the 19th century. A tiny fraction of the population lived well. I know a whole lot of Russians who experienced those days. A lot. A few cities did well, basically as “communist” parasite cities — Moscow, Leningrad, etc. This remains true — Moscow and St Petersburg are basically parasitic. The rest of the country is treated as colonies to fund the wealth of the cities. It’s why there were so many jokes about the trains and such things — what does the Moscow-Tver train smell like? Sausages (going to Moscow; none available in Tver).

The moment the artificial sops were removed, the entire system just collapsed, rotting from the inside.

The demographic collapse in Russia was exacerbated. But it had always been bad.

The places that had huge demographic growth were places like Chechnya, Uzbekistan, non-Russian ethnic parts of the RSFSR prior to 1991. Their growth rates were just enormous. It’s not a coincidence that these places were also super crappy and extremely poor, starved of development and attention.

Meanwhile, in Moscow, there was in-migration of ethnic Russians, etc., pushing the population of the city up. But this was draining the poor, backward countryside, which couldn’t do anything but stagnate. This meant an overall negative growth rate.

Russia continued to function as an empire throughout. It’s still trying to, hence Ukraine. Hence the using of ethnic minority soldiers in meatgrinders.

Russia before communism, during communism, and after communism was constructed as a state-empire. It was an empire. Culturally, it’s imperialist to the core. If you listen to any of the state propagandists who not totally dominate all official media (and there’s no allowed substantial dissident media in Russia any more), their rhetoric is rooted in nothing but imperial fantasies and dreams.

The nostalgia for the USSR in Russia is unlike anywhere else in the former Soviet Union, because for Russians, it was their empire. Sure, it funded these smaller places, but always in the service of the “Russian World”, a phrase that’s been current for 60 years and which Putin is making heavy use of.

But this imperial structure is terrible for demographics.

NONE OF THIS has anything to do with communism or socialism or capitalism or anything else.

All of these countries face the same declines in the face of industrial society. It doesn’t matter if it’s communist or totalitarian or fascist (which is what Russia essentially is, now), or democratic or whatever.

China’s population is in freefall. It’s on par with Italy. In fact, it’s among the very worst demographic nightmares on the planet. it’s literally the subject of entire departments of research in the CCP-funded and backed institutes in China. It’s among the highest priority subjects, among all subjects, in the country.

Nothing the CCP has done has budged the rate upwards; in fact, it’s trending down and lower and down by the day.

No country in the world has faced such dramatic population aging and losses, absent war, in all of human history. It’s stunning and shocking and just astounding to watch. ~ Craig Urquhart

https://www.businessinsider.com/discoveries-change-our-understanding-of-human-origins-2018-7?utm_content=buffere946f&utm_medium=social&utm_source=facebook.com&utm_campaign=buffer-ti&fbclid=IwAR13iikcMMRn7Tbg14UQCBSG7KZ5xSsy3reM32CBOCEXfw9FOR83jrpGNV0

Oriana:

For me, the most subversive phrase ever written is “the pursuit of happiness” — especially if understood as one of unalienable human rights.

But once you assume that you have a right to pursue happiness, you may start wanting luxuries such as being able to travel to beautiful places, or riding horses, growing orchids, or whatever else happiness means to a particular person. Sure, parenthood too can be a source of happiness, particularly to fathers, but to women it often means endless sacrifice and self-erasure. Hearing statements to the effect that having children is the most difficult thing in the world does not help motivate anyone to undertake the task.

Financial incentives don’t seem to be particularly effective. A network of quality child care centers, on the other hand, just might ease the burden enough so that parenthood would be seen mainly as joy rather than self-sacrifice.

(Parenthetically, I’m not surprised that only one country in the world has enshrined “the pursuit of happiness” as a human right. Nor that the phrase was made immortal by someone who had never experienced poverty.)

*
COMMUNISM VERSUS NATIONALISM

Communists see Capitalism and Capitalists as the main enemy. But Nationalism and Nationalists are the ones who have consistently broken Communism.

Global Marxism split almost immediately among national lines — this would be the “Socialism in _____ Country” and “[insert country here] Style Socialism” and so on and so on. A lot of these countries and ideologies didn’t even get along.

The Soviet Union itself collapsed along national lines, along with other powers like Yugoslavia. Interestingly enough, they practically accepted nationalism after WW2, when they let their new puppet states stay at least semi-independent, rather than absorb them directly into the USSR.

The Eastern Bloc countries all rejected Communism/Socialism in favor of nationalism, essentially. One might say capitalism, but I think if the West had another kind of economic system they would’ve gone with that too, so long as it came without the USSR.

Workers across the planet tend to associate far more strongly with their nation or state than they do their class. Few are as fervently patriotic, nationalistic, or ethnocentric than the world’s “workers and peasants” — the exact class the Communists want on their side.

I confess to not really keeping up with developments in Communist spaces. But it does seem to me that Communists put way too much emphasis on the Capitalist enemy, and not enough on Nationalism, when it’s the Nationalists that have consistently kicked Communists in the face over the past… century?

Of course, I’m not saying that Communists should ignore Capitalism — otherwise what’s even the point of their ideology.

But it does seem to me that Communism, as we’ve seen it, has been a very poor tool to use against Nationalism. Much to the woe of the former.

At best it gets nationalized and we end up with a sort of left-wing nationalism, which seems especially common in Europe. ~ Quora

Walt Kubilius:
Yes, but… the Soviet Union wouldn’t have collapsed if communism worked for them. It was not nationalist pressure that made it collapse, it was economic and intellectual failure.

Joe Moorman:
I somewhat disagree. I think the end of the USSR still speaks to the primacy of nationalism. The fissure lines of nationalism were a relief valve that broke open before ideological reform could be actually implemented in the USSR. Of course this wasn't necessarily a bad thing, because the union broke apart into it's constituent SSRs as independent states in a clean manner without protracted civil wars. It was like breaking a graham cracker along the perforations.

I think it's instructive to look at China, as that nation very gradually reformed from diehard communists who were even more gung-ho about collectivization than the Soviets were in the postwar era — but it all started to turn around in the 1970s, and by the late 1990s it was mostly state capitalism.

Chinese communism was a failure much as Soviet communism was. But the destitution of the CCP during much of this time did not cause the people to rise up in revolution against their leaders; the 1989 revolt did not catch on, unlike in other communist states. Because Chinese nationalism served the state, rather than undermining it as the individual SSR national movements did.

No matter how bad times got, the people did not have an obvious alternative form of government as the Chinese state's unity was historically much more established than the USSR, which was to easily split along more or less the same national borders as they had been a scant two or three generations ago. It was an even easier task for Eastern bloc states which were not in the USSR to overthrow their foreign backed pro-Soviet governments.

If the ethnic and sub-national entities had been destroyed during the USSR (a big task, to be sure!), the whole union might have successfully pursued the same course as China did and today would be a communist state in name only.

Philip L:
They have good ideas, good intentions, but it’s a giant social experiment (which is bound to fail a lot until getting things right) turning society upside down. Marx was an observer but not administrator.

And worst is that nearly every single communist country was formed through violence, overthrowing existing government, and they paid only lip service to democracy, hence they all ended up with a dictatorship.

André Louis:
It doesn’t work simply because it can only be fully implemented by a dictatorial government, and a particularly totalitarian and oppressive one which strips away any kind of freedom a society needs to function properly and prosper. In communism and many a socialist nation, much of the GDP is seized by the ever-growing State while they distribute literally crumbs to the masses who actually produced it. And such a self-serving state machine can only run under brutal oppression and intense manipulation of the masses.

Oriana:

I've long thought of the Soviet Union as a "giant social experiment." This experiment had to be performed so we could see if such a system works. We needed to see how much the collective farms produced as opposed to the tiny private lots where people were still allowed to grow their own potatoes and other basic vegetables. I think that now we can say yes, communism works, but only in small religious communities; it's fascinating to read about the early Christians and certain self-sustaining monastic communities. On the level of nation-states, attempts at communism have universally failed, keeping people in poverty until the communist project was abandoned (China is a spectacular example of that). 

In spite of this, there are still people who can't seem to digest the verdict of history and keep coming up with excuses, e.g. it collapsed because the "communist" countries were poor and backward, precisely the opposite of the most advanced capitalist states that, according to Marx, were destined to achieve communism first. Also, communism was always imposed by force a disastrous beginning that doesn't bode well for the outcome of any giant social experiment. 


*
WHAT THE SPACE RACE MEANT FOR THE SOVIET PEOPLE (Dima Vorobiev)

I can’t say enough how significant was the space race for us.

In the Soviet crusade for supremacy, this was our Jerusalem.

In the late 1950s, Krushchev finally decided to take better care of people, not only the industry and military. Before him, for the great majority of us, life was a pretty joyless affair. The Communist rulers kept telling us how great and prosperous our life was going to be. But that life kept eluding us somehow. We badly needed proof that all the poverty and suffering had meaning.

The Vostok-1 breakthrough came on the back of our missile program. It became a true Wunderwaffe for the guys in the Kremlin. The whole Communist project suddenly made sense.

This was high-tech! And we were ahead of everyone else in it!

The sky used to be the realm of God. Now, our guys are up there!

Who knew we could do it just a few years earlier? We certainly have lots and lots of other—even more amazing—stuff our secret scientists are preparing to roll out right now! Wait and see, everyone!

Gagarin touring the world! Western tabloids flashing his million-ruble smile on the covers! Everyone was signing on to Russian language courses. We knew it, we knew it: the whole world secretly loved and admired us all the way!

To this day today, Gagarin’s space trip gas ranked as the pinnacle of Russian civilization, along with the victory in WWII. I can remember how the space race in the 1960s was everyone’s measure of how successful we were.

Soviet breakthroughs in space came back-to-back in the 1960s. Was it too naive to dream of interstellar travel in not so far future when it all went so swimmingly? Our homes were poor and days uneventful—-but the space news held a promise: Another world was possible!

And then came the reckoning.

In 1969, Americans landed on the moon, while our cosmonauts were forever stuck in the Earth's orbit. This was when the soul left the body of the Soviet project.

“Sire, we lost Jerusalem.”

In retrospect, from that moment on, it was a dead empire walking. ~ Dima Vorobiev, Quora

Jan Smuga:
Soviet space program was awesome for PR on every level because it was real achievement. It was important, it was very difficult, it was measurable, it was practical, it was dreamy, it was adventurous. It proved that the system which created it had, surprisingly, some undeniable merit. Even if overall results of communism proved disappointing, space race became part of humankind legacy. Without efforts of USSR, even USA space program would be far more modest.

*

~ In Moscow, a mother reported her own daughter to police: Tatyana from Ochakovo-Matveevsky supports Putin’s “special operation”, and her daughter doesn’t.

Tatyana, 59, decided to discuss Russia’s involvement in Ukraine with her daughter Irina, who already had heated arguments with other family members about international politics of Vladimir Putin.

The attempt to resolve the conflict backfired. Tatyana and Irina got into such a quarrel that 3 days later the mother, holding a grudge, went to the police and filed an official report on her daughter.

There is no information whether a case has been opened against the girl, but that’s not the point.

The bottom line is that for the "greatness of the state", the mother denounced her own daughter.

This case highlights the biggest misconceptions about the Russian society.

1. Family values

The mother is reporting her own child to police for the “crime” that can see her daughter thrown into prison for up to 15–25 years: “discrediting Russia’s army” gets you up to 15 years, and “treason” might see you locked up for life.

2. Kindness, soulfulness

Knowing what will happen to a fellow human being, Russians are eagerly denouncing each other – including false reports on competitors or as a payback. FSB and police are flooded with reports about citizens speaking against the “special operation”.

3. “All Russians support Putin”

There are plenty of Russians who don’t. It’s dangerous to speak against the “special operation”, but people still do it, even now – 14 months after its start, in the oppressive political climate of 2023 Russia.

A new leaked conversation between, allegedly, billionaire Roman Trotsenko (wallet of Igor Sechin, who is an oil magnate, a close ally and "de facto deputy" of Vladimir Putin) and Nikolai Matushevsky (creator of popular art spaces in Moscow) is sending shock waves through Russian elites once again.

The two men talk about the need to withdraw all funds from Russia by the end of 2023, or it will be too late: “People will be slaughtered in the streets of Moscow,” they grimly predict.

“Everything is crashing in Russia,” they feel.

The processes happening in Russia are about the civilizational divide. The civilization of freedom and unfreedom are fighting in an existential battle.

The old generation is scared of freedom, because freedom means taking responsibility for one’s own choices and taking action.

The young generation, which tasted the freedoms and seen the world, feels suffocated in the atmosphere of the tightening oppression.

That’s one of the most disastrous aspects of Putin’s regime: all Russians suffer from PTSD as individuals and as a nation. They are psychologically deformed.

Life of ordinary Russians was so hard, so full of traps, the dangers to each citizen were so real, that it brought out the worst in people.

People's usual lives literally collapsed, their souls and hearts were smashed. They were easy to manipulate psychologically, it could be done by a snap of fingers.

This largely explains the phenomenon of mass Stockholm syndrome among Russians.

In Soviet times, people believed in the “bright future”, trying to overcome “bourgeoise backwardness”.

In Putin’s times, they believe in “the bright past”, which can be brought back by locking the future into prison.

The Russian society and Russian people of today aren’t the same people they were 2 years ago. Or a year ago. Or even 2 months ago. They are being broken weekly, daily, to deeper submission.

But slaves don’t love their master. Loyalty based on fear and enforcement, loyalty purchased by money, is false loyalty.

Russia isn’t North Korea. It will never be North Korea. Nor it can plunge into Stalin’s times.

Hitler promised his people the Third Reich, a bunch of slaves and the prosperity of the German people.

The Soviets promised sweet life under communism, when everything would be free.

Putin’s propagandists can only promise Stalin’s repressions and electronic draft notices to sacrifice your life for Putin’s oligarchs.

And this is why everyone in Russia is feeling that life as they knew it, is crashing. That’s why they are filled with anxiety and fear.

That’s why they all, from the regular folk to high-ranking officials, want an end to this giant uncertainty, any end that brings back common sense and predictability. They are more ready for Russia’s defeat than they are ready for a victory, because “victory” brings more issues, expenses and deaths.

The decision to withdraw Soviet troops from Afghanistan, after 9 years, was due to the Red Army’s failure to suppress Afghan fighters, and the high cost in Russian lives and resources.
It’s been 9 years since Putin attacked Ukraine in 2014. The cost in Russian lives and resources is mounting.

The end result is predictable, and when it happens, everyone will say, “I knew it! It just couldn’t be any different.” ~ Elena Gold, Quora

*
CRIME WAVE HITS RUSSIA AS RELEASED CONVICTS  RETURN FROM UKRAINE

Russia is suffering a spate of murders by convicts released to fight in Vladimir Putin’s war who then return home to continue their life of crime.

Other fighters have been confused by the brutality of the frontline and killed once they return to their families.

More fighters have been arrested for sexual violence against children.

Platoon commander Alexander Mameyev, 44, who returned home after the war, was held on suspicion of stabbing his wife Ekaterina to death in front of their sons, aged seven and eight. His children had to be hospitalized in a state of shock. Private Sergei Batuyev, 39, a father of three, was detained and accused of strangling his friend Denis Shustov, 41, a Chechen war veteran, during a brawl after returning to the Udmurtia region from the war.

The crazed fighter planned to feed his remains to pigs, prosecutors say. Batuyev is currently on trial and could face up to 15 years in prison if convicted. Wagner private army fighter Ivan Rossomakhin, 28, recruited from the prison where he was serving a sentence for murder, killed his village neighbor Yulia Buiskykh, 85, after being pardoned by Putin after six months at the front. Another of Wagner’s recruits, Georgy Siukayev, 33, is suspected of murdering Soslan Valyiev, 38, a former convict, less than a month after Putin pardoned him. Andrei Yakimov, 49, a volunteer combatant, killed his wife Viktoria Yakimova, 47, a secretary, on his way home from the war.

Another notorious case was that of 23-year-old Alexander Ionkin, an injured volunteer, who used a flare gun to cause an inferno at a nightclub in November.

Military Sergei Korshunov was sentenced to eight years in prison after stabbing his friend to death during a drinking session on his return from Putin’s war. The judge imposed a lenient sentence because the suspect was a ‘participant in a special military operation’ in Ukraine. A 19-year-old soldier Vladislav A is accused of raping a girl, 17, during his first leave from the war. He was detained in Taganrog. In the Orlov region, a 36-year-old man who fought for Putin in Ukraine faces up to 20 years in prison after being detained for sexually abusing his stepdaughter, 10, while on leave from the war. ~ Emmanuel Ikechukwu

https://www.emmanuelsblog.com.ng/2023/04/crime-wave-hits-russia-as-prisoners-freed-to-fight-in-ukraine-return.htm


Mary:

The news from Russia just gets more and more nightmarish. Each act on Putin's part increases the brutality and suffering of his own people. Not that he would care. But using criminals as soldiers and then pardoning them when they return leads to consequences beyond all the war crimes they commit in Ukraine. They come home to commit atrocities on wives and children as well as strangers — raping, shooting, stabbing mothers in front of their children...horror upon horror. It is hard to imagine the world that will be left afterword.

 *
PUTIN’S PSYCHOLOGICAL PROFILE

Ten years ago, Putin’s critic and a well-known supporter of democracy in Russia, Valerie Novodvorskaya published her last letter to the Russian people, after which she was promptly disposed of by the guardians of Putin regime. That how she described the “dictator in the making”:

Sadist.

Paranoid psychopath with delusions of grandeur and overvalued ideas.

Pathological liar.

Self-esteem is inadequate.

Intelligence is low.

His personality is rapidly deteriorating. Lives in a parallel world, divorced from reality, completely out of touch with reality. He doesn't understand it: How people live, how they feel, what they dream about, why they suffer.

A moral cretin who can't even imagine it. And he doesn't want to.

He created an artificial, tightly sealed world around itself, where he slowly stewed for years in a cauldron of his own lies and caveman prejudices.

As a result, he completely lost his mind, but not his conscience — conscience was never there.

Not capable of compassion and sympathy, but at the same time he himself is very vulnerable and suspicious.

He is incapable of love.

Very touchy and vindictive, probably due to all the inferiority and desire to get even, to cause as much suffering as possible to everyone.

Spineless and untalented errand boy.

Cowardly and resourceful.

Eternal cowardice gave rise in him to excessive, often meaningless cruelty in actions and transcendent, monstrous cynicism.

His view of the world is the view of an evil microbe from a test tube.

He thinks narrowly, stereotyped, with attitudes and slogans. Judgments about the surrounding world and people are artisanal and limited. Like any tyrant, he is prone to hoaxes, sacred meanings, and symbolism.

He compensates for his inferiority in all spheres of life by suppression and destruction of people.

His value system is cropped, narrowed down to a primitive formula: "friend or foe", and is a set of ideological clichés: — Russia Über Alles, the West — enemies and strangers.

The people of Russia are potentially dangerous cattle to be kept in the reins, periodically teased with the gingerbread of national exclusivity. Power is the lot of the elite, the meaning of power is in itself and personal enrichment.

He absorbed all the mocking misanthropy of the office breeds. Absolutely unprincipled. ~ Sgt. Carey Mahoney, Quora

Oriana:

At this point, paranoia has become a very important trait. Putin knows that a lot of people would love to see him dead; being a killer himself, he worries if there’s some undetectable poison in his tea.

Sadist and paranoid psychopath are right on. And note that this appraisal was written long before the war in Ukraine, which revealed, among other things, just how delusional he is. Delusional and deluded: who’d dare tell him an unpleasant truth?

Putin, second from left, in the nineteen nineties.

*
SURVIVAL IN THE EVENT OF A US-RUSSIA NUCLEAR WAR

The war wouldn't be noticeable in the vast majority of the world as less than 1/2 of a percent of earth's landmass would be involved. While the damage caused by nuclear weapons is indeed severe the idea of an apocalypse is greatly exaggerated by the press and perpetuated by people who react without making any effort to actually understand.

A few things to consider…

The world's nuclear arsenals are a small fraction of what they were in the 1980’s. Over 50,000 nuclear weapons have been dismantled and another 7000 are waiting for dismantling in total between the USA and Russia.

USA and Russia have less than 2000 warheads each that are considered strategic weapons on high alert. These weapons are much smaller in yield than what they were in the 1980’s. Multi megaton weapons are obsolete and no longer deemed useful militarily. This is the result of higher accuracy delivery systems and the use of ground penetrating warheads which are 30 times more destructive than a surface burst so larger yields are no longer necessary.

In a war scenario, not all strategic weapons will be used. Perhaps 2/3 in an all out war first strike, the rest would be held in reserve.

Both the USA and Russia have policies of not targeting civilians and due the number of weapons available and an oversupply of military targets, what weapons that would be used would all be targeted on military assets. You cannot win a war by bombing civilians. It did not work when the Germans did that to Britain and it did not work when Britain (and to a lesser extent the USA, but the USA tried to limit bombing to military targets …..just not very accurately delivered) tried it on the Germans. 

It did not work when the USA bombed Japan (it wasn't the atomic bomb that ended the war it was the USSR declaring war on Japan and attacking Japanese forces in Manchuria), it has not worked in places like Vietnam, or the Middle East. You win wars by taking out your opponents ability to make war not by targeting its civilians. Both Russia and the USA have agreed, in the event of war, not to target civilians and not to target things like civilian nuclear power plants. 

Airburst leave little radiation … almost zero.

Ground bursts and earth penetrating rounds leave radiation that after 2–3 weeks is safe to linger and after a few months is back to background radiation levels. The bikini Atoll, which took a lot of dirty bombs, has a lower radiation reading today then what you would read from the granite rocks found in NYC’s Central Park and it is also less than 1/2 the background radiation that the city of Denver gets from natural sources. Modern nuclear weapons are designed to minimize the longer lasting radioactive side effects. The Chernobyl accident released almost the same amount of radiation as all the above ground nuclear weapons testing in history, over 500 bombs. Chernobyl, while severe, it wasn't the end of the world or even a long lasting regional effect. The press blows everything out of proportion because terror and tragedy sells.

Nuclear weapons destruction will be concentrated in military strategic targets. Most of the country will remain untouched. There just aren't enough bombs to rain wholesale destruction across the country. If Russia launched 1300 weapons and each weapon had a destructive diameter of 10 miles, so average 100 square miles per bomb, that equals 130,000 square miles. Now targeting assets requires at least 2 warheads sent to a target… at least. So divide that area in half. So 65,000 square miles. The USA is 3.7 million square miles. That means that the total area of destruction in the USA in an all out nuclear war is 1.7% of the USA land area. That's it! Take it a step further and realize that most of that destruction will be targeting strategic assets in remote locations … we put them there on purpose.

Mutually Assured Destruction does not exist in 2020s. MAD is a relic of the 1980’s, we no longer have the assets and neither does Russia to assure anyone's complete destruction.

Nuclear winter calculations were based upon bombs greater than 1 megaton and cities with heavy loading of flammable materials. Neither exist today. Modern cities are significantly below the minimum loading of necessary flammable materials required to start massive firestorms, the premise of the theory.

In 1949 the Soviet Union exploded its first nuclear weapon. The emerging nuclear arsenal of the USSR raised an overriding new requirement for U.S. doctrine. Although the Joint Chiefs of Staff continued to plan for an attack against Soviet cities, destroying enemy nuclear weapons became the priority for American nuclear forces and remains so to this day. At the same time, U.S. leaders seriously debated whether to wage a preventive war in order to destroy Soviet nuclear forces before they could be used. In 1950, President Truman rejected preventive war as inconsistent with American values.

During the Kennedy administration, the Secretary of Defense McNamara developed plans that limited the U.S. nuclear attacks to only one or two of the three traditional categories of targets: nuclear forces, other military, and urban-industrial. Under the revised declaratory doctrine, known as the "no cities" or "city hostage" doctrine, U.S. forces would first, in the event of Soviet aggression, strike military targets (categories one and two) and simultaneously threaten next to hit cities (category three targets), in order to deter Moscow from retaliating against American population centers.
The "no-cities" doctrine represented a shift away from massive retaliation and towards a more calibrated response to Soviet aggression. Indeed, this increased targeting flexibility was adopted by NATO in 1967 when it formally approved the declaratory doctrine of flexible response.

During the early 1960s, deterrence was discussed in countervalue terms. For example, Jerome Wiesner, science adviser to President John F. Kennedy and President Lyndon B. Johnson, testified before Congress that the U.S. could establish deterrence based on a threat to destroy six of the 10 largest Soviet cities. However, by the mid-1980s, U.S. officials began to publicly explain that the U.S. did not target civilian populations and instead targeted Soviet military assets, including nuclear forces.

After the Korean War the U.S. Army’s revised the field manual on the law of land warfare introduced a new statement that expressed as doctrine the growing importance of intention. The revised 1956 manual said, “It is a generally recognized rule of international law that civilians must not be made the object of attack directed exclusively against them.” 

Previous army manuals had left this rule unexpressed. As a subculture, military professionals may have placed even more emphasis on their intentions not to harm noncombatants even in the face of widespread civilian deaths. While the sources make it difficult to assess the personal sentiments of officers and soldiers about civilian casualties during the Korean War, it is not hard to believe that many in private did not want to think of themselves as waging war against defenseless civilians.

Current attack plans integrate nuclear and conventional weapons to minimize civilian casualties. The Bush administration’s Nuclear Posture Review ordered the military to integrate nuclear and conventional weapons into the strike plans, some of these “New Triad” targeting strategies began to look more like countervalue than counterforce targeting except that strikes in cities no longer needed to be nuclear.

Yes many will die and it will be ugly and very messy but we will live on. It isn't going to be the end of the world or even this nation if it ever happens. ~

~ Allen E. Hall, Quora

Bikini, 1946

*
IT’S CHEAPER TO ERADICATE POVERTY THAN TO SUSTAIN IT — UNIVERSAL BASIC INCOME REVISITED

~ Five years ago, when I first heard about it, the idea had been all but forgotten. Most people I talked to had never heard of it either.

Now, suddenly, it’s everywhere.

Finland conducted a major trial, Canada has just launched an even bigger experiment and a test in Kenya is the mother of them all.

What I’m referring to is, of course, basic income. This is an unconditional cash transfer that is enough to your cover basic needs. It is guaranteed to everyone, whether young or old, rich or poor, overworked or out of work.

From Scotland to India, and from Silicon Valley to Kenya, policymakers all over the world have become interested in basic income as an answer to poverty, unemployment and the bureaucratic behemoth of the modern welfare state.

The idea is also attracting growing popular support. In a public referendum, 68% of Europeans would vote in favor of basic income (up from 64% the previous year), revealed a large survey conducted in 28 European countries.

Faster than I could have dared to hope, the discussion has catapulted into a new phase.
In 2017, I published my book Utopia for Realists. Since then, I have seen the focus of the discussion shift, from utopian dreams to real first steps. We have reached the point where it is no longer enough to philosophize about what could be. The time has come to start putting together concrete plans.

I realize that is easier said than done. First, we have to establish what we actually mean by basic income. Proponents differ widely on how much it should be, how we should fund it, and who should be eligible to receive it.

By now, I have talked to many people on the other side of the debate
the opponents of basic income. Their objections, I have discovered, consistently hinge on two fundamental concerns.

Their first concern is mainly practical. How would we pay for it? How can we afford to simply give everyone free cash? Wouldn’t that be astronomically expensive?

Their other main concern is ethical and centers on the 'universality'. What do you mean Bill Gates and Richard Branson would get cash handouts too?


Don’t make it universal

Both of these objections, I believe, can be overcome with a simple solution.

Don’t give a basic income to everybody – yet.

By that, I mean we shouldn’t start out with a universal basic income for poor and rich alike. This would eliminate concerns over affordability, and Mr Gates and Mr Branson would know to keep their day jobs.

I know that there are many excellent arguments for a universal form of basic income. Since everyone would get it, it would remove the stigma that dogs recipients of assistance and ‘entitlements'.

However, in recent months, I have also become convinced that the practical concerns still loom too large. A universal basic income means not only that millions of people would receive unconditional cash payments, but also that millions of people would have to cough up thousands more in taxes to fund it. This will make basic income politically a harder sell.

Not only that, it would also inflate marginal tax rates, or the tax you pay over every additional pound you earn. I know that sounds technical, but bear with me, because it’s crucial. Imagine you decide to work one extra hour every day, and that you earn £10 an hour. Under a marginal tax rate of 40%, you would take home £6. In other words: working more pays off.

Introduction of a universal basic income would change that, significantly inflating not average, but marginal, tax rates, and leaving you with only £3-4 of the original £10. Understandably, a lot of people would probably think "forget it — it’s not worth the extra work”.

But make it a guarantee

The good news? There is an alternative.

Instead of a universal basic income, we could have a basic income guarantee. Or, as economists prefer to call it, a negative income tax.

Again, this sounds technical, but it’s really just basic maths. In the current system, everybody who works pays taxes. A negative income tax flips that around. If you work, but your wages still leave you below the poverty level, you don’t have to pay taxes. Instead, the taxman pays you.

Think of it as building a massive floor underneath the economy. Anyone who falls below the poverty line, employed or not, is lifted back to security, no conditions attached. Protection against poverty would be a right, not a privilege. Meanwhile, working would always pay off, because above the poverty line, basic income would be stepped down incrementally, instead of cut off.

Imagine what a massive leap this would be.

For example, in Great Britain, more than 14 million people, including four million children, would be freed from the prison of poverty. To be clear: 60% of those people work in paid jobs.

This is an idea that could rally voters across the board, with something to please both the left and the right:

For the left, a world without poverty.
For the right, no more nanny state.

For the left, livelihood security for all.
For the right, an economy that always rewards hard work.

Here’s the kicker: in terms of costs, there is absolutely no difference between a basic income guarantee and a universal basic income. The net expense of both amounts to exactly the same.

When it comes to making the sell however, I think the latter has a big advantage. It is no coincidence that just such a scheme was once almost enacted in the US. In the 1970s, President Nixon got his basic income bill through the House of Representatives twice before it ultimately became stranded in the Senate. It was voted down by the Democrats, not because they hated the idea, but because they felt the basic income guarantee wasn’t high enough!

At this point, there will be readers who will object, arguing that handing out cash is an invitation to mass laziness. In reality, nothing could be further from the truth. Large-scale experiments have already been done in Canada and the US. The data show that people hardly work less. Rather, healthcare costs plummet and children’s school performance soars.

At what price?

The million dollar question, of course, is how much will it cost?

Now, this is where it gets really interesting. In a ground-breaking study, three US economists calculated what a negative income tax (a.k.a. basic income guarantee) would cost their country. After crunching the numbers, it was revealed that — surprise! — it would be amazingly cheap.

A negative income tax system that totally eliminated poverty would cost, at most, $336 billion, the researchers found. That is, a measly 1% of US GDP. To put this into perspective, the costs of child poverty alone, and its effects such as higher healthcare expenditure, more crime and worse performance at school, were pegged at $500 billion.

Yes, you read that right.
It is cheaper to eradicate poverty than to sustain it.

A basic income guarantee is brilliantly affordable. So affordable, that implementing it would be less expensive than not implementing it.

Basic security

Finally, I believe there is something else that has to change. We need a new term.

I have been struck time and again by the unjustified associations attached to the term 'basic income'. Whereas the word 'income' is something we associate with a conditional payment that has to be 'earned', what we are talking about here is the right to livelihood security.

Therefore,
I would like to propose that we call this variant simply what it is: basic security. A trampoline that you can always fall back on, whatever else happens.

One thing is certain: the time for philosophizing is past.

Every milestone of civilization begins with a crackpot idea once dismissed as unreasonable and unrealistic. But there comes a time when Utopian dreams become ripe enough to turn them into real-world policy. For basic income, that time is now. ~ Rutger Bregman

https://www.weforum.org/agenda/2018/05/how-we-make-basic-income-reality-Rutger-Bregman?fbclid=IwAR0IKIkKrFe4diDnqPBaml0mHSyJhY0WTus8hvJpfuDupXXACutVYIHtL2k

Oriana:

The right name has a lot of psychological power. "Basic Security" sounds a lot like Social Security, so it's already partly familiar and definitely not un-American.

Mary:

The idea of a basic income, of the security, makes sense and would be a wonderful thing...in terms of health and sanity alone. But here, where you can't get people to even think about universal healthcare, it may never come.

Oriana:

I have a faint hope that the idea of a “negative income tax” might prove successful. It would have to be very carefully interwoven with the existing tax system, an overcomplex monstrosity.

Of course everyone knows that the US is going to lag on this behind all the rest of the first world. But it’s significant that at least the idea is under discussion again. Reading about the pilot programs, I was especially impressed with the health benefits and the young staying in school. Just these two benefits are enough to convince me. 

It seemed like Social Security had no chance, but here we are . . . So there is some hope, though probably not within our lifetime unless some "basic pittance." But once the benefits of even a bit of extra income become apparent, then yes, I think people will get used to the idea. And then there'll always be those who will object to this "socialist" idea that some child, somewhere, will get a "free" hamburger. Let the kid go hungry — that's apparently more acceptable to some. Sad.

Will it take another Great Depression and another President as daring and popular as FDR? The future generations will find out. But the idea that it's cheaper to eliminate poverty than to maintain it will eventually have to register in people's minds.

*
SUCCESSFUL PSYCHOPATHS

~ What explains the stark differences in the life outcomes of psychopaths? Consider that one in five CEOs is a psychopath. That same rate also applies to prisoners. Of course, not all psychopaths are business leaders and criminals. But researchers have long struggled to explain why some psychopaths are relatively “successful,” and others engage in antisocial behavior that ruins lives.

Now, a new study — set to be published in the journal Personality Disorders: Theory, Research, and Treatment — aims to shed light on the factors that control psychopathic behavior.

“Psychopathic individuals are very prone to engaging in antisocial behaviors but what our findings suggest is that some may actually be better able to inhibit these impulses than others,” lead author Emily Lasko, a doctoral candidate in the Department of Psychology at Virginia Commonwealth University, told VCU News. “Although we don’t know exactly what precipitates this increase in conscientious impulse control over time, we do know that this impulse control does occur for individuals high in certain psychopathy traits who have been relatively more ‘successful’ than their peers.

The study notes that psychopathy exists on a spectrum in society, and it can manifest through a variety of personality traits, such as interpersonal manipulation, impulsivity, callousness, grandiosity, and boldness. Some traits may help psychopaths become “successful,” defined as those who adapt to social norms and avoid incarceration.

For example, the psychopathic trait fearlessness may help a psychopath become a good first-responder, while interpersonal manipulation might help a psychopath become an effective lawyer. In contrast, the psychopathic trait impulsivity may make a psychopath more likely to commit crime.

The researchers hypothesized that psychopaths who are able develop impulse-control skills are more likely to be successful. The team suggested that successful psychopaths develop a mechanism that gives them greater control over their behavior, helping them thwart their heightened antisocial impulses.

Conscientiousness is the trait that predicts whether a psychopath will develop this mechanism, according to the study.

“The compensatory model posits that people higher in certain psychopathic traits (such as grandiosity and manipulation) are able to compensate for and overcome, to some extent, their antisocial impulses via increases in trait conscientiousness, specifically impulse control,” Lasko said.

To test the hypothesis, the researchers examined data from a seven-year longitudinal study on adolescent criminals in Arizona and Pennsylvania.

“Although these participants are not objectively ‘successful,’ this was an ideal sample to test our hypotheses for two main reasons,” the researchers wrote. “First, adolescents are in a prime developmental phase for the improvement of impulse control. Allowing us the longitudinal variability we would need to test our compensatory model. Second, offenders are prone to antisocial acts, by definition, and their rates of recidivism provided a real-world index of ‘successful’ versus ‘unsuccessful’ psychopathy phenotypes.”

The study found that adolescents who scored high in grandiose-manipulative psychopathic traits early in the study were more likely to develop better impulse control and less aggression over time. Psychopaths who scored higher in impulsivity didn’t see as much of an increase.

The findings support the idea that psychopathy isn’t just about personality deficits, but rather a combination of heightened and diminished traits, some of which compensate for each other over time.

“Our findings support a novel model of psychopathy that we propose, which runs contradictory to the other existing models of psychopathy in that it focuses more on the strengths or ‘surpluses’ associated with psychopathy rather than just deficits,” she said. “Psychopathy is not a personality trait simply composed of deficits — there are many forms that it can take.”

The findings also suggest that the Big Five model of personality — of which conscientiousness is part — is an important tool for understanding psychopathy.

“Together, these results point to the consequential nature of the development of conscientious traits for psychopathic individuals, which may promote adaptive re-entry into society. Indeed, even ‘unsuccessful’ offenders in our sample exhibited associations between their grandiose-manipulative traits and greater impulse control (albeit to a substantially lesser degree).” ~

https://bigthink.com/neuropsych/psychopath-traits/#Echobox=1680241976


Oriana:

With so many news personalities, including psychiatrists, saying that Putin is a psychopath, it's important to understand the term. 

Also, I've read that surgeons are another occupation that attracts successful psychopaths.

*
THE SURPRISING GEOGRAPHY OF GUN VIOLENCE

Gun violence is actually worse in Red States. It’s not even close.

~ Listen to the southern right talk about violence in America and you’d think New York City was as dangerous as Bakhmut on Ukraine’s eastern front.

In October, Florida’s Republican governor Ron DeSantis proclaimed crime in New York City was “out of control” and blamed it on George Soros. Another Sunshine State politico, former president Donald Trump, offered his native city up as a Democrat-run dystopia, one of those places “where the middle class used to flock to live the American dream are now war zones, literal war zones.” In May 2022, hours after 19 children were murdered at Robb Elementary in Uvalde, Texas, Republican Gov. Greg Abbott swatted back suggestions that the state could save lives by implementing tougher gun laws by proclaiming “Chicago and L.A. and New York disprove that thesis.”

In reality, the region the Big Apple comprises most of is far and away the safest part of the U.S. mainland when it comes to gun violence, while the regions
Florida and Texas belong to have per capita firearm death rates (homicides and suicides) three to four times higher than New York’s. On a regional basis it’s the southern swath of the country — in cities and rural areas alike — where the rate of deadly gun violence is most acute, regions where Republicans have dominated state governments for decades.

Gun deaths far less common in NYC area than in US overall

If you grew up in the coal mining region of eastern Pennsylvania your chance of dying of a gunshot is about half that if you grew up in the coalfields of West Virginia, three hundred miles to the southwest. Someone living in the most rural counties of South Carolina is more than three times as likely to be killed by gunshot than someone living in the equally rural counties of New York’s Adirondacks or the impoverished rural counties facing Mexico across the lower reaches of the Rio Grande.

The reasons for these disparities go beyond modern policy differences and extend back to events that predate not only the American party system but the advent of shotguns, revolvers, ammunition cartridges, breach-loaded rifles and the American republic itself. The geography of gun violence — and public and elite ideas about how it should be addressed — is the result of differences at once regional, cultural and historical. Once you understand how the country was colonized — and by whom — a number of insights into the problem are revealed.

To do so you need to more accurately delineate America’s regional cultures. Forget the U.S. Census divisions, which arbitrarily divide the country into a Northeast, Midwest, South and West using often meaningless state boundaries and a willful ignorance of history. The reason the U.S. has strong regional differences is because our swath of the North American continent was settled by rival colonial projects that had very little in common, often despised one another and spread without regard for today’s state boundaries.

Those colonial projects — Puritan-controlled New England, the Dutch-settled area around what is now New York City; the Quaker-founded Delaware Valley; the Scots-Irish-led upland backcountry of the Appalachians; the West Indies-style slave society in the Deep South; the Spanish project in the southwest and so on — had different ethnographic, religious, economic and ideological characteristics. They were rivals and sometimes enemies, with even the British ones lining up on opposite sides of conflicts like the English Civil War in the 1640s. They settled much of the eastern half and southwestern third of what is now the U.S. in mutually exclusive settlement bands before significant third party in-migration picked up steam in the 1840s.

In the process they laid down the institutions, symbols, cultural norms and ideas about freedom, honor and violence that later arrivals would encounter and, by and large, assimilate into. Some states lie entirely or almost entirely within one of these regional cultures. Others are split between them, propelling constant and profound disagreements on politics and policy alike in places like Pennsylvania, Ohio, Illinois, California and Oregon.

Places you might not think have much in common, southwestern Pennsylvania and the Texas Hill Country, for instance, are actually at the beginning and end of well documented settlement streams; in their case, one dominated by generations of Scots-Irish and lowland Scots settlers moving to the early 18th century Pennsylvania frontier and later down the Great Wagon Road to settle the upland parts of Virginia, the Carolinas, Georgia, and Tennessee, and then into the Ozarks, North and central Texas, and southern Oklahoma. Similar colonization movements link Maine and Minnesota, Charleston and Houston, Pennsylvania Dutch Country and central Iowa.

I unpacked this story in detail in my 2011 book American Nations: A History of the Eleven Rival Regional Cultures of North America, and you can read a summary here. But, in brief, the contemporary U.S. is divided between nine large regions — with populations ranging from 13 to 63 million — and four small enclaves of regional cultures whose centers of gravity lie outside the U.S. For space and clarity, I’m going to set aside the enclaves — parts of the regions I call New France, Spanish Caribbean, First Nation, and Greater Polynesia — but they were included in the research project I’m about to share with you.

Understanding how these historical forces affect policy issues — from gun control to Covid-19 responses — can provide important insights into how to craft interventions that might make us all safer and happier. Building coalitions for gun reform at both the state and federal level would benefit from regionally tailored messaging that acknowledged traditions and attitudes around guns and the appropriate use of deadly violence are much deeper than mere party allegiance.
“A famous Scot once said ‘let me make the songs of a nation, and I care not who makes its laws,’ because culture is extremely powerful,” says Carl T. Bogus of Roger Williams University School of Law, who is a second amendment scholar. “Culture drives politics, law and policy. It is amazingly durable, and you have to take it into account.”

I run Nationhood Lab, a project at Salve Regina University’s Pell Center for International Relations and Public Policy, which uses this regional framework to analyze all manner of phenomena where regionalism plays a critical role in understanding what’s going on in America and how one might go about responding to it. We knew decades of scholarship showed there were large regional variations in levels of violence and gun violence and that the dominant values in those regions, encoded in the norms of the region over many generations, likely played a significant role.

But nobody had run the data using a meaningful, historically based model of U.S. regions and their boundaries. Working with our data partners Motivf, we used data on homicides and suicides from the Centers for Disease Control for the period 2010 to 2020 and have just released a detailed analysis of what we found. (The CDC data are “smoothed per capita rates,” meaning the CDC has averaged counties with their immediate neighbors to protect victims’ privacy. The data allows us to reliably depict geographical patterns but doesn’t allow us to say the precise rate of a given county.) As expected, the disparities between the regions are stark, but even I was shocked at just how wide the differences were and also by some unexpected revelations.

Deep South has highest rate of gun deaths among major regions

The Deep South is the most deadly of the large regions at 15.6 per 100,000 residents followed by Greater Appalachia at 13.5. That’s triple and quadruple the rate of New Netherland — the most densely populated part of the continent — which has a rate of 3.8, which is comparable to that of Switzerland. Yankeedom (Northeast) is the next safest at 8.6, which is about half that of Deep South, and Left Coast follows closely behind at 9. El Norte, the Midlands, Tidewater and Far West fall in between.

Greater Appalachia has highest rate of gun suicides among major regions

For gun suicides, which is the most common method, the pattern is similar: New Netherland is the safest big region with a rate of just 1.4 deaths per 100,000, which makes it safer in this respect than Canada, Sweden or Switzerland. Yankeedom and Left Coast are also relatively safe, but Greater Appalachia surges to be the most dangerous with a rate nearly seven times higher than the Big Apple. The Far West becomes a danger zone too, with a rate just slightly better than its libertarian-minded Appalachian counterpart.

Deep South has highest rate of gun homicides among major regions

When you look at gun homicides alone, the Far West goes from being the second worst of the large regions for suicides to the third safest for homicides, a disparity not seen anyplace else, except to a much lesser degree in Greater Appalachia. New Netherland is once again the safest large region, with a gun homicide rate about a third that of the deadliest region, the Deep South.

We also compared the death rates for all these categories for just white Americans — the only ethno-racial group tracked by the CDC whose numbers were large enough to get accurate results across all regions. (For privacy reasons the agency suppresses county data with low numbers, which wreaks havoc on efforts to calculate rates for less numerous ethno-racial groups.) The pattern was essentially the same, except that Greater Appalachia became a hot spot for homicides.

The data did allow us to do a comparison of white and Black rates among people living in the 466 most urbanized U.S. counties, where 55 percent of all Americans live. In these “big city” counties there was a racial divergence in the regional pattern for homicides, with several regions that are among the safest in the analyses we’ve discussed so far — Yankeedom, Left Coast and the Midlands — becoming the most dangerous for African-Americans.

Big urban counties in these regions have Black gun homicide rates that are 23 to 58 percent greater than the big urban counties in the Deep South, 13 to 35 percent greater than those in Greater Appalachia. Propelled by a handful of large metro hot spots — California’s Bay Area, Chicagoland, Detroit and Baltimore metro areas among them — this is the closest the data comes to endorsing Republican talking points on urban gun violence, though other large metros in those same regions have relatively low rates, including Boston, Hartford, Minneapolis, Seattle and Portland. New Netherland, however, remained the safest region for both white and Black Americans.

In a classic 1993 study of the geographic gap in violence, the social psychologist Richard Nisbett of the University of Michigan, noted the regions initially “settled by sober Puritans, Quakers and Dutch farmer-artisans” — that is, Yankeedom, the Midlands and New Netherland — were organized around a yeoman agricultural economy that rewarded “quiet, cooperative citizenship, with each individual being capable of uniting for the common good.”

Much of the South, he wrote, was settled by “swashbuckling Cavaliers of noble or landed gentry status, who took their values . . . from the knightly, medieval standards of manly honor and virtue” (by which he meant Tidewater and the Deep South) or by Scots and Scots-Irish borderlanders (the Greater Appalachian colonists) who hailed from one of the most lawless parts of Europe and relied on “an economy based on herding,” where one’s wealth is tied up in livestock, which are far more vulnerable to theft than grain crops.

These southern cultures developed what anthropologists call a “culture of honor tradition” in which males treasure their honor and believed it can be diminished if an insult, slight or wrong were ignored. “In an honor culture you have to be vigilant about people impugning your reputation and part of that is to show that you can’t be pushed around,” says University of Illinois Urbana-Champaign psychologist Dov Cohen, who conducted a series of experiments with Nisbett demonstrating the persistence of these quick-to-insult characteristics in university students. White male students from the southern regions lashed out in anger at insults and slights that those from northern ones ignored or laughed off. “Arguments over pocket change or popsicles in these Southern cultures can result in people getting killed, but what’s at stake isn’t the popsicle, it’s personal honor.”

Pauline Grosjean, an economist at Australia’s University of New South Wales, has found strong statistical relationships between the presence of Scots-Irish settlers in the 1790 census and contemporary homicide rates, but only in Southern areas “where the institutional environment was weak” — which is the case in almost the entirety of Greater Appalachia. She further noted that in areas where Scots-Irish were dominant, settlers of other ethnic origins — Dutch, French and German — were also more violent, suggesting that they had acculturated to Appalachian norms. The effect was strongest for white offenders and persisted even when controlling for poverty, inequality, demographics and education.

In these same regions this aggressive proclivity is coupled with the violent legacy of having been slave societies. Before 1865, enslaved people were kept in check through the threat and application of violence including whippings, torture and often gruesome executions. For nearly a century thereafter, similar measures were used by the Ku Klux Klan, off-duty law enforcement and thousands of ordinary white citizens to enforce a racial caste system.

The Monroe and Florence Work Today project mapped every lynching and deadly race riot in the U.S. between 1848 and 1964 and found over 90% of the incidents occurred in those three regions or El Norte, where Deep Southern “Anglos” enforced a caste system on the region’s Hispanic majority. In places with a legacy of lynching — which is only now starting to pass out of living memory — University at Albany sociologist Steven Messner and two colleagues found a significant increase of one type of homicide for their 1986-1995 study period, the argument-related killing of Blacks by whites, that isn’t explained by other factors.

Those regions — plus Tidewater and the Far West — are also those where capital punishment is fully embraced. The states they control account for more than 95 percent of the 1,597 executions in the United States since 1976. And they’ve also most enthusiastically embraced “stand-your-ground” laws, which waive a person’s obligation to try and retreat from a threatening situation before resorting to deadly force. Of the 30 states that have such laws, only two, New Hampshire and Michigan, are within Yankeedom, and only two others — Pennsylvania and Illinois — are controlled by a Yankee-Midlands majority. By contrast, every one of the Deep South or Greater Appalachia-dominated states has passed such a law, and almost all the other states with similar laws are in the Far West.

By contrast, the Yankee and Midland cultural legacies featured factors that dampened deadly violence by individuals. The Puritan founders of Yankeedom promoted self-doubt and self-restraint, and their Unitarian and Congregational spiritual descendants believed vengeance would not receive the approval of an all-knowing God (though there were plenty of loopholes permitting the mistreatment of indigenous people and others regarded as being outside the community). This region was the center of the 19th-century death penalty reform movement, which began eliminating capital punishment for burglary, robbery, sodomy and other nonlethal crimes, and today none of the states it controls permit executions. 

The Midlands were founded by pacifist Quakers and attracted likeminded emigrants who set the cultural tone. “Mennonites, Amish, the Harmonists of Western Pennsylvania, the Moravians in Bethlehem and a lot of German Lutheran pietists came who were part of a tradition which sees violence as being completely incompatible with Christian fellowship,” says Joseph Slaughter, an assistant professor at Wesleyan University’s religion department who co-directs the school’s Center for the Study of Guns and Society.

In rural parts of Yankeedom — like the northwestern foothills of Maine where I grew up — gun ownership is widespread and hunting with them is a habit and passion many parents instill in their children in childhood. But fetishizing guns is not a part of that tradition. “In Upstate New York where I live there can be a defensive element to having firearms, but the way it’s engrained culturally is as a tool for hunting and other purposes,” says Jaclyn Schildkraut, executive director of the Rockefeller Institute of Government’s Regional Gun Violence Research Consortium, who formerly lived in Florida. “There are definitely different cultural connotations and purposes for firearms depending on your location in the country.”

If herding and frontier-like environments with weak institutions create more violent societies, why is the Far West so safe with regard to gun homicide and so dangerous for gun suicides? Carolyn Pepper, professor of clinical psychology at the University of Wyoming, is one of the foremost experts on the region’s suicide problem. She says here too the root causes appear to be historical and cultural.

Far West has one of the lowest homicide rates but has second-highest suicide rate
Greater Appalachia leads in gun-suicide rates.

“If your economic development is based on boom-and-bust industries like mineral extraction and mining, people come and go and don’t put down ties,” she notes. “And there’s lower religiosity in most of the region, so that isn’t there to foster social ties or perhaps to provide a moral framework against suicide. Put that together and you have a climate of social isolation coupled with a culture of individualism and stoicism that leads to an inability to ask for help and a stigma against mental health treatment.”

Another association that can’t be dismissed: suicide rates in the region rise with altitude, even when you control for other factors, for reasons that are unclear. But while this pattern has been found in South Korea and Japan, Pepper notes, it doesn’t seem to exist in the Andes, Himalayas or the mountains of Australia, so it would appear unlikely to have a physiological explanation.

As for the Far West’s low gun homicide rate? “I don’t have data,” she says, “but firearms out here are seen as for recreation and defense, not for offense.”

You might wonder how these centuries-old settlement patterns could still be felt so clearly today, given the constant movement of people from one part of the country to another and waves of immigrants who did not arrive sharing the cultural mores of any of these regions. The answer is that these are the dominant cultures newcomers confronted, negotiated with and which their descendants grew up in, surrounded by institutions, laws, customs, symbols, and stories encoding the values of these would-be nations.

On top of that, few of the immigrants arriving in the great and transformational late 19th and early 20th century went to the Deep South, Tidewater, or Greater Appalachia, which wound up increasing the differences between the regions on questions of American identity and belonging. And with more recent migration from one part of the country to another, social scientists have found the movers are more likely to share the political attitudes of their destination rather than their point of origin; as they do so they’re furthering what Bill Bishop called “the Big Sort,” whereby people are choosing to live among people who share their views. This also serves to increase the differences between the regions.

Gun policies, I argue, are downstream from culture, so it’s not surprising that the regions with the worst gun problems are the least supportive of restricting access to firearms. A 2011 Pew Research Center survey asked Americans what was more important, protecting gun ownership or controlling it. The Yankee states of New England went for gun control by a margin of 61 to 36, while those in the poll’s “southeast central” region — the Deep South states of Alabama and Mississippi and the Appalachian states of Tennessee and Kentucky — supported gun rights by exactly the same margin.

Far Western states backed gun rights by a proportion of 59 to 38. After the Newtown school shooting in 2012, not only Connecticut but also neighboring New York and nearby New Jersey tightened gun laws. By contrast, after the recent shooting at a Nashville Christian school, Tennessee lawmakers ejected two of their (young black, male Democratic) colleagues for protesting for tighter gun controls on the chamber floor. Then the state senate passed a bill to shield gun dealers and manufacturers from lawsuits.

When I turned to New York-area criminologists and gun violence experts, I expected to be told the more restrictive gun policies in New York City and in New York and New Jersey largely explained why New Netherland is so remarkably safe compared to other U.S. regions, including Yankeedom and the Midlands. Instead, they pointed to regional culture.

“New York City is a very diverse place. We see people from different cultural and religious traditions every moment and we just know one another, so it’s harder for people to foment inter-group hatreds,” says Jeffrey Butts, director of the research and evaluation center at the John Jay College of Criminal Justice in Manhattan. “Policy has something to do with it, but policy mainly controls the ease to which people can get access to weapons. But after that you have culture, economics, demographics and everything else that influences what they do with those weapons.” ~ Collin Woodard

https://www.politico.com/news/magazine/2023/04/23/surprising-geography-of-gun-violence-00092413?utm_source=pocket-newtab
 

Mary:

The study of the "geography of violence" in the US is both fascinating and enlightening. Examining gun violence in terms of the cultural norms of areas settled by very different cultural groups leads to an understanding of "gun culture"...which makes more sense of things almost immediately. Because that is exactly the root of the problem, not the number of guns or the regulation of guns, but attitudes, beliefs and behaviors by gun users — what guns mean within a particular regional culture.

For me this was particularly enlightening in terms of the deep South, the slave economy states, where the white upperclass plantation owners lived in an "honor culture"...one that has violent acts in defense of one's "honor" at its essential core. Curious in its fierce dedication to the idea of honor in such an ugly and brutal system. This was satirized by Twain in Huck Finn — these slave masters modeling their ideas of chivalry and honor on the likes of Walter Scott's medieval romances, while dependent on the brutalization of human beings to maintain their wealth and life style.

"Honor" requires defense by eliminating any who would threaten, question, or insult it. The traditional way was the duel, but really, murder was always an easy, and definitive, reply. This is very like what we are recently seeing with innocents triggering homicidal responses from homeowners , neighbors, or strangers who feel their rightful territory has been infringed on, they have been insulted or "disrespected." Ring my doorbell, enter my driveway, complain about my shooting practice noise, and I'll kill you. Easy and quick, with my powerful weapon.

Oriana:

The second amendment was a tragic mistake. Gun laws and lack of universal healthcare are arguably two worst things about America, the areas where this country lags far behind the rest of the industrialized world.

*
THE DESTRUCTIVENESS OF CONCRETE

In the time it takes you to read this sentence, the global building industry will have poured more than 19,000 bathtubs of concrete. By the time you are halfway through this article, the volume would fill the Albert Hall and spill out into Hyde Park. In a day it would be almost the size of China’s Three Gorges Dam. In a single year, there is enough to patio over every hill, dale, nook and cranny in England.

After water, concrete is the most widely used substance on Earth. If the cement industry were a country, it would be the third largest carbon dioxide emitter in the world with up to 2.8 bn tons, surpassed only by China and the US.

The material is the foundation of modern development, putting roofs over the heads of billions, fortifying our defenses against natural disaster and providing a structure for healthcare, education, transport, energy and industry.

Concrete is how we try to tame nature. Our slabs protect us from the elements. They keep the rain from our heads, the cold from our bones and the mud from our feet. But they also entomb vast tracts of fertile soil, constipate rivers, choke habitats and – acting as a rock-hard second skin – desensitize us from what is happening outside our urban fortresses.

Our blue and green world is becoming grayer by the second. By one calculation, we may have already passed the point where concrete outweighs the combined carbon mass of every tree, bush and shrub on the planet. Our built environment is, in these terms, outgrowing the natural one. Unlike the natural world, however, it does not actually grow. Instead, its chief quality is to harden and then degrade, extremely slowly.

All the plastic produced over the past 60 years amounts to 8bn tons. The cement industry pumps out more than that every two years. But though the problem is bigger than plastic, it is generally seen as less severe. Concrete is not derived from fossil fuels. It is not being found in the stomachs of whales and seagulls. Doctors aren’t discovering traces of it in our blood. Nor do we see it tangled in oak trees or contributing to subterranean fatbergs. We know where we are with concrete. Or to be more precise, we know where it is going: nowhere. Which is exactly why we have come to rely on it.

This solidity, of course, is what humankind yearns for. Concrete is beloved for its weight and endurance. That is why it serves as the foundation of modern life, holding time, nature, the elements and entropy at bay. When combined with steel, it is the material that ensures our dams don’t burst, our tower blocks don’t fall, our roads don’t buckle and our electricity grid remains connected.

Solidity is a particularly attractive quality at a time of disorientating change. But – like any good thing in excess – it can create more problems than it solves.

At times an unyielding ally, at times a false friend, concrete can resist nature for decades and then suddenly amplify its impact. Take the floods in New Orleans after Hurricane Katrina and Houston after Harvey, which were more severe because urban and suburban streets could not soak up the rain like a floodplain, and storm drains proved woefully inadequate for the new extremes of a disrupted climate.

It also magnifies the extreme weather it shelters us from. Taking in all stages of production, concrete is said to be responsible for 4-8% of the world’s CO2. Among materials, only coal, oil and gas are a greater source of greenhouse gases. Half of concrete’s CO2 emissions are created during the manufacture of clinker, the most-energy intensive part of the cement-making process.

But other environmental impacts are far less well understood. Concrete is a thirsty behemoth, sucking up almost a 10th of the world’s industrial water use. This often strains supplies for drinking and irrigation, because 75% of this consumption is in drought and water-stressed regions. In cities, concrete also adds to the heat-island effect by absorbing the warmth of the sun and trapping gases from car exhausts and air-conditioner units – though it is, at least, better than darker asphalt.

It also worsens the problem of silicosis and other respiratory diseases. The dust from wind-blown stocks and mixers contributes as much as 10% of the coarse particulate matter that chokes Delhi, where researchers found in 2015 that the air pollution index at all of the 19 biggest construction sites exceeded safe levels by at least three times. Limestone quarries and cement factories are also often pollution sources, along with the trucks that ferry materials between them and building sites. At this scale, even the acquisition of sand can be catastrophic – destroying so many of the world’s beaches and river courses that this form of mining is now increasingly run by organized crime gangs and associated with murderous violence.

This touches on the most severe, but least understood, impact of concrete, which is that it destroys natural infrastructure without replacing the ecological functions that humanity depends on for fertilization, pollination, flood control, oxygen production and water purification.

Concrete can take our civilization upwards, up to 163 stories high in the case of the Burj Khalifa skyscraper in Dubai, creating living space out of the air. But it also pushes the human footprint outwards, sprawling across fertile topsoil and choking habitats. The biodiversity crisis – which many scientists believe to be as much of a threat as climate chaos – is driven primarily by the conversion of wilderness to agriculture, industrial estates and residential blocks.

For hundreds of years, humanity has been willing to accept this environmental downside in return for the undoubted benefits of concrete. But the balance may now be tilting in the other direction.

*

The Pantheon and Colosseum in Rome are testament to the durability of concrete, which is a composite of sand, aggregate (usually gravel or stones) and water mixed with a lime-based, kiln-baked binder. The modern industrialized form of the binder – Portland cement – was patented as a form of “artificial stone” in 1824 by Joseph Aspdin in Leeds. This was later combined with steel rods or mesh to create reinforced concrete, the basis for art deco skyscrapers such as the Empire State Building.

Rivers of it were poured after the second world war, when concrete offered an inexpensive and simple way to rebuild cities devastated by bombing. This was the period of brutalist architects such as Le Corbusier, followed by the futuristic, free-flowing curves of Oscar Niemeyer and the elegant lines of Tadao Ando – not to mention an ever-growing legion of dams, bridges, ports, city halls, university campuses, shopping centers and uniformly grim car parks. In 1950, cement production was equal to that of steel; in the years since, it has increased 25-fold, more than three times as fast as its metallic construction partner.

Debate about the aesthetics has tended to polarize between traditionalists like Prince Charles, who condemned Owen Luder’s brutalist Tricorn Centre as a “mildewed lump of elephant droppings”, and modernists who saw concrete as a means of making style, size and strength affordable for the masses.

The politics of concrete are less divisive, but more corrosive. The main problem here is inertia. Once this material binds politicians, bureaucrats and construction companies the resulting nexus is almost impossible to budge. Party leaders need the donations and kickbacks from building firms to get elected, state planners need more projects to maintain economic growth, and construction bosses need more contracts to keep money rolling in, staff employed and political influence high. Hence the self-perpetuating political enthusiasm for environmentally and socially dubious infrastructure projects and cement-fests like the Olympics, the World Cup and international exhibitions.

The classic example is Japan, which embraced concrete in the second half of the 20th century with such enthusiasm that the country’s governance structure was often described as the doken kokka (construction state).

At first it was a cheap material to rebuild cities ravaged by fire bombs and nuclear warheads in the second world war. Then it provided the foundations for a new model of super-rapid economic development: new railway tracks for Shinkansen bullet trains, new bridges and tunnels for elevated expressways, new runways for airports, new stadiums for the 1964 Olympics and the Osaka Expo, and new city halls, schools and sports facilities.

This kept the economy racing along at near double-digit growth rates until the late 1980s, ensuring employment remained high and giving the ruling Liberal Democratic party a stranglehold on power. The political heavyweights of the era – men such as Kakuei Tanaka, Yasuhiro Nakasone and Noboru Takeshita – were judged by their ability to bring hefty projects to their hometowns. Huge kickbacks were the norm. Yakuza gangsters, who served as go-betweens and enforcers, also got their cut. Bid-rigging and near monopolies by the big six building firms (Shimizu, Taisei, Kajima, Takenaka, Obayashi, Kumagai) ensured contracts were lucrative enough to provide hefty kickbacks to the politicians. The doken kokka was a racket on a national scale.

But there is only so much concrete you can usefully lay without ruining the environment. The ever-diminishing returns were made apparent in the 1990s, when even the most creative politicians struggled to justify the government’s stimulus spending packages. This was a period of extraordinarily expensive bridges to sparsely inhabited regions, multi-lane roads between tiny rural communities, cementing over the few remaining natural riverbanks, and pouring ever greater volumes of concrete into the sea walls that were supposed to protect 40% of the Japanese coastline.

In his book Dogs and Demons, the author and longtime Japanese resident Alex Kerr laments the cementing over of riverbanks and hillsides in the name of flood and mudslide prevention. Runaway government-subsidized construction projects, he told an interviewer, “have wreaked untold damage on mountains, rivers, streams, lakes, wetlands, everywhere — and it goes on at a heightened pace. That is the reality of modern Japan, and the numbers are staggering.”
He said the amount of concrete laid per square meter in Japan is 30 times the amount in America, and that the volume is almost exactly the same. “So we’re talking about a country the size of California laying the same amount of concrete [as the entire US]. Multiply America’s strip malls and urban sprawl by 30 to get a sense of what’s going on in Japan.”

Traditionalists and environmentalists were horrified – and ignored. The cementation of Japan ran contrary to classic aesthetic ideals of harmony with nature and an appreciation of mujo (impermanence), but was understandable given the ever-present fear of earthquakes and tsunamis in one of the world’s most seismically active nations. Everyone knew the grey banked rivers and shorelines were ugly, but nobody cared as long as they could keep their homes from being flooded.

Which made the devastating 2011 Tohoku earthquake and tsunami all the more shocking. At coastal towns such as Ishinomaki, Kamaishi and Kitakami, huge sea walls that had been built over decades were swamped in minutes. Almost 16,000 people died, a million buildings were destroyed or damaged, town streets were blocked with beached ships and port waters were filled with floating cars. It was a still more alarming story at Fukushima, where the ocean surge engulfed the outer defenses of the Fukushima Daiichi nuclear plant and caused a level 7 meltdown.

Briefly, it seemed this might become a King Canute moment for Japan – when the folly of human hubris was exposed by the power of nature. But the concrete lobby was just too strong. The Liberal Democratic party returned to power a year later with a promise to spend 200tn yen (£1.4tn) on public works over the next decade, equivalent to about 40% of Japan’s economic output.

Construction firms were once again ordered to hold back the sea, this time with even taller, thicker barriers. Their value is contested. Engineers claim these 12-meter-high walls of concrete will stop or at least slow future tsunamis, but locals have heard such promises before. The area these defenses protect is also of lower human worth now the land has been largely depopulated and filled with paddy fields and fish farms. Environmentalists say mangrove forests could provide a far cheaper buffer. Tellingly, even many tsunami-scarred locals hate the concrete between them and the ocean.

“It feels like we’re in jail, even though we haven’t done anything bad,” an oyster fisherman, Atsushi Fujita, told Reuters. “We can no longer see the sea,” said the Tokyo-born photographer Tadashi Ono, who took some of the most powerful images of these massive new structures. He described them as an abandonment of Japanese history and culture. “Our richness as a civilization is because of our contact with the ocean,” he said. “Japan has always lived with the sea, and we were protected by the sea. And now the Japanese government has decided to shut out the sea.”

There was an inevitability about this. Across the world, concrete has become synonymous with development. In theory, the laudable goal of human progress is measured by a series of economic and social indicators, such as life-expectancy, infant mortality and education levels. But to political leaders, by far the most important metric is gross domestic product, a measure of economic activity that, more often than not, is treated as a calculation of economic size. GDP is how governments assess their weight in the world. And nothing bulks up a country like concrete.

That is true of all countries at some stage. During their early stages of development, heavyweight construction projects are beneficial like a boxer putting on muscle. But for already mature economies, it is harmful like an aged athlete pumping ever stronger steroids to ever less effect.

During the 1997-98 Asian financial crisis, Keynesian economic advisers told the Japanese government the best way to stimulate GDP growth was to dig a hole in the ground and fill it. Preferably with cement. The bigger the hole, the better. This meant profits and jobs. Of course, it is much easier to mobilize a nation to do something that improves people’s lives, but either way concrete is likely to be part of the arrangement. This was the thinking behind Roosevelt’s New Deal in the 1930s, which is celebrated in the US as a recession-busting national project but might also be described as the biggest ever concrete-pouring exercise up until that point. The Hoover Dam alone required 3.3m cubic meters, then a world record. Construction firms claimed it would outlast human civilization.

But that was lightweight compared to what is now happening in China, the concrete superpower of the 21st century and the greatest illustration of how the material transforms a culture (a civilization intertwined with nature) into an economy (a production unit obsessed by GDP statistics). Beijing’s extraordinarily rapid rise from developing nation to superpower-in-waiting has required mountains of cement, beaches of sand and lakes of water. The speed at which these materials are being mixed is perhaps the most astonishing statistic of the modern age: since 2003, China has poured more cement every three years than the US managed in the entire 20th century.

Today, China uses almost half the world’s concrete. The property sector – roads, bridges, railways, urban development and other cement-and-steel projects – accounted for a third of its economy’s expansion in 2017. Every major city has a floor-sized scale model of urban development plans that has to be constantly updated as small white plastic models are turned into mega-malls, housing complexes and concrete towers.

But, like the US, Japan, South Korea and every other country that “developed” before it, China is reaching the point where simply pouring concrete does more harm than good. Ghost malls, half-empty towns and white elephant stadiums are a growing sign of wasteful spending. Take the huge new airport in Luliang, which opened with barely five flights a day, or the Olympic Bird’s Nest stadium, so underused that it is now more a monument than a venue. Although the adage “build and the people will come” has often proved correct in the past, the Chinese government is worried. After the National Bureau of Statistics found 450 sq km of unsold residential floor space, the country’s president, Xi Jinping, called for the “annihilation” of excess developments.


The Three Gorges Dam on the Yangtze River, China, is the largest concrete structure in the world.

Empty, crumbling structures are not just an eyesore, but a drain on the economy and a waste of productive land. Ever greater construction requires ever more cement and steel factories, discharging ever more pollution and carbon dioxide. As the Chinese landscape architect Yu Kongjian has pointed out, it also suffocates the ecosystems – fertile soil, self-cleansing streams, storm-resisting mangrove swamps, flood-preventing forests – on which human beings ultimately depend. It is a threat to what he calls “eco-security”.

Yu has led the charge against concrete, ripping it up whenever possible to restore riverbanks and natural vegetation. In his influential book The Art of Survival, he warns that China has moved dangerously far from Taoist ideals of harmony with nature. “The urbanization process we follow today is a path to death,” he has said.

Yu has been consulted by government officials, who are increasingly aware of the brittleness of the current Chinese model of growth. But their scope for movement is limited. The initial momentum of a concrete economy is always followed by inertia in concrete politics. The president has promised a shift of economic focus away from belching heavy industries and towards high-tech production in order to create a “beautiful country” and an “ecological civilization”, and the government is now trying to wind down from the biggest construction boom in human history, but Xi cannot let the construction sector simply fade away, because it employs more than 55 million workers – almost the entire population of the UK. Instead, China is doing what countless other nations have done, exporting its environmental stress and excess capacity overseas.

Beijing’s much-vaunted Belt and Road Initiative – an overseas infrastructure investment project many times greater than the Marshall Plan – promises a splurge of roads in Kazakhstan, at least 15 dams in Africa, railways in Brazil and ports in Pakistan, Greece and Sri Lanka. To supply these and other projects, China National Building Material – the country’s biggest cement producer – has announced plans to construct 100 cement factories across 50 nations.

*

As elsewhere, the craze for concrete in South America’s biggest nation started benignly enough as a means of social development, then morphed into an economic necessity, and finally metastasized into a tool for political expediency and individual greed. The progress between these stages was impressively rapid. The first huge national project in the late 1950s was the construction of a new capital, Brasília, on an almost uninhabited plateau in the interior. A million cubic meters of concrete were poured on the highlands site in just 41 months to encase the soil and erect new edifices for ministries and homes.

This was followed by a new highway through the Amazon rainforest – the TransAmazonia – and then from 1970, South America’s biggest hydroelectric power plant, the Itaipu on the Paraná river border with Paraguay, which is almost four times bulkier than the Hoover Dam. The Brazilian operators boast the 12.3m cubic meters of concrete would be enough to fill 210 Maracanã stadiums. This was a world record until China’s Three Gorges Dam choked the Yangtze with 27.2m cubic meters.

With the military in power, the press censored and no independent judiciary, there was no way of knowing how much of the budget was siphoned off by the generals and contractors. But the problem of corruption has become all too apparent since 1985 in the post-dictatorship era, with virtually no party or politician left untainted.

For many years, the most notorious of them was Paulo Maluf, the governor of São Paulo, who had run the city during the construction of the giant elevated expressway known as Minhocão, which means Big Worm. As well as taking credit for this project, which opened in 1969, he also allegedly skimmed $1bn from public works in just four years, part of which has been traced to secret accounts in the British Virgin islands. Although wanted by Interpol, Maluf evaded justice for decades and was elected to a number of senior public offices. This was thanks to a high degree of public cynicism encapsulated by the phrase most commonly used about him: “He steals, but he gets things done” – which could describe much of the global concrete industry.

Although the dangers are increasingly apparent, this pattern continues to repeat itself. India and Indonesia are just entering their high-concrete phase of development. Over the next 40 years, the newly built floor area in the world is expected to double. Some of that will bring health benefits. The environmental scientist Vaclav Smil estimates the replacement of mud floors with concrete in the world’s poorest homes could cut parasitic diseases by nearly 80%. But each wheelbarrow of concrete also tips the world closer to ecological collapse.

Chatham House predicts urbanization, population growth and economic development will push global cement production from 4 to 5bn tonnes a year. If developing countries expand their infrastructure to current average global levels, the construction sector will emit 470 gigatons of carbon dioxide by 2050, according to the Global Commission on the Economy and Climate.

The dangers are recognized. A report last year by Chatham House calls for a rethink in the way cement is produced. To reduce emissions, it urges greater use of renewables in production, improved energy efficiency, more substitutes for clinker and, most important, the widespread adoption of carbon capture and storage technology – though this is expensive and has not yet been deployed in the industry on a commercial scale.

Architects believe the answer is to make buildings leaner and, when possible, to use other materials, such as cross-laminated timber. It is time to move out of the “concrete age” and stop thinking primarily about how a building looks, said Anthony Thistleton.

But many engineers argue that there is no viable alternative. Steel, asphalt and plasterboard are more energy intensive than concrete. The world’s forests are already being depleted at an alarming rate even without a surge in extra demand for timber.

Phil Purnell, a professor of materials and structures at Leeds University, said the world was unlikely to reach a “peak concrete” moment.

“The raw materials are virtually limitless and it will be in demand for as long as we build roads, bridges and anything else that needs a foundation,” he said. “By almost any measure it’s the least energy-hungry of all materials.”

Instead, he calls for existing structures to be better maintained and conserved, and, when that is not possible, to enhance recycling. Currently most concrete goes to landfill sites or is crushed and reused as aggregate. This could be done more efficiently, Purnell said, if slabs were embedded with identification tags that would allow the material to be matched with demand. His colleagues at Leeds University are also exploring alternatives to Portland cement. Different mixes can reduce the carbon footprint of a binder by up to two-thirds, they say.

Arguably more important still is a change of mindset away from a developmental model that replaces living landscapes with built environments and nature-based cultures with data-driven economies. That requires tackling power structures that have been built on concrete, and recognizing that fertility is a more reliable base for growth than solidity. ~

https://www.theguardian.com/cities/2019/feb/25/concrete-the-most-destructive-material-on-earth?CMP=share_btn_fb&fbclid=IwAR0YaZwebPwXscprIag4JrmyP23JDQAzBUzbYN5PnbSwh8AhJbjI2cAAvIE

*

THE ULTIMATE FATE OF THE UNIVERSE


~ What is time? That is a question no one knows how to answer. Before we can approach it, we must ask another question: What kind of time do we mean? The time clocks measure? That is not time, but a representation of time by a machine. Clock time is a human construct, something we came up with to make sense of what we feel deep inside about the nature of time: that time is a measure of change, that things do change, and that if we want to have some measure of control over this change, we had better learn to quantify it. It is, however, the feeling of time passing that is so mystifying. We know it is there, we feel it, we see it in the mirror as we watch ourselves age. Yet we cannot get a hold on it.

Time’s essence dominates our existence and yet escapes our comprehension.

Time’s universal narrative

Time is not like space. We are free to move in space, left, right, backward, forward. But time is something of a prison. We cannot control it, and we cannot move around it. It goes forward and that’s it.

Yes, the theory of relativity has changed our perception of the steadiness of time’s flow. But time still flows, even if it does so differently for observers moving with respect to one another.
Einstein showed that a moving clock ticks slower than one at rest. But they both tick, and time moves forward for each. One of the cornerstones of relativity is that light is the fastest thing in the Universe. Traveling at the speed of light is impossible for anything that has mass. Only light itself can reach the speed of light. (So might gravitons, hypothetical particles that carry the gravitational attraction and are supposed to be massless.) So, maybe for light, time does not exist. But since light doesn’t have an awareness of existing, it probably does not care about time’s passing.

But we do care. And what modern science has shown us in spectacular ways is that everything that exists has a history. From living things to rocks to the Universe itself, there is always an embedded time narrative that tells the story of that thing. When it comes to the master of all time narratives, the story that contains all stories, we need to go to the Universe itself.

What we do know is that time started with the Big Bang some 13.8 billion years ago. That event marked the beginning of time. We understand time as a measure of change. What changed at the beginning of time is that the Universe as we know it came into being in ways we still do not (or maybe cannot) understand. What changed scientifically speaking, however, was that the extremely high density of matter and energy started to be diluted by the expansion of space itself — an expansion that unfolded through time and is still going on.

The details of this narrative depend on the kinds of stuff the Universe contains. The cosmic recipe determines how the Universe changes in time and what kind of future it will have. There are essentially two possibilities. In one, the Universe will keep on expanding forever. Because stars have finite lives, at some point far in the distant future they will be extinct. Stellar corpses will dot the Universe, from slowly smoldering white dwarfs to black holes of different sizes. 

But drama can be added here, depending on what kinds of matter the Universe contains. If the current recipe remains viable, there are three main ingredients: dark matter, dark energy, and the stuff we are made of — the normal matter of protons (quarks actually) and electrons.

A cold death, or destruction by fire

Assuming normal matter and dark matter are stable in the long term, dark energy controls the future of the Universe. If dark energy, this ether-like substance of unknown composition, is a constant — that is, if its density does not change and it maintains a fixed volume irrespective of cosmic expansion — then the Universe’s expansion will keep on accelerating. In extreme scenarios, it may have so much negative pressure that it will rip everything apart, decomposing matter back into its basic ingredients. Instead of “from dust to dust,” the cosmic epitaph would be “from particles to particles.”

But it is also possible that dark energy is not like that, and that its negative pressure will fade with time, no longer fueling the Universe’s rapid expansion. Acceleration will lose pace, and the Universe will retain the faded stars and their corpses, all resting far from each other — the ultimate cosmic loneliness. The sad part is that in a cosmos without room for entropy to grow, matter cannot reorganize into anything interesting. This is the cold cosmic death scenario. If nothing changes, time itself loses its function and reaches an end.

Another possibility is that the expansion will slow down. If there is enough matter out there, it may revert itself and push the Universe from expansion into contraction. Eventually, matter that was dispersed for billions of years will compact back into a small volume, heat up, reach crushing densities and…well, it depends. It may go into a Big Crunch, the inverse of the Big Bang; or it may reach a point of maximum contraction and then bounce outward into a new phase of expansion. This is the bounce universe model, where the Universe trades periods of expansion and contraction, never quite reaching the point of infinite density or initial singularity. Time keeps on ticking, although each cycle needs new clocks. Every end of time marks the beginning of a new time, a new cycle of existence. This is a bit more comforting a view than cold death, even if each cycle involves destruction by fire.

What we learn from our current cosmological models is how fortunate we are to exist precisely when it is possible to exist. Of course, there is no luck involved here. We exist now because this is when it is possible for matter to agglomerate int
o thinking blobs like us. In other eras there would be no stars capable of sustaining life long enough for it to conjecture about its fate. So if time will end, it is because creatures like us will also end. In a Universe without sentient beings aware of time’s passage, without the awareness of past and future, the very concept of existence is meaningless. That should give us pause when we consider how small we are in the vastness of space. Small yes, but as far as we know, we are the ones who hold the whole cosmic history in our minds. ~

https://bigthink.com/13-8/what-is-time-2/

Oriana:

It’s difficult for me to think about the fate of the universe without remembering one of Robert Frost’s little gems:

FIRE AND ICE

Some say the world will end in fire,
Some say in ice.
From what I’ve tasted of desire
I hold with those who favor fire.
But if it had to perish twice,
I think I know enough of hate
To say that for destruction ice
Is also great
And would suffice.

~ Robert Frost

*
THE DEAD BUT NOT DEAD TIBETAN MONKS

~ It’s definitely happening, and it’s definitely weird. After the apparent death of some monks, their bodies remain in a meditating position without decaying for an extraordinary length of time, often as long as two or three weeks. A fascinating account of the phenomenon was written by Daniel Burke for the publication Tricycle.

Tibetan Buddhists, who view death as a process rather than an event, might assert that the spirit has not yet finished with the physical body. For them, thukdam begins with a “clear light” meditation that allows the mind to gradually unspool, eventually dissipating into a state of universal consciousness no longer attached to the body. Only at that time is the body free to die.

Whether you believe this or not, it is a fascinating phenomenon: the fact remains that their bodies don’t decompose like other bodies. (There have been a handful of other unexplained instances of delayed decomposition elsewhere in the world.)

The scientific inquiry into just what is going on with thukdam has attracted the attention and support of the Dalai Lama, the highest monk in Tibetan Buddhism. He has reportedly been looking for scientists to solve the riddle for about 20 years. He is a supporter of science, writing, “Buddhism and science are not conflicting perspectives on the world, but rather differing approaches to the same end: seeking the truth.”

The most serious study of the phenomenon so far is being undertaken by The Thukdam Project of the University of Wisconsin-Madison’s Center for Healthy Minds. Neuroscientist Richard Davidson is one of the founders of the center and has published hundreds of articles about mindfulness.

Davidson first encountered thukdam after his Tibetan monk friend Geshe Lhundub Sopa died, officially on August 28, 2014. Davidson last saw him five days later: “There was absolutely no change. It was really quite remarkable.”

The Thukdam Project published its first annual report this winter. It discussed a recent study in which electroencephalograms failed to detect any brain activity in 13 monks who had practiced thukdam and had been dead for at least 26 hours. Davidson was senior author of the study.

While some might be inclined to say, well, that’s that, Davidson sees the research as just a first step on a longer road. Philosopher Evan Thompson, who is not involved in The Thukdam Project, tells Tricycle, “If the thinking was that thukdam is something we can measure in the brain, this study suggests that’s not the right place to look.”

In any event, the question remains: why are these apparently deceased monks so slow to begin decomposition? While environmental factors can slow or speed up the process a bit, usually decomposition begins about four minutes after death and becomes quite obvious over the course of the next day or so.

As the Dalai Lama said:

“What science finds to be nonexistent we should all accept as nonexistent, but what science merely does not find is a completely different matter. An example is consciousness itself. Although sentient beings, including humans, have experienced consciousness for centuries, we still do not know what consciousness actually is: its complete nature and how it functions.”

CONSCIOUSNESS

As thukdam researchers continue to seek a signal of post-mortem consciousness of some sort, it’s fair to ask what — and where — consciousness is in the first place. It is a question with which Big Think readers are familiar. We write about new theories all the time: consciousness happens on a quantum level; consciousness is everywhere.

So far, though, says Tibetan medical doctor Tawni Tidwell, also a Thukdam Project member, searches beyond the brain for signs of consciousness have gone nowhere. She is encouraged, however, that a number of Tibetan monks have come to the U.S. for medical knowledge that they can take home. When they arrive back in Tibet, she says, “It’s not the Westerners who are doing the measuring and poking and prodding. It’s the monastics who trained at Emory.” ~

https://bigthink.com/health/thukdam-study/#Echobox=1682223303

Oriana:

I'll never forget my mother's uncontrollable burst of laughter when she first heard the phrase "non-brain-based consciousness." But the phenomenon of the dead/undead Buddhist monks certainly deserves study.

*
THE CASE FOR TECHNO-OPTIMISM

~ We have evidence, hidden in plain sight, that the world is poised for another of history’s rare “mass flourishings,” to use the expression by Nobel economist Edmund Phelps. That evidence is not found in any single headline-grabbing invention, or the stock value of any single company, but is visible instead in the pattern of technological revolutions.

It’s the same pattern that ignited the great economic acceleration of the 20th century, which was not the consequence of any one invention. It was not just the car, the telephone, the radio, the electric light, or motor alone that so radically changed the world of that century. Instead, it was the incendiary effect of all of that happening simultaneously.

And, critically, it was the contemporaneous maturation — not just the invention — of those technologies. And the fact that those advances occurred in each of the three foundational spheres of technology that make civilization possible: the means for gathering and propagating information, the means (machines) of production, and the class of materials available to do everything.

On the information front: the 20th century saw the arrival not only of the telephone and radio (and then television) for information dissemination, but also (and often ignored in popular accounts) the new information acquisition tools from spectroscopy and X-ray crystallography to precision clocks (of which the atomic clock and, derivatively, GPS) were at the pinnacle. Superior measurement and monitoring improved production capabilities as well as expanded our understanding of underlying phenomenon.

On the machine front: the early 20th century saw the advent of high-speed, highly controllable, electrically powered manufacturing machines yielding not only mass production but also far greater control. And, of course, there were the new machines of transportation (automobiles and aircraft) and of power production.

And on the materials front: The 20th century saw the advent of a profoundly different character and greater variety of materials available to make things, not least the arrival of chemistry (polymers and pharmaceuticals) and high-strength concrete. For eons, the majority of things in the built environment were made from a relatively small set of materials: mankind used only a fraction of the 92 (original) elements in the periodic table. The material suite used to make automobiles in early days included mainly wood, rubber, glass, iron, copper, vanadium, and zinc. Today a car is built using at least one-third of all the elements in the periodic table, and computers and communications gear employ over two-thirds of all the elements.

Marked from to 1920 — roughly when the maturation occurred, contemporaneously, for all of those different technologies —  the next 80 years constituted history’s greatest overall expansion of wealth and wellbeing. The average lifespan of an American increased by 30 years and average per capita wealth rose 700 percent (in inflation-adjusted terms).

Yes, we know that the entirety of the 20th century wasn’t a time of wine and roses. Notably, the Roaring Twenties would end with the searing 1929 stock market crash, followed by a Great Depression and the tragedy of another “great” war. America survived the crises and chaos, but if it were not for that confluence of revolutions in technology’s three spheres, the great expansion leading to the modern life we now know would not have happened.

It has been said that history doesn’t repeat but rhymes. The rhyme, or pattern, that shaped the 20th century is again at play in the 21st. The evidence is visible in the same three spheres of technology — information, machines, materials — with, again, the same pattern of revolutionary technologies in each sphere contemporaneously reaching useful maturity.

This time, disrupting the information sphere, we find the microprocessor, a general-purpose information tool. Not only are the microprocessors morphing into a new class of software in the form of the inaptly named “artificial intelligence,” but they constitute the building blocks of data centers, the massive “cathedrals of commerce” at the core of the sprawling Cloud infrastructure.

The AI-infused Cloud is not a communications system but an information infrastructure that uses communications networks. It is as different from the Internet as the Internet was different from the telephone network. And in the information domains, the Cloud’s capabilities are transforming and amplifying the means for observing and measuring — in digital terms, the means for harvesting data — and thus advancing the tools of basic discovery as much as did the invention of the microscope and telescope. The Cloud also transforms the means of measurement and detection at scales and at a granularity never before possible, in factories and on farms as well as the services used in daily life.

In the reboot of the second domain, the machines, we find the maturation of the 3D printer as a new means of production in how it can “magically” convert a computer image directly into a final product. 3D printing machines offer not only the capability to create mass craft production but also enable the capability to essentially “grow” components, mimicking nature in ways that are impossible using conventional machine tools, including fabricating artificial skin (or skin-like material) or even artificial organs.

The machine domain is also witness to a revolution in manufacturing tools derivative from the microprocessor supply chain, ones that can fabricate at molecular scale. Half a century ago, such an idea was only in the imagination of science fiction writers like Isaac Asimov. And, despite decades of hype and hope, 21st century machines now include not only autonomous drones but also a path to untethered and anthropomorphic robots that can collaborate with humans in many tasks.

And the third in the triad of technology’s spheres is the nature of materials available to build everything. Here, just as happened in the early 20th century, we are witness to a materials revolution, but this time it includes computationally designed and synthesized materials that can exhibit “unnatural” properties, such as invisibility. Humanity is on the cusp of realizing the long dream of alchemists in literally conjuring materials.

Instead of relying on a fixed catalogue of available materials or undergoing trial-and-error attempts to come up with new ones, engineers can turn to algorithms running in supercomputers to design unique materials, based on a “materials genome,” with properties tailored to specific needs. Among the new classes of emerging materials are “transient” electronics and bioelectronics that portend applications and industries comparable to the scale that followed the advent of silicon-based electronics.

In each of the three technological spheres, we find the Cloud increasingly woven into the fabric of innovation. The Cloud itself is, synergistically, evolving and expanding from the advances in new materials and machines, creating a virtuous circle of self-amplifying progress. It is a unique feature of our emerging century that constitutes a catalyst for innovation and productivity, the likes of which the world has never seen.

Of course, given the patterns of human nature, it would be naïve to think the 21st century’s great lurch forward will be free of turmoil, political strife, and even (sadly) more wars. But, as with that same pattern circa 1920s, such deep technological transformations ultimately show up in the three key metrics that matter to society: greater per capita wealth, improved health and well-being, and more conveniences as well as more time to enjoy them in the forms of entertainment and leisure. ~

https://www.freethink.com/hard-tech/flourishing-mark-mills?utm_source=facebook&utm_medium=social&utm_campaign=secondary-pages&utm_content=ap_ssoazmlrxs&fbclid=IwAR1l2smlWtNrsAB1D4icJepsgkhWKJZs0hUeHH1rg-QPWyn9JNIn6CXQI6M

*
THE KKK DIDN’T ALWAYS WEAR ROBES AND HOODS



~ Together, a pointed white hood and robe creates the distinctive outfit worn by America’s oldest and most infamous hate group, the Ku Klux Klan. But members of the terrorist organization donned very different costumes for much of the group’s early history. It took the influences of Hollywood and a mail-order catalogue to establish the white supremacist's garb of choice, Alison Kinney writes in her book Hood (Object Lessons), excerpted for the New Republic. 

While the white robes—which were later mythologized by Klan members as depictions of Confederate ghosts—did show up in early costumes, it was initially more common for members to don costumes that came from a wide variety of folk traditions and pageants. Kinney writes:

Klansmen wore gigantic animal horns, fake beards, coon-skin caps, or polka-dotted paper hats; they imitated French accents or barnyard animals; they played guitars to serenade victims. Some Klansmen wore pointed hats suggestive of wizards, dunces, or Pierrots; some wore everyday winter hoods, pillowcases, or flour sacks on their heads. Many early Klansman also wore blackface, simultaneously scapegoating and mocking their victims.



During the Reconstruction era (1865-1877), this variety was what helped keep early versions of the Klan a secret. While testimonies from witnesses referenced the outlandish costumes, people in power denied that these attacks were evidence of efforts by a coordinated hate group. In 1890, with the ushering in of the Jim Crow laws, the Klan's first iteration mostly disbanded, as their prejudices had been successfully codified into law— meaning there was no need for lynch mobs to hide their faces and identities. 



A nostalgia for the Reconstruction-era Klan surfaced among white Southerners around the turn of the 20th century. Thomas F. Dixon, Jr. wrote a trilogy of books that depicted Klansmen as heroes, including his most infamous piece, The Clansman. The 1905 novel, which featured illustrations by Arthur I. Keller, depicted Klansmen in the white hood-and-mask combo—a made-up uniform that became the Klan's ubiquitous attire once D.W. Griffith adapted the book into his blockbuster 1915 film, The Birth of a Nation. 

"The critics were raving. People were on their feet cheering at the climax of the film, when the Klan is seen as a healing force—restoring order to the chaos of the South during Reconstruction," Dick Lehr, who wrote a book on the film, tells NPR.

The exact version of the hood seen in the film might have been influenced by Paris-trained costumer, Clare West, who worked on the production, Kinney suggests. This might explain the similarity with the outfits worn by penitents during some Holy Week processions in Europe, making the resemblance with the Klan outfit just a coincidence. 

 

So how did all the Klan members get their hoods? A traveling organizer for several fraternal orders, including the Klan, saw an opportunity in the commercial success of the movie, and started selling hoods and robes in 1920. By 1921, the Klan began mass producing the costume, even publishing a “sumptuous, full-color, mail-order” catalogue, Kinney reports. 

They were tapping into a big market as by the 1920s, the Klan had once again become "a powerful political force in both the North and the South," notes the National Museum of American History. 

The costume was less a disguise and more of an in-group identifier. As the Anti-Defamation League points out, the uniform hood and white robes served as a symbol that gave the hate group "a sense of power and belonging, as well as a quick way of identifying others who share[d] their beliefs." 

While financial difficulties and charges of tax evasion would cause the Klan to splinter and dissolve again, it emerged again as a smaller, violent presence during the Civil Rights Movement. The hood remains a part of the group, however, as does the hate, to this day.

https://www.smithsonianmag.com/smart-news/ku-klux-klan-didnt-always-wear-hoods-180957773

*



FRANS DE WAAL: Well, religion is an interesting topic because religion is universal. All human societies believe in the supernatural. All human societies have a religion one way or another.

REZA ASLAN: Religion has been a part of the human experience from the beginning. In fact, we can trace the origin of religious experience to before homo sapiens. We can trace it with some measure of confidence to Neanderthals. We can measure it with a little less confidence all the way to Homo erectus. So we're talking hundreds of thousands of years before our species even existed.

ROBERT SAPOLSKY: Essentially there has been no culture on Earth that has not invented some form of what could be termed meta-magical thinking, attributing things that cannot be seen, faith-based belief systems, things of that sort. It's universal.

ASLAN: Religious thinking is embedded in our cognitive processes. It is a mode of knowing. We're born with it. It's part of our DNA. The question then becomes why. There must be some evolutionary reason for it. There must be a reason, some adaptive advantage to having religious experience or faith experience. Otherwise it wouldn't exist.

SAPOLSKY: It makes perfect sense why they've evolved because they're wonderful mechanisms for reducing stress. It is an awful, terrifying world out there where bad things happen and we're all going to die eventually. And believing that there is something, someone responsible for it at least gives some stress reducing attributes built around understanding causality.

ALAIN DE BOTTON: Religion starts from the view that we are torn between good and evil. There is definitely a good core, but it's permanently tempted. And so what the individual needs is a structure which will constantly try and tug a person back towards the best of themselves.

DE WAAL: Our current religions are just 2,000 or 3,000 years old which is very young and our species is much older. And I cannot imagine that, for example, 100,000 years or 200,000 years our ancestors did not have some type of morality. Of course the had rules about how you should behave, what is fair, what is unfair, caring for others. All of these tendencies were in place already so they had a moral system.

And then at some point we developed these present day religions which I think were sort of tacked onto the morality that we had. In societies with 1,000 or several thousand or millions of people we cannot all keep an eye on each other and that's maybe why we installed religions in these large scale societies where a god kept watch over everybody and maybe they served to codify them or to enforce them or to steer morality in a particular direction that we prefer. And so instead of saying morality comes from god or religion gave us morality, for me that's a big no-no.

PENN JILLETTE: People are good. If you look at the seven billion people on this planet just about seven billion of them are really good. We can really trust them. Can we please learn something from Las Vegas. Learn something about gambling, right. We know how the odds work. We know the house always wins. In this case the odds are always on someone being good.

BILL NYE: When it comes to ethics and morals and religion to see if there's anything different between what religions want you to do and what you feel you should do, what you think is ethically innate within you. For most people – most people are not inclined to murder people, but certain religions quite reasonably have rules against that. It's antisocial. See if that comes from within you or it comes from outside of you from without you.

ROB BELL: My understanding of spirituality is that this life that we've each been given, the very breath that we took and we're about to take is a gift. That life is a gift and how you respond to it, what you do with it matters.

PETE HOLMES: It's not about literal facts or the unfolding of what happened in the life of Jesus of Nazareth. It's a story because sometimes you need an explanation and sometimes you need a story. And a story is going to transform you and symbols are going to transform you. You see this in our culture. Batman is a symbol. Go out on the street and look at how many men, especially, are wearing Batman shirts. It's a symbol. It's something that speaks to our psyche about the pain of a boy who lost his parents using his wound to become a superhero and try and change his reality. That's a symbol. That's a Christ story. That's a hero story and we need those because it's not about at the end of the day winning a televised debate or finding DNA on the Shroud of Turin or proving his burial was here.

I've been to Israel. I studied in Jerusalem. They're like he was crucified here and then they're like well, he was crucified here. Guess what? We didn't start writing that down until 150 years later because nobody gave a shit. It wasn't about that. It was about your inner transformation. You. Yours. I don't care how you get there. It can be photos from the Hubble telescope. It can be Buddhism, atheism, agnosticism, Catholicism. It doesn't matter. Who fucking cares. Whatever gets you there because we're talking about something. An energy that you can feel and be quiet to and respect, but most importantly you can flow with and dance with and feel and listen to and attune to.

BELL: This idea somehow that faith and science are in opposition I've always found to be complete insanity. Both are searching for the truth. Both have a sense of wonder and an expectation and exploration. They're each simply naming different aspects of the human experience. One thrives in naming exteriors – height, weight, gravitational pull, electromagnetic force. The other is about naming interiors – compassion, kindness, suffering, loss, heartache. They're both simply different ways of exploring different dimensions of the human experience.

FRANCIS COLLINS: Science is about trying to get rigorous answers to questions about how nature works and it's a very important process that's actually quite reliable if carried out correctly with generation of hypotheses and testing of those by accumulation of data and then drawing conclusions that are continually revisited to be sure they're right. So if you want to answer questions about how nature works, how biology works, for instance, science is the way to get there. But faith in its proper perspective is really asking a different set of questions and that's why I don't think there needs to be a conflict here. The kinds of questions that faith can help one address are more in the philosophical realm. Why are we all here? Why is there something instead of nothing? Is there a God? Isn't it clear that those aren't scientific questions and that science doesn't have much to say about them.

NYE: So the question is if you have a religious tenet, if you hold a point of view that excludes something about modern science I don't think the burden is on scientists or engineers to provide you a comfortable link. The link is for you. You have to reckon the facts as we call them with some belief system that is incompatible with it. An example that I think everybody would eventually find ourselves discussing would be geology, the age of the Earth. A couple of years ago I debated a guy who insists that the Earth is 6,000 years old. That's completely wrong. It's obviously wrong. And the way we know it is wrong was a result of centuries of study. People found layers of rocks, figured out where the layers came from. People found radioactive elements which chemically substitute into certain crystals in exchange like rubidium and strontium substitute for potassium and calcium and argon and so on. This led us to an understanding of the age of the Earth.

So if you have a belief system that is incompatible with modern geology, really the problem is for the person trying to argue the Earth is extraordinarily young. Not for the people who have studied the world around us and understand it. There's nothing there that I've seen in the Bible that informs modern science with one possible exception. There's in some translations that I've read there's reference to 22/7 for being the distance around a circle, the value of pi. And that's pretty close. That's pretty close. It doesn't go past three digits but it's pretty close. Okay, so the people who wrote the Bible were literate but they were not literate in the modern scientific sense. So you have to reckon that, man. I can't get in there. The earth is not 6,000 years old. Never going to be.

COLLINS: My study of genetics certainly tells me incontrovertibly that Darwin was right about the nature of how living things have arrived on the scene by descent from a common ancestor under the influence of natural selection over very long periods of time. Darwin was amazingly insightful given how limited the molecular information he had was. Essentially it didn't exist. Now with the digital code of DNA we have the best possible proof of Darwin's theory that he could have imagined. So that certainly tells me something about the nature of living things. But it actually adds to my sense that this is an answer to a how question and it leaves the why question still hanging in the air. Why is it, for instance, that the constants that determine the behavior of matter and energy, like the gravitational constant, for instance, have precisely the value that they have to in order for there to be any complexity at all in the universe.

That is fairly breathtaking in its lack of probability of ever having happened and it does make you think that a mind might have been involved in setting the stage. At the same time that does not imply necessarily that that mind is controlling the specific manipulations of things that are going on in the natural world. In fact, I would very much resist that idea. I think the laws of nature potentially could be the product of a mind. I think that's a defensible perspective, but once those laws are in place then I think nature goes on and science has the chance to be able to perceive how that works and what its consequences are.

BELL: Everything is driven by the desire to know the truth. There's an exploration. There's a wide-eyed sense of wonder. If you talk to the best scientists they have this sort of gleam in their eye like 'This is what we're learning. And we don't know what's actually around the corner.' And if you talk to the best theologians and poets and scholars they—ideally—have the same gleam in their eye which is 'Look what we're learning. Look what we're exploring.' And so to me they're not enemies. They're long lost dance partners.

COLLINS: Part of the problem is I think the extremists have occupied the stage. Those voices are the ones we hear. I think most people are actually kind of comfortable with the idea that science is a reliable way to learn about nature, but it's not the whole story and there's a place also for religion, for faith, for theology, for philosophy. But that harmony perspective doesn't get as much attention. Nobody is as interested in harmony as they are in conflict I'm afraid.

NYE: As you may know I'm not a believer. I'm a nonbeliever. I spent a lot of time trying to understand my place in the cosmos and I've reached my own conclusions but I'm the first to say that ultimately we are all agnostic. This is to say you can't know whether or not there is a giant entity running the show or choosing to not run the show. You can't know. So we all are I believe best served by just living good lives. Trying to leave the world better than we found it.

ASLAN: The truth of the matter is we just don't know. But what is a fact is that there is something in the way that our brains work that compel us to believe that we are more than just the sum of our material parts. That thing is either an echo or an accident or it's deliberate and purposeful. And which you decide is surely a matter of choice because there is no proof either way.

https://bigthink.com/thinking/has-science-made-religion-useless/#Echobox=1682243889

 

Oriana:

I've met people who seem to need religion, and people who don't. Sometimes I find religion simply odious, anti-human (especially anti-woman), and just overall idiotic, but I try to remind myself of the need for tolerance. 

I can't help but note that the drive to know the truth isn't all that common. Mostly, it seems that people want to have their beliefs confirmed. But that too calls for tolerance.

*
Atheists, have you considered that maybe God is a malevolent monster who enjoys seeing us all suffer? Maybe that is the answer to the problem of evil.

Yes. That is indeed the logical conclusion.

This conclusion leads into dystheism — the concept that God is evil, some kind of a sadistic cosmic bully, and we are only his playthings because it amuses him.

Dystheism is indeed a very old concept — it has been documented already in the Viking sagas. Many Pagan Norse saw their gods only as supernatural bullies bossing around them and demanding sacrifices and giving nothing. ~ Susan Viljanen, Quora

Oriana:

This took my breath away. In childhood, I thought I was the only one who saw god as evil; he amused himself by watching people suffer. And it turns out to be an ages-old concept.

Of course it was a "he" hence the excitement about war or any kind of conflict and slaughter. And the infinity vanity that required praise 24/7. God out-Hitlered Hitler.

And, sure, now that I think about it, the Gnostics believed that the god who created this world was an evil demiurge. But there was also another, more distant, benevolent deity. Now that was beyond my child’s imagination. There was only one evil god who threw children into hell. And not just children, of course — some 90% of humanity, since not even being a Catholic offered an absolute protection.

Nevertheless, I don’t understand why the question is directed to atheists. To me, atheism was a liberation from believing in the God of Punishment. “The monster really doesn’t exist!” something sang out in my mind. And I whirled around in delight.

But not right away — not at fourteen, when I decided it was all “just another mythology” and stopped going to church. It took more years of having intermittent nightmares of being in hell (actually interesting ones, since the longer we live longer, the more kinds of hell
we can imagine). 

And then, after a long while, I felt my recovery was as complete as possible (it’s never 100%). That’s why the singing and dancing.


*
EXERCISE DURING PREGNANCY MIGHT REDUCE THE RISK OF OBESITY IN OFFSPRING

~ There’s an ongoing joke among members of Gen X that our mothers smoked cigarettes and drank alcohol while pregnant and we turned out just fine. Well, sort of. Research has shown that obesity levels among my peers are astonishingly high. Mental health issues are also on the rise. Of course, this cannot be pinned on maternal habits—we can’t blame everything on our parents—though a new study shows it plays a role.

Washington State University professor Min Du and his PhD student, Jun Seok Son, discovered that female mice that exercised had healthier offspring than mothers that got no wheel time. Offspring of the exercising mothers group are less likely to grow obese and exhibit better metabolic health.

Exercising while pregnant stimulates the production of brown adipose tissue, otherwise known as brown fat. Its primary function is thermoregulation; fans of Dutch athlete Wim Hof are well aware that he has an inordinate amount of brown fat, which is in part why he can thrive in freezing ice baths, meditate in subzero temperatures, and scale Mt Kilimanjaro wearing only shorts.

Newborns have a lot of brown fat, as do hibernating mammals. This tissue decreases as we age. Brown fat is much healthier than white fat; we don’t want to carry the latter around. Whereas the accumulation of white fat leads to all of the metabolic and cardiovascular issues we associate with obesity, brown adipose tissue activation has been shown to promote bone health and density; increase levels of irisin, which helps build lean muscle mass; improve insulin sensitivity; and aid in longevity by increasing levels of the protein hormone adiponectin.

Du and Son’s study might be the first to display the possible benefits of exercising while pregnant. Previous research has linked maternal obesity to infants. This study shows the benefits of exercise, one of which is better glucose tolerance, meaning children have a reduced likelihood of developing type 2 diabetes. Son says, “These findings suggest that physical activity during pregnancy for fit women is critical for a newborn’s metabolic health. We think this research could ultimately help address obesity in the United States and other countries.”

Still, myths perpetuate regarding the efficacy of exercising while pregnant. According to NYU OB-GYN, Jennifer Aquino, as long as women stay hydrated while working out, they are unlikely to experience ill effects. Overheating is a major concern, however. Avoid exercise in hot environments. Eating a snack before working out is also a good idea.

The current guidelines for exercising while pregnant are similar to everyone else: 150 minutes of moderate level fitness, split between cardiovascular and strength training. Pregnant women generally want to choose low impact options, such as swimming and indoor cycling. Of course, every woman’s approach should be tailored to meet their needs and pre-pregnancy fitness levels.

As a general guideline, my advice as a fitness instructor (who has taught hundreds of pregnant women over the last 16 years) has been to keep up their regimen as best they can, provided they are healthy enough to do so and with modifications. I don’t advise learning anything new during this time as that could increase their risk for injury. If an expecting mother does want to engage in new exercise routines, medical professionals advise slow adoption.

Again, anecdotally, I’ve seen a range of responses. Some women choose to scale back their routines or even stop working out if adverse reactions begin (usually causing them to take bed rest). I’ve also seen one instructor friend teach kickboxing and perform handstands while nine months pregnant. I even had a woman in her fortieth week take my class to try to “get the baby out already.” (He was born the next day, though I take no credit for that.)

It should not surprise anyone that healthier mothers have healthier babies. We are well aware of the genetic consequences of our parents that we pass to our offspring. We also know well the behavioral imprints our forebears leave on us. A guy named Freud wrote a few books about that. Of course, parental behavior affects our development in every capacity, fitness levels included. Thanks to this team in Washington, we have proof. ~

https://bigthink.com/health/exercise-while-pregnant/#Echobox=1682223310


*
WHY MEMORIES OF LATE CHILDHOOD AND EARLY YOUTH PERSIST IN ALZHEIMER’S

~ During an activity designed for people with dementia, I had a conversation with an older woman who had been diagnosed with Alzheimer’s disease. We were sitting next to one another at a traditional Danish coffee table in a museum setting, and everything around us was furnished and decorated like a 1950s apartment, matching the period of her young adulthood. ‘Isn’t it strange,’ she said, ‘all the things that happened to me when I was young stand so vividly in my memory, but if you ask me what happened yesterday, I wouldn’t have the faintest idea.’ She then began telling me stories from her youth.

Increased longevity has rendered dementia tragically common: with prevalence rates increasing with age, it is estimated that about 10 per cent of people aged 65 and older currently live with Alzheimer’s disease, the most prevalent type of dementia. Alzheimer’s disease causes irreversible changes in the brain, and the damage begins in areas that are especially important to memory – notably, the hippocampus and surrounding areas in the medial temporal lobes (MTL). This is why one of the earliest complaints in Alzheimer’s disease concerns difficulties with remembering.

Autobiographical memory is especially compromised, even in the early stages of the disease. Autobiographical memory is the kind of memory that enables each of us to remember our own past, from a recent meeting at work to experiences we had when growing up. It is central to everyday functioning, including problem-solving, maintaining social relationships and having a sense of self-continuity over time. There is evidence that autobiographical memory deficits in Alzheimer’s disease are associated with identity impairments (not quite knowing who you are) and apathy. Although normal aging also involves some reductions in the ability to remember the personal past, the deficits exhibited by individuals with Alzheimer’s disease are different and much more severe.

Yet, as illustrated by the woman I spoke with in the museum, not all periods of one’s life are equally hard to remember for someone with dementia. Evidence suggests that events and knowledge stemming from earlier in life are often more easily retrieved than more recent experiences. For example, a person with Alzheimer’s disease may be able to recount a memory of their wedding day, even with some details (eg, my bridal bouquet was lilies of the valley), while not being able to remember a family visit that happened yesterday. This possible sparing of distant memories in dementia has received substantial attention from researchers – and generated competing theoretical explanations.

One of the most prominent accounts derives from the ‘standard model of memory consolidation’, introduced by Larry Squire and Pablo Alvarez. According to this theory, memories are only temporarily dependent on the MTL structures in the brain. After a period of years, a gradual reorganization takes place, and memories that were originally dependent on those structures are instead stored elsewhere in the brain, such as in the neocortex. This account provides a possible explanation of the relative sparing of distant memories in Alzheimer’s disease: since memories of the remote past have been consolidated elsewhere in the brain, they are less affected by damage to the MTL than are memories of recent events.

An alternative model, originally called the ‘multiple trace theory’, was introduced by Morris Moscovitch and Lynn Nadel. According to this framework, the expression of detailed, perceptually rich memories of specific events – which are known as episodic memories – always requires the involvement of MTL structures. Given the degeneration of these structures in Alzheimer’s disease, patients should show no relative sparing of episodic memories of long-ago events. In cases where patients nonetheless report memories for such events, this could reflect a transformation of the memory to a more abstract and gist-like (or semantic) form – for example, a frequently recalled memory of one’s wedding day with few, if any, concrete details.

To test theories such as these and better understand autobiographical memory in Alzheimer’s disease and other brain disorders, researchers have used structured interview procedures, in which participants are asked to produce memories from a series of specific periods in their life. One of these methods, the Autobiographical Memory Interview, asks for a number of memories and facts from each of three life periods – childhood, early adult life and recent adult life. Another method called the Autobiographical Interview asks for one memory from each of five time periods, and every memory is assessed for the amount of details generated.

Findings from these methods are mixed. Some studies show a progressively increasing frequency and elaboration of memories as they stretch further back into the past – consistent with predictions derived from the standard model of memory consolidation. Others show a flat distribution, consistent with predictions derived from the multiple trace view; it appears that events from any period of the lifetime are equally well (or poorly) remembered. How should these discrepancies be resolved?

Both of the two frameworks I have described center on how memories are consolidated in the brain. A third and alternative approach takes a different starting point, looking at how autobiographical memories are distributed across the lifespan when people are asked to freely talk about their past.

Research with healthy adults has shown a distinct pattern: there is an increased frequency of memories recalled from late childhood to early adulthood, relative to the surrounding periods. If you chart the number of memories that people recall from different parts of the lifespan, the memories from late childhood and early adulthood form a ‘bump’ on the curve. This phenomenon, known as the reminiscence bump, was first identified by David C Rubin and colleagues in the 1980s

It has been replicated many times using different ways of sampling memories. The reminiscence bump has also been found in memories for semantic information, such as memories for public events or memorable books. Rather than pointing to purely biological mechanisms, most theories attribute the reminiscence bump to social and cultural factors, such as the formation of adult identity during early adulthood, or cultural norms attaching greater importance to this period in life, compared with other periods.

There is accumulating evidence that the reminiscence bump also appears in people with Alzheimer’s disease. The first study to find evidence of this asked older adults with Alzheimer’s disease and healthy older adults to talk about the events that had been important in their life in a 15-minute, open-ended narrative. After they had told their story, participants dated the events they had mentioned. Not surprisingly, participants with Alzheimer’s disease produced fewer memories overall from all life periods. However, they demonstrated a peak of memories in adolescence and early adulthood.

A reminiscence bump has also been found when Alzheimer’s patients are asked to retrieve autobiographical memories in association to word cues, and during free conversations about the past. These findings may help to clarify the nature of memory loss in those affected. For people with this disease, it is likely that autobiographical memories are neither equally impaired across the lifespan, nor that they simply worsen progressively – going from one’s earliest years to the present – but that they instead follow the irregular shape of the reminiscence bump.

Methodological differences likely explain why standard structured interviews typically have not identified a reminiscence bump, since they probe only a few time periods, or ask for only one memory per time period, which may not be sufficiently sensitive to detect a bump. More open-ended methods enable the detection of a reminiscence bump by allowing for a more granular charting of memories across the lifespan.

However, a critical limitation of methods such as open-ended life stories is exactly that they allow participants to sample their own memories freely. How do we know that the tendency for memories to cluster in late childhood and early adulthood doesn’t just reflect a preference to talk about those ‘good old days’, rather than an actual inability to remember other times?

To address this concern and continue pursuing the hypothesis of a reminiscence bump in Alzheimer’s disease, my colleagues and I conducted a recent study on the topic. Our study used the Autobiographical Memory Interview, but we expanded the number of lifetime periods we would ask about from the standard three to seven. In this way, we ensured that participants tried to recall memories from periods across the whole lifespan, while the increased number of periods would allow for the detection of a reminiscence bump if there was one.

As expected, the distribution of memories for events produced by Alzheimer’s patients showed a dominance of autobiographical memories from ages six to 30, followed by a steep drop in memory referring to events that had occurred after age 30. These results provide corroborating evidence of a reminiscence bump in people with Alzheimer’s disease.

Although this finding raises a number of questions that call for more research, it also has important implications. It suggests that the relative sparing of older memories in dementia is not solely an effect of biological mechanisms related to memory consolidation in the brain. Like the reminiscence bump observed in healthy older adults, it likely also reflects social and cultural mechanisms, such as identity formation in young adulthood and culturally sanctioned age norms. These norms, known as cultural life scripts, include things such as knowing when in life a person is expected to get married, have their first child, or settle on a career, and they affect how we encode, rehearse and search for our memories.

Work with both older and younger adults has shown that such schematized expectations over-represent events in young adulthood, in line with the reminiscence bump. Recent research has also shown that cultural life scripts are largely preserved in older adults with mild to moderate Alzheimer’s disease, suggesting that they may indeed guide memory retrieval, and thus at least partly account for the bump.

The findings also have implications for therapeutic interventions that aim to revive autobiographical memory in people with dementia by using prompts such as music, photos, objects, movie clips and the like. Given the findings, it is important that such prompts are selected to make connections to the time covered by the reminiscence bump – roughly the time before the person turned 30. Choosing prompts from this salient period of life could maximize the likelihood that they will match memories that are still accessible. ~

https://psyche.co/ideas/how-memories-of-late-childhood-and-early-adulthood-resist-alzheimers?utm_source=Psyche+Magazine&utm_campaign=e7dd6df194-EMAIL_CAMPAIGN_2023_04_26&utm_medium=email&utm_term=0_-a9a3bdf830-%5BLIST_EMAIL_ID%5D


*
SCIENTISTS EXTENDS WORM LIFESPAN 500%

~ A new study shows that altering two cellular pathways in a species of roundworm can extend lifespan by a staggering 500 percent. The discovery could help scientists develop anti-aging therapies for humans, considering that humans have the same cellular pathways featured in the research.

Scientists have spent decades trying to solve the mysteries of aging by experimenting on a tiny nematode species called C. elegans. These microscopic roundworms are ideal for aging research because they live for only two to three weeks, meaning researchers are quickly able to distinguish which alterations or mutations are related to lifespan. In 1993, a famous paper revealed that C. elegans with a specific single-gene mutation lived twice as long as roundworms without it. This discovery helped to spawn a new era of research on aging.

The new study, published in Cell Reports, shows that
altering the insulin signaling (IIS) and TOR pathways yields a lifespan extension of about 500 percent. This surprised the researchers. After all, past research on the ISS and TOR pathways shows that altering them (through a process called gene knockdown) usually yields a 100 percent and 30 percent lifespan increase, respectively. So, they thought that altering them together would boost lifespan by 130 percent. But the effect was greater than the sum of its parts.

“The synergistic extension is really wild,” Jarod A. Rollins, Ph.D., who is the lead author with Jianfeng Lan, Ph.D., of Nanjing University, told Phys.org. “The effect isn’t one plus one equals two, it’s one plus one equals five. Our findings demonstrate that nothing in nature exists in a vacuum; in order to develop the most effective anti-aging treatments we have to look at longevity networks rather than individual pathways.”

The findings suggest that future anti-aging therapies might involve a combination of treatments, similar to how combination treatments are sometimes used for cancer and HIV.

Scientists have so far failed to pinpoint a specific gene that explains why some humans live mostly disease-free into old age. Why? In addition to environmental factors that affect aging and health, the answer might be that aging is primarily regulated not by single genes, but by a so-called “longevity network,” comprised of seemingly unrelated systems in the body. For years, scientists have been trying to demystify the aging process by mapping out possible connections within the longevity network. The new study suggests that scientists are beginning to understand a bit of how this complex network operates.

Specifically, the new study focuses on the role that mitochondria, which are organelles that generate chemical energy in cells, might play in the longevity network. Recent research suggests that mitochondria may play a key role in the aging process, as described in a 2017 overview published in the journal Genes:

“Among diverse factors that contribute to human aging, the mitochondrial dysfunction has emerged as one of the key hallmarks of aging process and is linked to the development of numerous age-related pathologies including metabolic syndrome, neurodegenerative disorders, cardiovascular diseases and cancer.”

It’s unclear what effect manipulating the ISS and TOR pathways might have for humans. But a growing body of research suggests that promoting mitochondrial health could be a reliable way for us to increase lifespan. Interestingly, another recent aging study found that putting C. elegans on an intermittent-fasting diet helped to keep the roundworms’ mitochondria in a “youthful” state, which seemed to extend lifespan.

Low-energy conditions such as dietary restriction and intermittent fasting have previously been shown to promote healthy aging. Understanding why this is the case is a crucial step towards being able to harness the benefits therapeutically,” Heather Weir, lead author of the study, told Harvard News. “Our findings open up new avenues in the search for therapeutic strategies that will reduce our likelihood of developing age-related diseases as we get older.”

https://mdibl.org/in-the-media/biologists-extend-worm-lifespan-by-500-in-surprising-discovery-on-aging/


*
THE PATIENTS WHO REGRET LASER EYE SURGERY: ‘MY LIFE’S STOOD STILL SINCE THEN’

Surgeons view Lasik as routine, but patient advocates and some experts say the complication rate is far higher than reported



~ Until last year, Robin Kyle Reeves lived an active life in Laurel Hill, Florida. She made lace gowns for children to wear during baptisms or family portraits. It was intricate work that requires precision, and Reeves’ glasses kept getting in the way. So her doctor recommended Lasik.

The procedure, which uses lasers to cut in and reshape a patient’s eye, was billed as simple and quick, usually done in under 30 minutes. “It was supposed to be zip, zap, and within a couple of weeks you’re healed and life goes on,” Reeves said. “But my life has stood still since July 12 of last year.”

According to Reeves, the procedure left debris behind her corneal flap, which ruined her eyesight and causes double vision, intense migraines and eye strain. She finds it impossible to stare at screens for an extended period of time, and can no longer enjoy her hobbies. She quit her job and had to repay deposits when she realized she could no longer focus on sewing.

“It puts a big dent on our household income,” Reeves said. “My head hurts all the time, and I can’t do normal activities. Simple things, like reading a box of mac and cheese, or putting on the same makeup I’ve applied for 40 years – I just can’t do that.”

Reeves is one of the 500,000 Americans who undergo Lasik every year to correct their vision (about 100,000 people in the UK have the surgery annually). Surgeons who perform Lasik view it as routine, touting surveys promising a customer satisfaction rate of 90% to 95%. Surgeons who perform Lasik must have the standard board certification in ophthalmology and it is recommended that a patient choose one with a one-year accredited fellowship in refractive and cornea surgery. That extra step is not necessary to perform Lasik, but specialists who have it are more likely to get referrals from other generalists.

The American Refractive Surgery Council says the procedure’s complication rate is less than 1% (though 30% of people may see short-term side effects like dry eyes). Doctors also say that using new lasers significantly decreases complications, compared with the older-generation models that were used in the early 2000s.

But patient advocates and some experts say that is not the full picture. Dr Morris Waxler, a retired FDA adviser who voted to approve Lasik in the 1990s, is now one of its biggest critics. He says he regrets his role in bringing the procedure to the public.

According to his own analysis of industry data, the complication rate of Lasik falls between 10% and 30%. One investigation of an FDA database by the reporter Jace Larson found more than 700 complaints of severe pain, described as “worse than childbirth” or as if “their eyeballs would stick to their eyelids almost every night”.

Last year, the FDA released draft guidance telling doctors that prospective patients should be warned they might be left with double vision, dry eyes, difficulty driving at night, and persistent eye pain.

In one clinical trial from 2017, the FDA found that nearly half of participants reported experiencing “new visual symptoms” after undergoing surgery. Those effects can show up as the presence of glare, halos, or starbursts, especially at night.

The guidance is not final; a spokesperson for the FDA told the Guardian that the agency was currently in the process of considering hundreds of public comments submitted on the draft guidance. Those comments include strongly worded rebuttals from ophthalmologists who say the procedure changes lives for the better, and regretful patients who wish they had never gone through with it.

If the FDA ends up issuing this draft guidance as final, it will provide a recommendation – not enforced – on how surgeons should inform patients of potential risks.

According to the draft, doctors would be advised to share with their clients that their corneal nerves “may never fully recover, resulting in dry eyes and/or chronic pain”, and that there have been reports of some patients who have experienced depression or suicidality they believe to be a result of the fallout from Lasik. (The FDA notes that “a definitive causal link between Lasik and these reported psychological harms has not been established”.)

Paula Cofer, from Tampa, Florida, who testified in front of the FDA in 2008 and 2018, said her experience had started with visual symptoms. Cofer paid about $1,000 in 2000 for the procedure, which she knew next to nothing about at the time, other than that it was her ticket to a life in which she no longer needed glasses.

But she started to notice complications almost immediately: the first night she went out and looked at the moon, Cofer saw eight overlapping circles smeared with a “ghastly” halo around it. “It looks like something out of a horror movie,” she said. Now, she lives with severe dry eye and bad night vision. She owns four pairs of glasses to make up for the eyesight she’s lost.

Cofer is one of the loudest critics of Lasik. She runs a Facebook support group with over 8,000 members who swap stories of their post-op ailments. “There is an epidemic of Lasik complications,” Cofer said. A number of people on the group claim Lasik has led to them having severe mental health issues.

In 2019, a Florida car dealership comptroller named Gloria McConnell asked her eye doctor for a new glasses prescription. According to her son Kingsly Alec McConnell, she had recently undergone Lasik surgery, which had fixed her problems seeing from a distance. But she still dealt with farsightedness, and thought a pair of readers might help. During the appointment, McConnell’s doctor talked her out of the idea, and she decided to try one more attempt at Lasik to fix everything.

Four years later, after experiencing debilitating complications from the procedure that left her unable to leave bed most days, McConnell died by suicide at the age of 60. Her son said that in a note she left to her family, McConnell wrote that the pain of the botched Lasik surgery had contributed her decision to end her life.

Kingsly describes his mother as fun and youthful before she had the surgeries.

The complications crept into McConnell’s life a few weeks after her surgery, but she tried to stay positive. Still, as things continued to grow worse, she became a shut-in, Kingsly said.
Her main issue was chronic dry eyes, to the point that she told people it felt like her lids were burning. She also suffered from corneal neuralgia, which is caused by damage to the nerves of the cornea. She had mites and ingrown hairs in her eyelashes and inflammation of her eyelids and spent most of her day lying in bed with her eyes closed.

“The pain affected her whole head,” her son said. “She did not take her life randomly or in the heat of the moment. In a way, it was a rational choice for her: why would you live a life so full of suffering?”

McConnell submitted a comment on the FDA’s draft guidance in November. “[Lasik] has destroyed my life,” she wrote. “My doctor told me I was the perfect candidate for Lasik and never talk[ed] at all about the risk … please help people like me.”

Would a stronger warning from doctors help reduce complications from Lasik? Waxler, the former FDA adviser, thinks that the FDA’s proposed guidelines are “very mild”.

“After 30 years, the FDA has finally decided that maybe they should require refractive surgeons and manufacturers to tell their customers a little more about the downsides of Lasik,” he said. “If surgeons told people of all of the possibilities of getting complications, they wouldn’t have any customers.”

Dr Cynthia MacKay, a retired clinical professor of ophthalmology at Columbia University medical school, worked alongside Stephen Trokel, who was the first ophthalmologist to recognize the significance of the laser used in corneal refractive surgery.

“I thought it would never catch on,” MacKay said. “If you slice into the cornea to change its shape, you’re going to cut through all of the nerves that feed the cornea and keep it healthy, which will result in terrible pain,” she said.

Giving patients consent forms, as the FDA’s draft guidance recommends, is not enough for MacKay. She has worked as an expert witness in Lasik malpractice cases and has seen shady behavior from surgeons who give patients such forms right before their procedure, when they’re under anesthetic and trying to read the paperwork while their pupils are dilated.

“I think Lasik should be banned,” MacKay said. “It’s a public health hazard. There’s an epidemic of pain and blindness all over the world [because of this procedure].”

Most ophthalmologists who perform Lasik say that is simply not the case. The surgery is not without complications, but it is rare for any extreme issues to arise. One paper has found that the majority of Lasik recipients were happy with their results, with only 1.2% reporting dissatisfaction. (But as the New York Times reported, most studies are written by surgeons who perform the procedure themselves and may be biased.)

“I had Lasik on my own eyes about 20 years ago and it was one of the best things I have ever done for myself,” said Dr Sidney Gicheru a spokesperson for the American Academy of Ophthalmology and medical director of LaserCare Eye Center in Dallas.

“A large majority of people who have had Lasik report being satisfied with their improved vision and the ease they now enjoy in their day-to-day lives,” Gicheru added.

All of which only adds to the confusion for prospective patients. Will Lasik change your life for the better, or worse? Concerned perspective patients may find themselves sifting through opposing data or online forums before making their decision.

Gicheru, who performs the procedure, said it helps to know whether or not a patient will be a “good candidate”. It is not for everyone. There are a few boxes you have to check: being over 18, with an eye prescription that has not changed in the last year. Patients who experience severe dry eye, corneal disease, advanced glaucoma, or diabetes that is not controlled well should seek other options. Those who live with astigmatism or are near or farsighted should also consult their ophthalmologists first, and may see better outcomes with other forms of surgery.

Dr Edward Manche, a professor of ophthalmology at Stanford University, recommends that patients seek out one or two opinions from surgeons before they decide to go through with the procedure and advised that patients stay away from clinics that overly advertise Lasik. “The vast majority of centers that do the surgery are extremely ethical and try to do the right thing,” he said. “But if a center feels like it’s giving you a sales pitch, and seems more like it’s doing business rather than looking out for your best interest, that’s a red flag.”

Reeves, one of the Facebook group members, no longer drives a car. When she needs to go somewhere, she enlists a family member to get behind the wheel. “I can’t take myself to do something as simple as getting bread, milk, cheese, or eggs,” she said.

Shortly after her botched surgery, Reeves returned to the clinic for an appointment to discuss her complications. The doctor told her he could try another round of Lasik, but she refused. Reeves still remembers sitting in the waiting room of the clinic, watching a revolving door of patients go in and out for their procedures.

“It took everything in my being to sit there, be quiet, and not tell them, ‘This is not an easy, in-and-out thing the way they make it out to be,’” Reeves said. “I wanted to say, ‘This could change your life for ever.’” ~

https://www.theguardian.com/us-news/2023/apr/18/lasik-laser-eye-surgery?utm_source=pocket-newtab


Oriana:

I came very close to having a LASIC eye surgery. It was already scheduled. But then, in the nick of time, I came across an article in a professional journal on “irreversible corneal scarring after LASIK.” I called off the procedure.

This was before the Internet. To this day I bless myself for having had the will and endurance to go to UC Biomedical Library and do old-style research.

The funny thing is that my optometrist tells me I have pretty good vision. I am near-sighted, so I have “built-in reading glasses.” And I feel blessed.

*
ending on beauty:

I need my secret place the Upper Peninsula near Lake Superior, my dark thicket covered by winter. It is night in there but I can watch passing animals, a deer, bear, even possums, which I love for their humility. The thicket is flooded with birds, a few inches from my good eye. Saint Francis would love this thicket. Maybe I’ll take him there someday. And best of all a stump in a gully that I can crawl into and sit up. My place of grace on earth, my only church. The gods live there.

~ Jim Harrison, Hospital, from Dead Man’s Float