Saturday, January 20, 2018


Indra. Note Ganesha the Elephant God in the bottom part. It's such an exhilarating image. Imagine if instead of the crucifix we had a dancing god.



hangs a curtain of pearls
threaded with infinite skill:
each pearl reflects every other pearl,
suspended in the moon gleam.

We too are interlaced
more than we dare believe.
We dream of heaven
because we have known hell. 

My mother, already unconscious,
lifted her arm and reached out
as if to lace her hand with the hand
of someone waiting on the other side.

Then she went into that love.

~ Oriana

I knew that gesture so well. My parents used to hike a lot. My father would be the first one to cross a stream, then wait for my mother to catch up. Then he’d stretch out his arm to her, and she’d take his hand before crossing.

~ “Did a secret society bring about the French Revolution? In the classic fictional version of this widely believed conspiracy theory, Alexandre Dumas’s novel “Joseph Balsamo,” a Masonic society known as the Illuminati gather in a ruined castle in 1770 and plot the overthrow of the French monarchy. Their leader, called the “Great Copt,” speaks of the day when “the monarchy is dead…religious domination is despised…social inferiority is extinguished.”

Dumas would have found a great deal to appreciate in Jonathan Israel’s Revolutionary Ideas. Israel, a much-respected professor of history at the Institute for Advanced Study in Princeton, does not present the French Revolution of 1789 as the result of a literal conspiracy. But he repeatedly characterizes it as the work of a “small minority” or “unrepresentative fringe” of disaffected Frenchmen who, in his view, consciously and deliberately sought to bring about the greatest political upheaval the Western world had ever seen. Israel does not contend that they belonged to a secret society. But he does argue that they shared a common creed, which they acted deliberately to realize. It is very much the same creed outlined by the Great Copt, although Israel would add sexual and racial inequality to the list of injustices his heroes sought to overthrow.

Rising literacy rates, declining patterns of religious observance, and a consumer revolution that put books within the reach of millions do not concern him. He takes no interest in the common people’s culture, and never considers the possibility that they might have conceived and articulated revolutionary political ideas on their own. “Most ordinary folks did not read their books and would have scarcely understood them had they tried.” . . . Only ideas matter for understanding how the Revolution came about, and what course it took. A particular set of ideas was its “sole fundamental cause,” and conflicts over these ideas drove it forward.

The proponents of “radical Enlightenment” were not only atheists, but also democrats, social egalitarians, feminists, advocates of complete religious toleration, and even, for the most adventurous among them, believers in sexual toleration . . . ideas in clandestine books and pamphlets, mostly printed in the Netherlands, that subsequently reached a wide European audience.

But history does not have the neatness, or the moral clarity, of conspiracy fiction. There was no Great Copt plotting out the events of the French Revolution and driving it forward. And, alas, there was no unified, coherent radical Enlightenment either — at least not as Jonathan Israel has imagined it.” ~


Interesting that the struggle of Enlightenment, the foundation of modern democracy and egalitarianism, is far from over . . . Ideas can be very powerful, but there was also the rising price of bread. Still, I appreciated the reminder that the Netherlands played an important role as the country of tolerance and printing presses that spread the forbidden Enlightenment ideas. It’s not the Revolution I admire (and especially not the Reign of Terror), but the visionary minds that dared imagine a different social order.

And the ideas lived on, though the opposition to them is still fierce (note our “culture wars”), even murderous (ISIS and other terrorists). As Sam Harris says, this is the great story of our time. Will the guiding idea, the idea of human rights, prevail once and for all? We still can't be sure, though there has been steady progress — with vehement authoritarian backlash always on its heels.

The Death of Marat, David, 1793

VIA NEGATIVA TO HAPPINESS (or at least contentment)

~ “British journalist Oliver Burkeman is the author of The Antidote: Happiness for People Who Can't Stand Positive Thinking. He believes that the negative path is the one, if not to happiness, then to fulfillment. His brilliant analysis of what’s wrong with that “happiness industry” shows the limitations of spending your mental energy on such staples of the self-help guides as positive imagery, getting yourself motivated(!!!), and dousing your mind of all thoughts that you could possibly fail at your life’s most cherished goals.

On the contrary, he advises thinking about the some of the very worst outcomes you could possibly imagine, including your own demise. Instead of trying to rid your mind of all negative imagery, he advocates embracing it, watching the negative thoughts drift in and out of your consciousness without trying to drown them out.

Burkeman strips his message down to its roots in Stoic philosophy which, as he argues, forms the basis for modern cognitive behavioral therapy. By this he means that the Stoics of ancient Greece believed that our emotions are determined by our judgments—or, as Hamlet said, “There is nothing either good or bad but thinking makes it so.” An event, in and of itself, has no emotional meaning. It’s what we make of it that determines how we feel. Stoics could observe events without judging their inherent goodness or badness and, as a result, accept these experiences on their own terms. Things happen and it’s up to us to decide how to interpret what these things mean and how they ultimately will affect us.” ~

A little more on the via negativa to happiness:

~ "Here's the word that will change your life," Schuller tells his audience. After a dramatic pause he yells out, "Cut! … Cut the word 'impossible' from your life.... Cut it out forever!"

A few months later Schuller, the ringmaster of this failure-is-not-an-option lovefest, declares his Crystal Cathedral bankrupt.

Accept the idea that you will inevitably die. Learn to celebrate your failures. See the wisdom in your pessimistic thoughts. Burkeman writes that "the effort to try and feel happy is often precisely the thing that makes us miserable." He argues that "it is our constant efforts to eliminate the negative — insecurity, uncertainty, failure, or sadness — that is what causes us to feel so insecure, anxious, uncertain, or unhappy."

Using the example of the disasters that have befallen many who have tried to climb Mt. Everest — the ultimate type-A personality goal — Burkeman shows persuasively that "goal setting" as a path to success is a fallacy.

Countless books relate the triumphs of the adventurers and the corporate executives who set ambitious goals for themselves — and who take risks in the relentless pursuit of those goals. What those books don't tell us is that the leaders responsible for the world's most spectacular failures possess exactly the same qualities. It's a simple insight, but a powerful one.” ~

(Alas, the link has expired, but I'm pretty sure that the author is Susan Krauss Whitbourne, who frequently posts in Psychology Today and Huffington Post)


Yes, the Mt. Everest trail is by now strewn with the frozen corpses of extremely motivated people who refused to “think negative.”

It took me a long time to find out that, except for short-term goals, I change too fast and know too little to "visualize an ideal future self," as so many self-help books recommend. My vision of my future self at 22 was as a psychology professor! Good grief! (I was beginning to discover myself as a writer as an undergraduate, but got discouraged by a person I mistook for a mentor; I became a poet and writer only in my mid-thirties.)

At this point I am very aware that “the stage of life rules.” At the very least, it’s enormously important. The future that stretches ahead when we are in our twenties begins to shrink . . .  until it’s no longer pleasant to contemplate any future self. Of course later life is no longer as dismal as presented by Shakespeare in the famous “Seven Ages of Man” speech in As You Like It. And yet . . . friends now tell me that the fastest way to get themselves depressed is to think ten years ahead. Past a certain age you don’t dare do that. “At my age, you don’t even buy green bananas,” Maggie Smith says in “The Best Marigold Hotel.”

But the paradox is that without those fantasies about the radiant future, life actually gets better. The whole world becomes enlivened as we pay attention to what IS, not what should be. By letting go, not trying to manipulate things but just letting them be, we discover how endlessly surprising and interesting reality actually is.


~ “In his most famous essay, “The Myth of Sisyphus,” Camus made the point that Sisyphus stands for all humanity, ceaselessly pushing our rock up a steep hill, only to have it roll back down again. Over and over, ceaselessly, remorselessly, always striving but never succeeding, if only because ultimately everyone dies and his or her personal boulder rolls back down. Gravity always wins.

Camus nonetheless concludes his essay with the stunning announcement that “One must imagine Sisyphus happy,” because he accepts this reality, defining himself—achieving meaning—within its constraints. Camus’ stance is that meaning is not conveyed by life itself but must be imposed upon it.

Camus is the existential thinker most associated with the “life is absurd” characterization of the human condition. Often misunderstood, he felt that this absurdity didn’t reside in life itself, but in something uniquely human, namely the peculiar relationship (which he called a “divorce”) between the human need for ultimate meaning and the “unreasonable silence” of the world. For Camus, neither human existence nor the universe is inherently absurd, but rather the relationship between the two, whereby people seek something of the universe that it fails to deliver.

"The greatest mystery," according to André Malraux, whose work Camus greatly admired, "is not that we have been flung at random among the profusion of the earth and the galaxy of the stars, but that in this prison we can fashion images of ourselves sufficiently powerful to deny our nothingness.” Denying our nothingness isn’t what Camus proposed; rather, he urged something closer to accepting our nothingness and pushing on nonetheless, achieving meaning via meaningful behavior, even though—or rather, especially because—in the long run any action is meaningless.

Probably the greatest such account of people achieving meaning through their deeds is found in Camus’ novel The Plague, which describes events in the Algerian city of Oran during a typhoid epidemic.” ~

Titian: Sisyphus

One of the nurse’s aides I had in the hospital said, “I was a lobotomist in Kentucky for 16 years.” ~ “You were no such thing,” I began to explain, but she interrupted, “I didn’t do anything disgusting, like remove blood from dead people.” 



Stewart Guthrie cites our tendency to see faces and face-like patterns everywhere as an example of interpreting sense data in ways that are relevant to survival. For early humans — as well as modern ones — the most important elements in the environment have tended to be other humans. Other humans are where we get our resources, knowledge, care, affection, vital information, and most other goods. They’re also the sources of most significant threats: physical aggression, social ostracism, bullying, and competition. So it makes sense for our brains to be finely tuned to over-perceive human agents in our environment.

For Guthrie, “perceptual uncertainty is chronic.” That is, it’s hard to always know for certain what we’re seeing or experiencing, and even harder to know what might be causing that experience; our senses are unreliable, and our ability to tell causal stories about the world even more so. This unreliability, combined with our human-oriented social brains, leads us to err on the side of perceiving events as having been caused by humans or human-like agents: the leaves didn’t move because of the wind, but because someone moved them.

Plenty of research over the past decades has suggested that humans are intuitive mind/body dualists, sensing at a gut level that our minds are somehow separate from, and independent of, our bodies. Guthrie’s not talking here about metaphysical, Cartesian-style dualism; instead, he’s referencing our general tendency to feel that emotions, inner states, dreams, and so forth belong to our “minds,” while physical sensations and actions belong to our bodies. This mind-body dualism allows us to perceive minds in places where there are no bodies: for example, in wind that blows our hats off, or in the gurgling of water in a stream.

Finally, our minds are constantly making the anthropomorphic equivalent of Pascal’s wager: “betting” that it’s most worthwhile to use models of human agency in interpreting perceptions. If we’re wrong, we don’t lose much: a moment of distraction. But if we’re wrong in the other direction, we stand to lose a lot: we could get ambushed, killed, or excluded from social relationships. So when choosing which models to apply to our perceptual experience, we tend to err on the side of choosing the model labeled “human mind.”

Guthrie’s model, when added up, presents a picture of humans as intelligent, socially aware animals whose evolutionary history has pressured us to be hyperalert to signals of agency and intelligence in our environment. Since we’re intuitive dualists, this intelligence doesn’t always have to be connected to a body, which means our minds are free to detect agency in the shapes of clouds, in meaningful coincidences, and in experiences we interpret as answered prayers. Together, these proclivities lay the cognitive foundation for the universal human tendency to believe in gods and spirits – the core of religion, according to most cognitive scientists of religion.

An interesting consequence of Guthrie’s theory — which in the years since 1993 has become almost universally accepted among CSR researchers — is that it may help explain why there’s such an overlap between the autism spectrum and irreligion. People with autism-spectrum disorders are generally less socially oriented than neurotypicals, and tend to be poor with social cues, body language, and imagining others’ mental states. Such people are also often less interested in imaginative play or storytelling as children than their peers. Together, these traits make it unsurprising that autistics tend to anthropomorphize less than neurotypicals.

On the other hand, many autistics are high systemizers, showing interest in impersonal systems with regular, predictable features. Interestingly, there’s evidence to suggest that the networks of the brain that underlie systemizing thought are distinct from, and may even inhibit, those that drive social cognition. Guthrie points out that the brain’s so-called “default mode network” is likely oriented toward social events and relationships. The fact that dozens of brain-imaging studies have found this network to light up when subjects had no tasks to attend to implies that, whenever humans aren’t actively engaged in a focused task, they tend to revert to daydreaming about what matters most: other humans. (Anecdotal corroboration: this is certainly true for me, for example when I win my recent arguments in the shower.)

So if some people tend to be higher systemizers, and to use social cognition less than most folks, then according to Guthrie’s theory you’d expect such people to be less likely to anthropomorphize, and therefore be less religious than average.* And, in fact, this is exactly what studies have found: people on the autism spectrum tend to be less religious than normal.

The fact remains that individualism and low levels of interest in personal relationships are two of the best predictors of religious nonbelief. So Guthrie’s theory may not be all-encompassing, but it certainly sheds light on many of the basic features of the religious landscape. Religion may not be exclusively social. It may not be solely our brains’ tendency to anthropomorphize reality. But there is something deeply social and anthropomorphic about much of what we call “religion,” and Guthrie’s lifetime of work forces us to take that fact seriously.


The broader pattern here is that we have evolved to see patterns even where there are none, to connect the dots. For instance, the belief in cosmic justice (“the just universe”) is our default setting — it takes skeptical thinking to see randomness and coincidences.

It takes a cognitive effort to see that much depends on mere chance, though we can make the best of it. And we can still reject an immoral, outdated religion, and venture to find and/or create our own journey.

For me the article becomes more interesting in the second part, when it gets to autism and individualism being associated with less religiosity. The third trait is being male. Men tend to be less religious — this was found already in surveys going back to the 1930. However, women who work outside the home are more similar to men in that they are less interested in religion.

At the same time, science has chipped away at the anthropomorphizing of nature by supplying natural explanations for various natural phenomena. Thus, we know that storms and earthquakes and volcanic eruptions are not caused by angry deities that need to be appeased by animal or human sacrifice and/or rituals and prayers. It’s not witches or demons that make us sick or sour the milk.

Finally, technology and medicine have made us feel less helpless. When we are sick, we are hoping for the most competent physician, not the most pious one.

The irony here is that the less we need religion, the better we understand why our ancestors did. In my childhood I was puzzled why the most devout churchgoers were elderly women. Now it seems pretty obvious.

But it still seems puzzling that a do-nothing god, a god that can’t even say Hi, is constantly being asked for complicated favors that would violate the laws of physics on behalf of a single "undeserving" (you need to appear "humble" while asking) individual.

Doré: Beatrice gives Dante a tour of heaven


~ “I drove around some PA farm country yesterday. Don Draper could have been born in any of the houses—“I’m a whore child, ain’t you heard?”—and people actually still talk like his family did. You hear them talking at restaurants etc. “Communists do have souls, but they can’t get into heaven.” Who can’t get into heaven is a big topic around here. Sometimes I get why dumb hipster kids want to wear pro-Soviet T-shirts—especially if they live in a place like this.” ~ RLB (I have a reason for using only the initials, but would rather not go into it).

Oriana: “Who can’t get into heaven is a big topic around here.” Some hold the view that cats can’t, but “good dogs” can. Alas, this is just the type of discussion that religion tends to generate — all tangled up in unreality, absurdity, and judgmentalism.

By the way, I have nothing against Pennsylvania per se; a scenic state, a lovely place to visit. Besides, as someone said to me, “As you get deeper into inland California, believe me, it’s not that different from rural Alabama.” I have indeed spent some time in inland California, as well as in Arkansas — and while Arkansas, with its tent revivals, still struck me as being ahead in fundamentalism, I know the basic truth of that statement. 

Pennsylvania, stone bridge over the Susquehanna; Kerry Shawn Keys

~ “Damage from extreme weather events during 2017 racked up the biggest-ever bills for the U.S. Most of these events involved conditions that align intuitively with global warming: heat records, drought, wildfires, coastal flooding, hurricane damage and heavy rainfall.

Paradoxical, though, are possible ties between climate change and the recent spate of frigid weeks in eastern North America. A very new and “hot topic” in climate change research is the notion that rapid warming and wholesale melting of the Arctic may be playing a role in causing persistent cold spells.

It doesn’t take a stretch of the imagination to suppose that losing half the Arctic sea-ice cover in only 30 years might be wreaking havoc with the weather, but exactly how is not yet clear. As a research atmospheric scientist, I study how warming in the Arctic is affecting temperature regions around the world. Can we say changes to the Arctic driven by global warming have had a role in the freakish winter weather North America has experienced?

Weird and destructive weather was in the news almost constantly during 2017, and 2018 seems to be following the same script. Most U.S. Easterners shivered their way through the end of 2017 into the New Year, while Westerners longed for rain to dampen parched soils and extinguish wildfires. Blizzards have plagued the Eastern Seaboard — notably the “bomb cyclone” storm on Jan. 4, 2018 – while California’s Sierra Nevada stand nearly bare of snow.

This story is becoming a familiar one, as similar conditions have played out in four of the past five winters. A warm, dry western North America occurring in combination with a cold, snowy east is not unusual, but the prevalence and persistence of this pattern in recent years have piqued the interest of climate researchers.

The jet stream – a fast, upper-level river of wind that encircles the Northern Hemisphere – plays a critical role. When the jet stream swoops far north and south in a big wave, extreme conditions can result. During the past few weeks, a big swing northward, forming what’s called a “ridge” of persistent atmospheric pressure, persisted off the West Coast along with a deep southward dip, or a “trough,” over the East.

New terms have been coined to describe these stubborn features: “The North American Winter Temperature Dipole,” the “Ridiculously Resilient Ridge” over the West, and the “Terribly Tenacious Trough” in the East.

Regardless what it’s called, this dipole pattern – abnormally high temperatures over much of the West along with chilly conditions in the East – has dominated North American weather in four of the past five winters. January 2017 was a stark exception, when a strong El Niño flipped the ridge-trough pattern, dumping record-breaking rain and snowpack on California while the east enjoyed a mild month.

Two other important features are conspicuous in the dipole temperature pattern: extremely warm temperatures in the Arctic near Alaska and warm ocean temperatures in the eastern Pacific. Several new studies point to these “ingredients” as key to the recent years with a persistent dipole.

The new twist in this story is that the Arctic has been warming at at least double the pace of the rest of the globe, meaning that the difference in temperature between the Arctic and areas farther south has been shrinking. This matters because the north/south temperature difference is one of the main drivers of the jet stream. The jet stream creates the high- and low-pressure systems that dictate our blue skies and storminess while also steering them. Anything that affects the jet stream will also affect our weather.

When ocean temperatures off the West Coast of North America are warmer than normal, as they have been most of the time since winter 2013, the jet stream tends to form a ridge of high pressure along the West Coast, causing storms to be diverted away from California and leaving much of the West high and dry.

If these warm ocean temperatures occur in combination with abnormally warm conditions near Alaska, the extra heat from the Arctic can intensify the ridge, causing it to reach farther northward, become more persistent, and pump even more heat into the region near Alaska. And in recent years, Alaska has experienced periods of record warm temperatures, owing in part to reduced sea ice.

My colleagues and I have called this combination of natural and climate change-related effects “It Takes Two to Tango,” a concept that may help explain the Ridiculously Resilient Ridge observed frequently since 2013. Several new studies support this human-caused boost of a natural pattern, though controversy still exists regarding the mechanisms linking rapid Arctic warming with weather patterns farther south in the mid-latitudes.

More extreme weather ahead?

In response to the strengthened western ridge of atmospheric pressure, the winds of the jet stream usually also form a deeper, stronger trough downstream. Deep troughs act like an open refrigerator door, allowing frigid Arctic air to plunge southward, bringing misery to areas ill-prepared to handle it. Snowstorms in Texas, ice storms in Georgia and chilly snowbirds in Florida can all be blamed on the Terribly Tenacious Trough of December 2017 and January 2018.

Adding icing on the cake is the tendency for so-called “nor’easters,” such as the “bomb cyclone” that struck on Jan. 4, to form along the East Coast when the trough’s southwest winds align along the Atlantic Seaboard. The resulting intense contrast in temperature between the cold land and Gulf Stream-warmed ocean provides the fuel for these ferocious storms.

The big question is whether climate change will make dipole patterns — along with their attendant tendencies to produce extreme weather — more common in the future. The answer is yes and no.

It is widely expected that global warming will produce fewer low-temperature records, a tendency already observed. But it may also be true that cold spells will become more persistent as dipole patterns intensify, a tendency that also seems to be occurring.” ~

IS THERE A PERFECT DIET? (especially during the flu-and-cold season?)

~ "All healthy persons are alike; each unhealthy person is unhealthy in his own way."

If Tolstoy were a diet-and-health blogger, this might be how he would begin.

What differs among the animals is the composition of the digestive tract. Animals have evolved digestive tracts and livers to transform diverse food inputs into the uniform set of nutrients that all need. Herbivores have foregut organs such as rumens or hindgut chambers for fermenting carbohydrates, turning them into fats and volatile acids that can be used to manufacture fats. Carnivores have livers capable of turning protein into glucose and fat.

If diets differ because of digestive tract differences, we should expect the same pattern to recur in humans. All humans have the same nutrient needs, but our optimal food intake may vary if our digestive tracts differ.

In fact there is evidence for variations in digestive tract structure among human populations. The blogger Melissa McEwen has summarized evidence that some populations have slightly larger colons, suggesting a slightly more plant-focused diet, and others have slightly smaller colons, suggesting a more animal-focused diet.

Longer colons allow more fermentation of plant fiber, but they don't dramatically change macronutrient ratios of the diet. Across human populations, the optimal human diet probably doesn't vary in any macronutrient by more than 5% of energy or so.

So there is little support for a "blood type diet" or "metabolic type" with significantly different food needs. All healthy people can and should eat a similar diet - one that approximates to our body's nutrient needs.

Each Unhealthy Person is Unhealthy in his Own Way

People who are malnourished will benefit from getting more of the things they are malnourished in, and perhaps less of others which balance those - as reducing zinc may help someone who is copper deficient, or reducing omega-6 fats may help someone who is omega-3 deficient. People exposed to toxins may benefit from an extra dose of toxin-metabolizing nutrients. People with infections may benefit from diets which starve pathogens of needed nutrients, or which support immune function. People with gut dysbioses may benefit from removing or reducing whole classes of foods - starches, fructose, FODMAPs, fiber, even protein.

Infections can make a big difference in the optimal diet. Ketogenic diets, which starve the brain of glucose but feed it with small molecules derived from fats, are highly effective against bacterial infections of the central nervous system, since bacteria depend on glucose metabolism. But hepatitis B and C viruses can utilize the process of gluconeogenesis—manufacture of glucose from protein—for their own benefit, so people with hepatitis benefit from higher carb diets.

Other pathologies disrupt the ability to handle certain nutrients. Diabetes is characterized by an inability to secrete insulin, and diabetics usually benefit from low-carb diets. Migraines, like epilepsy, may be caused by genetic or other impairments to brain glucose metabolism, and can often be cured by ketogenic diets, as several of our readers have discovered.

With ill health, the optimal diet often changes. Sick people often have to tweak their diet, and the nature of the change varies with the nature of the pathology.

Ketogenic diets are therapeutic for bacterial and viral infections, but can feed protozoa, fungi, and worms (which have mitochondria and can metabolize ketones). Response to a ketogenic diet can help expose the nature of an infectious pathogen.

Because neurons are dependent on glucose or ketones for energy, any pathology which disrupts glucose utilization will cause neuronal starvation, and neurological and psychological distress, which can be relieved by provision of ketones. A well-designed, nourishing ketogenic diet may often ameliorate psychiatric and neurologic disorders.

Dietary tactics can help prevent as well as treat disease. For instance, fasting upregulates autophagy ("self-eating"), the cellular mechanism for recycling damaged or unnecessary components. But autophagy is a central part of the innate immune system; it is how cells destroy invading microbes. Intermittent fasting as a regular practice helps keep the body infection-free, and during intracellular infections refraining from food is often a helpful strategy.

There is no one diet that is perfect for everyone, but that is mainly because not everyone is healthy.” ~


I suspect that clearing out infections is only part of the story. Clearing out incipient cancer cells may be even more important. The improvement in my health when I severely reduced carbs was astonishing. (Warning: you can gain weight on an excessively high-protein diet because it’s child’s play for the liver to convert protein to glucose. Atkins didn’t seem sufficiently aware of it, though he did recommend mostly fats for those who were “weight-loss resistant”.)

Re: dysbioses (think: “irritable bowel” or “leaky gut”). From Dr. Steven Gundry I learned why whole grains used to make me so sick — lectins! Most of them can be destroyed by pressure-cooking, but not the most dangerous lectins, which are found in whole grains (have you ever wondered why Asians thrive on WHITE rice? Lectins are concentrated in the husk). NSAIDs such as ibuprofen also damage the intestines — you will not see a warning anywhere on the label.

Feeding your resistant starch-loving microbiome is ultimately more tricky than feeding yourself.

By the way, ketogenic diet is an opportunity to experience the benefits of “good fats” — olive oil, avocado (a great source of potassium, by the way), fatty fish. Eliminate all sweet fruit — it can make a very nice difference in how fast your health will improve and how much weight you’ll lose (if your main goal is weight loss). 

Finally, the older you get, the more your blood sugar and insulin tend to rise — even if you are not diabetic. Ketogenic diet simulates fasting and calorie restriction — two ancient practices that have been vindicated in modern times as effective for maintaining good health and sharp mind well into old age.
avocado flower opening

ending on beauty:


No longer prefect, this isn’t home anymore.
I planted day lilies and cassia for nothing.

Cassia renowned for enticing us to stay on,
day-lilies never making it sorrow forgotten:

they are a far cry from this riverside moon,
come lingering out farewell step after after.

~ Po Chu-i, tr David Hinton

Saturday, January 13, 2018


Hieronymus Bosch, John of Patmos, 1489. If not for the little black half-scorpion, it would be hard to believe it’s Bosch; note also the crustacean appendages of the angel’s wings. I love the rose-like folds of John’s pink robe.
Even though Bart Ehrman’s compelling view of Jesus as an apocalyptic preacher finished him off for me as an inspiring figure, I can see that the end of the world will happen for each of us. For me it’s a reminder to feast on the world while it exists.


But, Master, what shall I dedicate to you,
who taught all creatures to hear?
My memory of an evening in Russia,
in springtime — a horse . . .

From the village came the white horse alone,
on one front leg the hobble,
to be alone on the meadow at night;
how the mane beat against his neck

to the rhythm of his perfect joy,
in that rudely hindered gallop.
What leaping went on in his stallion-blood!

He felt the distances and he sang and he heard —
your cycle of myths was enclosed in him.
His image: I dedicate.

~ Rainer Maria Rilke, Sonnets to Orpheus, XX, Part I

This poem has a peculiar tension (poems, too, need dramatic tension). The horse, known to us from hundreds of poems, statues, and paintings as an image of strength and thundering speed, is hobbled, dragging an awkward wooden weight that prevents him from full gallop. In spite of that, the horse appears spirited and happy. 

And that, it seems to me, is an image of life: we are hobbled by various circumstances. To be so “rudely hindered,” and yet capable of joy — that’s the condition of those who refuse to be defeated by circumstances (even though, in the end, we must lose to aging and mortality; and yet it need not be called a defeat).

And this is the image that Rilke dedicates to Orpheus, the supreme poet and musician: not an image of a horse running freely, but of a horse that remains joyful in spite of hobbled.

driftwood sculpture by Jeffro Uitto


~ “Well,’ I said, ‘Paris is old, is many centuries. You feel, in Paris, all the time gone by. That isn’t what you feel in New York — ’ He was smiling. I stopped.

‘What do you feel in New York?’ he asked.

‘Perhaps you feel,’ I told him, ‘all the time to come. There’s such power there, everything is in such movement. You can’t help wondering — I can’t help wondering — what it will all be like — many years from now.”

~ James Baldwin, “Giovanni’s Room”

When I first saw Manhattan, I was prepared for the skyscrapers. I wasn’t prepared for the old buildings with water towers and fire escapes, though I must have seen photographs of those as well. “Nobody thinks of New York as an old city,” my hostess remarked. “But it is quite old.”

And the antique elevators in those buildings, and the rather antique and noisy plumbing. What a symphony of noise that old, futuristic city was!

Again a reminder of how complex things are, how in some ways “the empire never ended” (to steal from P.K. Dick), while in other ways, let’s not forget how the mighty are fallen.

Neither a city of the past nor a city of the future, but an astonishing flight of perspectives. 

New York in 1937

We all know that the world is going to hell. Given the rising risk of nuclear war with North Korea, the paralysis in Congress, warfare in Yemen and Syria, atrocities in Myanmar and a president who may be going cuckoo, you might think 2017 was the worst year ever.

But you’d be wrong. In fact, 2017 was probably the very best year in the long history of humanity.

A smaller share of the world’s people were hungry, impoverished or illiterate than at any time before. A smaller proportion of children died than ever before. The proportion disfigured by leprosy, blinded by diseases like trachoma or suffering from other ailments also fell.

We need some perspective as we watch the circus in Washington, hands over our mouths in horror. We journalists focus on bad news — we cover planes that crash, not those that take off — but the backdrop of global progress may be the most important development in our lifetime.

Every day, the number of people around the world living in extreme poverty (less than about $2 a day) goes down by 217,000, according to calculations by Max Roser, an Oxford University economist who runs a website called Our World in Data. Every day, 325,000 more people gain access to electricity. And 300,000 more gain access to clean drinking water.

Readers often assume that because I cover war, poverty and human rights abuses, I must be gloomy, an Eeyore with a pen. But I’m actually upbeat, because I’ve witnessed transformational change.

As recently as the 1960s, a majority of humans had always been illiterate and lived in extreme poverty. Now fewer than 15 percent are illiterate, and fewer than 10 percent live in extreme poverty. In another 15 years, illiteracy and extreme poverty will be mostly gone. After thousands of generations, they are pretty much disappearing on our watch.

Just since 1990, the lives of more than 100 million children have been saved by vaccinations, diarrhea treatment, breast-feeding promotion and other simple steps.

Steven Pinker, the Harvard psychology professor, explores the gains in a terrific book due out next month, “Enlightenment Now,” in which he recounts the progress across a broad array of metrics, from health to wars, the environment to happiness, equal rights to quality of life. “Intellectuals hate progress,” he writes, referring to the reluctance to acknowledge gains, and I know it feels uncomfortable to highlight progress at a time of global threats. But this pessimism is counterproductive and simply empowers the forces of backwardness.

President Trump rode this gloom to the White House. The idea “Make America Great Again” professes a nostalgia for a lost Eden. But really? If that was, say, the 1950s, the U.S. also had segregation, polio and bans on interracial marriage, gay sex and birth control. Most of the world lived under dictatorships, two-thirds of parents had a child die before age 5, and it was a time of nuclear standoffs, of pea soup smog, of frequent wars, of stifling limits on women and of the worst famine in history.

What moment in history would you prefer to live in?

F. Scott Fitzgerald said the test of a first-rate intelligence is the ability to hold two contradictory thoughts at the same time. I suggest these: The world is registering important progress, but it also faces mortal threats. The first belief should empower us to act on the second.

Granted, this column may feel weird to you. Those of us in the columny gig are always bemoaning this or that, and now I’m saying that life is great? That’s because most of the time, quite rightly, we focus on things going wrong. But it’s also important to step back periodically. Professor Roser notes that there was never a headline saying, “The Industrial Revolution Is Happening,” even though that was the most important news of the last 250 years.

I had a visit the other day from Sultana, a young Afghan woman from the Taliban heartland. She had been forced to drop out of elementary school. But her home had internet, so she taught herself English, then algebra and calculus with the help of the Khan Academy, Coursera and EdX websites. Without leaving her house, she moved on to physics and string theory, wrestled with Kant and read The New York Times on the side, and began emailing a distinguished American astrophysicist, Lawrence M. Krauss.

I wrote about Sultana in 2016, and with the help of Professor Krauss and my readers, she is now studying at Arizona State University, taking graduate classes. She’s a reminder of the aphorism that talent is universal, but opportunity is not. The meaning of global progress is that such talent increasingly can flourish.

So, sure, the world is a dangerous mess; I worry in particular about the risk of a war with North Korea. But I also believe in stepping back once a year or so to take note of genuine progress — just as, a year ago, I wrote that 2016 had been the best year in the history of the world, and a year from now I hope to offer similar good news about 2018. The most important thing happening right now is not a Trump tweet, but children’s lives saved and major gains in health, education and human welfare.

Every other day this year, I promise to tear my hair and weep and scream in outrage at all the things going wrong. But today, let’s not miss what’s going right.

New York Times, Opinion, 1-7-2018

from another source:

~ “Declining infectious disease is a major factor behind progress against premature death. The latest global data suggests life expectancy at birth has climbed by 10 years over the past four decades; it now stands at 72 years. The proportion of children who die before the age of five has halved since 1998.

Consider the issue from a slightly different perspective: In 1950, about one in five children died before the age of five. Since the average woman worldwide in 1950 had five children, the typical woman had about a two-thirds chance of losing at least one child. Today, the average woman has 2.5 children and the mortality risk is one in 25, meaning that the average woman now has only a 10 percent chance of experiencing the pain of losing a child.” ~

[the article goes on to list eight other reasons for cautious optimism]


Certainly there is enough danger, violence, grief and loss in our daily news to push us into despair, but the statistics presented in 2017 as “Best year ever” show enormous actual progress in the lives of all human beings, something that only rarely comes up in the news. I have great faith in the capacity of science and technology to address and find solutions that we can’t yet imagine. When I was a college student computers took up a whole floor, and few had access to them. Now most ordinary people in most of the world, not just the most developed countries, but all countries, carry computers in their pockets and have access to world wide connections. Totalitarian regimes, repressive states, can attempt to curtail this kind of access, but not completely successfully, and not for long. We may be surrounded by hurricanes of lies, but I do not think we will be fooled for long, or that great majorities can be easily misled.

So these are both dangerous and exciting times, but in that lies enormous opportunity for real and positive change. I have great hopes.

 A little instance here—I brought my orchids with me when we moved, and it was in the middle of a real cold spell. Some died, some did not appear to have been hurt at all. For this past year I have kept up a faithful watering schedule, noting most had roots that would green up when watered. I lost a few more in the course of the year—but now I have 6 of them “in spike” for the first time since the trauma of the move. They are preparing to bloom again. Humanity is, I think, at least as stubborn and hardy as those orchids.


Steven Pinker, perhaps the most eminent bringer of “good news,” says he puts his hope in “numeracy.” Numbers, if presented clearly enough, repeatedly enough, are bound to have an effect. If the average woman used to have eight children, but now has only two or three, you can’t keep saying that there hasn’t been a change in family size — which has a huge effect on prosperity and practically everything else.

And yes, it’s become more difficult to lie. If Trump tweets that his approval rating among black people has doubled to 16% while sources show that it has fallen to 3%, his disconnect with reality becomes a tad more obvious. Sure, nothing has an effect on his deluded base, but those people become increasingly isolated in their twilight zone.


In art, just as in life, only the patient succeed. You cannot count on instant success. I don’t like to lose, but I’ve learned how to lose. It’s a greater skill than winning. ~ Zbigniew Herbert, tr. Oriana

But I like better the way Kafka put it:

It was because of impatience that [humans] were expelled from Paradise; it is because of impatience that they do not return.” 

Expulsion from the Garden of Eden, Thomas Cole, 1828


~ “In the Thursday meeting in which President Trump complained about "having all these people from shithole countries come here" — and singled out Haiti, El Salvador and Africa as examples — he also added that, "we should have more people from Norway."

In fact there was a time when we did.

From 1870 to 1910 a quarter of Norway's working-age population emigrated, mostly to the United States. You read that right — one-fourth of its workers left the country.

Back then Norway was quite poor. Wages were less than a third of what they were in the United States. And the wave of emigration out of the country quickly benefited those who remained. That's because it reduced the supply of workers in Norway, so those left behind could demand higher wages. And this helped narrow Norway's wage gap with the U.S. by 25 percent over that same 40-year period, putting Norway on the path toward its status today as one of world's most prosperous nations.

It turns out that the immigrants that Norway sent to the U.S. during that great migration wave of the 1870s were its poorest and least educated citizens. Researchers were able to determine this thanks to newly digitized census data from Norway. (Other European countries have embarked on similar efforts but Norway, with only around 2 million residents in its early census data, finished the task first. That has made Norway the go-to nation for researchers of historical economics.)

those who left Norway came from "some of the lowest skilled families. They are coming from either rural areas or they are [the product of fathers holding] lower skilled laborer positions in cities," she says.

And on arrival in the United States these Norwegian immigrants remained at the bottom of the socioeconomic ladder. Compared to immigrants from the 15 other European nations that contributed to this great wave of arrivals, "the Norwegians held the lowest paid occupations in the U.S.," says Boustan.

"They tended to be farm laborers. They were also fishermen. If they were in cities they were just sort of in the manual labor category — what today you would think of as a day laborer."

"So when you look at the people leaving Norway you do pick up quite a bit of evidence of the poor, huddled masses," she says, referring to the famous Emma Lazarus poem at the foot of the Statue of Liberty.

And although their descendants would eventually catch up to the rest of the U.S. population, it took a while. Twenty years after their arrival in the United States, the Norwegian immigrants were still making 14 percent less than native-born workers.

In other words, they shared a lot in common with many of today's immigrants from ... El Salvador, Haiti and Africa.” ~

Norwegian immigrants


There is some similarity here to the effects of Black Death during the Middle Ages. Black Death, having severely reduced European population, was followed by an economic boom since there were more resources for the survivors.


“Over 25% of Nigerian-Americans have a graduate degree, as compared with only about 11% of whites.” ~ New York Times


Today a lot of immigrants are more educated than the locals — but the old prejudices die hard.


~ “ Birgit Schwarz: In my opinion, people have underestimated the notion that Hitler considered himself an artist, in fact, an artistic genius, and that much can be deduced from this self-image, this overheated artist's ego. However, this has hardly played a role in the research to date. That's the starting point, from my perspective, because it can help us gain a better understanding of Hitler as a person, as well as his system of power. Hitler's deluded view of himself as a genius is based on the confused system of thought emerging in the late 19th century, which centered on the idea that a genius — a strong personality who outshone everything else — could do anything and could do anything he pleased.

SPIEGEL: Hitler's relationship with art is well-documented. He earned money with his watercolors and wanted to become a painter. Later he became an insatiable collector, a passion which turned into the most brutal art theft of all time. All of this is well known. What, then, is supposedly incorrect about the current image of Hitler?

Schwarz: There is a widespread view that he was not truly fascinated by art, and that although he collected art and used it to cultivate his image, he then hid it away in basements and mines. Someone like Göring was constantly bragging about his collection, but many believe that Hitler wasn't actually that interested. But it was very deeply ingrained in his personality.

SPIEGEL: What makes you so certain?

Schwarz: The previously underestimated observations of his contemporaries, for one. For example, there was the Italian archeologist and art historian Ranuccio Bianchi Bandinelli, an accomplished expert who was not on Hitler's side. He became one of Italy's great intellectuals after the war. In 1938, Bianchi Bandinelli was asked to play the role of tour guide during one of Hitler's state visits, and Hitler spent hour after hour admiring paintings. According to Bianchi Bandinelli, it was evident in Hitler's body language that he was truly entranced by the art.

SPIEGEL: But Mussolini was simply annoyed by the time Hitler spent looking at art.

Schwarz: Yes, but sources like Bianchi Bandinelli's account show that there is something important missing from our picture of Hitler, something we still need to understand and that hasn't been taken into account until now. In fact, a very different image was built up over decades, namely of Hitler and his fight against so-called degenerate art.

 SPIEGEL: But that too is an important part of his relationship with art.

Schwarz: Of course, and it was probably fueled by real hatred. At the same time, art was very important to him throughout his entire life.

SPIEGEL: Doesn't the perception of Hitler as an artist make him seem less evil?

Schwarz: No. In fact, his love of art led directly into the heart of evil. But neither is it the root of everything else. His fanatical pursuit of his own cause, and his self-image as a genius, contributed to his powers of persuasion and, therefore, his success. Art was part of his life until his last hours, even playing a role in his private will, in which he mentions his collections. This was someone who issued the so-called Nero Decree (Ed's note: Hitler's Nero Decree, issued in March 1945, ordered the destruction of any infrastructure which could be of use to the Allies.) while at the same time making sure art treasures were rescued. But no one is willing to admit to his obsession with art.

This obsession with art was interpreted as nothing but a cultivation of his image and propaganda. When you look at his biography, you understand that art was vitally important to him much earlier, and that he needed it for self-affirmation.

SPIEGEL: Prominent historians, particularly the brilliant Ian Kershaw, see the young Hitler primarily as a failed painter. He wanted to study painting, but he was rejected by the Vienna Academy of Fine Arts twice, in 1907 and 1908. Why don't you accept this interpretation?

Schwarz: Of course, being turned down was a fundamental shock to him. But the Hitler research community believes that he accepted his failure, and that he gave up the artistic world. But in reality he always retained his self-image as an artist and as someone obsessed with art. The rebuff from the academy was probably what prompted him to consider himself a genius.

SPIEGEL: In your opinion, he saw himself as someone who had been underestimated. But where is the difference between "failed" and "underestimated," which is so critical to understanding Hitler?

Schwarz: If he had seen himself as failed, he would have had to abandon his idea of being an artist. That's what Ian Kershaw, for example, claims. And (German historian and Hitler biographer) Joachim Fest didn't take Hitler's self-image as a genius seriously enough. Many believe that Goebbels didn't start consistently referring to Hitler as a genius until later on.

SPIEGEL: And that was indeed the case.

Schwarz: But for Hitler it was more than a propaganda strategy. He seriously believed he was a genius, long before Goebbels referred to him as such. And it makes sense that Goebbels constantly described him as a genius. A genius shouldn't refer to himself as a genius. He needs a community of admirers. His conviction that he was a genius, in my interpretation, was at the center of his entire worldview.

SPIEGEL: For a time, Hitler survived by painting watercolor scenes of Vienna. He was apparently fired by an architecture firm where you believe he worked, because his performance wasn't good enough. He then moved to Munich, where he hung around in cafés. That doesn't sound like someone with the creative urges of a genius.

Schwarz: On the contrary. Let me give you an example. A competition for an imposing building project of the late Kaiser period was announced in Berlin. The opera house was going to be rebuilt. We don't know if Hitler attempted to officially enter the competition — in fact, it's unlikely — but it appears that he did draw some of his own designs. He believed that he could hold his own with the most famous architects.

SPIEGEL: Why didn't he seek public attention?

Schwarz: A genius can shine in secret, hoping that he will make a big splash one day.

SPIEGEL: Could Hitler seriously have considered himself a genius? His talent as a draftsman was moderate at best.

Hitler: White Orchid
Schwarz: He apparently felt differently, and it was important for his ego that he was self-taught. After the humiliation of being rejected by the academy, he developed an aversion to all professors, and to all academic study. He referred to himself once as a minor painter, but that was at a time when he believed he was a great architect. On the whole, he saw himself as a creative genius. You mustn't forget that the concept we have today of a genius is so much more harmless than it was back then.

SPIEGEL: In what sense?

Schwarz: We define a genius on the basis of his talent. At the time, talent was not the main focus. A genius had to have a strong personality. He was a larger-than-life talent who was permitted to do anything, including evil things. The genius has outstanding ideas, and they must be implemented, even if they are completely amoral. Hitler admired the work of dour philosophers like Arthur Schopenhauer and Friedrich Nietzsche. One important aspect is often overlooked, namely that the concept of genius had long been colored with racism. Houston Stewart Chamberlain, a Briton by birth who had married into the family of Richard Wagner, was a significant figure. He published his views in a book, which became a bestseller. Chamberlain, who promoted the great Aryan personality, was a key figure for Hitler.

SPIEGEL: Are you going so far as to draw a line between the concept of genius and the Holocaust?

Schwarz: Let me say it one more time: The genius was allowed to be above morality. The amorality of the Nazis represents taking this position to its unthinkable extreme. Goebbels wrote the brutal sentence: "Geniuses consume people." Part of Hitler's concept of a genius was the image of an enemy. In his case, it even needed to be a mortal enemy.

SPIEGEL: But his worldview was strongly influenced by World War I and his own drastic experiences at the front.

Schwarz: Naturally that was a turning point. However, he believed that the world war proved that it was possible to overcome all odds. But I don't see an absolute shift in his life. Even before World War I, he had the self-image of a genius, and he kept it up after that. That's continuity. In the early 1920s, he even declared that what was needed was "a dictator who is a genius." Of course, the population also yearned for a genius.

 SPIEGEL: But shouldn't the word "genius" be replaced with "Führer" ("leader")?

Schwarz: No. The Führer concept arose from the genius concept in the first place. Once again, too great a distinction has been drawn between Hitler the artist and Hitler the politician until now. The research describes Hitler as a man who was a failure during his first 30 years before suddenly, as if in a new life, managing to captivate the masses as a politician. It's a divided biography, in other words. But the question is: Where did he get his self-confidence, and the certainty that he was an exceptional figure?

SPIEGEL: Hitler himself described a split in his biography, "Mein Kampf," in which he famously wrote: "But I decided to become a politician."

Schwarz: It wasn't a split, but a development. His career as a politician doesn't contradict his self-image as a genius by any means. And that was what he considered himself to be, first an artist, and then a politician and strategist. But without the self-image as an artist, he would never have been able to see himself as a genius. That's why he constantly had to reaffirm his love for art.

Hitler: The Vienna Opera House
SPIEGEL: You describe which paintings Hitler hung, re-hung or removed in his private and official rooms, including works by the Swiss painter Arnold Böcklin and the German painter Carl Spitzweg. These two painters represent very different styles: overblown and aggressive versus detailed and contemplative, respectively. And then there were the neo-classical portraits of women by painters like Anselm Feuerbach. How does all this fit together?

Schwarz: It doesn't fit together at all. I have reconstructed his collection of paintings, including the ones in his private rooms. Hitler's taste cannot be pinned down. There is no aesthetic lowest common denominator. But what his favorite painters do have in common is that Hitler saw them as misunderstood geniuses.

Hitler Schloss-und-Kirche-Perchtoldsdorf

SPIEGEL: Does a genius need a muse? If so, was Hitler's muse Eva Braun — or perhaps his favorite architect, Albert Speer?

Schwarz: Perhaps an artist needs a muse, but a genius doesn't, because a genius's creative strength comes from within. And a genius, as Hitler explained to his secretary, could not have any children. However, he did have role models, including Frederick the Great, who became increasingly important to him. Hitler felt that he was an incarnation of this art-loving ruler, who was both a collector and a military strategist. He imitated everything about him, including his love for dogs and, later, his shuffling walk and stained uniform. It was even obvious to the terribly banal Eva Braun, who chided him for his excessive efforts to imitate Frederick. In the end, he insisted on having a portrait of the king nearby at all times, even in the bunker. Academics are familiar with this adoration and with how alarmingly deep it went, but it probably hasn't been adequately studied.

SPIEGEL: In the end, how much did he retain of his belief that he was a genius?

Schwarz: It was everything at the end. In fact, Hitler, in his delusions of being a genius, is best understood by studying the last months of his life. The period in the Führer's bunker is very illuminating. It was only a few steps from his quarters to the cellar of the New Reich Chancellery, where the model of his architectural plans for Linz was displayed. He had to reaffirm his status as a genius, and he could only do so through his close connection to art and architecture. These final attempts at creating a certain image for himself had a fatal effect. He made a strong impression on many of the people around him. Many believed that Hitler would succeed in the end, just as his role model and supposed fellow genius Frederick the Great managed to win certain battles, even emerging from wars as the victor despite having suffered military defeats.

SPIEGEL: So art never opened Hitler's eyes — he saw only what he wanted to see?

Schwarz: That was always his intention, right from the start.



I found this article to be quite eye-opening. By the way, when Hitler would describe someone as "completely inartistic," it was a big insult. While in America an artist is seen as something of a fool and, if male, effeminate or even likely to be gay, to Hitler an artistic genius was a Nietzschean Übermensch to the most exalted degree.

These days we regard someone as an artistic genius based on the quality of his or her work and the unique style that makes it like no one else’s. But the nineteenth and early twentieth century was strongly influenced by the idea of an “artistic personality” — akin to the Byronic hero. Being a passionate, stormy, obsessed, unconventional, often misunderstood person —  which somehow automatically translated into “superior” — seemed to count most of all.

You may recall that Raskolnikov held the view that to a “great man” nothing is forbidden. He is above law and morality. 


Hitler’s self identification as a “genius,” an exceptional, great man, for whom all things are permissible, certainly has its roots in the romanticism of the 19th century, and immediately calls to mind Byron, the perception of Milton’s Satan as heroic, and the theories personified in Dostoyevsky’s  Raskolnikov — that for the great, nothing is forbidden, all is possible, they live and act beyond the dictates of morality. Raskolnikov’s genius allows him murder, Hitler’s allows mass murder, genocide.

Leaders of this type, and I am thinking Putin, Hitler, Trump, operate by calling on humanity’s most primitive fears, the fear of the Other, the Not Us, the Enemy, and establishing a situation where those fears and hatreds can be openly expressed and acted on, eventually becoming part of the mechanism of the State—pogroms, massacres, Holocausts, genocides, lynchings, bombings, gulags—all  become legitimate, either ignored as crimes or sanctioned  under the operations of the State itself.

And these mythologies always turn on the idealization of some golden remembered past, a lost eden, a time of “greatness” and security, unthreatened by any challenges from the identified Other, who are so marginalized and powerless they fairly cease to exist. History and science retold and reshaped to suit, all but the chosen invented narrative refused, government and law reworked to become an apotheosis of lies.

What is further disturbing is that we seem to have devolved from the idea of “genius” to that of “celebrity’” — since Trump is unsatisfactory, let’s have Oprah! So we go down the endless slide from what may once have been the idea of the “philosopher king” — maybe the first colossal error  in our thinking.


Plato came up with the notion of the “noble lie” (let’s skip here the scholarly dispute about the accuracy of the translation) that was supposed to justify class differences. “Alternative facts” may work for a while — for a whole generation, say. But ultimately nothing of lasting value can be built on lies. The corruption that began with even the best-intentioned single lie starts eating the system from the inside. The Nazi Reich was supposed to last for a thousand years, but in fact lasted only eleven years; the delusional basis of fascism (Jesus was presented as an Aryan, for example) doomed it from the start. Soviet-style communism collapsed as reality became more and more removed from the proclaimed ideal of the workers’ paradise; Rome fell for many reasons, but one of them, I think, was that the lie about the divinity if caesars wasn’t sustainable.

The common thread here is megalomania — call it genius, the great man, or, in Putin’s diminished version, simply a “strong man” — society needs not democracy, but a strong man as its leader. He may be vile, a killer, but — don’t touch the strong man on whom the country depends! Don’t even think of removing him from power!

And the devolution to “celebrity” is indeed pathetic — but perhaps also a sign that since we can't go lower than that, change has to happen. The change will also carry the seed of its own undoing, but that’s unavoidable. No Utopia is possible — or desirable. If humanity, after so many blood-stained millennia, learns at least this much, there is hope.


“Beware of the pursuit of the Superman: it leads to an indiscriminate contempt for the human.” ~ George Bernard Shaw, Don Juan in Hell, 1903 (toward the end of his life, in the nineteen twenties and thirties, GBS developed an unfortunate infatuation with Mussolini — like many intellectuals, he failed to grasp the true nature of fascism and Stalinism, mistaking both for progress)


~ “What makes the comparison between Hitler and Trump so poignant is not just the rhetorical marginalization [“OTHERING”] of groups, lifestyles or beliefs, but the fact that both men represented their personal character as the antidote to all social and political problems.

Neither Hitler nor Trump campaigned on specific policies, beyond a few slogans. Instead, both promised a new vision of leadership. They portrayed the existing political systems as fundamentally corrupt, incompetent, and, most importantly, unable to generate decisive action in the face of pressing problems.

In this scenario, democracy has less to do with representative institutions than with a leader who is intuitively 'in tune' with the sentiments of the people.” ~


 ~ “The Soviet class structure didn’t really collapse. People who have been privileged in the Soviet period by and large continued to be privileged. What collapsed were the divisions between classes that made people invisible to one another.

So members of the Central Committee used to have their own buildings, their own dachas behind tall fences, their own sanatoriums. Their children went to separate schools, and they got their food and clothes at distribution centers that were behind unmarked doors in the city. A lot of those walls disappeared or developed windows, and the distribution centers closed and people started buying food in supermarkets. So other people were walking by those supermarkets and seeing the food in the windows that they couldn’t afford.

All of that together made the ’90s for a lot of people a time of deep psychological misery, resentment, envy—a feeling of having been both clobbered and cheated.

~ What is your sense of how those psychological wounds played a role in the rise of Putin and how he either exploited that or was a product of it?

I think it was actually both. That’s a very insightful way to ask the question because I think Putin shared a lot of that resentment. He shared a lot of the nostalgia for an imaginary past in which he felt a kind of certainty and had a clear vision for his own future, in his case as a KGB agent. So he was able to effortlessly tap into that longing among so many Russians to return to an imaginary time of certainty and security.

~ When most people think of totalitarianism, they think of 1984, North Korea, or something like that. Why is totalitarianism the word that you chose to use in your title?

What I think happened in Russia is that Putin has built a mafia state. There are different terms that have been applied to his regime. Some people have used kleptocracy. Some people have called it an illiberal democracy.

The thing is that that mafia regime exists not in a vacuum, but on the ruins of a totalitarian society. Putin felt that his power was endangered after the popular protests of 2011–12, and he began his crackdown. The signals that that crackdown sent out were interpreted by Russian society in totalitarian ways, right? The mechanisms that kicked back in were mechanisms inherited from totalitarian societies.

A totalitarian regime is profoundly political. Everything becomes political. Private life disappears as such because even private action becomes political, and people are basically urged to be out in the squares demonstrating their support for the mission of their country.

That’s the kind of regime that Russia has turned into. It’s a highly mobilized country. It’s a country in which everything has become political. It’s expansive, and that’s why it’s fighting wars in Ukraine and Syria. What they’re pursuing is the sense of constant movement, the sense of expansion that is essential to totalitarianism.

But once they establish the structure that’s necessary for a totalitarian regime, they tend to flail. Even if you read about Stalin—it was a mess, as much as he tried to project the image of somebody who was in control of every single thing in the country. And certainly by the time Soviet society had aged and entered what we call the stagnation period, it was a pretty convoluted and just shockingly incompetent kind of state apparatus that was fragmented, had a lot of different people pursuing their own interests, had no clear direction, and had highly problematic command centers.

~ One concept that you talk a lot about in the book is Homo Sovieticus. What’s behind that idea?

The explicit project of the Bolshevik revolution — as is the case with every totalitarian society — is to create a new kind of man. This was going to be the perfect man, a man who lived in perfect harmony with his society. But in the late 1980s, a great Soviet sociologist named Yuri Levada had this hypothesis than the Soviets had indeed created a new kind of man, not necessarily perfect, but very much shaped by the experience of Stalinist terror. His hypothesis was that since it had been 30 years since terror ended, Homo Sovieticus — that person characterized by doublethink and his very strong identification with the state — had to be dying off and that Soviet institutions had to crumble once Homo Sovieticus died out.

He conducted the survey in 1989. It was this great big survey, the first real study of Soviet people. They concluded that Homo Sovieticus was on its way out and it seemed that they correctly predicted the collapse of the Soviet Union, which came two years later. The problem was they went back to the survey in 1994 and came back with pretty disturbing results that suggested that Homo Sovieticus hadn’t quite gone away and maybe wasn’t as generationally bound as they thought. When they did the survey again in 1999, they came to the conclusion that Homo Sovieticus was not only surviving but also reproducing. They actually predicted as early as 1999 that there was a possibility of totalitarian revanche.

You can’t expect a society that has been subjected to terror for decades to suddenly shake that experience off and develop an entirely new set of skills and a new kind of baseline trust and live as a happy democracy, right? And I think that’s what’s happened to Russia.

Because we’re reading history books, we think that autocracy is developed linearly in the pursuit of an autocratic project. I don’t think that’s true. I think that humanity has stumbled into awful moments of history, and this may well be one of those moments. Trump has an instinct for manipulating people and for making them feel powerless, and that’s an instinct that drives a lot of his actions. He also has an instinct for self-aggrandizing, which happens to dovetail with that instinct [for manipulation]. He has the habit of advancing his brand by making a lot of loud gestures and contradictory things. He doesn’t need to be pursuing a grand strategy in order to be consolidating a kind of psychological power.


from another source:

~ “Will Putin last forever? When and how can we expect things to change in Moscow?

Nothing lasts forever, so Putin won’t. But he has created a closed system that is virtually impervious to outside pressure. It will collapse from the inside, likely, though not necessarily, when Putin dies. After that, things will be in disarray. Putin plans to live forever, so there will be no succession plan. There will be a scramble (for some detail on what it might look like, I recommend Joshua Rubenstein’s terrific book Last Days of Stalin). I think that the borders of Russia will be redrawn—it’s a tense and complicated federation now, held together by pressure, fear, and habit. But I don’t hold out a lot of hope, because I think that damage that’s been done to that society is just unspeakable.

~ Could Putin’s rise have happened in any other country? Every other country?

Sure, and I think this is important: mediocre men become leaders of nations by accident. This happens—it’s not an exceptional event. And if these mediocre men have a talent for trafficking in fear and nostalgia, they get to hold on to power. Now, I think that the particulars of what has happened to Russia have a lot to do with what happened to Russia before Putin: I think he set out to build a mafia regime but ended tapping into a reservoir of totalitarian customs and institutions. In a different country, he would have done a different kind of damage.” ~


I'm not sure if Masha Gessen has the same thing in mind when she speaks about the redrawing of  Russia’s borders, but it seems to me that the tension between the Slavic population and the expanding Turkic and other ethnic populations may eventually become too great.


~ “Putin views Trump’s victory as the triumph of a particular world view: “a large number of Americans share our ideas of what the world should be like” and even of “right and wrong.” The phrase “traditional values” is crucial here: the instrumentalization of some vague idea of past greatness is something Putin and Trump share.

In the last few years, the Kremlin has framed the battle for global domination as a conflict between a “Western civilization” rooted in the idea of human rights and a “traditional values civilization.” Putin’s “traditional values” campaign has included a virulent antigay offensive, an insistent effort to raise the birth rate in order to save the 145-million Russian nation from extinction, and, most important, a systematic discrediting of any idea that is viewed as connected with contemporary Western culture. This is where Putin sees a kindred spirit in Trump, with his flailing against political correctness and his defense of Christmas against a fictitious threat. “Traditional values” becomes a catchall term for an imaginary past—which goes a long way toward explaining Trump’s seamless symbiosis with the American Christian Right.

Putin has declared victory in his war on modern culture, which gives him the right to call himself the most powerful man in the world. But, of course, that description has generally been part of the definition of a different job—the one to which Trump has in fact been elected.

But Putin is reveling in the idea that he is “the most powerful man in the world.” He is right: it doesn’t matter if Russian hacks were decisive in the election—what matters is that many people believe that they were. If Americans perceive Putin as the ultimate winner of the presidential race, then that is what he is.

Trump and Putin lack a concept of the future. In Putin’s version of the clash of civilizations, we have only a threatening Western present versus an imaginary Eurasian past. In Trump’s case, the threatening present is global while the alluring past is American. Both men traffic in appeals to the local and the familiar from the past against the frighteningly strange future. They are also both short-tempered, thin-skinned, not very bright, and disinclined to listen to advisers—all major risk factors for escalation. But it is their shared inability to look ahead that poses the greatest danger to the world.” ~


It's a good summary: Trump rejects modern Western civilization in favor of "traditional values," e.g. the sole role of women is to serve men and cater to their fantasies (hence they have to “dress like women”); Putin rejects democracy in favor of a "strong leader."


 ~ “Exercise may change the composition and activity of the trillions of microbes in our guts in ways that could improve our health and metabolisms over time, a new study finds.

The results provide novel insights into how exercise can affect even those portions of our bodies that seem uninvolved in workouts, perhaps providing another nudge to stick with our exercise resolutions this year.

This microbiome includes countless different species of microbes in varying proportions that interact, compete and busily release various substances that are implicated in weight control, inflammation, immune responses and many other aspects of health.

In broad terms, our microbiomes tend to be relatively stable, most studies show. But our microbiomes can change as our lifestyles do. Diet clearly affects the makeup of a person’s microbiome, as do illness, certain drugs, how much we weigh and other factors.

Exercise also has been associated with variations in the microbiome. Past studies have shown that endurance athletes tend to have a somewhat different collection of microbes within their intestines than sedentary people do, especially if the athletes are lean and the sedentary people are not.

The [new] study was designed as a follow-up to an earlier, interesting animal study by the same scientists. In that work, the researchers had allowed some lab mice to run and others to sit around for most of their adult lives. Gut material from the mice was then transplanted into animals that had been bred to be germ-free, so that their guts would easily incorporate these new tribes of bacteria. After the animals’ microbiomes were established, the scientists exposed the mice to a substance that can cause tissue irritation and inflammation in the colon.

The scientists found that the animals with gut bugs from the runners were better able to resist and heal tissue damage and tamp down inflammation than those whose microbes had come from sedentary mice.

Now the scientists wished to see if exercise would likewise affect the functioning of microbes in people.

They began by recruiting 32 men and women who did not exercise. About half were obese and the rest of normal weight.

The scientists took blood and fecal samples and tested everyone’s aerobic fitness. Then they had the men and women begin supervised workouts, during which their efforts increased over time from about 30 minutes of easy walking or cycling to about an hour of vigorous jogging or pedaling three times per week.

After six weeks, the scientists collected more samples and retested everyone, and then asked the volunteers to stop exercising altogether.

Six weeks later, the tests were once again repeated.

The subsequent analysis showed that the volunteers’ gut bugs had changed throughout the experiment, with some increasing in numbers and others declining. The researchers also found changes in the operations of many microbes’ genes. Some of those genes were working harder now, while others had grown silent.

Most of these changes were not shared from one person to the next. Everyone’s gut responded uniquely to exercise.

But there were some similarities, the researchers found. In particular, they noted widespread increases in certain microbes that can help to produce substances called short-chain fatty acids. These fatty acids are believed to aid in reducing inflammation in the gut and the rest of the body. They also work to fight insulin resistance, a precursor to diabetes, and otherwise bolster our metabolisms.

Most of the volunteers had larger concentrations of these short-chain fatty acids in their intestines after exercise, along with the microbes that produce them.

These increases were greatest, though, among the volunteers who had begun the experiment lean compared to those who were obese, the scientists found.

And perhaps not surprisingly, almost all of the changes in people’s guts dissipated after six weeks of not exercising. By and large, their microbiomes reverted to what they had been at the study’s start.

Still, the study’s overall results suggest that even a few weeks of exercise can alter the makeup and function of people’s microbiomes.

In theory, Dr. Woods continues, these changes could contribute to some of the broader health benefits of exercise, such as its ability to reduce inflammation throughout the body.

He also hopes that future research can explain why the obese volunteers showed smaller gains in their fatty-acid producing microbes than the leaner men and women. Additional study could also help to determine whether and how people’s microbiomes might continue to change if they exercise for longer than six weeks.

ending on beauty


It is foolish
to let a young redwood
grow next to a house.

Even in this
one lifetime
you will have to choose.

That great calm being,
this clutter of soup pots and books

Already the first branch-tips brush at the window.
Softly, calmly, immensity taps at your life.

~ Jane Hirshfield