Saturday, December 30, 2017


Dali: Nativity, 1967


In the morning we found we’d been
in each other’s dream.
Our dreams would disappoint
both Jung and Freud:
you were with me at a party,

I with you in Salt Lake City.
The parallel fact
beat in us like a heart:
untouching, we touched,
silent, we talked,
unmoving, we walked together.

I invented instant metaphysics:
death as a dream
in which we still
go to parties, drive to Salt Lake City.
But I could be wrong:

in this dreamtime,
not long, let the tongues
taste and tell,
let the dreamers’ arms
repeat the gesture, gathering
the beloved body.

Tonight an orange half-moon
rises over the coyote hills
like one half of a wish.
Some eternal I do
is said without our speaking.

The moon’s mottled lamp
lights our good-nights.
Let us now turn it off, until
we meet in another dream,
beyond the white rain of the stars.

~ Oriana



Wonderful dream, wonderful poem. I like those metaphysics — "untouching, we touched”— like no amount of space or time can separate these lovers, who will keep meeting like this, in parallel dreams!!


One of the sweetest experiences of my life, unforgettable — even if the dreams themselves were forgettable except for the parallel presence of the partner in them. (And yes, we are still together.)

Funny, I suddenly remembered that my mother used to call me every time she happened to have a dream about me. I remember only one of those dreams . . .  But more than anything, I remember that she had them, and cared enough to call me the morning after — yet another reason to love mothers and forgive them for being overprotective and always seeing us as a child (of course!)

(A shameless, unromantic partial denial of “unforgettable”: Or, as Samuel Beckett remarked, “What was that unforgettable line?”

I can’t discount the fact that I wrote a poem about this experience. Would I spontaneously remember it if I never wrote the poem? I can’t say for sure. But we are in part created by our own creations.

O look at all the fire folk sitting in the air,
The bright boroughs, the circle citadels there.

~ Gerald Manley Hopkins



This morning, the rude ringing of the phone (Home Depot) exiled me from the earthly paradise I saw in a dream. A campus in pine woods (is there a tree more glorious than the Jeffrey pine?). Buildings with great bay windows, a campus practically of glass so there is but a membrane between the inside and the trees. I turn to the area that is just lawn — but what green! We don’t get such green in California, except along mountain streams — and I say to my guide (a dean?): “We need to plant more trees.”

Always a more northern, rain-rich landscape, but especially the woods. The dream of living in the woods, ever since in childhood I saw Jurata on Hel Peninsula, the villas among the pines.

“Want to make sure you haven’t wasted your life? Plant a tree.”


“Want to be happy? Think like an old person,” read a recent headline in the New York Times.

Fortunately, one doesn’t have to be really old to realize that it’s simply too late in life not just to be depressed, but also for all kinds of noxious nonsense.

Too late to whine and grumble.

Too late to hold back forgiveness.

Too late to waste time on debates with religious people: our premises are so different, talking leads nowhere. The wise refuse to talk with creationists, Holocaust deniers, or climate-change deniers. Creationists and deniers care nothing about the earthly paradise, only the pie in the sky and/or the purity of their ideas, no matter how at odds with reality.

Time only to share beauty, poetry, and whatever life wisdom I have learned through both pain and joy. Suffering isn’t the only teacher. Each relationship has been a profound learning experience. Even the bad ones — I simply can’t repent having loved. Even if spoken to a stranger, I want my last words to be “I love you.”


    ~ “The murder rate in most countries has fallen significantly in the past 15 years. That's the reality, but most people don't believe it — fewer than one in 10 thinks there are fewer murders.

    Deaths from terrorist attacks around the world were lower in the past 15 years than in the previous 15 — but only a fifth of us think that's the case.

Even when it comes to other areas of public life, people's assessments can tend to be incorrect.

For example, people overestimate the number of teenage pregnancies by what the researchers call staggering amounts.

In some countries, they think about half of teenage girls get pregnant every year: in reality, the highest figure for any country is 6.7%, and the rate across all 38 countries is just 2%.

One reason for this tendency to assume the worst of the world, say the Ipsos-Mori people, is that we're genetically programmed to believe bad news more readily than good.

Our brains process negative information in a different way and store it more accessibly than positive stuff.

One neuroscientist demonstrated this by showing people pictures of things known to arouse positive feelings — which apparently include pizzas and Ferraris — and others known to arouse negative ones, such as a mutilated face or a dead cat.

As he did so, he measured the electrical activity in the brain. It turns out we react more strongly to the negative images.

The world is getting healthier and wealthier, which is good news, but headlines about that sort of thing just don't cut it when there's a terrorist attack or a war to report.

“If it bleeds, it leads,” is said to be the tabloid news editor's mantra. Whoever coined the phrase clearly had a profound insight into human nature.

Critics talk about "fear-based media". If we're fed such a relentlessly negative diet, they ask, is it any wonder we end up thinking the world is a terrible place?

Except, it turns out, we already thought that — or at least were predisposed to think it.

All those negative news stories are just reinforcement, feeding us what we're programmed to want — because it may save our lives.

This hypersensitivity to negative information — or bad news — apparently served an important function as human beings evolved.

Having the kind of brain that reacted more strongly to information about possible dangers meant, quite simply, that you were likely to live longer.

And those who didn't have that kind of brain? Well, as one scientist delicately put it, they "got edited out of the gene pool”.


A topic that threads through several subjects of the blog this week is the lack of congruence between reality, the facts of a situation, and our perception and beliefs about that reality. Interesting that we react differently to good news and bad, and that the tendency is to pay more attention  and give more weight to bad news. Of course this may have been a bias that helped us survive, that readiness, even expectation, of danger and threat, but it seems much less useful to us now, when the overwhelming emphasis on the negative can lead to hopelessness, depression and despair, and perhaps even worse, to miss directing energy and resources in ways that squander them, while real needs may be ignored, overlooked, or given scant attention.


Not that I mean to blame the media — they need support at this point. But anyone with half a brain can understand that a mass shooting will be reported in a way that takes precedence over a thousand acts of kindness that took place during the same time. It’s just unavoidable, but it does distort our view of reality (we live in the safest time in history) and of human nature (not innately evil). This feeds the right-wing ideology.

By the way, every time there is a mass shooting, the prices of the stock shares of the gun industry go up. A mass shooting stirs up the fear that gun control will finally be enacted, so gun enthusiasts rush to buy more guns. I don’t think there is the slightest bit of empathy for the victims, despite all the “thoughts and prayers” (I think the phrase has become morally obscene). Nor is there any REALISTIC concern about one’s own safety. Studies show that buying a gun highly increases the chance of accidental or intentional homicide (a bit of black humor here: “Without a gun, how could I defend myself against my family?”)

I call it "Ganesha Dancing"


But here is something fascinating:

“Gerontologists call this the paradox of old age: that as people’s minds and bodies decline, instead of feeling worse about their lives, they feel better. In memory tests, they recall positive images better than negative; under functional magnetic resonance imaging, their brains respond more mildly to stressful images than the brains of younger people.” ~ New York Times, 12-29-2017

Arik Brauer: My Father in Winter, 1983-84

~ “Glory to God in the highest heaven,
and on earth peace to those on whom his favor rests.”
~ Luke 2:14, New International Version

You know that favorite Christmas verse that says "Peace on earth, good will toward men?" Well, that's not really what it says in the Greek. What it really says is "On earth, peace toward those whom he favors." You can look it up in almost any English version besides the KJV and see it translated more accurately.

It may look like a small difference, but it's not really small at all. The correct rendering tracks better with the rest of the Bible, which always presents God's blessings as discriminatory, not equally doled out to all in the same way. If any God created "all men equal," it most certainly wasn't the God of the Bible.

But that's not what people today read when they get to that verse. They read what they want to read there, and telling them what the original says doesn't change their minds one bit. Cognitive dissonance is a powerful thing, and the religious mind typically doesn't change directions, even if new information arises from an already approved source of authority. I find that fascinating.” ~

~ Neil Carter (Godless in Dixie)

Parmigianino, Self-Portrait in a red hat, 1540

Neil Carter knows the New Testament Greek and I trust his accuracy. It’s true that Yahweh plays favorites in the most blatant way. But in this case, I favor the inclusive mistranslation in the King James Version. Inaccurate translation is one way a religion can reform.

Equality is a very new idea, not yet a century old (I don’t mean “equal before the law,” which is older, but true “equal rights”), and not accepted by most of the world . . . and yet I sense it shall prevail (if humanity survives this century, that is — it began with 9/11, not a good omen; and yet, for now, ISIS is losing). 


~ “It will soon be that time of year where many of us set ourselves up for failure. Make a resolution or don’t make a resolution; you will regret either. Or so the Danish philosopher Soren Kierkegaard might quip. One estimate suggests that almost half of Americans make New Year’s resolutions, and yet less than 10 percent successfully follow through. Maybe we forget about them long before our snow boots dry out. Maybe life takes us on a different path. Maybe we stop caring. Maybe we simply fail. It might be tempting to do away with this farce altogether, but before we commit to being noncommittal about the New Year, it’s worth thinking through some of the options.

The tradition of making New Year’s resolutions is at least four thousand years old. The ancient Babylonians celebrated their new year—the rebirth of the sun god Marduk—in spring, to coincide with barley-sowing season. Akitu was a twelve-day festival in which the king would promise to fulfill an extensive list of duties. To seal the king’s commitment, the high priest would slap him hard across the face. The slap had to be firm enough to draw tears: proof of the king’s dedication and a reminder to him to be humble. As part of the festival, other people also pledged their allegiance to the king and the gods and promised to repay their debts.

It may be tempting to overthrow this ancient tradition, to make no resolutions, and to go along with the flow of life like a carefree leaf on the surface of a happily bubbling stream. But Kierkegaard would argue that such a metaphor is deceptive: we would be akin to a stone hurled across the surface of the water, which “skips lightly for a time, but as soon as it stops skipping, instantly sinks down into the depths.” Without commitments, we risk disappearing into the existential abyss. A life that lacks purpose creates anxiety. A meaningful life, Kierkegaard suggests, is one in which we actively assert ourselves in order to live more fully.

It’s all well and good to make promises, but there’s still the challenge of keeping them. Friedrich Nietzsche suggests that what differentiates humans from other creatures is that we have “the right to make promises.” Making promises addresses a fundamental aspect of our humanity: that each of us is and is not the person we will become in the future. This is confusing, so let’s get concrete: Are you the same person you will be next year? Well, not exactly. Gray hair may sprout, wrinkles may emerge, your voice may deepen and thicken, your joints begin to ache. Your physical characteristics will objectively change, even if minutely. Your emotional and psychological identity may also shift; you might get a new job or a new partner, a new hobby or a new therapist.

A promise is a way of laying claim to an uncertain future. It is a way of projecting oneself into the coming months, protecting a commitment that may be impossible to keep. It is also a means of guarding or binding one’s identity—the I in “I promise.” Why does a nonhuman animal not make promises? Most don’t have a conception of themselves as individuals or a vested sense of identity. Yes, some animals may experience guilt, but guilt is not the same as the shame of breaking a longstanding promise. Nietzsche’s suggestion is that we ought to keep making resolutions—heartfelt, honest-to-God promises—lest we devolve into an animal-like state.

Nietzsche does not say, however, that we must keep our resolutions. Sometimes, many times, the cost is simply too high. To fulfill all promises unconditionally may be unwise, if not pig-headed and arrogant. For example, perhaps you committed to shedding a few pounds, but it turns out that your blood sugar plummets every time you go for more than two hours without a snack and you’re constantly on the verge of passing out. So that wasn’t a great resolution after all. Or you resolved not to go on any new dates and to focus on your career, but every morning you bump into the same lovely person at your favorite café. With new information, you just might need to leave some commitments behind. There’s no reason to feel guilty about that. The Romantic view of the self is that there’s no need to feel enslaved to an idea of ourselves that we wanted in the past. The self is forever in flux, changing, growing. The Romantic self is one that is ready to annihilate itself over and over again. As Nietzsche’s most famous protagonist Zarathustra says, “You must be ready to burn yourself in your own flame: how could you become new, if you had not first become ashes?”

For an existentialist an unwillingness to “burn yourself in your own flame,” to overcome or break a promise, can be a sign of “bad faith.” “Bad faith” is a situation in which you disavow the immediate free will that is always at your disposal. Bad faith is “bad” because it denies the hard, metaphysical core of being human—radical freedom. Radical freedom means we are radically responsible both for keeping and for transgressing promises. The fragility of our promises is what makes them meaningful.

So go ahead: make your resolutions. You have the right to make promises. And you have the right to break them. But you don’t have to make them during an evening of late-night drunkenness. That is what the rest of your sober life is for.” ~


I think most people attempt too much. Resolve to do something very small that would nevertheless improve your life. Go ahead, nothing is too tiny — it’s so sweet to succeed.



~ “The first line of Epictetus’ manual of ethical advice, the Enchiridion — “Some things are in our control and others not” — made me feel that a weight was being lifted off my chest. For Epictetus, the only thing we can totally control, and therefore the only thing we should ever worry about, is our own judgment about what is good. If we desire money, health, sex, or reputation, we will inevitably be unhappy. If we genuinely wish to avoid poverty, sickness, loneliness, and obscurity, we will live in constant anxiety and frustration. Of course, fear and desire are unavoidable. Everyone feels those flashes of dread or anticipation. Being a Stoic means interrogating those flashes: asking whether they apply to things outside your control and, if they do, being “ready with the reaction ‘Then it’s none of my concern.’ ”

Reading Epictetus, I realized that most of the pain in my life came not from any actual privations or insults but, rather, from the shame of thinking that they could have been avoided. Wasn’t it my fault that I lived in such isolation, that meaning continued to elude me, that my love life was a shambles? When I read that nobody should ever feel ashamed to be alone or to be in a crowd, I realized that I often felt ashamed of both of those things. Epictetus’ advice: when alone, “call it peace and liberty, and consider yourself the gods’ equal”; in a crowd, think of yourself as a guest at an enormous party, and celebrate the best you can.

Epictetus also won me over with his tone, which was that of an enraged athletics coach. If you want to become a Stoic, he said, “you will dislocate your wrist, sprain your ankle, swallow quantities of sand,” and you will still suffer losses and humiliations. And yet, for you, every setback is an advantage, an opportunity for learning and glory. When a difficulty comes your way, you should feel proud and excited, like “a wrestler whom God, like a trainer, has paired with a tough young buck.” In other words, think of every unreasonable asshole you have to deal with as part of God’s attempt to “turn you into Olympic-class material.” This is a very powerful trick.

Much of Epictetus’ advice is about not getting angry at slaves. At first, I thought I could skip those parts. But I soon realized that I had the same self-recriminatory and illogical thoughts in my interactions with small-business owners and service professionals. When a cabdriver lied about a route, or a shopkeeper shortchanged me, I felt that it was my fault, for speaking Turkish with an accent, or for being part of an élite. And, if I pretended not to notice these slights, wasn’t I proving that I really was a disengaged, privileged oppressor? Epictetus shook me from these thoughts with this simple exercise: “Starting with things of little value—a bit of spilled oil, a little stolen wine—repeat to yourself: ‘For such a small price, I buy tranquillity.’ ”

Born nearly two thousand years before Darwin and Freud, Epictetus seems to have anticipated a way out of their prisons. The sense of doom and delight that is programmed into the human body? It can be overridden by the mind. The eternal war between subconscious desires and the demands of civilization? It can be won. In the nineteen-fifties, the American psychotherapist Albert Ellis came up with an early form of cognitive-behavioral therapy, based largely on Epictetus’ claim that “it is not events that disturb people, it is their judgments concerning them.” If you practice Stoic philosophy long enough, Epictetus says, you stop being mistaken about what’s good even in your dreams.”


“For such a small price, I buy tranquility” — just that statement has affected me deeply.

And also, for such a small price we can make others happy — a larger tip, sometimes simply a smile. And when we make someone else happy, we automatically feel good too. 


Leisure … is not the privilege of those who can afford to take time; it is the virtue of those who give to everything they do the time it deserves to take. In fact, work ought to be done with leisure, if it is to be done well.

The heart is a leisurely muscle. It differs from all other muscles. How many push-ups can you make before the muscles in your arms and stomach get so tired that you have to stop? But your heart muscle goes on working for as long as you live. It does not get tired, because there is a phase of rest built into every single heartbeat. Our physical heart works leisurely. And when we speak of the heart in a wider sense, the idea that life-giving leisure lies at the very center is implied. Never to lose sight of that central place of leisure in our life would keep us youthful.” ~ David Steindl-Rast (a Benedictine monk)


“Work ought to be done with leisure, if it is to be done well.” We rarely hear about it: the ideal of giving to everything the time it deserves to take.

Scott Peck, author of the Road Less Taken, said something similar: people fail at various tasks mainly because they are not willing the take the time it takes to do it well. He spoke in terms of self-discipline rather than leisure. Note the power of words: the term "leisure" makes it sound a lot more pleasant.



~ “It was true that the Germans had more planes than anyone else. But, as the historian Victor Davis Hanson explains, in “The Second World Wars: How the First Global Conflict Was Fought and Won,” the Luftwaffe had a number of weaknesses, some very fundamental. A lack of four-engine bombers, for example, made it hard for Germany to conduct truly devastating long-range strategic-bombing campaigns against enemies overseas. (The Nazis never succeeded in mass-producing an equivalent to America’s B-17 Flying Fortress, which was in the air before the war.) The German Navy had no aircraft carriers, which made air supremacy during naval battles impossible. (In total, the Axis fielded only sixteen carriers; the Allies, a hundred and fifty-five.) Germany had limited access to oil, and thus to aviation fuel, and this constrained the number of missions the Luftwaffe could fly. Unlike the Allies, who excelled at building tidy, concrete runways from scratch as the front shifted, the Germans relied on whatever slapdash rural runways they could find, resulting in more wear and tear on their planes.

The Nazis were slower than the Allies to replace downed aircraft (they had less experience with high-volume manufacturing); they were also slower to replace fallen pilots (their aircraft were harder to operate). Over time, this lower replacement rate eroded, then reversed, their initial numbers advantage. They also lagged behind in various other areas of aviation technology: “navigation aids, drop tanks, self-sealing tanks, chaff, air-to-surface radar.” Some of these factors emerged only during the war. But others were clear beforehand, and analysts could have noticed them. In truth, Hanson writes, Lindbergh and many others were “hypnotized by Nazi braggadocio and pageantry.” The Nazis were apparently hypnotized, too. As a land-based power with a small navy, they needed the Luftwaffe to perform miracles (for instance, bombing Britain into submission). They did not see the Luftwaffe realistically; they deluded themselves into believing it could do the impossible.

“The Second World Wars” takes an unusual approach to its subject. The book is not a chronological retelling of the conflict but a high-altitude, statistics-saturated overview of the dynamics and constraints that shaped it. Hanson is unusual, too: he is a classicist and a specialist in military history at Stanford’s Hoover Institution, where he edits Strategika, “an online journal that analyzes ongoing issues of national security in light of conflicts of the past”; he’s also an almond farmer and a conservative polemicist whose articles on race, immigration, and the decline of agrarian values appear regularly on National Review’s Web site and other places. I’ve long found his political commentary tiresome—but his deeply researched and detailed military analyses are fascinating. “The Second World Wars” confines itself to the latter subject, with spectacular results. Hanson starts with the idea that the Axis powers were more or less destined to lose, then works backward to understand the reasons for their defeat. The book revolves around a question highly relevant to our own brewing confrontation with North Korea: Why, and how, do weaker nations convince themselves, against all evidence to the contrary, that they are capable of defeating stronger ones?

Hanson begins by putting the Second World War in a “classical context.” Although it was a high-tech conflict with newly lethal weapons, he writes, it still followed patterns established over millennia: “British, American, Italian, and German soldiers often found themselves fortifying or destroying the Mediterranean stonework of the Romans, Byzantines, Franks, Venetians, and Ottomans.” In many instances, military planners on both sides ignored the lessons of the past. Some lessons were local: it’s always been hard to “campaign northward up the narrow backbone of the Italian peninsula,” for example, which is exactly what the Allies struggled to do. Others were universal. Small countries have difficulty defeating big ones, because—obviously—bigger countries have more people and resources at their disposal; Germany, Italy, and Japan, therefore, should have been more concerned about their relatively small size compared to their foes. History shows that the only way to win a total war is to occupy your enemy’s capital with infantrymen, with whom you can force regime change. Hitler should have paused to ask how, with such a weak navy, he planned to cross the oceans and sack London and, later, Washington. At a fundamental level, it was a mistake for him to attack countries whose capitals he had no way to reach.

In terms of management and logistics, the Axis powers were similarly, and sometimes quite conspicuously, disadvantaged. Before the war, the United States produced a little more than half of the world’s oil; Axis leaders should have known this would be a decisive factor in a mechanized conflict involving tanks, planes, and other vehicles. (The Nazis may have underestimated the importance of fuel because—even though they planned to quickly conquer vast amounts of territory through blitzkrieg—many of their supply lines remained dependent upon horses for the duration of the war.) In general, Allied management was more flexible—British planners quickly figured out the best way to place radar installations, for example—while the Axis powers, with their more hierarchical cultures, tended toward rigidity. Axis leaders believed that Fascism could make up the difference by producing more fanatical soldiers with more “élan.” For a brief time at the beginning of the war, Allied countries believed this, too. (There was widespread fear, especially, of Japanese soldiers.) They soon realized that defending one’s homeland against invaders turns pretty much everyone into a fanatic.

In any event, Hanson shows that the Second World War hinged to an unprecedented extent upon artillery (“At least half of the combat dead of World War II probably fell to artillery or mortar fire”): the Allies had bigger, faster factories and could produce more guns and shells. “The most significant statistic of the war is the ten-to-one advantage in aggregate artillery production (in total over a million large guns) enjoyed by the British Empire, the Soviet Union, and the United States over the three Axis powers.” Russia, meanwhile, excelled at manufacturing cheap, easily serviceable, and quickly manufactured tanks, which, by the end of the war, were better than the tanks the Nazis fielded. Many Allied factories remained beyond the reach of Axis forces. There were a few possible turning points in the war: had Hitler chosen not to invade Russia, or not to declare war on the United States, he might have kept his Continental gains. Similarly, Japan might have contented itself with a few local conquests. But temperance and Fascism do not mix, and the outsized ambitions of the Axis powers put them on a collision course with the massive geographical, managerial, and logistical advantages possessed by the Allies, which, Hanson suggests, they should have known would be insurmountable.

The Axis powers fell prey to their own mythmaking: they were adept at creating narratives that made exceedingly unlikely victories seem not just plausible but inevitable. When the Allies perceived just how far Fascist fantasy diverged from reality, they concluded that Axis leaders had brainwashed their citizens and themselves. They began to realize that “the destruction of populist ideologies, especially those fueled by claims of racial superiority,” would prove “a task far more arduous than the defeat of a sovereign people’s military”:

Sober Germans, Italians, and Japanese, in the Allied way of thinking, had to be freed from their own hypnotic adherence to evil, even if by suffering along with their soldiers. . . . Death was commonplace in World War II because fascist zealotry and the overwhelming force required to extinguish it would logically lead to Allied self-justifications of violence and collective punishment of civilians unthinkable in World War I.

Hanson explores the specific decision-making processes behind the most merciless Allied decisions—“the firebombing of the major German and Japanese cities, the dropping of two atomic bombs, the Allied-sanctioned ethnic cleansing of millions of German-speaking civilians from Eastern Europe, the absolute end of the idea of Prussia”—while, from a higher altitude, pointing out that the delusional ideological fervor that shaped the beginning of the war shaped its end, too.

One of the tragic elements of war, in Hanson’s view, is that it often uncovers a reality that might have been comprehended in advance and by other means. Unfortunately, in the years before the Second World War, confusion reigned. The Axis countries lived in a fantasy world—they believed their own propaganda, which argued that, for reasons of race and ideology, they were unbeatable. The Allies, meanwhile, underestimated their own economic might in the wake of the Great Depression. They allowed themselves to be intimidated by Fascist rhetoric; justifiably horrified by the First World War, they wanted to give pacifism a chance, and so refrained from the flag-waving displays of aggression that might have revealed their true strength, while hoping, despite his proclamations to the contrary, that Hitler might be satisfied with smaller, regional conquests. 

“Most wars since antiquity can be defined as the result of such flawed prewar assessments of relative military and economic strength as well as strategic objectives,” Hanson writes. “Prewar Nazi Germany had no accurate idea of how powerful were Great Britain, the United States, and the Soviet Union; and the latter had no inkling of the full scope of Hitler’s military ambitions. It took a world war to educate them all.”

Sadly, a detailed examination of exactly when and how deterrence averts conflict is beyond the scope of “The Second World Wars.” Instead, with an extraordinary array of facts and statistics, the book offers an account of the fatalism of war. Until it begins, war is a matter of choice. After that, it’s shaped by forces and realities which dwarf the individuals who participate.” ~

Berlin near Hitler's bunker, May 1945


The assertion that neither Germany nor Japan could ever have won WWII was startling. However, the limitations of resources for both countries clearly support this argument, and we can only conclude the long, arduous, bitter struggle was due to factors other than strength of arms. I think the essence of this 'other' strength is fierce dedication to “the purity of their ideas, no matter how at odds with reality.” This is part of what allowed a prejudice to become an efficient engine of destruction, dedicated to the total obliteration of the scapegoated group.

The irrational, outrageous, unbalanced extreme, housed in the machinery created for it, became unquestionable, unstoppable, impervious to all but the most extreme acts against it. The same argument for the war against Japan — the fanaticism, the delusion, the refusal to countenance defeat, led to the extremities of ending that war. In Europe, Dresden, saturation bombing, in the Pacific, Hiroshima, Nagasaki. As though defeat could only be achieved by horrific, massive destruction — a terrible, undeniable, inescapable reality check.

Taking these ideas into our current situation with, for instance, the alternative right or evangelicals, suggests argument, and certainly discussion, are useless. Some minds may only be changed by some kind of scorched earth campaign. Perhaps the best we can do will be to resist, and be persistent in the pursuit of social justice and scientific fact.

Of course I enjoyed all of the blog topics, but this was the big one for me this time — it invited a different way of thinking about WWII.


In the next blog, I’ll have an article on the same phenomenon regarding the American Civil War: there was no way the South could have won. But crazy ideologies, fueled by selective bible quotations on both sides, mad it a Holy War — and those are the worst and longest, defeating logic — though one could argue that the defeat of logic and realistic perception was at the very inception.

All of Hitler’s generals begged him not to invade Russia — but could not prevail against this one mediocre man’s delusions. Contrary to myth, Hitler was not an “evil genius” — the word “genius” doesn’t apply. He had an actor’s gift for giving dramatic speeches, and some talent in the visual arts (not great talent, but some). His intelligence, though, was mediocre at best, his education limited, his writing ability pathetic (try reading Mein Kampf) — and there is the distinct possibility that his brain was malfunctioning due to mustard-gas injury during WW1, and later, due to the drug cocktail his feel-good doctor was administering. So, on top of all the other disadvantages Germany was dealing with, Hitler was a drug junkie.

Also — and I'm surprised that the article doesn’t mention it — the extermination of the Jews was actually a costly project that diverted a lot of resources. It was sheer insanity. But that is just the point of the article — Germany could have gained a lot of territory in Europe if it went about it in a rational way, but what we see here is insanity. Now, it may seem like insanity carries with it its own ultimate defeat — but only if nuclear weapons are not being used, so in our era it’s more terrifying.



~ “Christianity was in chaos in its early days, with some sects declaring the others heretics. And then, in the early 300s, Emperor Constantine of Rome declared he had become follower of Jesus, ended his empire’s persecution of Christians and set out to reconcile the disputes among the sects. Constantine was a brutal sociopath who murdered his eldest son, decapitated his brother-in-law and killed his wife by boiling her alive, and that was AFTER he proclaimed that he had converted from worshipping the sun god to being a Christian. Yet he also changed the course of Christian history, ultimately influencing which books made it into the New Testament.”

“Things that are today accepted without much thought were adopted or reinforced at Nicaea. For example, the Old Testament was clear in declaring that God rested on the seventh day, making it the Sabbath. The seventh day of the week is Saturday, the day of Jewish worship and rest. (Jesus himself invoked the holiness of the Jewish Sabbath.) The word Sunday does not appear in the Bible, either as the Sabbath or anything else. But four years before Nicaea, Constantine declared Sunday as a day of rest in honor of the sun god.
At Nicaea, rules were adopted regarding the proper positions for prayer on Sundays—standing, not kneeling; nothing was said of the Jewish Sabbath or Saturday. Many theologians and Christian historians believe that it was at this moment, to satisfy Constantine and his commitment to his empire’s many sun worshippers, that the Holy Sabbath was moved by one day, contradicting the clear words of what ultimately became the Bible. And while the Bible mentioned nothing about the day of Jesus’s birth, the birth of the sun god was celebrated on December 25 in Rome; Christian historians of the 12th century wrote that it was the pagan holiday that led to the designation of that date for Christmas.”


Then there is what many fundamentalist Christians hold to be the most important of all elements of the Bible: the Second Coming of Christ and the end of the world. What modern evangelicals want to believe cannot be reconciled with the Bible. In the Gospel of Mark, Jesus says of the Apocalypse, “This generation shall not pass, till all these things be done”—in other words, the people alive in his time would see the end of the world. Paul in 1 Corinthians is even clearer; he states, “The time is short.” He then instructs other Christians, given that the end is coming, to live as if they had no wives, and, if they buy things, to treat them as if they were not their own. Some evangelicals counter these clear words by quoting 2 Peter as saying that, for God, one day is like 1,000 years.

Two problems: That does nothing to counter what either Jesus or Paul said. And even in ancient times, many Christian leaders proclaimed 2 Peter to be a forgery, an opinion almost universally shared by biblical scholars today.


“The Barna Group, a Christian polling firm, found in 2012 that evangelicals accepted the attitudes and beliefs of the Pharisees—religious leaders depicted throughout the New Testament as opposing Christ and his message—more than they accepted the teachings of Jesus.”

The Trinity—the belief that Jesus and God are the same and, with the Holy Spirit, are a single entity—is a fundamental, yet deeply confusing, tenet. So where does the clear declaration of God and Jesus as part of a triumvirate appear in the Greek manuscripts?

Nowhere. And in that deception lies a story of mass killings.

The Sociopath Emperor

 Why would God, in conveying his message to the world, speak in whispers and riddles? It seems nonsensical, but the belief that he refused to convey a clear message has led to the slaughter of many thousands of Christians by Christians. In fact, Christians are believed to have massacred more followers of Jesus than any other group or nation.
Those who believed in the Trinity butchered Christians who didn’t. Groups who believed Jesus was two entities—God and man—killed those who thought Jesus was merely flesh and blood. Some felt certain God inspired Old Testament Scriptures, others were convinced they were the product of a different, evil God. Some believed the Crucifixion brought salvation to humankind, others insisted it didn’t, and still others believed Jesus wasn’t crucified.

Constantine convened a meeting in the lakeside town of Nicaea. Invitations were sent around the world to bishops and leaders of various sects, although not all of them. The group included the educated and the illiterate, zealots and hermits. Constantine arrived wearing jewels and gold on his scarlet robe and pearls on his crown, eager to discuss the true essence of a poor carpenter who had died 300 years before.

Things that are today accepted without much thought were adopted or reinforced at Nicaea. For example, the Old Testament was clear in declaring that God rested on the seventh day, making it the Sabbath. The seventh day of the week is Saturday, the day of Jewish worship and rest. (Jesus himself invoked the holiness of the Jewish Sabbath.) The word Sunday does not appear in the Bible, either as the Sabbath or anything else. But four years before Nicaea, Constantine declared Sunday as a day of rest in honor of the sun god.

And while the Bible mentioned nothing about the day of Jesus’s birth, the birth of the sun god was celebrated on December 25 in Rome; Christian historians of the 12th century wrote that it was the pagan holiday that led to the designation of that date for Christmas.

Constantine sided with those who believed Jesus was both God and man, so a statement of belief, called the Nicene Creed, was composed to proclaim that. Those who refused to sign the statement were banished. Others were slaughtered. After they had returned home and were far from Rome, some who signed the document later sent letters to Constantine saying they had only done so out of fear for their lives.

About 50 years later, in A.D. 381, the Romans held another meeting, this time in Constantinople. There, a new agreement was reached—Jesus wasn’t two, he was now three—Father, Son and Holy Ghost. The Nicene Creed was rewritten, and those who refused to sign the statement were banished, and another wholesale slaughter began, this time of those who rejected the Trinity, a concept that is nowhere in the original Greek manuscripts and is often contradicted by it.

To this day, congregants in Christian churches at Sunday services worldwide recite the Nicene Creed, which serves as affirmation of their belief in the Trinity. It is doubtful many of them know the words they utter are not from the Bible, and were the cause of so much bloodshed. (Some modern Christians attempt to use the Gospel of John to justify the Trinity—even though it doesn’t explicitly mention it—but they are relying on bad translations of the Greek and sentences inserted by scribes.)

To understand how what we call the Bible was made, you must see how the beliefs that became part of Christian orthodoxy were pushed into it by the Holy Roman Empire. By the fifth century, the political and theological councils voted on which of the many Gospels in circulation were to make up the New Testament. With the power of Rome behind them, the practitioners of this proclaimed orthodoxy wiped out other sects and tried to destroy every copy of their Gospels and other writings.

And recall that they were already working from a fundamentally flawed document. Errors and revisions by copyists had been written in by the fifth century, and several books of the New Testament, including some attributed to Paul, are now considered forgeries perpetrated by famous figures in Christianity to bolster their theological arguments. It is small wonder, then, that there are so many contradictions in the New Testament. Some of those contradictions are trivial, but some create huge problems for evangelicals insisting they are living by the word of God.” ~


I’ve been familiar with much of this stuff for some years now — forgeries, additions to the text by medieval scribes, problems with translation, contradictions, etc. What I find most interesting of all is that there is so much discussion of religion, a veritable explosion of it. I think it started not long after 9/11, which was a traumatic awakening on all kinds of levels. Now politics has moved ahead, but there is no going back to the old-time polite lies in the discussion of religion.

One important function of religion has been to support those in power, but I'm grateful to Newsweek for having provided scriptural reference: “Romans 13:1-2, which in the International Standard Bible says, “The existing authorities have been established by God, so that whoever resists the authorities opposes what God has established, and those who resist will bring judgment on themselves.”


“There are no last words. Any sacred text lasting millennia will have long since drifted far from its original meaning.” ~ Jeremy Sherman


Biologists have puzzled over the resilience of the germline for 130 years, but the phenomenon is still deeply mysterious.

Over time, a cell’s proteins become deformed and clump together. When cells divide, they pass that damage to their descendants. Over millions of years, the germline ought to become too devastated to produce healthy new life.

“You take humans — they age two, three or four decades, and then they have a baby that’s brand new,” said K. Adam Bohnert, a postdoctoral researcher at Calico Life Sciences in South San Francisco, Calif. “There’s some interesting biology there we just don’t understand.”

On Thursday in the journal Nature, Dr. Bohnert and Cynthia Kenyon, vice president for aging research at Calico, reported the discovery of one way in which the germline stays young.

Right before an egg is fertilized, it is swept clean of deformed proteins in a dramatic burst of housecleaning.
Clumping proteins are involved in many diseases of old age, such as Alzheimer’s. Dr. Kenyon and Dr. Bohnert set up an experiment using a special strain of C. elegant worms in which clumping proteins glowed.

It begins with a chemical signal released by the sperm, which triggers drastic changes in the egg. The protein clumps within the egg “start to dance around,” said Dr. Bohnert.

The clumps come into contact with little bubbles called lysosomes, which extend fingerlike projections that pull the clumps inside. The sperm signal causes the lysosomes to become acidic. That change switches on the enzymes inside the lysosomes, allowing them to swiftly shred the clumps.

“It’s a huge, coordinated shift,” said Dr. Bohnert.

The germline may not be the only place where cells restore themselves in this way.

Throughout our lives, we maintain a supply of stem cells that can rejuvenate our skin, guts and brains. It may be that stem cells also use lysosomes to eradicate damaged proteins.

“That would have huge implications,” Dr. Conboy said. It might be possible, for example, to treat diseases by giving aging tissues a signal to clean house.

Calico, founded by Google in 2013, is searching for drugs to counter aging. But Dr. Kenyon doesn’t see new medicine emerging from this research anytime soon.


~ "What happens when you first start drinking," Tabakoff explains, "is that a hormone that controls your water balance, an anti-diuretic hormone, is suppressed." And this leaves us heading for the ladies' or men's room — which can precipitate a pounding headache in the morning.

But Tabakoff says dehydration is not the only reason we get a headache.

"High levels of alcohol in the brain have fairly recently been shown to cause neuro-inflammation, basically, inflammation in the brain," he says.

This is why taking aspirin or other anti-inflammatory medicines, such as ibuprofen, can help us feel better.

Now, alcohol isn't the only headache-producing culprit in our drink glasses. Many alcoholic beverages, such as wines and beers, contain toxic byproducts of fermentation, such as aldehydes. And Tabakoff says if you drink too much, you can feel the effects.

"If these compounds accumulate in the body, " explains Tabakoff, "they can release your stress hormones, like epinephrine and norepinephrine, and as such can alter function in a stresslike way" — paving the way for a hangover.

Tabakoff says distilled spirits contain fewer of these toxic compounds than other types of booze, which explains why some people report feeling fewer hangover effects if they stick with vodka or gin.

Obviously, the only sure way to avoid a hangover is to not drink alcohol. But if you are going to indulge, Tabakoff says the tried-and-true advice — eat something before you drink, and while you drink, makes good sense.

"Food is very good for the purpose of slowing the absorption of alcohol," he says.

Adding liquid calories to your cocktails — say, Coke, ginger ale or sugary punch as a mixer — is a good way to slow absorption, too. In fact, a study we reported on back in 2013 determined that a diet soda and rum will make you drunker than rum mixed with sugary Coke.

Cecile Marczinski, a cognitive psychologist who authored that study, found that the average breath alcohol concentration was .091 (at its peak) when subjects drank alcohol mixed with a diet drink. By comparison, BrAC was .077 when the same subjects consumed the same amount of alcohol but with a sugary soda.

And here's another self-evident tip when it comes to drinking: Pace yourself.

"We can get rid of most of the alcohol we drink if we [limit] drinking to one drink per hour," Tabakofff says. This way, "our blood alcohol levels don't start accumulating."

One drink per hour is a rule of thumb, but that can vary depending on height or body size. Bigger people tend to be able to handle a little more alcohol, and smaller people a little less.

A single drink is less than you might think. It's 5 ounces of wine, 12 ounces of beer, or a shot of liquor.

ending on beauty:

Ring out the old, ring in the new,
Ring, happy bells, across the snow:
The year is going, let him go;
Ring out the false, ring in the true.

~ Tennyson 

No comments:

Post a Comment