Sunday, October 30, 2016



Rivers rang solid as cobbled highways.
Wine froze in glasses.
Birds fell from the sky.
In deadliest winter of oblivious time,
Tzarina Anna commissioned an ice palace.

In the middle of the Neva,
artisans fashioned finest ice
into porticoes and columns,
ice statues set in niches
in the style of an Italian villa. Inside,

the frozen furniture was carved
down to an ice clock and a box of ice snuff.
The walls glimmered a faint blue.
Candelabras floated
as in a thousand mirrors.

In the ice orchard hung crystal rows
of ice oranges, painted orange.
Six ice cannons guarded the gate.
an ice elephant curled its trunk,
a man blowing a trumpet hidden inside.

This luminous fantasy for the sake
of a practical joke: a courtier and his bride
transported in an iron cage
on a live elephant, to spend
their wedding night on a bed of ice —

ice logs in the fireplace,
guards stationed to prevent escape.
The couple survived
by dancing all night,
blue-lipped, slipping and falling.

A winter like that always carried off
the sick and the poor.
To bury the dead you had to thaw
the ground with fire. Thirty thousand
roubles, a Tzarina’s whim. By June

the palace was melted to a few
dirty ice floes drifting in the river.
The architect was in prison,
awaiting execution. The Tzarina ruled
for one more year, amid beheadings

and ballet. Then, like a bad queen
in a fairy tale, she was forgotten.

~ Oriana © 2016

Valery Jacobi, The Ice House, St. Petersburg
The woman in gold dress is Tzarina Anna. The “Ice House” was built in the winter of 1739-1740.

Europe’s “Little Ice Age” actually lasted several centuries, from the early 14th century through the early 19th century (the last time the Thames River froze over was in 1804). The early Middle Ages were a warm period; then the weather got much colder. The main theory is decreased solar activity combined with increased volcanic activity.

One interesting theory is that deforestation of Northern Europe resulted in large areas of land becoming agricultural fields; especially when covered with snow, these fields reflect a lot more sunlight, causing a cooling (even now, rural areas are significantly cooler than cities). Increased levels of carbon dioxide in the atmosphere eventually overcame this side effect of agriculture. 

Here is a summary of all the theories of what caused this long-lasting period of cold temperatures called Little Ice Age:

A drawing of the Sun made by Galileo on 23 June 1613 showing the positions and sizes of a number of sunspots. Galileo was one of the first to observe and document sunspots. Image credit: The Galileo Project/M. Kornmesser


~ “If history is any guide the [recent] “natural” disasters may be just the beginning—at least, that’s the implication of a comprehensive new book by British historian Geoffrey Parker, “Global Crisis: War, Climate Change, and Catastrophe in the Seventeenth Century.” Behind a tumultuous and grueling series of revolutions, wars, and famines, which ultimately killed off a third of the human population, was a culprit, he writes: a period of global cooling known as the Little Ice Age. Extreme weather caused crop failures, which led to hunger, disease, and forced migrations, which in turn translated to political and social chaos. 

The idea for “Global Crisis” first occurred to Parker nearly 40 years ago, when he heard a radio interview with solar physicist John A. Eddy, who had discovered that the reign of Louis XIV, from 1643 to 1715, coincided with a unique solar phenomenon: a lack of sunspots. (Scientists believe the lack of sunspots contributed to the Little Ice Age.) Eddy mentioned that this was particularly interesting because during this period “there were some wars, and some revolutions,” Parker recalls. “And I thought, yeah, some! That’s an understatement.”

While the book is set squarely in the 17th century, it’s intended as a cautionary tale. And it also offers a lesson about what seems to have worked: Parker finds that survival often hinged on the willingness of central government to take action, coercively or not. “Here’s the evidence,” Parker says, “that climate really does matter, and we need to prepare.”

Parker, who is the Andreas Dorpalen professor of European History at Ohio State University, spoke to Ideas from his home in Ohio.

IDEAS: You had a sense that there was a connection between climate change and this global crisis. Were you surprised by the extent of what you found?

PARKER: Yes. I was always afraid of drawing bull’s-eyes around bullet holes. That’s to say, you find something in the climatic record and it’s roughly equivalent to the human archive and you say, “Oh, wow, A must have caused B,” but then on closer inspection they don’t line up. But in this case . . . the two sources aligned perfectly. Once you find that, you want to find more.

IDEAS: The book details how rulers in a few areas enacted policies that limited the damage. But some of those measures were brutal and unfair and perhaps only possible in an authoritarian state.

PARKER: I’m going to disagree slightly with the second part, which is that only they could make those decisions. These rulers, particularly the Mughal emperors in India and the Tokugawa shogun in Japan, do indeed take executive action which is brutal, stifles opposition, which requires people to give up their freedoms. But they do, in return, provide protection and food. That option was also open to rulers elsewhere. The Chinese emperors, most of the European monarchs also had those powers; they chose not to use them or they used them to wage war . . . In return for survival, it was necessary to sacrifice liberty.

IDEAS: Can you share an example of how the cooling climate interacted with political and social realities to cause devastation?

PARKER: My favorite example occurs in Ireland on the 23d of October, 1641. There’s an uprising of Irish Catholics, particularly in Ulster, in Northern Ireland. The Catholics rise up against their Protestant neighbors. The Catholics on the whole are tenants, and the Protestants are landowners. And they rise up and they drive them out of their homes....

But the strange thing is, 1641 is the coldest year on record, and snow and ice falls almost immediately. Ireland never has snow in October. And if you read the aftermath of the rebellion, the Protestant survivors overwhelmingly say they’ve survived in spite of the cruel cold. Twice as many of them say those who died died of the cold rather than because a Catholic killed them. So you could say that the Little Ice Age doubled if not more than doubled the casualties.

Why does that matter? Because it’s the scale of the killing that inspires the Protestants for revenge. Eight years later, Oliver Cromwell will arrive in Ireland, he will depopulate the areas with Catholics, send them beyond the Shannon River. There is a huge changeover of land ownership in Ireland that is the root of the problem in Ireland today.

IDEAS: People might read the book and say, “That was then, these things could never happen now.” What do you say to that?

PARKER: Let’s start with last [May], shall we, and the millennial floods in central Europe, for which there was very little preparation, so hundreds of thousands of people lost their homes and a number of people were killed. We could go back to Sandy, to Katrina. Natural catastrophes happen, and some countries prepare for them and some don’t.

We need to protect ourselves, and pay now to avoid paying much more later. If we don’t prepare, we will be like the 17th century. The only difference is that we have the resources to do something and they did not.

IDEAS: It seems making the necessary changes can take a long time. You write about how barriers on the Thames were first proposed after devastating floods in the 1700s, but they weren’t actually funded until 1972.

PARKER: Gradually, those opposed — shipping interests, local governments who said they can’t afford it — were overruled by the central government. It’s the Tokugawa solution. You have to accept a greater view of central government action . . . All over the countries of the world there is a fear of central government. It’s not unjustified. But when it comes to preparing for climate change, only big government has the resources to act in advance. That’s the dilemma we face.

IDEAS: How can history help us in terms of trying to ease resistance to central government actions?

PARKER: History is the best argument for being willing to concede a certain degree of our own autonomy for the greater good. That’s the problem with civil society, isn’t it? Hobbes and Locke both wrote about it in the 17th century, both get enshrined in the Constitution. The reason we have the Bill of Rights, the amendments, is because the Constitution was thought to give too much power to central government. But the more you look at history, the more you realize we’re in a slightly different situation with regard to the climate. The dilemma is, do we pay now to prepare or do we pay a whole lot more later to repair? It’s an individual choice. The decision in the US lies with the states. They have to accept a greater degree of intervention by the federal government.


We certainly see the Southern states, with their epic floods and tornadoes, and their equally gigantic hatred of the federal government and constant threats to secede (but they never secede when you want them to), also rely most on the federal government when it comes to disaster relief. But the rational solution would be preparation — and only the central government has the resources initiate large-scale projects. I hate to think how severe the disasters will need to get before anything is done.

“Germany has declared war on Russia. Swimming in the afternoon.” ~ Franz Kafka in his diary entry for August 2, 1914

(one reason not to keep a diary — you may end up sounding like those French aristocrats who wrote on the day the Bastille fell: "Today, nothing.”)



~ “[At twenty-eight], he would cancel his engagement to Regine Olsen to begin a life of celibacy that would also mark the start of his philosophical career: “My engagement to her and the breaking of it,” he wrote, “is really my relationship to God.”

Kierkegaard is widely considered the most important religious thinker of the modern age. This is because he dramatized with special intensity the conflict between religion and secular reason, between private faith and the public world, and he went so far as to entertain the thought that a genuine reconciliation between them is impossible.

Society, for Kierkegaard, is a place of leveling conventions, and the ethical principles that bind us together ignore the genuine self. It is faith alone, uncontaminated by public understanding, that distinguishes the authentic individual, and faith is something wholly interior, a leap into paradox. For Kierkegaard, as for Barth, God remains “wholly other” and cannot be pressed into service for mundane causes.

At their limit such arguments suggest religious absolutism; they extol the believer even if his belief runs against all accepted codes of humanity. In reading Kierkegaard’s works one begins to fear that the individual whom he celebrated as “the knight of faith” too closely resembles that figure upon whom we have heaped so many of the anxieties of our own time: the religious fanatic.

Kierkegaard, a drawing by his cousin Niels Christian Kierkegaard, 1840

But Kierkegaard’s thinking is more subtle than this. A lover of irony, he signed many of his works with pseudonyms: Vigilius Haufniensis (“Watchman of Copenhagen”), Johannes de Silentio, Anti-Climacus, and, perhaps best of all, Hilarius Bookbinder. Everyone in Copenhagen knew that Søren was the author of his works, but this did not deter him from giving his pseudonymous personae a further twist of the pen, at times adding to the title page of a book the scholarly note that it had been “edited by Søren Kierkegaard.”

Daphne Hampson, the author of Kierkegaard: Exposition and Critique, permits herself the unusual gesture of thanking S.K. himself:

    My life would have been subtly different had I not encountered Kierkegaard. He has been a source of delight and edification with his insights and perspicacity. I am moved by his love of God, his sensitivity to others, and his sparkling wit.

But, she adds, Kierkegaard has also helped her to grasp “with greater clarity why I should not wish to be Christian.”

Kierkegaard’s own devotion to his faith was unqualified. Although he was born into financial comfort, his father had arrived as a nearly destitute youth in the Danish metropolis of Copenhagen, and he brought with him a lingering nostalgia for the Moravian pietism that flourished in the Jutland countryside. Søren himself was schooled in the evangelical Lutheran Church of Denmark, but he was racked by doubt about whether its respectable teachings were genuinely Christian. 

The Marxist theoretician Georg Lukács suggested that Kierkegaard’s entire philosophy could be found in his separation from Regine. Either/Or is in fact a meditation on the conflict between two modes of existence, the hedonistic or aesthetic (as described in the “Diary of a Seducer”) and the ethical (in which one awakens to remorse and “holiness”). To the individual who must choose either one life or the other, Kierkegaard says, reason offers no guidance. But at the book’s conclusion we learn that neither path is right, for in prayer alone we find truth: “We gladly confess that in relation to [God] we are always in the wrong.”


In the same year Kierkegaard also published what surely remains his most famous work, Fear and Trembling: A Dialectical Lyric. It is here, hiding behind the pseudonym of “Johannes de Silentio,” that S.K. insists on the stark opposition between public obligation and private faith. The book offers a meditation on the “Akedah,” the biblical episode in which Abraham appears ready to sacrifice his beloved son Isaac in obedience to God’s command and is only prevented from doing so by divine intervention. If this book arouses controversy today it is partly because its author does not flinch from considering the most extreme implications of the biblical tale.

Kierkegaard grants that any reasonable person who wishes to follow social norms must condemn Abraham as a would-be murderer. But we are asked to consider the alternative possibility, that Abraham’s obedience signifies the genuine “paradox” of individual faith without which religion is impossible. From the observer’s point of view this faith must appear preposterous, and we are right to conclude that Abraham’s love of God is “incommensurable with the whole of actuality.”

But for Kierkegaard this is precisely the point: faith entails a paradox by which the individual cannot make himself understood and yet “the particular is higher than the universal.” Abraham embraces the impossible proposition that through sacrifice he will receive his son alive: “Only he who draws the knife gets [back] Isaac,” Kierkegaard wrote. It is commitment, heroic yet absurd, that distinguishes Abraham as a “knight of faith.”

In the nearly endless commentaries on the Akedah some have argued that God never meant for Abraham to go through with the sacrifice. The medieval Jewish commentator Maimonides was especially troubled by the proposal that God “tested” Abraham since this implied that God did not know the outcome of the test—a violation of the principle of divine foreknowledge. For Christians the unrealized sacrifice of Isaac has an important part in Old Testament prophecy since it was thought to prefigure the actual sacrifice of Jesus Christ. For Muslims Abraham’s faith is praiseworthy but they hasten to explain that it was not Isaac who was readied for sacrifice but Ishmael, ancestor to all who worship Allah.

Modern readings of the episode are no less varied. Many have followed Kierkegaard in praising Abraham’s devotion, though few have matched him in analytical intensity. But Hampson is troubled (and rightly so, I think) by a religious disposition that violates our common understandings of humanity. Appealing to feminist criticism, she sees in the God of Kierkegaard a distinctly patriarchal ideal of stern (even violent) authority.

The charge is familiar but it reflects our own cultural preconceptions of male and female conduct. Indeed, the notion that God has gender at all involves an unwarranted lapse into anthropomorphism. The abstract deity of the Protestant West is customarily called “God the Father,” but as a matter of philosophical principle it makes no more sense to imagine God as a conventional man (the strong, silent type) than it does to picture God with the attributes we conventionally assign to women.

More worrisome, in my view, is not Kierkegaard’s lapse into anthropomorphism but rather the opposite: his readiness to amplify the doctrine of Protestant abstraction to a limit where the divine exceeds all understanding. Kierkegaard’s God lies at such a great remove from everyday categories as to contravene our most fundamental and enduring norms of morality. When a parent believes he hears voices that command him to take a knife to his own child our proper response should be not praise for his piety but horror at his self-evident lunacy. Kierkegaard is of course aware that this is our customary belief. But he sees in Abraham’s conduct a “teleological suspension of the ethical,” which is to say, he entertains the thought that religion imposes a higher purpose on us than what ethical reasoning demands.

Here, as Hampson reminds us, Kierkegaard belongs to the tradition called “divine command theory.” God does not command an action because it is good; rather, an action is good only because God commands it. One can admire Kierkegaard for the candor with which he pursues this principle to its perverse conclusion. But it smacks of authoritarianism. If a parent hears such an obscene command we should extol him not if he falls in line but only if he disobeys. Resistance to barbarism, even when it is commanded by the highest of all authorities, is the true mark of the blessed.

 Caravaggio: The Sacrifice of Isaac, 1604

There is a painting called The Sacrifice of Isaac by Caravaggio that hangs in the Uffizi, about which the philosopher J.M. Bernstein has recently written a telling commentary. The look in Isaac’s eyes as Abraham holds the knife to his throat bespeaks not just terror but protest. Though he wishes to be an obliging son, he cannot assent to what his father is determined to do. His resistance is visceral, the cry of a human animal whose desire to live calls into question any ennobling ideas of divine obligation. For Bernstein, the painting is an allegory for the birth of secular consciousness: it expresses a dawning awareness that if faith demands barbarism, then it is faith that must yield. But Kierkegaard’s argument runs in precisely the other direction: when faith and humanity conflict, faith supervenes. He is most blessed who persists in his piety even if he has made himself utterly unintelligible to those around him.

Such arguments run through many of Kierkegaard’s most celebrated works, which ruminate on the chasm between God and humanity, between the individual and the collective.

 At stake, Hamson writes, was a basic question regarding the place of religion in modern society:

    Should there be a broad state church, a spiritual home for the Danish people, which could provide a focus for the nation in times of crisis or rejoicing, able also to offer guidance and comfort to individuals whether or not they normally attended Christian services, a place where they might negotiate life’s transitions. Or by contrast should there be…a “confessing church,” standing for defined and stringent Christian beliefs, based on a certain reading of the New Testament, its members apparently ready . . . for “martyrdom”?

The question is still with us today, and not only for Christianity but for all faiths that confront a choice between solitude and society, individual purity and the comforts of belonging.


But Kierkegaard was an unbending conservative, and the political consequences of his religious absolutism remain uncertain. His hatred of the mob, for instance, fosters a healthy skepticism toward political conformity but also a disabling contempt for the public good. A rather different line of influence connects him to illiberal critics of modern democracy such as Carl Schmitt, the Nazi legal theorist who cited the Dane as an authority when he claimed that the ultimate problems in politics require radical decision, not reasonable deliberation.
There is a deep irony in Kierkegaard’s rebellion. Although he imagined himself a critic of modern convention, his individualist bid to wrest himself free of social constraint was a highly modern ideal. As the philosopher Charles Taylor has explained, in the secular age even those who cleave to a conventional faith conceive of their religion as one option among many: it is something an individual must will to have and no longer something one is merely given as an artifact of collective history. The age of religious reform was but one stage in the historical process that Taylor calls the “disembedding” of the self from shared traditions of meaning.

Ironically, the desire to stand as an authentic individual beyond all such traditions is the greatest conceit of the bourgeois era and Kierkegaard was in this respect far more conformist than he cared to admit. And yet none of us wishes wholly to surrender this desire for authenticity since it is also the very sign of possibility itself, the hope that life might be otherwise than it is. To abandon this hope is to give up on possibility altogether. Against all the forces that counsel resignation Kierkegaard remains not just the knight of faith but something more: the eternal child.” ~


That line between being a devout believer and a religious fanatic is awfully thin . . . When a person is willing to sacrifice the well-being of his family (for instance) for the sake of an all-consuming relationship with an imaginary being, that's where a problem may arise. I’ve read of only one such case — it’s probably quite rare nowadays.

But Kierkegaard seemed to be advocating an uncompromising piety and blind obedience to divine command, a willingness to submit and sacrifice like Abraham. The danger of putting anything above ethics is self-evident. 

Tangentially, some suggest that while conventional (mainstream or “liberal”) Christianity is bound to disappear, those who are truly devout will embrace radically strict churches. We will see them as pious when their activities benefit others; if they hurt others or even themselves (e.g. sexual repression and the almost inevitable pathologies that follow), we will call them religious fanatics.

Kieerkegard would no doubt agree with the traditional idea that a saint is not necessarily a good person, but someone who has an intense personal relationship with god. That may have been good enough  in the Middle Ages; if someone spent many hours every day praying, that was saintly. I think now we judge more by the “fruits” of their activity and by the humanistic yardstick of whether a person does something good for others.

Likewise, the idea that the opposite of sin is not virtue but faith appears to be dangerously close to fanaticism. Again, obedience to an imaginary being is be primary, human beings coming second. Of course to a believe this imaginary being is real and allegedly exists “out there” and not just inside the believer’s mind. But to a non-believer, this is delusion. It can be a benign delusion that encourages acts of kindness, or a malignant delusion (oddly enough, there are Christian hate groups; Islam has no monopoly on hate).


~ It will be easy for us once we receive the ball of yarn from Ariadne (love) and then go through all the mazes of the labyrinth (life) and kill the monster. But how many are there who plunge into life (the labyrinth) without taking that precaution? ~

[Oriana: I suspect that by “monster” Kierkegaard means despair. For me personally, one’s vocation is the kind of love that can guide and protect. But the love in the sense of affection that we give and receive also guides and protects.

By the way, it’s not that we “choose” to plunge into the labyrinth of life with or without Ariadne's thread. We are forced to plunge whether or not we have received enough love — or have found our vocation/talents already in childhood, and had the ability to develop those talents and pursue that vocation.]

~ If I were to wish for anything, I should not wish for wealth and power, but for the passionate sense of the potential, for the eye which, ever young and ardent, sees the possible. Pleasure disappoints, possibility never. And what wine is so sparkling, what so fragrant, what so intoxicating as possibility! ~

~ The more one suffers, the more one has a sense for the comic. ~

[Oriana: This reminds me of “Tragedy is for the young, who haven’t experienced yet it for real; only they can afford it.” ~ A. S. Byatt, in a Paris Review interview. Once we’ve lived for a while, we see that life is a mixed-genre play: tragedy and comedy all mixed up. Black humor always seems to surface, always, and does heavy irony.]

~ It belongs to the imperfection of everything human that man can only attain his desire by passing through its opposite. ~

~ The tyrant dies and his rule is over; the martyr dies and his rule begins. ~ [Oriana: This sounds wonderful, but we need only remember suicide bombers to be horrified.]

~ To have faith is to lose your mind and to win God. ~ [note Kierkegaard’s rejection of reason; this is the famous Kierkegaardian “leap of faith”]

About Kierkegaard:

    History has a way of reducing individuals to flat, two-dimensional portraits. it is the enemy of subjectivity, which is why Stephen Dedalus called it "a nightmare from which I am trying to awake". If we think of Kierkegaard, of Nietzsche, of Hölderlin, we see them standing alone, outside of history. They are spotlighted by their intensity, and the background is all darkness.

They intersect history, but are not a part of it. There is something anti-history about such men; they are not subject to time, accident and death, but their intensity is a protest against it. I have elsewhere called such men "Outsiders" because they attempt to stand outside history, which defines humanity on terms of limitation, not of possibility. ~ Colin Wilson in Rasputin and the Fall of the Romanovs, p. 13-14 (1964)

Regine Olsen, Kierkegaard's one-time fiancée. He dedicated all his books to her. 


The 2016 election campaign has completely upstaged Halloween.


In 1995, if you had told Toby Spribille that he’d eventually overthrow a scientific idea that’s been the stuff of textbooks for 150 years, he would have laughed at you. Back then, his life seemed constrained to a very different path. He was raised in a Montana trailer park, and home-schooled by what he now describes as a “fundamentalist cult.” At a young age, he fell in love with science, but had no way of feeding that love. He longed to break away from his roots and get a proper education.

At 19, he got a job at a local forestry service. Within a few years, he had earned enough to leave home. His meager savings and non-existent grades meant that no American university would take him, so Spribille looked to Europe.

Thanks to his family background, he could speak German, and he had heard that many universities there charged no tuition fees. His missing qualifications were still a problem, but one that the University of Gottingen decided to overlook. “They said that under exceptional circumstances, they could enroll a few people every year without transcripts,” says Spribille. “That was the bottleneck of my life.”

Throughout his undergraduate and postgraduate work, Spribille became an expert on the organisms that had grabbed his attention during his time in the Montana forests—lichens.

In the 150 years since Schwendener, biologists have tried in vain to grow lichens in laboratories. Whenever they artificially united the fungus and the alga, the two partners would never fully recreate their natural structures. It was as if something was missing—and Toby Spribille might have discovered it.

He has shown that largest and most species-rich group of lichens are not alliances between two organisms, as every scientist since Schwendener has claimed. Instead, they’re alliances between three. All this time, a second type of fungus has been hiding in plain view. 

“There’s been over 140 years of microscopy,” says Spribille. “The idea that there’s something so fundamental that people have been missing is stunning.” 

Down a microscope, a lichen looks like a loaf of ciabatta: it has a stiff, dense crust surrounding a spongy, loose interior. The alga is embedded in the thick crust. The familiar ascomycete fungus is there too, but it branches inwards, creating the spongy interior. And the basidiomycetes? They’re in the outermost part of the crust, surrounding the other two partners. “They’re everywhere in that outer layer,” says Spribille.

Despite their seemingly obvious location, it took around five years to find them. They’re embedded in a matrix of sugars, as if someone had plastered over them. To see them, Spribille bought laundry detergent from Wal-Mart and used it to very carefully strip that matrix away.

Unless you know what you’re looking for, there’s no reason why you’d think there are two fungi there, rather than one—which is why no one realized for 150 years. Spribille only worked out what was happening by labeling each of the three partners with different fluorescent molecules, which glowed red, green, and blue respectively. Only then did the trinity become clear.

“The findings overthrow the two-organism paradigm,” says Sarah Watkinson from the University of Oxford. “Textbook definitions of lichens may have to be revised.”

“It makes lichens all the more remarkable,” adds Nick Talbot from the University of Exeter. “We now see that they require two different kinds of fungi and an algal species. If the right combination meet together on a rock or twig, then a lichen will form, and this will result in the large and complex plant-like organisms that we see on trees and rocks very commonly.  The mechanism by which this symbiotic association occurs is completely unknown and remains a real mystery.


Migraine sufferers have a different mix of gut bacteria that could make them more sensitive to certain foods, scientists have found.

The research showed that migraine sufferers had higher levels of bacteria that are known to be involved in processing nitrates, which are typically found in processed meats, leafy vegetables and some wines.

The latest findings raise the possibility that migraines could be triggered when nitrates in food are broken down more efficiently, causing vessels in the brain and scalp to dilate.

When nitrates in food are broken down by bacteria in the mouth and gut they are eventually converted into nitric oxide in the blood stream, a chemical that dilates blood vessels and can aid cardiovascular health by boosting circulation.

However, around four in five cardiac patients who take nitrate-containing drugs for chest pain or heart failure report severe headaches as a side effect.

Dr Brendan Davies, a consultant neurologist at the University Hospitals of North Midlands and a trustee of the Migraine Trust, said the idea of gut bacteria playing a role in migraine was medically plausible. “There’s something called a hot dog headache, where nitrates are suspected to be involved,” he said. “This is interesting work, but would need to be confirmed.”

The study, published on Tuesday in the journal mSystems, sequenced bacteria found in 172 oral samples and 1,996 faecal samples from healthy participants, who had also reported whether they were affected by migraines.

In both oral and fecal samples, people with migraines had slightly higher levels of bacteria linked to breaking down nitrates.

The scientists are now planning a controlled diet study of migraine sufferers to see whether nitric oxide levels in the bloodstream are linked to migraine attacks.

ending on beauty

Yellow and heavy, a last ray of the sun
Congeals in a brilliant dahlia bouquet.
As in a dream I hear a viol
And the thin strains of a clavecin.

~ Anna Akhmatova, “Evening Room,” 1912

Viol and clavecin are historical instruments, invoking past elegance. Even that last ray of the sun “congeals” in a dahlia bouquet. Someone, perhaps a widow, still maintains the room, putting fresh flowers in the vase — but it’s a place of the past, of youth and love long gone. 


Ice Palace is a great poem. I like the last stanza the best.

ABRAHAM: A ROLE MODEL, OR A REVOLTING BARBARIAN? Your explanation has more wisdom then the Bible.

I’d rather befriend a person with an 800 credit rating then a person that sits and meditates and “works on himself
all day.

Favorite image: sunspots drawn by Galileo.


A reminder: My recent blogs contain a lot of excerpts from articles. Please credit the article, and not me, except when it comes to the commentary after the link.

The statement about Caravaggio’s painting is especially perceptive: “The painting is an allegory for the birth of secular consciousness: it expresses a dawning awareness that if faith demands barbarism, then it is faith that must yield.” But then it’s bizarre that Abraham failed to haggle with Yahweh, as he’d done on at least one other occasion: how many just men in Sodom would save the city. The willingness to sacrifice Isaac a nasty instance of blind obedience. We can excuse it only on the grounds that it was the barbarous culture of the time. The same barbarous sacrifice is repeated in the Christian interpretation of the crucifixion — and all note that at that point human sacrifice was no longer practiced in Judaism.

Apologists may say that what matters in the story of Isaac is the last-minute reprieve — a sign that the willingness to sacrifice Isaac was sufficient, but human sacrifice would no longer be required. But it’s precisely that willingness to kill that appalls us and must be condemned. Nor is the problem merely archaic — willingness to kill in the name of religion is still present in the world.

People who meditate for hours on end — I mean those who really put a huge amount of time into it — seem to me just like people who pray for a long time every day (it’s rare now, but I used to see some very devout women in my childhood). It’s an escape from life, from relationships — and even from activities like gardening, which certainly involves the risk of failing, the nastiness of pests and diseases. Life demands coping, meaning action rather than contemplation. I regret every minute spent praying — how lovely it would have been to have spent that time learning Italian!

(Of course some meditation is fine — but the wisdom of “nothing in excess” pertains to religious practices as much as anything else.)

Galileo drawing of sun spots makes me appreciate his genius all the more. 

Sunday, October 23, 2016


Warsaw: Lazienki Castle, First Snow


Other grandmothers knitted.
Mine only crocheted.
And exclusively hairnets.
Ever since I was a toddler,

I remember her that way,
with a little silver hook,
spiraling around and around
the nothing at the top.

Endless hairnets! 
She kept her hair short.
Even after eighty,
it was only beginning to gray.

Her hairnets were brown
or black, the yarn so fine
the hairnet hardly showed.
It was not about need.

Only now I see
it was about that spiraling
around empty space,
the eye of wisdom that opens

when you come to know
how in one moment
you can lose all, except
your own soul.

Everything else
is a ball of yarn.
It’s about the flight
of the hook.

~ Oriana © 2016

Quickly: when I use the word “soul,” I don’t mean the detachable little ghost that’s supposed to leave the body at the moment of death, a brain-free consciousness or self that continues to live on for eternity. Rather, I mean the core values, the innermost essence. But once published, the poem belongs to the reader who may read his or her meaning into this undefinable word. And that’s fine with me.

“In one moment you can lose all” — I had in mind specifically the fact of being taken to Auschwitz, losing not only your possessions (how minor that really is) but your whole former life — your profession, your social identity, your human rights — everything but that very core of yourself that you could still preserve.

For me the moment of a similar overwhelming multiple loss was leaving Poland and coming to America. Of course at the moment of leaving I didn’t yet comprehend the loss. That came later, when I saw that indeed “in the morning I had a homeland; / in the evening I had two suitcases.” What came even later, after the loss of home and family, the language and the culture, was the perception that I still had my “homeland of the mind.” Eventually I also made a home in poetry, but my first refuge was simply my intellect.

Kraków; photo: Ania Maria


~ “Last night for the third time in as many months I found myself explaining to someone raised outside of a devoutly religious environment that religious people are not stupid simply because they believe nonsensical things.

Very often they flatly disagree and insist that anyone who believes in things like demons and angels and Young Earth Creationism must be morons. But then like last night they get a puzzled expression as they sit across from me and finally admit, “The thing is, you don’t seem stupid to me. So how on earth did you ever believe such things?”


The first thing you have to realize is that intelligence is compartmental. By that I mean that people who employ sharp wit and critical thinking about one area of life (or even multiple areas) can still remain almost juvenile about a number of others. One need only look at how adept many of history’s greatest thinkers were at parsing ideas related to their own field of expertise but were complete disasters in their personal lives because they could never wrap their heads around the intricacies of human social interaction.

To see what I mean by compartmental intelligence, look no further than Ben Carson, who distinguished himself as a pioneering brain surgeon but who displays the political acumen of a remedial third grader. Or consider another less-well-known medical example whom I’ve mentioned here before: The last Sunday School teacher I had before leaving the church is a world-class oncologist who chairs an international committee on research protocols in his medical field, but he also studies “creation science” as a hobby. He uses up-to-date, state-of-the-art treatments for fighting cancer but gets all of his geological theories from the Institute for Creation Research, which quit putting out new theories in the early 1970’s (or as some would argue, in the late Bronze Age).


Another thing you must realize is that very intelligent people will believe very nonsensical things if you get to them young enough. When you grow up in an environment which takes for granted that a system of belief is sacred, your knowledge base and your critical thinking skills grow up around that belief structure in such a way as to leave it undisturbed. In fact, an argument could be made that without the checks and balances of the scientific method, human reasoning only serves to rationalize and validate the emotional content already in place in our psyches from our earliest years. We think in order to rationalize what we already believe.


Another thing which is almost impossible to grasp if you were never devout is how deeply we were taught to distrust ourselves. The notion of sin and human brokenness is bedrock to the Christian message, and the church drove this home to us before we even learned to read and write. We learned at an early age that human reasoning cannot be trusted.
“For as the heavens are higher than the earth, so are my ways higher than your ways, and my thoughts than your thoughts.”

With a narrative like that, is it any wonder that Christians grow up suspicious of the life of the mind? We were taught to distrust our own intellects even within those subcultures which otherwise valued science, education, and exploration (I know that’s inconsistent but see point #1). We learned early on that when our powers of logic and reasoning conflict with the teachings of our faith, we should privilege “what God says” over what anyone else thinks makes sense. Who can disagree with God himself, amirite?


And finally, people who did not grow up thoroughly enveloped by a community of faith will find it difficult to appreciate how heavily the social pressure to remain faithful keeps us from freely embracing our own cognitive dissonance. I recall clearly how apprehensive I became each time I collided with my own inner skeptic, realizing how costly it would be for me if my pursuit of reality ever led me outside the Christian fold. I knew long before I finally became honest with myself that I could lose everything, and for the most part I was right. When your whole life is built around an idea, challenging that idea shakes you to the core of who you are, both psychologically and socially. For some of us, this threatens to demolish our entire world.

And that’s why we hold on to irrational beliefs long after our own critical thinking skills seem like they should have outgrown these inferior ideas. Those ideas were always privileged for us, and it’s not as easy as it sounds to shake them when they are the very house in which you live.” ~



I freak out when I remember what bizarre things I used to believe when still a Catholic. A devil perched on my left shoulder, whispering temptations to sin. Behind me, or slightly to the right, my Guardian Angel. The world full of angels and saints and demons, of course, a sky filled with ghosts (with vastly more ghosts right underfoot, in hell. And this vast world, with billions of people, was ruled by the Invisible Man in the Sky who could (and did) read every thought in everyone’s head.

I was told those things at the age of eight, and at the age of ten I still believed them. Serious doubt didn’t begin until the age of twelve or so. And only soon at fourteen did doubt win.

(Other religions were of course crazy, absurd. A believer, blind to the absurdities of her own creed, stands agape at the thought that anyone could believe in Zeus.)

The only thing I could never believe was that the idea that god was good. Now that was just too absurd. God was blatantly evil. He was cruel. He out-Hitlered Hitler. There was no way I could love an evil god, so I knew I was doomed to eternal damnation. I did believe in that, and to believe it was a sin against the Holy Ghost, the one sin that would not be forgiven. There seemed to be no way out.

“Suffering is good for you”; “Human reason is very weak,” (i.e. “you’re too dumb to understand, so shut up”); “You are a sinner who deserves eternal damnation” — this and other harmful twaddle was the constant fare. The power of repetition. And, above all, a child’s trust that adults know better and are telling the truth. When I feel astonished that I truly believed this and more nonsense, I have to remind myself that I was indeed a child, even if an intelligent child.

And besides, what good was intelligence? It was held to be completely inadequate — “Of course this doesn’t make sense to you; it’s a divine mystery.” Any atrocious bunch of nonsense can be defended as “divine mystery.”

A child is easily intimidated by adult “authority.” Many thoughts were forbidden, the penalty being eternal hellfire. It was an Orwellian culture obsessed with sinning “in deed, in word, and in thought.” I was especially worried about sinning in thought — Orwell’s “thought crime.”

I was also told that god chooses who will believe in him and who won’t — “Faith is a gift.” It is a gift he gives to some and not to others (who are doomed to hell). Oddly, no group seemed as likely to possess the gift of faith as old, uneducated women. But now it strikes that it wasn't their belief in god that was deep and impervious to doubt. It was their belief in the devil.

Alfred Stieglitz: “Going to Prayer,” 1895


But then religion is so out of kilter with reality that it can be shed more easily than more subtle kinds of indoctrination and social pressures. We may not even be aware that we harbor certain views as absolute truth.

Coming to another culture showed me this — certain things I took absolutely for granted were regarded with horror in the US. I had no idea the US public was so conservative. In my teens, if someone had told me, America is a very conservative and religious country, I would have burst out laughing. I naively thought technological progress = progressive social ideas, so the more technologically advanced a society is, the more we can expect things like paid maternity leave and free medical care for everyone (remember, I grew up with those). What an eye opening it was.

Some things are of course universal, like nationalism. And since the mystery we’re discussing here is how people can believe all kinds of nonsense, I remember how my mother used to remark that Hitler was the greatest buffoon in modern history, perhaps in all history. “How could people fall for this buffoon?” my mother would ask for the thousandth time, and again not even try to answer. She’d just shake her head in that special way she had of trying to recoil from terrible memories. Sometimes she’d vary the question a bit: How could INTELLIGENT people ever fall for this buffoon? and then just shake her head. Sometimes we simply don’t have a convincing answer.

Well, he was very skilled at whipping up a purely visceral nationalistic frenzy of wanting to make Germany great again. Watch his body language:

It’s still an undeniably buffoonish performance, so the mystery remains.



The essence of Buddha’s great wisdom was pointing out that much suffering comes from delusional thinking. Now, “delusional” is a strong term, and it may be difficult for some to accept. But it’s time we understood that thought disorders are extremely common — just as one need not be certifiably insane to experience hallucinations or false memories. All it takes is the right circumstances.

Let’s say that as a child you experienced some degree of emotional insecurity — and it’s hard to meet someone who had a mostly happy and secure childhood — a “good-enough” childhood (I truly hope such people exist, and it’s just my strange luck that I don’t meet them). A school where you were never teased or bullied (or practically never — remember, we are talking about the “good-enough” childhood). Teachers who’d never stoop to demeaning and shaming you and making you feel stupid. Clergy who praised you for being a good boy or girl rather than a sinner who deserved eternal damnation. I realize there has been an enormous progress toward less child abuse, but it’s still awfully common to have grown up in the “I'm not OK, you’re not OK” mode.

Sooner or later something bad is bound to happen — “shit happens” is the most succinct translation of the First Noble Truth — and we are required to cope with adversity, aka finding ourselves deep in doodoo. It’s rarely out own fault, pure and simple. There are circumstances. There is other people’s doo-doo. But cope we must.

One way, alas, is by falling into delusional thinking that builds on the early patterns of self-loathing and a sense of abandonment. Now, both “I am a total failure in life” and “I had to do it all by myself; no one ever helped me” are outrageously false beliefs easily contradicted if you only stop and think and remember — astounding, all it really takes is remembering — the gazillion times when you did succeed and the innumerable instances when you did receive help from someone or from numerous others — from the whole society, in fact — but oddly enough, those memories are blocked. Anger, hate, resentment, depression — it’s incredibly easy to start riding the automatic spiral, and not see the thought disorders and memory disorders underlying the suffering. Life-changing insight may come only when half or more of our life is over.

Or it may never come. But if it does, it should teach us patience with those who aren’t there yet — and also the humility of knowing that though we’re now more enlightened about X or Y, we still harbor all kinds of false beliefs, despite being intelligent and educated. It’s simply the human  condition.

The Devil and a Woman, stained glass, before 1248, from Sainte Chapelle, now at Cluny



~ “A new paper by philosopher Neil Van Leeuwen [suggests] that factual belief isn't the same as religious belief.

Behind the common word "belief" is something like this:

Devon (factually) believes that humans evolved from earlier primates over 100,000 years ago.

Devon (religiously) believes that humans were created less than 10,000 years ago.

Factual beliefs seem to influence the way we act and think in pretty much all contexts, whereas religious beliefs have a more circumscribed scope. Even when engaged in pretend play, for example, children know that they shouldn't really bite the Play-Doh "cookie"; the factual belief that Play-Doh isn't food infiltrates imaginative play. And even when imagining an improbable scenario, like having a pet cat on Pluto, factual beliefs will typically guide the inferences one draws — for instance, that the Plutonian cat needs food to survive. These findings suggest that factual beliefs have a wide-ranging influence on cognition and behavior.

Not so when it comes to religious beliefs. One study, for example, found that members of the Vezo tribe in Madagascar endorsed some aspects of life after death in a ritual context but not in a naturalistic context. Another study found that even people who explicitly endorsed an omnipotent, omnipresent and omniscient God didn't think about God in these terms (for instance, as capable of being in more than one place at once) when engaged in imaginative storytelling. These findings suggest that religious beliefs govern how we think and act in appropriately "religious" contexts but not necessarily beyond.
A second reason to differentiate factual and religious belief comes from how these beliefs respond (or don't respond!) to evidence. Van Leeuwen provides a nice example: At the end of the last century, many people (factually) believed there was a "Y2K problem." Due to the way dates were handled by digital computers, people worried that computer systems would go wonky on and after Jan. 1, 2000. However, nothing much happened and, in the face of this evidence, people stopped believing there was a serious Y2K problem.

Now consider a superficially similar religious belief: A doomsday cult's prediction that the world will end on some particular date. Many such dates have come and gone, without an ensuing rejection of the beliefs that generated the prediction. These doomsday beliefs were held religiously, not factually; they were — as a result — relatively immune to evidence.
In these respects (and others that Van Leeuwen describes), religious beliefs are more like fictional imaginings than like factual beliefs. We can imagine that the Play-Doh is a cookie without having this imagining infiltrate our thoughts and actions in all contexts — and we can imagine that the Play-Doh is a cookie in the face of evidence to the contrary.

Like fiction or imaginative play, religious beliefs may persist alongside factual beliefs precisely because they operate within restricted contexts and aren't firmly tethered to evidence. An important difference, however, is in the contexts that fictions and religion typically govern.

"How can something so serious as religion," asks Van Leeuwen, "be rooted in the same capacity that yields something as frivolous as fiction?"

His answer, of course, is that fiction needn’t be frivolous: "Humans, in fact, take many fictions incredibly seriously." Still, it doesn't follow that it's rational to entertain any religious beliefs, even if human psychology provides a suite of mechanisms for doing so.

Van Leeuwen's paper can help us make sense of how people hold seemingly contradictory factual and religious beliefs — a very real phenomenon that's been of interest to psychologists.

"I think there are two main messages. The first is an encouragement in the direction of self-knowledge. What psychological state is actually going on in your mind when you say (for example) 'I believe in God, the Father almighty ... '? If it's credence as opposed to factual belief, as I think and as the word 'creed' suggests, then perhaps you have no business pushing it on someone else as if it were a factual belief — no matter how much it may do for you personally. So I think that self-knowledge can yield a certain amount of humility and restraint. This paper, I hope, can facilitate self-knowledge."

"Second, I think another important message is that people with different religions from your own (if you have a religion) may not be as crazy as you think. Having a credence that (say) the ancestors are alive and watching is very different from having a factual belief that the ancestors are alive and watching. It could be that the former isn't crazy, even if the latter would be. So I think that grasping this psychological distinction could foster a healthier level of understanding and curiosity toward others.”

The Maoris believed this was the entrance to the Underworld: Cape Reinga, New Zealand


This reminded me of the famous poem by Thomas Hardy about the belief that on Christmas Eve cows and sheep kneel at midnight (this belief is also expressed in one of Polish Christmas carols) — and the poet’s refusal to actually go to the barn and check — because of “hoping it might be so” while knowing deep down that it isn’t. Van Leeuwen proposed that religious beliefs are not literal but rather “literary” — closer to fiction and imagination. Karen Armstrong also suggested that religion is not literal but mythological and metaphorical.


The problem, however, seems to be a lot of confusion as to which beliefs are factual and which are “merely” (if that’s the word) religious. Yes, many people are able to compartmentalize religion and hold their beliefs only loosely and chiefly for one hour on Sunday. But there are those who seem genuinely convinced that angels and devils exist and can help or hurt us, that miracles violating the laws of nature happen all the time, that the dead continue to exist in the sky, and so on. There are those who at least seem to believe all this as firmly as they believe that the earth is round.

But at least in the West we are past the point of burning alive those who doubt those various archaic beliefs, and it does appear that religious beliefs are increasingly more loosely held and more confined to ritual occasions.

Perhaps the most interesting part of the article is the point about “credence” not being bound to evidence — thus doomsday dates come and go, but those who were preparing to be “raptured” just shrug off the non-fulfillment of prophecy and stand ready for next time.

I suspect we need to study in more detail how the brain functions in terms of imagination, fiction, false memories, and acceptance of various degrees of “reality.” There is no denying that children only pretend to eat Play-Doh cookies. But Catholics are supposed to believe that the wafer (or a piece of cracker) becomes literally the body of Jesus. In past centuries, people killed and died for that belief. It was “factual” then — is it merely “religious” now?

The human brain seeks survival, not truth, so it's easy to see the hand of evolution here. Myths can serve survival, especially the collective survival. And then there is wishful thinking, so hard to resist! Sometimes I wonder how science ever emerged, given our bias to believe whatever makes us happy.

“My mother was watching me from heaven!” someone who just narrowly escaped an accident may exclaim. But later the same person may claim to have left religion a long time ago, and is in fact not a church goer. But are you going to needle him, “So, does your mother really watch over you from heaven?” That would be unkind. We understand that he adopts the belief about his mother in heaven in times of emotional need. 

Even more interesting is to look at "religious professionals": how much do ministers, rabbis, priests, monks and nuns REALLY believe? Already in my early teens I strongly suspected that some priests were non-believers. Not that they were jolly about it; they looked tortured, depressed. There was an occasional jolly fat priest, but most priests looked seriously unhappy. In part it may have been celibacy. I remember a sad monk in a TV documentary; he said that every day he thinks what it would have been like if he'd gotten married and had a family life. "I hope god is pleased with my sacrifice," he finished. I felt so sorry for him: he sacrificed sexual and emotional/family fulfillment to worship a fictional character.

And those doubt-filled letters of Mother Teresa, what an eye-opener! Apparently as a young nun she really did expect Jesus to come to her cell as a bridegroom . . . and later was forever bitter “because I don't have Him, so it all means nothing.” How revealing that it wasn’t quite to help people that she did her good work, but to have a special relationship with the imaginary Beloved . . . She (now officially a saint in spite of those letters) admitted that she never sensed the presence of god.

And then there is the fact that occasional hallucinations are a perfectly natural phenomenon among people who are not mentally ill. It just takes special circumstances — prolonged fasting, for instance, or extreme danger. It seems that Mother Teresa heard a voice telling her to “serve the poorest of the poor” during an illness when she was running a high fever. Apparently she craved more such “mystical experiences” — but that opens up another huge chapter.

Can we make a general claim that people understand the difference between religious beliefs and factual beliefs? Not with any clarity about it. But it’s probably a step in the right direction to suggests that beliefs fall into those two (or several) categories. 



~ “We must here make a clear distinction between belief and faith, because, in general practice, belief has come to mean a state of mind which is almost the opposite of faith. Belief, as I use the word here, is the insistence that the truth is what one would “lief” or wish it to be. The believer will open his mind to the truth on the condition that it fits in with his preconceived ideas and wishes. Faith, on the other hand, is an unreserved opening of the mind to the truth, whatever it may turn out to be. Faith has no preconceptions; it is a plunge into the unknown. Belief clings, but faith lets go. In this sense of the word, faith is the essential virtue of science, and likewise of any religion that is not self-deception.” ~ Alan Watts

I’ve just rediscovered this thought-provoking statement. It’s interesting that Watts sees faith and belief as almost opposites. Belief is akin to having a closed mind. Faith, according to Watts, is open-mindedness.

In common usage there is no such opposition between faith and belief. In fact there isn’t even a “clear distinction” between the two words. Yet obviously there are different shades of meaning, and those differences can be significant. When a person says “I believe in kindness” it’s a not a factual belief like “the earth is round,” nor a religious belief like “Jesus died for our sins.” (I find it fascinating that Watts traced the etymology of “belief” to “lief,” related to wishing or desiring; to him a belief [I think he means mainly religious beliefs] is a type of wishful thinking).

Faith seems to be a broader term, and is closer to “trust.” I was raised in large part by an Auschwitz survivor (my grandmother), and yet, like Anne Frank, I have faith that most people are good at heart. I have been mocked for it, called naive, overly optimistic, and “rather silly.” But in spite of having experienced my share of cruelty and deception, and in spite of having, through my grandmother’s eyes, stared into an abyss of enormous evil, I still find that *most* people are good and even altruistic, glad to help others if they can. Likewise I have faith in some other conceptions about reality that I have reached over the years, though I realize that they are not absolute and keep on evolving.

My special challenge has been developing the faith (trust) that no matter what happens, I will be able to cope with it somehow. It has taken me a long time and many life experiences to come to trust in my ability to cope. Still trembling a bit, I think that I have enough intelligence, emotional strength, accumulated wisdom (“This too shall pass” is priceless), and other resources to be able to cope rather than fall apart under stress, come what may.

This kind of “faith in oneself” may sound pretty obvious, even trivial, to someone who’s always had high self-esteem. But many women know what it’s like to have experienced being put down and disvalued, of being made to feel incompetent and inadequate; those women (and some men, but women in particular) will understand that gaining faith in your ability to cope can be an achievement.

I believe in hard work; I believe in studying things in depth; I believe that “you get what you pay for” in more ways in one. I believe in treating others as I myself would like to be treated. I believe in forgiving rather than trying to take vengeance. I believe in moving on rather than holding grudges.

I also have faith in “negative capability.” I believe in waiting for clarity to arrive “in its own sweet time” (i.e. “ripeness is all”) rather than rushing for an answer; I believe the cognitive unconscious has the capacity to produce amazing solutions. Perhaps “I have faith in” would be a more accurate expression. My long experience with the creative process has taught me to trust my unconscious.

I also have faith in the collective human genius and the collective human goodness, a dominant tendency to cooperate rather take pleasure in inflicting harm.  When Bernie Sanders defined his spirituality as acting from the knowledge that “we’re all in this together,” that was an example of this faith (trust) in human solidarity — also called humanism. Once we fully grasp the fact that “we’re all in this together,” we see the need to work together, to help one another.

But perhaps we’re getting too caught up in words here. What matters is not how precisely we define the difference between “belief” and “faith,” or even what we believe and/or have faith in, but how we act.

William Blake: Behemoth


    ~ "Strange clouds forming above the Bermuda Triangle could explain why dozens of ships and planes have mysteriously vanished in the notorious patch of sea.
    Using radar satellite imagery, [meteorologists] discovered bizarre “hexagonal”-shaped clouds between 20 and 50 miles wide forming over the dodgy patch of water.
     The blasts of air are so powerful, they can reach 170 mph — a hurricane-like force easily capable of sinking ships and downing planes." ~

Will it convince the conspiracy nuts that something supernatural isn’t at play here? I doubt it. They’re impervious to evidence. As the saying goes, you can’t reason someone out of something they were never reasoned into in the first place.


~ “The habit of always saying “please” and “thank you” first began to take hold during the commercial revolution of the sixteenth and seventeenth centuries — among those very middle classes who were largely responsible for it. It is the language of bureaus, shops, and offices, and over the course of the last five hundred years it has spread across the world along with them. It is also merely one token of a much larger philosophy, a set of assumptions of what humans are and what they owe one another, that have by now become so deeply ingrained that we cannot see them.

The English “please” is short for “if you please,” “if it pleases you to do this” — it is the same in most European languages (French s’il vous plait, Spanish por favor). Its literal meaning is “you are under no obligation to do this.” “Hand me the salt. Not that I am saying that you have to!” This is not true; there is a social obligation, and it would be almost impossible not to comply. But etiquette largely consists of the exchange of polite fictions (to use less polite language, lies). When you ask someone to pass the salt, you are also giving them an order; by attaching the word “please,” you are saying that it is not an order. But, in fact, it is.

In English, “thank you” derives from “think,” it originally meant, “I will remember what you did for me” — which is usually not true either — but in other languages (the Portuguese obrigado is a good example) the standard term follows the form of the English “much obliged” — it actually does means “I am in your debt.” The French merci is even more graphic: it derives from “mercy,” as in begging for mercy; by saying it you are symbolically placing yourself in your benefactor”s power — since a debtor is, after all, a criminal. Saying “you’re welcome,” or “it’s nothing” (French de rien, Spanish de nada) — the latter has at least the advantage of often being literally true — is a way of reassuring the one to whom one has passed the salt that you are not actually inscribing a debit in your imaginary moral account book. So is saying “my pleasure” — you are saying, “No, actually, it’s a credit, not a debit — you did me a favor because in asking me to pass the salt, you gave me the opportunity to do something I found rewarding in itself!” ~

Debbie Milma: Please, 1993


Fascinating. Saying “please” and “thank you” is something we take for granted, unaware that such “good manners” didn’t exist until relatively recently in human history. You didn't need to thank a slave. Relatively speaking, we live in an era of emphasis on human dignity.

Someone pointed out to me that Southerners cultivated exquisite manners toward their white peers. And Hitler was known for “beautiful manners” towards women — his secretaries, for instance, who were notoriously in love with him. This almost makes me want to say, “Beware of people with beautiful manners — they may be compensating for being complete bastards toward SOME human beings.” Beautiful manners and rank prejudice — not uncommon. And in my unfortunately experience I’ve found that great charm can go together with utter cruelty. Of course in most cases this need not be true. Ideally, we should have beautiful manners when dealing with anyone. 

ending on beauty:

“She smelled the way the Taj Mahal looks by moonlight.” ~ Raymond Chandler, The Little Sister