Sunday, January 13, 2019


"Everyone comes out from nothingness moment after moment. Moment after moment we have true joy of life. So we say shin ku myo yu, “from true emptiness, the wondrous being appears.” ~ Shunryu Suzuki, Zen Mind, Beginner’s Mind



When a friend lay dying of leukemia,
I was walking the shore of Lake Tahoe.
A pleasure boat passed by; the passengers
burst into a wild chorus of La Bamba.

‘Yo no soy marinero! Yo soy capitan!’
they sang with such triumph
you’d think they were singing
I’m alive, I’m alive.

I was hardly in the presence
of Art, and this was not
the Mormon Tabernacle Choir.
A phone call could come in any time —

and suddenly I knew this was
my own memorial service.
To honor me, the mourners
sang La Bamba to the sun.

~ Oriana


~ “It’s become a truism that if you read A Catcher in the Rye at 14, you love Holden Caulfield, and if you read it at 20, you hate him. But when I reread it recently as an adult — 100 years after J.D. Salinger’s birth — I just felt sorry for him. That boy spends so much energy judging people that his life must be exhausting.

Holden Caulfield is the eternal voice of the adolescent male: riddled with nostalgic protectiveness for children and fury toward adult phonies, irreverent and very aware of how cool he thinks that irreverence makes him, while at the same time obsessed with certain “rules.”

He’s convinced he has mastered the complex unspoken social codes of the world, and it makes him furious when other people don’t abide by them the way he thinks they should. He hates phonies, of course, but he’s also repulsed by what he sees as ignorance. He can’t stand the schoolmate who doesn’t understand that when you pick up a knee supporter from someone’s chiffonier [dresser], you shouldn’t put it back on their bed. He is filled with righteous indignation when girls don’t offer to pay for their own drinks — he wouldn’t have let them, but for form’s sake, he insists, they really should have tried.

   ~ “It may be the kind where, at the age of thirty, you sit in some bar hating everybody who comes in looking as if he might have played football in college. Then again, you may pick up just enough education to hate people who say, ‘It’s a secret between he and I.’ Or you may end up in some business office, throwing paper clips at the nearest stenographer. I just don’t know. But do you know what I’m driving at, at all?”

    “Yes. Sure,” I said. I did, too. “But you’re wrong about that hating business. I mean about hating football players and all. You really are. I don’t hate too many guys. What I may do, I may hate them for a little while, like this guy Stradlater I knew at Pencey, and this other boy, Robert Ackley. I hated them once in a while — I admit it — but it doesn’t last too long, is what I mean. After a while, if I didn’t see them, if they didn’t come in the room, or if I didn’t see them in the dining room for a couple of meals, I sort of missed them. I mean I sort of missed them.” ~

It’s one of the few times in the book where Holden gestures toward forgiving the people who don’t live up to his exacting standards, toward recognizing that people have value and worth even when they break his rules. It’s almost enough to give you hope for the future — but then the English teacher tries to molest Holden while he sleeps, so that kills that idea.

Still, the passage exemplifies Holden’s slow crawl toward self-awareness and self-reflection. It develops the tension between his adolescent desire for black-and-white morality and his emerging adult recognition that people can do things that annoy you and still be basically good people. And that tension is what gives A Catcher in the Rye its forward drive.
If you read the book in celebration of Salinger’s 100th birthday, you can have a little celebration for yourself too. I’ll be having one in recognition of the fact that I am no longer a teenager and no longer have to waste Holden Caulfield levels of energy on judging everyone around me. You could not pay me enough to go back.” ~


I think Holden is right not to tolerate being molested, even if the teacher assumed the boy wasn’t conscious of it. But aside from that, I find myself nodding my head. Adolescence is a time of intense emotions, and hatred at that age can be especially white-hot — even over relatively minor stuff. We don’t yet grasp how complex life is, rarely black and white — rather a very wide spectrum of shades of gray. And people — well, anyone who’s made it past the age of thirty has usually some idea that everyone is a mixed bag of qualities we like and those we don’t — we may even abhor some of them, but as long as the person is decent — which I define as honest, hard-working, and willing to help others — that is worth of respect right there.

And some respect needs to be granted simply because a person is human. There are reasons why people are the way they are, and those reasons are generally not under their control. Perhaps they really did have a terrible childhood; the effects of it tend to be lifelong. Perhaps one parent was an alcoholic, or both parents — it happens. Perhaps the mother died and no one else stepped in to provide the critically needed love and nurturing. Perhaps there’s a genetic problem — it’s a lottery.

Spinoza said that to understand all is to forgive all. It’s a shame that an individualistic culture tends to blame the victim and not give the right weight to the power of circumstances.

The longer I live, the more clearly I see the power of circumstances. At the same time, I do find certain people just too difficult to be around. Life is too short for time to be wasted on those who truly get on our nerves. At the same time, I also find it true that even those whom we dislike at first can become “part of the family” if we spend enough time with them (at work, for instance) and get to know some personal details. 

(A shameless grammar-Nazi confession: I hate it when people say "between you and I" or say a "feasible excuse" instead of "plausible" or don't know the difference between ambiguous and ambivalent. I detest those who pronounce the "t" in "often." I let myself experience being flooded by that wave of loathing, knowing it will pass — usually in a matter of five to ten minutes.)

Rembrandt: Self-Portrait Frowning, 1630 (aged 24)
What Neruda said can also serve as a comment on how a person should mature and become more connected:

There is no insurmountable solitude.
All paths lead to the same goal:
to convey to others what we are.
And we must pass through solitude
and difficulty, isolation and silence
in order to reach forth to the enchanted place
where we can dance our clumsy dance and sing
our sorrowful song.

~ Pablo Neruda

In a complex system, we control almost nothing, but we influence almost everything. “Of all the fantasies human beings entertain, the idea that we can go it alone is the most absurd and perhaps the most dangerous. We stand together or we fall apart.” ~ George Monbiot


~ “In the future, the ebbing of romantic and sexual connections will continue. People will have sex less frequently than they did in the pre-internet era, which will be remembered as a more carnal time. They will have fewer lifetime sexual partners, and they will be more likely to be abstinent.

Only a minority of teenagers will have sex of any sort. Masturbation and other varieties of solo sex will continue to be more prevalent than they were before; porn aficionados will enjoy VR sex and sex robots.

Like many other aspects of our world in the decades to come, the gap between the haves and have-nots will continue to grow. Those who have many advantages already will be disproportionately likely to find romantic and sexual partners if they desire them and to have fulfilling sex lives. There will be good parts of this: Nonconsensual sex will be far less common than it is today. There will be little to no social stigma attached to being unattached. Those who approach singledom with psychological and financial advantages will flourish. It will be the best time in human history to be single. But there will be less unambiguously positive developments as well: For better and for worse, the birth rate will continue to fall, and those who are less suited to solo life will suffer from profound loneliness.

If you compare Americans’ sex lives today to the sex lives of people of the same age in the early ’90s, people who are now in their 20s are on track to have fewer lifetime sexual partners. They’re having sex less frequently. They’re about two and a half times as likely to be abstinent, and they have launched their sex lives later. It would seem that something is getting in the way of people’s ability or desire to connect to each other physically.

In The Atlantic, I called this a sex recession. Its biggest cause is that more people than not who are under 35 are living without any sort of partner, which is a change from decades past. The most common living arrangement for adults who are under 35 is to be living with a parent, which, I think it is safe to say, is for many of us probably not a great recipe for a superactive and fulfilled sex life.

Other factors include media, broadly — not just social media and not just porn. I would put any kind of digital occupation that makes it less desirable to go out and connect with somebody in person in the same category. It could be Netflix, streaming TV — all of these things coincide with a measured increase in the percentage of people who say that they’ve masturbated in the past week. Among men, that’s doubled since the early ’90s. Among women, it’s tripled.

Another set of causes when we’re looking at teens specifically has to do with the way adolescence has changed: Teenagers are having sex later, and the teen birth rate is a third of what it was in the early ’90s. People are more likely to say their first sexual experience was consensual. People seem to be coming into their 20s with less romantic experience than past cohorts. That can be a really difficult thing to reverse. I’ve never held hands with somebody, I’ve never kissed somebody; how do I do that when I’m 23, 24, 25? A third big category has to do with dating apps, which have become a normal way to meet people in a lot of circles. And yet for some people, they are clearly functioning really poorly and maybe sort of paradoxically actually making it harder to match up with people. That will continue.” ~ Kate Julian


Who knew? I thought that hormones were invincible, but by now we’ve had multiple studies all confirming that the young are not having sex the way they used to in previous generations. What’s actually more frightening is that they aren’t having relationships, they aren’t connecting with others — and not just erotically. It’s sad, because so much depends on having rich human connections.


“How wonderful it is to be able to be silent with another person.” ~ Kurt Tucholsky


A Soviet-era monument in Vilnius. That patch of snow on the sheaf of wheat makes it surreal —  in a symbolic way if you are inclined to read it as a statement on the Soviet regime.

More poignant: January 13 marks an anniversary of the 1991 “Bloody Sunday” in Vilnius. Thousands of unarmed civilians tried to stop the Soviet tanks, sometimes with their bare hands.


~ “Joseph Brodsky said something very good when he was asked: ‘What is the difference between great literature and the merely average?’ Brodsky replied: ‘In the taste for the metaphysical.’ And how are we to understand ‘the metaphysical’? It is when someone sees… more deeply. Her worlds, her space, the enigmas of the world are involved in all this. She is enlightened in another way. That’s where the difference lies.” ~ Svetlana Alexevich


Thinking about how Russian literature differs, I'd say that it is indeed richer in metaphysical concerns. Strangers in a train compartment may at any point start discussing Life's Persistent Questions.

(By the way, let’s remember that it’s fiction. I remember the tales of a woman who studied in Russia during the late fifties. She loved to remember traveling by train: “People would spontaneously start singing. Soon everyone was singing; the whole train would be singing.”)



St. Nick, Gingerbread. Alas, I don't know the time period (I suppose gingerbread can be sprayed with a fixative) -- but note the horse rather than reindeer, and the religious trappings: this is definitely meant to be the original St. Nicholas, the Bishop of Myra, 270-342 (back then the town was Greek; now it's Turkish and it's called Demre).


~ “A few years ago a student walked into the office of Cesar A. Hidalgo, director of the Collective Learning group at the MIT Media Lab. Hidalgo was listening to music and asked the student if she recognized the song. She wasn’t sure. “Is it Coldplay?” she asked. It was “Imagine” by John Lennon. Hidalgo took it in stride that his student didn’t recognize the song. As he explains in our interview below, he realized the song wasn’t from her generation. What struck Hidalgo, though, was the incident echoed a question that had long intrigued him, which was how music and movies and all the other things that once shone in popular culture faded like evening from public memory.

Hidalgo is among the premier data miners of the world’s collective history. With his MIT colleagues, he developed Pantheon, a dataset that ranks historical figures by popularity from 4000 B.C. to 2010. Aristotle and Plato snag the top spots. Jesus is third. It’s a highly addictive platform that allows you to search people, places, and occupations with a variety of parameters. Most famous tennis player of all time? That’s right, Frenchman Rene Lacoste, born in 1904. (Roger Federer places 20th.) Rankings are drawn from, essentially, Wikipedia biographies, notably ones in more than 25 different languages, and Wikipedia page views.

Last month Hidalgo and colleagues published a Nature paper that put his crafty data-mining talents to work on another question: How do people and products drift out of the cultural picture? They traced the fade-out of songs, movies, sports stars, patents, and scientific publications. They drew on data from sources such as Billboard, Spotify, IMDB, Wikipedia, the U.S. Patent and Trademark Office, and the American Physical Society, which has gathered information on physics articles from 1896 to 2016. Hidalgo’s team then designed mathematical models to calculate the rate of decline of the songs, people, and scientific papers.

The report, “The universal decay of collective memory and attention,” concludes that people and things are kept alive through “oral communication” from about five to 30 years. They then pass into written and online records, where they experience a slower, longer decline. The paper argues that people and things that make the rounds at the water cooler have a higher probability of settling into physical records. “Changes in communication technologies, such as the rise of the printing press, radio and television,” it says, affect our degree of attention, and all of our cultural products, from songs to scientific papers, “follow a universal decay function.”

Why does collective memory decay matter?

If you think about it, culture and memory are the only things we have. We treasure cultural memory because we use that knowledge to build and produce everything we have around us. That knowledge is going to help us build the future and solve the problems we have yet to solve. If aliens come here and wave a magic wand and make everyone forget everything—our cars, buildings, bridges, airplanes, our power systems, and so forth, we would collapse as a society immediately.

In your mind, what is a classic example of collective memory decay?

I thought everybody knew “Imagine” by John Lennon. I’m almost 40 and my student was probably 20. But I realized “Imagine” is not as popular in her generation as it was in mine, and it was probably less popular in my generation than in the generation before. People have a finite capacity to remember things. There’s great competition for the content out there, and the number of people who know or remember something decays over time. 

There’s another example, of Elvis Presley memorabilia. People had bought Elvis memorabilia for years and it was collecting huge prices. Then all of a sudden the prices started to collapse. What happened is the people who collected Elvis memorabilia started to die. Their families were stuck with all of this Elvis stuff and trying to sell it. But all of the people who were buyers were also dying.

You write collective memory also reflects changes in communication technologies, such as the rise of the printing press, radio, and TV. How so?

Take print. Changing the world from an oral tradition to a written tradition provided a much better medium for data. A lot of people have linked the revolution in the sciences and astronomy to the rise of printing because astronomical tables, for instance, could be copied in a reliable way. Before printing, astronomical tables were hand-copied, which introduced errors that diminished the quality of the data. With printing, people had more reliable forms of data. We see very clearly from our data that with the rise of printing you get the rise of astronomers, mathematicians, and scientists. You also see a rise in composers because printing helps the transmission of sheet music. So when you look at people we remember most from the time when print first arose, you see ones from the arts and sciences.

What did the mediums that came next mean for science?

The new mediums of radio and TV were much more adaptive for entertainment than science, that’s for sure. The people who belong to the sciences, as a fraction of the people who became famous, diminished enormously during the 20th century. The new mediums were not good for the nuances that science demands. For good reason, scientists need to qualify their statements narrowly and be careful when they talk about causality. They need to be specific about the methods they use and the data they collect. All of those extensive nuances are hard to communicate in mediums that are good for entertainment and good for performance. So the relative power of scientists, or their position in society, have diminished as we exited the printing era and went into this more performance-based era.

What does your analysis tell us we didn’t know before about the decay of collective memory?

We began by looking at how popular something is today based on how long ago it became popular in the first place. The expectation is collective memory decays over time in a smooth pattern, that the more time goes by, the more things become forgotten. But what we found when we looked at cultural products—movies, songs, sports figures, patents, and science papers—was that decay is not smooth, but has two defined regimes. There’s the first regime in which the attention starts very high and the decay is really fast. Then there’s the second regime in which it has a much longer tail, when the decay is smoother, and the attention is less.

When we started to think about decay, we realized we could take two concepts from anthropology—“communicative memory” and “cultural memory.” Communicative memory arises from talking about things. Donald Trump is very much in our communicative memory now. You walk down the street and find people talking about Trump—Trump and tariffs, Trump and the trade war. But there’s going to be a point, 20 years in the future, in which he’s not going to be talked about everyday. He’s going to exit from communicative memory and be part of cultural memory. And that’s the memory we sustain through records. Although the average amount of years that something remains in communicative memory varies—athletes last longer than songs, movies, and science papers, sometimes for a couple decades—we found this same overall decay pattern in multiple cultural domains.

In your forthcoming paper, “How the medium shapes the message,” you refer to the late cultural critic Neil Postman who argued that the popular rise of TV led to a new reign of entertainment, which dumbed us down, because entertainment was best suited for TV. Is that what you found?

We found evidence in that favor, yes. Because the fraction of people who belong to the sciences, as a fraction of all of the people that become famous, diminishes enormously during the 20th century. It would completely agree with that observation.

Did you come away from your study with insights into what may or may not cause something to stick in collective memory?

I read a very good book recently called The Formula by Albert-Laszlo Barabas. He says you can equate quality and popularity in situations in which performance is clearly measurable. But in cases in which performance is not clearly measurable, you cannot equate popularity with quality. If you look at tennis players, you find tennis players who win tournaments and difficult games are more popular. So quality and fame are closely correlated in a field in which performance is measured as tightly as professional tennis players. As you move to things that are less quantifiable in terms of performance, like modern art, your networks are going to be more important in determining popularity.

How should we think about quality in media content?
Well, I would say that collective memory decay is an important way to measure and think about quality. If you publish some clickbait that is popular in the beginning, that gets a lot of views in the first couple of days, but a year later, nobody looks at it, you have a good metric. The same is true if publish a more thoughtful piece that might not be as popular in the beginning because it didn’t work as clickbait—it required more engagement from the readers—but keeps on building readers over time. So the differences in longevity are important metrics for quality.

That goes back to a paper I did when I was an undergrad about the decay functions of attendance of movies. There were some movies that had a lot of box office revenue in the first week but then decayed really fast. And there were other movies that decayed more slowly. We created a model in which people would talk to each other and communicate information of the quality of the movie. And that model only had one parameter, which was how good was the movie was. So the quality of the movie would increase or decrease the probability that people would go watch it. We could then look at the curves and infer how good the movie was, based not on the total area it was shown, or on the total revenue, but on the shape of the curve. That was interesting because there were movies that were really bad like Tomb Raider, which at first was a box office success. But if you put it on our model, you would see that it was just hype, people watched it, hated the movie, and the curve decayed really fast.” ~


I think it's a very important article. It touches on the subject that poets often comment on: a poet can be very popular while alive, but goes into near-oblivion shortly after his death. Then some poets "come back" and maintain a steady readership, while others are truly forgotten. And we've proven to be poor judges of “who'll survive.”

I have anthologies going way back, and the farther back you go, the less you can understand why 90% (or more) of those poets were chosen, presumably carefully, in tight competition. Academia used to rule — which poets were studied in college. But some poets have gained fame in spite of academic disdain — Mary Oliver is an example, I think. Whether she'll have a literary afterlife is another question.

Bukowski seems to have endured. So perhaps it’s a question of meeting a certain need in readers, and also of timing. Dickinson is now a stellar figure, but had no chance in her lifetime.

The main point here is this: “The decay [of collective memory] is not smooth, but has two defined regimes. There’s the first regime in which the attention starts very high and the decay is really fast. Then there’s the second regime in which it has a much longer tail, when the decay is smoother, and the attention is less.”

(By the way, “mediums”? That plural is correct only for a group of psychics.)


~ “Meritocracy loomed large over Victorian capitalism. However, in the course of the 19th century, the free market largely failed to deliver the developmental goods, proving itself to be more adept at generating than distributing wealth. And so, in the early decades of the 20th century, stirred by both political and intellectual developments – the growing appeal of communism to a working class that had tasted comparatively few of the market’s fruits, and the consequent rise of economic schools that aimed to renew capitalism, such as Keynesianism and the German social-market – the state gradually took on a much more active role in both society and the economy.

Thus a tremendously successful initiative that would come to be known as the development industry was born. By 1948, Western economies had emerged from crisis, beginning a decades-long period of rising growth and prosperity. Rather than pack up and go home, the development industry now turned its attention to a new frontier. With Europe’s overseas empires breaking up, dozens of new nation-states were coming into being, each of them eager to ‘catch up’ with its erstwhile colonial master. Amid this exciting atmosphere, the development industry could use its expertise to play a clear and prominent role, one captured in the subtitle to the then-Bible of development, Walt Rostow’s Stages of Economic Growth (1960) – ‘a non-communist manifesto’.

But when growth slowed in the 1970s, governments began to turn away from state-led approaches and to free up the market. Leaders such as Ronald Reagan in the US and Margaret Thatcher in the UK, early proponents of this new libertarian approach, harkened back to the unbridled individualism of the Victorian age. Reagan told a 1981 development summit in Cancun that Third World countries ought to follow the model set by the US, whose economy, in this telling, had been built by self-sufficient, independent farmers. Reagan elided the roles of slavery and industrialisation behind post-Civil War tariff-walls, major chapters in US history impossible to square with the libertarian ideal. But with a bit of editing of the historical record, the ‘neoliberals’ took a decidedly dim view of the government. They tended to think that those who can, do, while those who can’t, administer, looking for ways to frustrate society’s makers.

The ‘less government, more growth’ approach became orthodoxy, but it brought back – with a vengeance – the challenges of distribution. By the end of the 20th century, social indicators in developing countries were going backwards, just as the tide was turning against conservative politics in the West. Unlike what happened after the Second World War, though, the pendulum this time did not swing back toward the more social roadmap to development. Instead, the development industry asserted its autonomy from government, and assumed a new role. Rather than have the state build the bridge between accumulation and distribution, we now learned that philanthrocapitalism, a radically new approach to development, would offer a whole new way of doing things.

Reagan and Thatcher and other exponents of free-market economics had been social conservatives. By the 1990s, a new generation had come along, represented by leaders such as Bill Clinton in the US and Tony Blair in the UK, who mixed conservative economics with social liberalism. As much as possible, they preferred a progressive politics that channelled private initiative, and the logic of philanthrocapitalism was pleasingly straightforward. Since the rich were getting richer, they had more money to throw around. The lure of yet more lucre could now be used to steer them into sinking some of this new wealth into the poorest communities, something touted by Clinton late in his presidency when he went on a four-day ‘new markets’ tour of deprived American neighborhoods. Urging the super-rich to do some good with a portion of their rapidly growing prosperity, Clinton told them that a better world would make them richer yet. ‘Every time we hire a young person off the street in Watts and give him or her a better future,’ he said, ‘we are helping people who live in the ritziest suburb in America to continue to enjoy a rising stock market.’

In fact, in the two decades after Clinton took office, the number of charitable foundations doubled. A new problem arose, though. Due to the worsening inequality produced by free-market policies, this growing number of foundations and NGOs found themselves relying on a diminishing pool of wealthy donors. Inevitably, that forced them to cultivate the plutocrats, and reflect their views. However, even this supposed vice could be turned into a virtue. If the free market had in fact sorted the best from the rest, and enabled them to use their ingenuity to enrich themselves, it followed that this same ingenuity could subsequently be applied to the solution of social problems. As the state withdrew behind the curtains, the development industry thus moved beyond its traditional supporting role in tackling social problems to take centre stage. If Rostow’s Stages of Economic Growth had articulated the ideals of a liberal age, Matthew Bishop and Michael Green’s Philanthrocapitalism (2008) did it for a neoliberal one. As Rostow’s subtitle had done, theirs offered a pithy summation of the doctrine: ‘How the Rich Can Save the World’.

The story that philanthrocapitalists told was a great one: history marching forward, heroes and villains, and a Hollywood ending. History has a way of surprising us, however, and most of the script ended up on the cutting-room floor, the actors left to ad-lib parts they weren’t expecting to play. Russia’s shock therapy didn’t beget a flourishing capitalist democracy. Chinese autocracy didn’t collapse under the weight of its contradictions – in fact, scholars today wonder if China gives the lie to the long-cherished rule that economic dynamism demands a lean state. As for the rich people who were meant to save the world, almost to a man, they chucked the script in the bin: for every billionaire funding a progressive cause, there would be dozens who used their wealth to support conservative campaigns to further roll back the state’s social provisions.

 A wave of popular anger against disconnected ‘elites’ has resulted, which authoritarian populists have skilfully exploited to launch crackdowns on the development industry. Whether it be the Orthodox chauvinism of Russia’s Vladimir Putin, the Hindu nationalism of India’s Narendra Modi, or the Muslim fundamentalism of Turkey’s Recep Tayyip Erdoğan, they are tapping into disillusionment with the old model to repress ‘globalist elites’. As they decry NGOs for their lack of patriotism, these governments are pushing them to fall into line behind government, or to leave the field altogether (as Hungary, for instance, has done with its notorious ‘Stop Soros’ law).

And as disturbing as their conspiracy-theories of ‘globalist’ or ‘metropolitan’ elites can sound, the populists might be on to something. In 2011, the Swiss Federal Institute of Technology conducted a network analysis of the global corporate elite. What they found was a small web, made up of a few hundred tightly networked and extraordinarily wealthy individuals, dominated by bankers, and commanding vast pools of capital. If this was Davos man, then meritocracy was arguably its governing ethos – its mission, to replace the narrow, limiting confines of the old nation-state. Through compulsion and cajoling, much of the development industry got drawn into an alliance with this new global elite.

On the face of it, it seems puzzling that philanthrocapitalism ever got much of a hearing, because history had surely shown it would never work. If the rapid but unequal economic growth of the Victorian age failed to produce commensurate social development, what made anyone think that the rapid but unequal growth of the contemporary period would do any different? Moreover, the idea that the rich should be left to use their wealth to solve the world’s problems because they have proved their merit in the market ignores the science behind success. Does anyone really believe that, if Steve Jobs had been born into a Bengali peasant family, he would have still created Apple? In fact, economists who’ve actually worked out scientifically what contribution our own initiative plays in our success have found it to occupy an infinitesimally small share: the vast majority of what makes us rich or not comes down to pure dumb luck, and in particular, being born in the right place and at the right time.

At heart, philanthrocapitalism offered not a new science of development, but an old-fashioned moral tale – one in which a hero, who would reveal himself by some magnificent achievement, would come along to save us from some peril. There is no shame in weaving moral tales. Economics has always given us moral narratives by which to live our lives – in fact, that’s arguably its primary reason for being. But if it is to enter our canon, a story needs an audience that finds it rings sufficiently true to then retell it. Philanthrocapitalism failed that test. It will probably end up in history’s remainder bin as a result, while storytellers devote themselves to crafting more compelling narratives.” ~


Actually I was astounded to discover a few genuine philanthropists. But when I heard that Bill Gates chose which disease he’d try to wipe out, and where, I felt totally uneasy. Should one super-rich person, with no expertise in the area, have this kind of power? Shouldn't reforestation, or clean water, or solar energy be priorities? Or girls' education, which we know has a cascade of benefits?

And besides, when decisions are made from the top down, without understanding how ordinary people actually live, things tend to go wrong — sometimes catastrophically so.

It seems that “one size doesn’t fit all.” Different countries have different paths, and no path remains the right one forever.


~ “The revolutionary possibilities of the sandwich have always been well hidden by its sheer obviousness. The best history, written by Woody Allen in 1966, imagines the conceptual journey taken by the fourth Earl of Sandwich 200 years earlier. “1745: After four years of frenzied labour, he is convinced he is on the threshold of success. He exhibits before his peers two slices of turkey with a slice of bread in the middle. His work is rejected by all but David Hume, who senses the imminence of something great and encourages him.” ~

The first definite sandwich sighting occurs in the diaries of Edward Gibbon, who dined at the Cocoa Tree club, on the corner of St James Street and Pall Mall in London on the evening of 24 November 1762. “That respectable body affords every evening a sight truly English,” he wrote. “Twenty or thirty of the first men in the kingdom … supping at little tables … upon a bit of cold meat, or a Sandwich.” A few years later, a French travel writer, Pierre-Jean Grosley, supplied the myth – beloved by marketing people ever since – that the Earl demanded “a bit of beef, between two slices of toasted bread,” to keep him going through a 24-hour gambling binge. This virtuoso piece of snacking secured his fame.

The evidence for this, though, is weak. In his definitive biography, The Insatiable Earl, published in 1994, NAM Rodger concludes that Sandwich was hard-up, and never wagered much for a man of his rank. A large, shambling figure, prone to breaking china, the Earl ran the Admiralty, by most accounts badly, for a total of 11 years. He lived alone after his wife went mad in 1755. Visitors to his house remarked on the poor quality of the food. “Some of his made dishes are either meagre or become absolutely obsolete,” said his friend, Lord Denbigh. The likely truth is that the entire future of the sandwich – its symbiotic relationship with work, its disregard for a slower, more sociable way of eating – was present at its inception. In 18th-century English high society, the main meal of the day was served at around 4pm, which clashed with the Earl’s duties at the Admiralty. He probably came up with the beef sandwich as a way of eating at his desk.” ~

 John Montagu, 4th Earl of Sandwich


~ “When Europeans arrived in North America, they carried with them pathogens against which the continent's native people had no immunity. And the effects could be devastating. Never was this more true than when smallpox wiped out 5-8 million Aztecs shortly after the Spanish arrived in Mexico around 1519. Even worse was a disease the locals called “huey cocoliztli" (or “great pestilence" in Aztec) that killed somewhere from 5 to 15 million people between 1545 and 1550. For 500 years, the cause of this epidemic has puzzled scientists. Now an exhaustive genetic study published in Nature Ecology and Evolution has identified the likely culprit: a lethal form of salmonella, Salmonella enterica, subspecies enterica serovar Paratyphi C. (The remaining Aztecs succumbed to a second smallpox outbreak beginning in 1576.)

Cocoliztli was therefore probably enteric fever, a horrible disease characterized by high fever, headaches, and bleeding from the nose, eyes, and mouth, and death in a matter of days once the symptoms appeared. Typhoid is an example of one enteric fever. “The cause of this epidemic has been debated for over a century by historians, and now we are able to provide direct evidence through the use of ancient DNA to contribute to a longstanding historical question," co-author Åshild Vågene of the Max Planck Institute in Germany tells AFP. (S. enterica no longer poses a serious health problem to the local population.)

The study is based on DNA analysis of teeth extracted from the remains of 24 Aztecs interred in a recently discovered cemetery in the Mixteca Alta region of Oaxaca, Mexico. The epidemic grave was found in the Grand Plaza of the Teposcolula-Yucundaa site.

Researchers suspect the Spanish brought the disease in tainted food or livestock because the teeth from five people who died prior to the Europeans' arrival show no trace of it—this is not a huge sample, of course, so it's difficult to be certain. Another team member, Kirsten Bos says, “We cannot say with certainty that S. enterica was the cause of the cocoliztli epidemic," adding, “We do believe that it should be considered a strong candidate.”

A chilling consideration is that the same strain of bacteria has been identified in a Norwegian female who died in 1200, 300 years before it appeared in the Aztec community. Clearly, Europeans weren't as defenseless against it as those in the Western Hemisphere.” ~


~ "Robert Lustig is a paediatric endocrinologist at the University of California who specializes in the treatment of childhood obesity. A 90-minute talk he gave in 2009, titled Sugar: The Bitter Truth, has now been viewed more than six million times on YouTube. In it, Lustig argues forcefully that fructose, a form of sugar ubiquitous in modern diets, is a “poison” culpable for America’s obesity epidemic.

A year or so before the video was posted, Lustig gave a similar talk to a conference of biochemists in Adelaide, Australia. Afterwards, a scientist in the audience approached him. Surely, the man said, you’ve read Yudkin. Lustig shook his head. John Yudkin, said the scientist, was a British professor of nutrition who had sounded the alarm on sugar back in 1972, in a book called Pure, White, and Deadly.

“If only a small fraction of what we know about the effects of sugar were to be revealed in relation to any other material used as a food additive,” wrote Yudkin, “that material would promptly be banned.” The book did well, but Yudkin paid a high price for it. Prominent nutritionists combined with the food industry to destroy his reputation, and his career never recovered. He died, in 1995, a disappointed, largely forgotten man.

When Yudkin looked at the data on heart disease, he was struck by its correlation with the consumption of sugar, not fat. He carried out a series of laboratory experiments on animals and humans, and observed, as others had before him, that sugar is processed in the liver, where it turns to fat, before entering the bloodstream.

He noted, too, that while humans have always been carnivorous, carbohydrates only became a major component of their diet 10,000 years ago, with the advent of mass agriculture. Sugar – a pure carbohydrate, with all fiber and nutrition stripped out – has been part of western diets for just 300 years; in evolutionary terms, it is as if we have, just this second, taken our first dose of it. Saturated fats, by contrast, are so intimately bound up with our evolution that they are abundantly present in breast milk. To Yudkin’s thinking, it seemed more likely to be the recent innovation, rather than the prehistoric staple, making us sick.

The British Sugar Bureau dismissed Yudkin’s claims about sugar as “emotional assertions”; the World Sugar Research Organization called his book “science fiction”.


In 2008, researchers from Oxford University undertook a Europe-wide study of the causes of heart disease. Its data shows an inverse correlation between saturated fat and heart disease, across the continent. France, the country with the highest intake of saturated fat, has the lowest rate of heart disease; Ukraine, the country with the lowest intake of saturated fat, has the highest. When the British obesity researcher Zoë Harcombe performed an analysis of the data on cholesterol levels for 192 countries around the world, she found that lower cholesterol correlated with higher rates of death from heart disease.

In the last 10 years, a theory that had somehow held up unsupported for nearly half a century has been rejected by several comprehensive evidence reviews, even as it staggers on, zombie-like, in our dietary guidelines and medical advice.

The UN’s Food and Agriculture Organization, in a 2008 analysis of all studies of the low-fat diet, found “no probable or convincing evidence” that a high level of dietary fat causes heart disease or cancer. Another landmark review, published in 2010, in the American Society for Nutrition, and authored by, among others, Ronald Krauss, a highly respected researcher and physician at the University of California, stated “there is no significant evidence for concluding that dietary saturated fat is associated with an increased risk of CHD or CVD [coronary heart disease and cardiovascular disease]”.


If Yudkin was ridiculed, Atkins was a hate figure. Only in the last few years has it become acceptable to study the effects of Atkins-type diets. In 2014, in a trial funded by the US National Institutes of Health, 150 men and women were assigned a diet for one year which limited either the amount of fat or carbs they could eat, but not the calories. By the end of the year, the people on the low carbohydrate, high fat diet had lost about 8lb more on average than the low-fat group. They were also more likely to lose weight from fat tissue; the low-fat group lost some weight too, but it came from the muscles. The NIH study is the latest of more than 50 similar studies, which together suggest that low-carbohydrate diets are better than low-fat diets for achieving weight loss and controlling type 2 diabetes. As a body of evidence, it is far from conclusive, but it is as consistent as any in the literature.

Professor John Yudkin retired from his post at Queen Elizabeth College in 1971, to write Pure, White and Deadly. The college reneged on a promise to allow him to continue to use its research facilities. It had hired a fully committed supporter of the fat hypothesis to replace him, and it was no longer deemed politic to have a prominent opponent of it on the premises. The man who had built the college’s nutrition department from scratch was forced to ask a solicitor to intervene. Eventually, a small room in a separate building was found for Yudkin.

When I asked Lustig why he was the first researcher in years to focus on the dangers of sugar, he answered: “John Yudkin. They took him down so severely – so severely – that nobody wanted to attempt it on their own.”

Today, as nutritionists struggle to comprehend a health disaster they did not predict and may have precipitated, the field is undergoing a painful period of re-evaluation. It is edging away from prohibitions on cholesterol and fat, and hardening its warnings on sugar, without going so far as to perform a reverse turn. But its senior members still retain a collective instinct to malign those who challenge its tattered conventional wisdom too loudly. " ~

ending on beauty:

Don't bring the ocean if I feel thirsty,
nor heaven if I ask for a light;
but bring a hint, some dew, a particle,
as birds carry only drops away from water,
and the wind a grain of salt.

~ Olav Hauge

Saturday, January 5, 2019


Pieter Bruegel the Younger Winter Landscape with skaters and a bird trap, detail, 1565


Even dreams have grown smaller
where are the dream pageants
of our grandmothers and grandfathers
when colorful as birds carefree as birds they ascended
the imperial staircase lit with a thousand chandeliers
and grandfather already tamed to the walking stick
pressed to his side
a silver sword and unloved grandmother
who out of courtesy
put on for him the face of first love

Isaiah spoke to them
from clouds like swirls of tobacco smoke
they saw Saint Teresa pale as a wafer
carrying an authentic basket of firewood

their terror was immense as a Tatar horde
their happiness like golden rain

my dream – the doorbell rings
I am shaving in the bathroom I open the door
the bill collector hands me my gas and electricity bill
I have no money I return to the bathroom brooding
over the figure 63.50
I raise my eyes and see in the mirror
my face so life-like that I wake up screaming

if at least once a hangman’s red tunic
appeared in my dream
or a queen’s necklace
I would be grateful to dreams

~ Zbigniew Herbert, tr Oriana Ivy


“This is just delightful” is the only comment I wish to make. 



~ “I think hell’s a fable,” the famous professor proclaimed—a surprising declaration not only because it was made in the late sixteenth century, when very few people would have dared to say such a thing, but also because he was at that moment in conversation with a devil to whom he was offering to sell his soul. The professor in question was Doctor Faustus in Christopher Marlowe’s great Elizabethan tragedy. Bored with his mastery of philosophy, medicine, and law, Faustus longs for forbidden knowledge. “Where are you damned?” he asks Mephastophilis, the devil whom he has conjured up. “In hell,” comes the prompt reply, but Faustus remains skeptical: “How comes it then that thou art out of hell?” The devil’s answer is quietly devastating: “Why this is hell, nor am I out of it.”

Did Marlowe, a notorious freethinker who declared (according to a police report) that “the first beginning of Religioun was only to keep men in awe [terror],” actually believe in the literal existence of hell? Did he imagine that humans would pay for their misdeeds (or be rewarded for their virtues) in the afterlife? Did he think that there was a vast underground realm to which the souls of sinners were hauled off to suffer eternal punishments meted out by fiends? It is difficult to say, but it is clear that hell was good for the theater business in his time, as exorcism has been good for the film industry in our own. In his diary, the Elizabethan entrepreneur Philip Henslowe inventoried the props that were in storage in the Rose Theater. They included one rock, one cage, one tomb, and one hellmouth, the latter perfect for receiving a sinner like Faustus at the end of act 5.

There is evidence that Marlowe’s play produced a powerful effect on his contemporaries. During a performance at the Theatre—London’s first freestanding wooden playhouse—a cracking sound caused a panic in the audience; in the town of Exeter the players bolted when they thought that there was one devil too many on stage; and multiple rumors circulated of “the visible apparition of the Devill” unexpectedly surging up during the conjuring scene. In Doctor Faustus, hell may have been a form of theatrical entertainment; audiences paid their pennies to enter a fictional world. But when the performance was disrupted by a surprise noise, the crowd was prepared instantly to jettison the idea of fiction and grant that it was all too true. This is a familiar story. We humans have a way of turning our wildest imaginations into unquestionable beliefs, the foundations on which we construct some of our most elaborate and enduring institutions. 

The Penguin Book of Hell, edited by the Fordham history professor Scott Bruce, is an anthology of sadistic fantasies that for millions of people over many centuries laid a claim to sober truth. Not all people in all cultures have embraced such fantasies. Though the ancient Egyptians were obsessively focused on the afterlife, it was not suffering in the Kingdom of the Dead that most frightened them but rather ceasing altogether to exist. At the other extreme, in ancient Greece the Epicureans positively welcomed the idea that when it was over it was over: after death, the atoms that make up body and soul simply come apart, and there is nothing further either to fear or to crave. Epicurus was not alone in thinking that ethical behavior should not have to depend on threats and promises: Aristotle’s great Nicomachean Ethics investigates the sources of moral virtue, happiness, and justice without for a moment invoking the support of postmortem punishments or rewards.

The Hebrews wrote their entire Bible without mentioning hell. They had a realm they called sheol, but it was merely the place of darkness and silence where all the dead—the just as well as the wicked—wound up. For the ancient rabbis, heaven was a place where you could study the Torah all the time. Its opposite was not a place of torture; it was more like a state of depression so deep that you could not even open a book.

. . . Something the anthology lightly skims over: Jesus’s striking insistence on Gehenna, the sinister valley in Jerusalem where in archaic times the followers of Moloch were said to have sacrificed their children. “If you say, ‘You fool,’ you will be liable to the hell [Gehenna] of fire,” he declared in the Sermon on the Mount (Matt. 5:22), and the synoptic gospels attribute this warning to the Savior at least ten more times: “It is better for you to lose one of your members, than for your whole body to be thrown into hell [Gehenna]” (Matt. 5:29); “If your eye causes you to stumble, tear it out and throw it away; it is better for you to enter life with one eye than to have two eyes and to be thrown into the hell [Gehenna] of fire” (Matt. 18:9); “If your hand causes you to stumble, cut it off; it is better for you to enter life maimed than to have two hands and to go to hell [Gehenna], to the unquenchable fire” (Mark 9:43); “But I will warn you whom to fear: fear him who, after he has killed, has the authority to cast into hell [Gehenna]” (Luke 12:5); etc., etc. The gospels’ good news is closely conjoined, on the authority of God’s own son, with repeated dire warnings about a place where the worm dies not, and the fire is not quenched, and there shall be weeping and gnashing of teeth.

Whether it derived from the Pharisees or the Essenes or some entirely personal vision, Jesus’s emphasis on a fiery place of torment for sinners seems to have licensed the outpouring of texts, many of them translated here by the editor, that constitute most of a volume that would, given the absence of Buddhist and other traditions, have been more accurately titled The Penguin Book of Christian Hell.

In the sixteenth century, Catholics eagerly prayed for the day when Martin Luther would join [the heretics in hell], along with other Reformers who were rebelling against the Holy Mother Church. For their part, Protestants consigned the pope and his bishops to the flames. But there was nothing particularly new in doing that: ecclesiastics had long featured prominently in medieval depictions of hell. In the Inferno, Dante sees Pope Nicholas III wriggling upside down in a fiery hole. The pope, roasting in the flames, was guilty of simony—the selling of church offices—an accusation frequently brought against high-ranking churchmen, along with pride, gluttony, and hypocrisy.

Still more often, the charges against the clergy were sexual in nature: for well more than a thousand years, the rule of strict and perfect celibacy, promulgated in the Roman Catholic Church and still officially mandated, has proved to be almost impossible to sustain in practice. Violations were sometimes treated, as in Boccaccio or Chaucer, with a certain wry humor, but they very often provoked disgust and outrage. Hence the visitor to hell in the influential twelfth-century Vision of Tundale stares at a large group of souls who are undergoing a particularly horrific torture: “The genitals of the men and the women were like serpents, which eagerly mangled the lower parts of their stomachs and pulled out their guts.” The angelic guide tells the appalled visitor that these are all monks, nuns, and other clerics who have been guilty of fornication.

. . . Writing in the mid-nineteenth century, Father Furniss may have been afraid that the spirit of Voltaire had eroded robust belief in the horrors to come. “Perhaps at this moment, seven o’clock in the evening,” he told his young readers, “a child is just going into Hell. Tomorrow evening at seven o’clock, go and knock at the gates of Hell and ask what the child is doing. The devils will go and look. Then they will come back again and say, the child is burning!” But notwithstanding the hell-monger’s intentions, the burning child leads us away from theology and toward Freud: the words “Father, don’t you see I’m burning?” lie at the center of one of his most famous dream interpretations. (The Interpretation of Dreams, Chapter 7. The dream lies at the center of a remarkable recent film by Joseph Koerner, The Burning Child, 2017)

Freud argued that the words, terrible though they are, allowed the dreamer to continue to sleep. We can perhaps suggest something similar about the texts collected in The Penguin Book of Hell. One of the prime motives of these texts is rage, rage against people occupying positions of exceptional trust and power who lie and cheat and trample on the most basic values and yet who escape the punishment they so manifestly deserve. History is an unending chronicle of such knaves, and it is a chronicle too of frustration and impotence, certainly among the mass of ordinary people but even among those who feel that they are stakeholders in the system. Hell is the last recourse of political impotence. You console yourself—you manage to stay asleep, as Freud might say—by imagining that the loathsome characters you detest will meet their comeuppance in the afterlife.

But Voltaire and the Enlightenment carried a different message: wake up. Throw out the whole hopelessly impotent fantasy; it is, in any case, the tool not only of the victims but also of the victimizers. We must fight the criminals here and now, in the only world where we can hope to see justice.” ~

Vermeer: Woman Holding a Balance, 1664. Note the painting of the Last Judgment on the wall. 



The dream intrigued me, so I investigated it further.

~ “A father had been watching beside his child’s sick-bed for days and nights on end. After the child had died, he went into the next room to lie down, but left the door open so that he could see from his bedroom into the room in which his child’s body was laid out, with tall candles standing round it. An old man had been engaged to keep watch over it, and sat beside the body murmuring prayers. After a few hours’ sleep, the father had a dream that his child was standing beside his bed, caught him by the arm and whispered to him reproachfully: ‘Father, don’t you see I’m burning? [‘Vater, siehst du denn nicht, daß ich verbrenne?]’ He woke up, noticed a bright glare of light from the next room, hurried into it and found that the old watchman had dropped off to sleep and that the wrappings and one of the arms of his beloved child’s dead body had been burned by a lighted candle that had fallen on them. (Freud, Interpretation of Dreams, Chapter 7) 

Freud argues that, in line with his theory that dreams are wish fulfillment, the purpose of the dream was to prolong the sleep of the father for a few moments more because in it his dead child was still alive:

    The dead child behaved in the dream like a living one: he himself warned his father, came to his bed, and caught him by the arm, just as he had probably done on the occasion from the memory of which the first part of the child’s words in the dream were derived. For the sake of the fulfillment of this wish the father prolonged his sleep by one moment. The dream was preferred to a waking reflection because it was able to show the child as once more alive. If the father had woken up first and then made the inference that led him to go into the next room, he would, as it were, have shortened his child’s life by that moment of time.

Lacan, however, notes that the dream itself contains another, more terrifying Real, which is what wakes the father. And this is not simply that the dream is ‘telling’ the father to wake up because of the events in the adjoining room. Rather, it’s the reproach of the son to his father.

In Freud’s account of the dream there is an implication that the child had died of fever. So perhaps the child is reproaching his father for not having done something sooner to prevent his death. And there is also the possible reproach that the father had entrusted the task of looking over his son’s body to someone who was not up to the job.

As Žižek points out:

    ~ The subject does not awake himself when the external irritation becomes too strong; the logic of his awakening is quite different. First he constructs a dream, a story which enables him to prolong his sleep, to avoid awakening into reality. But the thing he encounters in the dream, the reality of his desire, the Lacanian Real — in our case, the reality of the child’s reproach to his father, ‘Can’t you see I am burning?’, implying the father’s fundamental guilt — is more terrifying than so-called external reality itself, and that is why he awakes: to escape the Real of his desire, which announces itself in the terrifying dream. ~

Perhaps this is yet another example of the many paradoxes and contradictions that lie at the heart of Freud’s work, which in turn reflect the paradoxical and contradictory nature of psychical reality itself. For Freud, dreams were essentially wish fulfillment, and yet at their centre sits a trauma, a ‘black hole’ around which orbits the subject’s desire, and which Freud recognized as the dream’s navel [which he also called a “thought-tangle”]. But this ‘black hole’ is not ‘empty’; rather it is the raw stuff of the (semiotic) universe itself. Relating this back to the dream of the burning child, we could say the dream is a Symbolic construction (the desire for the child to live just a bit longer…) that revolves around a Real core or point of singularity, which is not a lack (the child gone, lost) but a Real presence: a dead child who reproaches his father.” ~ Leslie Chapman, 2016


To me, the dream seems the very opposite of wish fulfillment. It is a nightmare. The simplest interpretation is the physical reality of what's happening in the room next door. If we have a dream of being in a sauna and wake up all sweated up, and notice we've put on too many blankets and thus have become overheated, no “deep” interpretation is needed. But here indeed another, more disturbing element inserts itself: the way the child speaks suggests a reproach. It's possible that the father feels guilty about not having tried harder to have prevented the child's death.

The guilt may be irrational: even if the most expensive physician in town had been summoned, given both child mortality and the state of medicine back then, the child was likely past saving. It’s one of those “If only” reproaches that people may experience after someone near them commits suicide, “If only I had said such-and-such”; “If only I hadn’t sounded critical the last time we spoke.” But there is absolutely no guarantee that it would have made any difference. Yet for the rest of his or her life, a person may be haunted by at least a twinge of this irrational guilt.

In our historical era, we may also have an association with the Holocaust. Whether it’s the children’s bodies burning in sacrifice to some imaginary ancient god, or the child victims of the Nazis, killed because of a racist-nationalist delusion, we can hardly escape the background of a huge historical nightmare.

Not that the father’s dream can be said to prefigure the nightmare of history — it’s only that we are the children born after the nightmare officially ended, but whose aftermath is still with us, the Hitlerian spirit coming alive in hate groups that have no shame in saying things like, “Hitler had the right idea.”


Yes, I see the reproach so clearly there, the accusation of the parent who has not saved/protected the child. Maybe a case where guilt is irrational
but a phrase that can speak to us all, who Know, who See, and yet do not protect and save the children the children who burned in the Holocaust, or drowned in sinking boats of refugees, or starved in war zones, everywhere, everywhere.

Thank you putting it so clearly: yes, this dream is unforgettable because the child clearly reproaches the father's blindness and inaction: Don't you see I'm burning? Freud goes off on a crazy wish-fulfillment angle here, while the father's guilt about his failure to save the child -- be it an irrational guilt -- is almost as obvious as the physical origin of the dream -- a fire indeed broke out, and the sleeping father's brain had to find a way to wake up the sleeper. And yes, we the parents (in the broad sense of the word) can definitely see that's all of us, failing to protect the children again and again.

(Nevertheless, I admire Stephen Goldblatt’s insight about the concept of hell allowing the poor and others who felt politically impotent and frustrated by the corruption of those in power to “stay asleep a little longer” — thanks to their belief that the wicked will “fry in hell.” And I also agree that the point is not to stay asleep by cultivating such fantasies, but to wake up and investigate the possibilities of action.)


I agree 100% with your interpretation of the father's dream. Of the course the father would have tremendous  love and therefore guilt for the child's death. How could Freud be so blind? Guilt is usually irrational. 


While it may indeed seem paradoxical to say that the idea of hell may be a comfort, I think it certainly can be, to those who are powerless victims without resources, who see no way to "get out from under" the onus of suffering they experience in their lives. Where all is "unfair," unjust and unbearable, hell offers relief — there one's persecutors will finally be punished, there justice will be meted out. This can be a very satisfying conviction, and unfortunately can also reinforce the situation of the powerless, perpetuating it, because there is no need to struggle for justice here and now, all you have to do is believe, and wait for that final after death reckoning, when all scores will be divinely and spectacularly settled.

This belief in a spiritual accounting also prevents the realization of the truth Marlowe's devil so chillingly declares with " Why this is hell, nor am I out of it." We needn't wait for it: all the hell you can imagine is here already, in the world and in the mind, and the only justice is what we can struggle to create — without any supernatural agent.


I also think Freud's dream theory is too narrow. All dreams are not wish fulfillment, and dreams do not function to protect sleep. Lacan gets much closer to what is happening in the burning child dream. I would call that dream a nightmare, and nightmares frequently kick us out of sleep — we wake suddenly just to escape the horror of the dream. What else these dreams are doing we can only guess . . . reliving trauma, for instance, may be a way to reintegrate a damaged and fractured psyche, or may be only the echo of unbearable pain.

And as referenced in the opening poem by Herbert, dreams can be rich with wonder, magically inventive in configuring images and stories both intense and unforgettable. I have dreamed Apocalypse, with the sun and moon falling out of the sky, floors cracking and splitting open into a dark abyss beneath, the world burning to a cinder. I have dreamed of walled gardens that contain infinity inside the gates, and houses that contain undiscovered rooms full of beautiful things. Dreams that are poems, indelible.


Napoleon famously said: “Religion is excellent stuff for keeping common people quiet. It keeps the poor from murdering the rich.”

And that has certainly been one in the top three or so functions of religion: to legitimize those who are in power and keep them safe by convincing believers that this is the divine order to be meekly submitted to: there are the rich and the poor because the invisible Heavenly King ordained it that way: masters and slaves each in their place, with slaves forbidden to rebel. 

Luther was appalled that peasants were inspired by his courage in opposing the Catholic church and rose up against the landlords. The leaders of the 1525 German peasant uprising hoped Luther would support them and their cause, but he denounced them instead. No doubt he invoked hellfire — after all, he was deeply religious.

But one unexpected aspect was that hell was a democracy. It was for ALL sinners, and the poor could enjoy a revenge fantasy — indeed another way to keep them submissive. Give them the fantasy of posthumous justice rather than the reality of it in the world.

Nor is this fantasy dead in modern times. One man I know through Facebook, educated, artistic, socialist-leaning, wrote that he hopes god exists — because this way the bad guys would finally get what they deserve in the afterlife! I'm also disgusted by the bad guys, but I fail to feel any pleasure at the thought of the posthumous payback. No, there is no cosmic justice — there is only the imperfect kind that we humans create ourselves.

Yahweh hiding his face — it has never ceased to astonish me that people would make up a god who hides, who deliberately withdraws. Of course it can be seen as a clever ruse to defend the existence of a supernatural agent in spite of lack of evidence — but the contrast with the early “active god” who walked and talked is rather painful.


As for dreams, they apparently have an important biological function, but at this point we are mainly speculating. We know they are important for memory (including forgetting of useless details) and learning. But all we really have is theories. Some dreams can be interpreted as wish fulfillment, but certainly not all. Nightmares and anxiety dreams remain a mystery — except for the observation that people under stress will have more nightmares and worse, more vivid nightmares than usual. The apocalyptic dreams you and I had in our youth — no surprise that we were going through difficult times.

I didn’t think those dreams helped me in any way — until they started changing, and I repeatedly walked out of concentration camps or away from the execution — but again, I was getting emotionally stronger in my waking life too. Then those dreams provided comfort. But at the beginning, the dreams were actually worse than my personal situation, and they seemed to amplify my anxiety. But they were still super-interesting in a kind of literary fashion — projecting my personal stuff on a huge canvas, e.g. nuclear missiles were on the way, or else I was in a post-apocalyptic world, with only women and children left, and some old men — and all that destruction and poverty. Nightmares, yes, but fascinating and “large” —beyond the personal.

Dreams used to be be one of my main inspirations for poems, back when my dream recall was vivid. I miss that very much. It was like having a wilder, more radical poet inside me.

~ “And what of the value of dreams in regard to our knowledge of the future? That, of course, is quite out of the question. One would like to substitute the words: ‘in regard to our knowledge of the past.’ For in every sense a dream has its origin in the past. The ancient belief that dreams reveal the future is not indeed entirely devoid of the truth. By representing a wish as fulfilled, the dream certainly leads us into the future; but this future, which the dreamer accepts as his present, has been shaped in the likeness of the past by the indestructible wish.” ~ Sigmund Freud, The Interpretation of Dreams.


This seems plausible when it comes to dreams that readily yield to being interpreted as wish fulfillment. As I see it, however, only a small portion of dreams are based on wish fulfillment. Most dreams, alas, are either anxiety dreams or reflect our mundane concerns, e.g. trying to find parking in an endless labyrinth crowded parking lot, pondering our gas and electric bill as in Herbert's poem, and the like oppressive trivia.

Bartolomeo Veneto, early 1500s


“If I cannot bend the heavens, then I shall move the powers of hell.” ~ Virgil, The Aeneid

[Alternate translation: “If you cannot move the upper regions, dare to move the underground.”]

~ “Sigmund Freud famously placed Virgil's quote on the title page of his masterwork, The Interpretations of Dreams. It is the motto for any radical change. It points to the need for disturbing and interrupting the unexpressed, underground structure of our daily life. Of all forms of violence, the one with the most catastrophic consequences is not personal or interpersonal but "systematic": the kind of violence imposed by the fluid, seemingly natural functioning of our economic, political and religious systems.

Real change only erupts when the unwritten laws of a system are disturbed. It was Freud who, through his clinical work on the unconscious, recognized that what bonds and binds individuals to a system are its secret, half-spoken, shadowy rules. What really cements group loyalty and submission is not the open agreement on which laws to keep but the "somehow always already known" ones that everyone secretly agrees to break.” ~



“My definition of a tragedy is a clash between right and right.” ~ Amos Oz


Yes, the choice between what is obviously right and what is obviously wrong is pretty easy for most of us — it's not even a choice. If we need cash, we go to the bank or an ATM, and it doesn't even occur to us to hold up a 7-11. But oh, when there is much to be said for each option, and choosing one means sacrificing something of considerable value . . . that's where agony comes in.


We all come to America
For the same reason —
To find a manger
For our baby.

~ John Guzlowski

Nativity by Geertgen tot Sint Jans,1490


“Our baby” can be real, or symbolic. I wanted access to all books, not just those approved of by Poland's illegitimate government. Thus, Kafka and Nietzsche — though there was no official word that they were censored, their works were not being published and thus hard to find (used book stores might have pre-war copies). I imagined America — and the West in the general — as a paradise of books.



You could call it the face that launched a thousand Christmas letters. Appearing on January 3, 1863, in the illustrated magazine Harper’s Weekly, two images cemented the nation’s obsession with a jolly old elf. The first drawing shows Santa distributing presents in a Union Army camp. Lest any reader question Santa’s allegiance in the Civil War, he wears a jacket patterned with stars and pants colored in stripes. In his hands, he holds a puppet toy with a rope around its neck, its features like those of Confederate president Jefferson Davis.

A second illustration features Santa in his sleigh, then going down a chimney, all in the periphery. At the center, divided into separate circles, are a woman praying on her knees and a soldier leaning against a tree. “In these two drawings, Christmas became a Union holiday and Santa a Union local deity,” writes Adam Gopnik in a 1997 issue of the New Yorker. “It gave Christmas to the North—gave to the Union cause an aura of domestic sentiment, and even sentimentality.”

The artist responsible for this coup? A Bavarian immigrant named Thomas Nast, political cartoonist extraordinaire and the person who “did as much as any one man to preserve the Union and bring the war to an end,” according to General Ulysses Grant. But like so many inventors, Nast benefitted from the work of his fellow visionaries in creating the rotund, resplendent figure of Santa Claus. He was a man with the right talents in the right place at the perfect time.

Prior to the early 1800s, Christmas was a religious holiday, plain and simple. Several forces in conjunction transformed it into the commercial fête that we celebrate today. The wealth generated by the Industrial Revolution created a middle class that could afford to buy presents, and factories meant mass-produced goods. Examples of the holiday began to appear in popular literature, from Clement Clarke Moore’s 1823 poem “A Visit from St. Nicholas” (more commonly known by its first verse, “Twas the night before Christmas”) to Charles Dickens’ book A Christmas Carol, published in 1843. By the mid-1800s, Christmas began to look much more as it does today. “From a season of misrule characterized by drink, of the inversion of social roles in which working men taunted their social superiors, and of a powerful sense of God’s judgment, the holiday had been transformed into a private moment devoted to the heart and home, and particularly to children,” writes Fiona Halloran in Thomas Nast: The Father of Modern Political Cartoons.

In addition to repurposing the imagery of the Moore poem—reindeer pulling a sleigh, sack full of presents—Nast also found inspiration in his surroundings. He based Santa’s bearded visage and round belly partially on himself and used his wife and children for other characters, says Ryan Hyman, a curator at the Macculloch Hall Historical Museum. Located in Nast’s hometown of Morristown, New Jersey, the museum holds a large collection of his work. “The outside pictures that show rooftops and church spires were all here in Morristown,” Hyman adds.
Even though people may know that Nast gave us the donkey for the Democrats and the elephant for Republicans, and that he took on corrupt New York City politicians, few may realize the role he played in creating Christmas. Hyman and his colleagues hope they can change that, in part through their annual Christmas showcase of Nast’s work. “He created the modern image of Santa Claus,” Hyman says—though we don’t tend to think about Civil War propaganda when we’re opening presents.” ~


~ “A decade ago, 80 percent of Americans believed that a free market economy was the best economic system. Today, that number is 60 percent. Another recent poll shows that only 42 percent of millennials support capitalism.

So what happened? Why have so many people, both in the US and abroad, lost faith in capitalism?

Steven Pearlstein, a columnist for the Washington Post and public affairs professor at George Mason University, has a few answers. The primary reason is that the system has become too unstable: Wages are largely stagnant, and the income gap is so wide that the rich and the poor effectively live in different worlds. No surprise, then, that people are unhappy with the status quo.

Pearlstein’s new book, Can American Capitalism Survive?, chronicles the excesses of capitalism and shows how its ethical foundations have been shattered by a radical free market ideology — often referred to as “neoliberalism.” Capitalism isn’t dead, Pearlstein argues, but it has to be saved from itself before it’s too late. 

Why have so many people lost faith in capitalism?

Steven Pearlstein: The most obvious answer is that capitalism has left a lot of people behind in the last 30 years. Everyone can see that the top 1 percent, the top 10 percent, the top 20 percent, have captured most of the benefits of economic growth over the last 30 years, and the rest of the population has been marginalized.

Now, we all know this, but I wrote the book because I think there is a feeling even among those of us who didn’t get left behind that this system has become too unfair, too ruthless, and rewards too many of the things we think of as bad. The system offends the moral sensibilities even of people who are benefiting from it.

I’m not so sure that the people at the top are starting to see it that way, but we’ll come back to that. First, tell me what went wrong in the 1970s and ’80s, when you say capitalism really started to go sideways.

Two things happened during the ’70s and ’80s. First, the American industrial economy lost its competitiveness. Neoliberal policies of global free trade and unregulated markets were embraced, and the US was suddenly facing competition from all over the globe.

So American companies, which had been so dominant in our own market and in foreign markets, started to lose their dominance, and they had to get leaner and meaner. They started behaving in different ways. They started sharing less profits with their employees and with shareholders and customers.

Eventually, that produced a revolt from shareholders, and in the mid-’80s we had the first of what were called “hostile takeovers,” in which people would come in and buy up large chunks of companies and threaten to take them over or out the executives if they didn’t put shareholders above all else.

The result of all this was that companies changed how they did business and completely embraced the idea that companies should be run to maximize shareholder value and nothing else. Obviously, that meant more money for executives and shareholders and less money for employees and customers.

This is the mentality that led us to the place we’re in now.

I want to push you on what I think is an excessively sanguine view of capitalism. In the book, you imply that capitalism has gone off the rails, but I disagree. I’d argue that capitalism has evolved in precisely the way we should have expected it to evolve. The culture of norms and values that were supposed to check the excesses of capitalism has (predictably) been eroded by capitalism itself, and now it’s propelled entirely by greed.
You seem to think that capitalism can be saved from itself. What do you say to people who think it’s not salvageable, not morally legitimate, and in any case not worth salvaging?
The question is, is all of that endemic to capitalism? I don’t think so, because we see different kinds of capitalism in countries in, say, Northern Europe and in Germany. Some of that has to do with the rules and laws under which they operate, but a lot of it has to do with the norms of behavior. So capitalism doesn’t have to reach the point of ruthlessness like it has here and other places.

And one of the good things about capitalism is that it has self-correcting mechanisms, just as democracy has self-correcting mechanisms. The truth is that the outcome we have now, all of this tremendous inequality, is bad morally and economically. This is not a sustainable system, and if it keeps getting worse, we run the risk of a revolution.

So I don’t think capitalism is an inherently moral system or an inherently self-defeating system, but we have to ensure that it adapts when it veers too far into corruption and inequality. And that’s basically what I’m calling for in this book.

Well, yes, capitalist systems are extremely adaptable (that’s definitely one thing Karl Marx got really, really wrong), but the problem is that our system isn’t adapting, or not adapting fast enough. And we live in a media culture in which nearly half the population is fed propaganda that convinces them that immigrants and regulations are what are holding them back, not greedy corporations.

How do we course-correct in the face of all this confusion?

We do it by changing norms, and by talking about it and discussing it. That’s how a democracy goes about it. Now one of the questions you might ask is, how do norms change? And the answer is, I don’t know.

But in the #MeToo movement, we see a very good example of how norms can change very quickly. What was acceptable five years ago is really not acceptable anymore. And it’s because enough people got morally outraged and things changed. That’s how norms shift and the culture evolves.

I’ll circle back to the #MeToo comparison because I think it’s a bad one, but there are also legal and structural impediments here. We have a political system fueled by private money, which means that wealth translates to political influence, which in turn means the laws are increasingly rigged to benefit the people on top.

You make a very good point, and in the book I say the No. 1 thing we have to do is get money out of politics — and that will probably require a constitutional amendment. But you’re right: We can’t reform our economic system if we don’t reform our political financing system.

As it is now, we’re stuck in a vicious cycle in which concentration of wealth leads to concentration of political power, which leads to yet more concentration of wealth. And we know how this plays out in the long run — it leads to revolution. But we don’t have to get anywhere near that if we can make the changes we need to make now.

The Democratic Party will have to lead the way, and if they really want to do that, they need to put this at the top of their agenda and run on it. People out there are angry, and this will help them win. It’s a slam-dunk issue, really. People are as disgusted by what they’re seeing as you and I are.

I want to quote something interesting from your book: “Liberal critics never miss an opportunity to complain about the level of inequality, but they’ve rarely been willing to say what level, or what kinds, of inequality would be morally acceptable.” I have my own answer to this, but I’m curious what you think the acceptable level of inequality is.
In the book, you catalogue all of these solutions to the problem — more income redistribution, better tax reform, something like a universal basic income, a new social contract between business and society, more access to higher education, etc. — and I agree with most of it. But I’m not confident we have the political will to get these things done.

If I’m right about that, what do you think is going to happen in the short to medium term?

First, let me just say that it will be easier to do these sorts of things than it will be to go full socialist. If we lack the political will to fix the kind of capitalism we have, then there’s surely a higher political barrier to the full socialist model of national health insurance, free college for everybody, and guaranteed income for every individual, whether they work or not.

So if you’re saying that things have to get worse before they get better, you may be right. However, if you look at public opinion polls, if you look at the recent election, I think the will may be already there. Again, I see the success of the #MeToo movement as a great example of what’s possible.

The #MeToo movement is a misguided comparison. We’re talking about broad changes in our political and economic system, changes that directly threaten the most entrenched financial interests in this country. I think you’re right about public sentiment, but I’m not at all convinced that the financial class is prepared to relinquish anything.

In fact, we’ve seen the big banks essentially go right back to the sorts of behaviors that produced the financial crash in 2008, and we just saw Republicans pass an egregious tax cut that will deepen the very inequalities we’re talking about here.

Well, it’s worth remembering that social norms change before policy changes, not the other way around. But yes, I agree that the GOP tax cut was enormously irresponsible and unfair. These are the sorts of things that can cause the public to say, “Enough is enough.”

My view is that we’re at a tipping point now and things are about to change. You and I may disagree about what, exactly, we need to do, or how far we need to go, but I think there are enough positive signs in public opinion that suggest we’re at a tipping point.

We’ll just have to see what happens next.


Practically all my friends are something-ists. They don’t believe in god, but say, “There is SOMETHING out there.” I alone don’t hedge my bets. I see the universe as entirely natural, without deities, demons or angels — and without the SOMETHING OUT THERE that’s supposed to account for weird coincidences. Our consciousness, dreams, thoughts, hallucinations, cultural influences, memorable fictional characters, etc. — these are natural phenomena.

We can’t explain what consciousness is or how it works, but the new science of complexity has given us a useful term: emergence. It’s bottom-up self-organization that’s evident in bird-migration, for instance. A single neuron firing is meaningless — like a stranded ant, separated from its colony. But a million neurons firing together adds up to a pattern.

There is no free-floating, brain-less consciousness, even though a lot of people speak of “cosmic consciousness” (and they don’t mean the laws of physics; they mean a mysterious, all-knowing intelligence that’s friendly specifically to them). If evidence for it can be produced, I will change my views.

Meanwhile, the interesting news that in the Netherlands atheists now outnumber traditional believers — but not Something-ists.

~ “For the first time the Netherlands has more atheists than believers, according to a recent survey conducted by Ipsos. Slightly more than 25 percent of the people are atheists while 17 percent believes in the existence of God.

    The majority, 60 percent, is between believing and disbelieving in God. … The majority categorize themselves as either agnostics or ‘something-ists’. Agnostics say they can not know if there is a higher power and somethingists, or ietsists, believe that there must be a some sort of higher power beyond material.

The number of believers is higher among the young than it is among the elderly.

Despite the relatively small percentage of believers, 53 percent of the population believe in some form of life after death, and over 40 percent define themselves as ‘spiritual persons.’

Something-ism is a benign kind of belief. No one has been killed in the name of Something-Out-There. No one has died as a martyr for Something-Out-There.
” ~

“A pilot photographing a rainbow as he flies through it. That's what haven would look like, if it existed.” ~ M. Iossel, who thanks Carolyn Forché and Ruben Santos Claveria


~ “Animal research, published in the journal Nature, showed breast tumors struggled without the dietary nutrient asparagine.

It is found in the foodies' favorite asparagus, as well as poultry, seafood and many other foods.

In the future, scientists hope to take advantage of cancer's "culinary addictions" to improve treatment.

Asparagine is an amino acid — a building block of protein — and takes its name from asparagus.

The study, conducted at the Cancer Research UK Cambridge Institute, took place on mice with an aggressive form of breast cancer.

Normally they would die in a couple of weeks as the tumor spread throughout the body.

But when the mice were given a low-asparagine diet or drugs to block asparagine then the tumor struggled to spread.

"It was a really huge change, [the cancers] were very difficult to find," said Prof Greg Hannon.

Last year, the University of Glasgow showed cutting out the amino acids serine and glycine slowed the development of lymphoma and intestinal cancers.

Prof Hannon told the BBC: "We're seeing increasing evidence that specific cancers are addicted to specific components of our diet.

"In the future, by modifying a patient's diet or by using drugs that change the way that tumor cells can access these nutrients we hope to improve outcomes in therapy.”

An initial tumor is rarely deadly. It is when the cancer spreads throughout the body - or metastasizes - that it can become fatal.

A cancerous cell must go through huge changes in order to spread - it must learn to break off the main tumor, survive in the bloodstream and thrive elsewhere in the body.

It is this process for which researchers think asparagine is necessary.

But fear not asparagus lovers, these findings still need to be confirmed in people and asparagine is hard to avoid in the diet anyway.

In the long run, scientists think patients would be put on special drinks that are nutritionally balanced, but lack asparagine.

Prof Charles Swanton, Cancer Research UK's chief clinician, said: "Interestingly, the drug L-asparaginase is used to treat acute lymphoblastic leukemia, which is dependent on asparagine.

"It's possible that in future, this drug could be repurposed to help treat breast cancer patients."

Further trials are still necessary.” ~

ending on beauty:

My street lamp is so glacially alone in the night.
The small paving stones lay their heads down all around
where it holds its light-umbrella over them
so that the wicked dark will not come near.

~ Rolf Jacobsen, Light Pole, tr Robert Bly

Snow fall in Leningrad