Sunday, January 13, 2019

FORGETTING JOHN LENNON: HOW COLLECTIVE CULTURAL MEMORY DECAYS; CAN THE RICH SAVE THE WORLD? THE YOUNG WILL CONTINUE TO HAVE LESS SEX; WHAT KILLED 15 MILLION AZTECS


"Everyone comes out from nothingness moment after moment. Moment after moment we have true joy of life. So we say shin ku myo yu, “from true emptiness, the wondrous being appears.” ~ Shunryu Suzuki, Zen Mind, Beginner’s Mind

*

LA BAMBA

When a friend lay dying of leukemia,
I was walking the shore of Lake Tahoe.
A pleasure boat passed by; the passengers
burst into a wild chorus of La Bamba.

‘Yo no soy marinero! Yo soy capitan!’
they sang with such triumph
you’d think they were singing
I’m alive, I’m alive.

I was hardly in the presence
of Art, and this was not
the Mormon Tabernacle Choir.
A phone call could come in any time —

and suddenly I knew this was
my own memorial service.
To honor me, the mourners
sang La Bamba to the sun.

~ Oriana



ONE MORE LOOK AT HOLDEN CAULFIELD

 
~ “It’s become a truism that if you read A Catcher in the Rye at 14, you love Holden Caulfield, and if you read it at 20, you hate him. But when I reread it recently as an adult — 100 years after J.D. Salinger’s birth — I just felt sorry for him. That boy spends so much energy judging people that his life must be exhausting.

Holden Caulfield is the eternal voice of the adolescent male: riddled with nostalgic protectiveness for children and fury toward adult phonies, irreverent and very aware of how cool he thinks that irreverence makes him, while at the same time obsessed with certain “rules.”

He’s convinced he has mastered the complex unspoken social codes of the world, and it makes him furious when other people don’t abide by them the way he thinks they should. He hates phonies, of course, but he’s also repulsed by what he sees as ignorance. He can’t stand the schoolmate who doesn’t understand that when you pick up a knee supporter from someone’s chiffonier [dresser], you shouldn’t put it back on their bed. He is filled with righteous indignation when girls don’t offer to pay for their own drinks — he wouldn’t have let them, but for form’s sake, he insists, they really should have tried.

   ~ “It may be the kind where, at the age of thirty, you sit in some bar hating everybody who comes in looking as if he might have played football in college. Then again, you may pick up just enough education to hate people who say, ‘It’s a secret between he and I.’ Or you may end up in some business office, throwing paper clips at the nearest stenographer. I just don’t know. But do you know what I’m driving at, at all?”

    “Yes. Sure,” I said. I did, too. “But you’re wrong about that hating business. I mean about hating football players and all. You really are. I don’t hate too many guys. What I may do, I may hate them for a little while, like this guy Stradlater I knew at Pencey, and this other boy, Robert Ackley. I hated them once in a while — I admit it — but it doesn’t last too long, is what I mean. After a while, if I didn’t see them, if they didn’t come in the room, or if I didn’t see them in the dining room for a couple of meals, I sort of missed them. I mean I sort of missed them.” ~

It’s one of the few times in the book where Holden gestures toward forgiving the people who don’t live up to his exacting standards, toward recognizing that people have value and worth even when they break his rules. It’s almost enough to give you hope for the future — but then the English teacher tries to molest Holden while he sleeps, so that kills that idea.

Still, the passage exemplifies Holden’s slow crawl toward self-awareness and self-reflection. It develops the tension between his adolescent desire for black-and-white morality and his emerging adult recognition that people can do things that annoy you and still be basically good people. And that tension is what gives A Catcher in the Rye its forward drive.
If you read the book in celebration of Salinger’s 100th birthday, you can have a little celebration for yourself too. I’ll be having one in recognition of the fact that I am no longer a teenager and no longer have to waste Holden Caulfield levels of energy on judging everyone around me. You could not pay me enough to go back.” ~

https://www.vox.com/2016/7/16/12198828/catcher-in-the-rye-65th-anniversary-j-d-salinger


Oriana:

I think Holden is right not to tolerate being molested, even if the teacher assumed the boy wasn’t conscious of it. But aside from that, I find myself nodding my head. Adolescence is a time of intense emotions, and hatred at that age can be especially white-hot — even over relatively minor stuff. We don’t yet grasp how complex life is, rarely black and white — rather a very wide spectrum of shades of gray. And people — well, anyone who’s made it past the age of thirty has usually some idea that everyone is a mixed bag of qualities we like and those we don’t — we may even abhor some of them, but as long as the person is decent — which I define as honest, hard-working, and willing to help others — that is worth of respect right there.

And some respect needs to be granted simply because a person is human. There are reasons why people are the way they are, and those reasons are generally not under their control. Perhaps they really did have a terrible childhood; the effects of it tend to be lifelong. Perhaps one parent was an alcoholic, or both parents — it happens. Perhaps the mother died and no one else stepped in to provide the critically needed love and nurturing. Perhaps there’s a genetic problem — it’s a lottery.

Spinoza said that to understand all is to forgive all. It’s a shame that an individualistic culture tends to blame the victim and not give the right weight to the power of circumstances.

The longer I live, the more clearly I see the power of circumstances. At the same time, I do find certain people just too difficult to be around. Life is too short for time to be wasted on those who truly get on our nerves. At the same time, I also find it true that even those whom we dislike at first can become “part of the family” if we spend enough time with them (at work, for instance) and get to know some personal details. 


(A shameless grammar-Nazi confession: I hate it when people say "between you and I" or say a "feasible excuse" instead of "plausible" or don't know the difference between ambiguous and ambivalent. I detest those who pronounce the "t" in "often." I let myself experience being flooded by that wave of loathing, knowing it will pass — usually in a matter of five to ten minutes.)

Rembrandt: Self-Portrait Frowning, 1630 (aged 24)
 
What Neruda said can also serve as a comment on how a person should mature and become more connected:

There is no insurmountable solitude.
All paths lead to the same goal:
to convey to others what we are.
And we must pass through solitude
and difficulty, isolation and silence
in order to reach forth to the enchanted place
where we can dance our clumsy dance and sing
our sorrowful song.


~ Pablo Neruda


YOU CAN’T STEP INTO THE SAME BOOK TWICE
Mary:

Just as returning to The Catcher in the Rye as an adult becomes a very different experience than upon first reading it as a teenager, so it was for me when I reread some favorite novels I had first encountered in my mid teens. When I first read Hugo's Toilers of the Sea I was overwhelmed with the Romantic notion of love presented there, found Gilliat's enormous struggles heroic, and his renunciation of claim to the beloved, who loved someone else, noble. When he makes it possible for the lovers to escape to be together, then sits on the rocks waiting for the tide to rise and drown him, I felt it was tragic, and my teenage heart broke for him.

Reading the book 50 years later I found the whole romantic fantasy and the agonizing struggle of the hero against all the forces of nature in service to that fantasy, nothing more than silly and delusional. His suicide no longer seemed noble or heartbreaking, but pitiful and childish.

Then there is of course the delight of returning to a book maturity only finds richer and better than you realized, perhaps because you have managed to grow up to its measure. That was my experience with Tolstoy's  Anna Karenina, which stunned me with its perfection. Anna's choices and her situation, at once in a world vastly different than ours and yet ruled by some of the same constraints, tastes like truth, like reality. By the end of the story there is not a shred of romantic fantasy left to comfort Anna, there is no place left for her in that world, and her suicide seems the inevitable conclusion. Here is a tragedy for grown ups, that a younger mind might not recognize as a determined and overdetermined end for a woman breaking the rules society has dictated for her.

Oriana:

That’s just it: A great novel can dazzle us on re-reading. The style, the depth, the marvelous details! Not so with inferior writing, whose triteness becomes insufferable.

And then there are those reasonably good novels that speak primarily to the concerns and priorities of the young, and just don’t have enough in them for the more mature readers. Or, when we read them as adults, we pay attention to different scenes and characters, and notice so much more, a whole world we missed on first reading.

You can never step into the same book twice and have the same experience — ten years later, you the reader are already a different person, which also makes the book different.

**

In a complex system, we control almost nothing, but we influence almost everything. “Of all the fantasies human beings entertain, the idea that we can go it alone is the most absurd and perhaps the most dangerous. We stand together or we fall apart.” ~ George Monbiot


LESS SEX PREDICTED FOR THE FUTURE

 
~ “In the future, the ebbing of romantic and sexual connections will continue. People will have sex less frequently than they did in the pre-internet era, which will be remembered as a more carnal time. They will have fewer lifetime sexual partners, and they will be more likely to be abstinent.

Only a minority of teenagers will have sex of any sort. Masturbation and other varieties of solo sex will continue to be more prevalent than they were before; porn aficionados will enjoy VR sex and sex robots.

Like many other aspects of our world in the decades to come, the gap between the haves and have-nots will continue to grow. Those who have many advantages already will be disproportionately likely to find romantic and sexual partners if they desire them and to have fulfilling sex lives. There will be good parts of this: Nonconsensual sex will be far less common than it is today. There will be little to no social stigma attached to being unattached. Those who approach singledom with psychological and financial advantages will flourish. It will be the best time in human history to be single. But there will be less unambiguously positive developments as well: For better and for worse, the birth rate will continue to fall, and those who are less suited to solo life will suffer from profound loneliness.

If you compare Americans’ sex lives today to the sex lives of people of the same age in the early ’90s, people who are now in their 20s are on track to have fewer lifetime sexual partners. They’re having sex less frequently. They’re about two and a half times as likely to be abstinent, and they have launched their sex lives later. It would seem that something is getting in the way of people’s ability or desire to connect to each other physically.

In The Atlantic, I called this a sex recession. Its biggest cause is that more people than not who are under 35 are living without any sort of partner, which is a change from decades past. The most common living arrangement for adults who are under 35 is to be living with a parent, which, I think it is safe to say, is for many of us probably not a great recipe for a superactive and fulfilled sex life.

Other factors include media, broadly — not just social media and not just porn. I would put any kind of digital occupation that makes it less desirable to go out and connect with somebody in person in the same category. It could be Netflix, streaming TV — all of these things coincide with a measured increase in the percentage of people who say that they’ve masturbated in the past week. Among men, that’s doubled since the early ’90s. Among women, it’s tripled.

Another set of causes when we’re looking at teens specifically has to do with the way adolescence has changed: Teenagers are having sex later, and the teen birth rate is a third of what it was in the early ’90s. People are more likely to say their first sexual experience was consensual. People seem to be coming into their 20s with less romantic experience than past cohorts. That can be a really difficult thing to reverse. I’ve never held hands with somebody, I’ve never kissed somebody; how do I do that when I’m 23, 24, 25? A third big category has to do with dating apps, which have become a normal way to meet people in a lot of circles. And yet for some people, they are clearly functioning really poorly and maybe sort of paradoxically actually making it harder to match up with people. That will continue.” ~ Kate Julian

http://nymag.com/intelligencer/2019/01/2038-podcast-predictions.html?fbclid=IwAR12rXSbj7r-o3CnZU9HHEc0__a2eBaOior-nb6p3NW2Jyl142GEHzSMSG4


Oriana:

Who knew? I thought that hormones were invincible, but by now we’ve had multiple studies all confirming that the young are not having sex the way they used to in previous generations. What’s actually more frightening is that they aren’t having relationships, they aren’t connecting with others — and not just erotically. It’s sad, because so much depends on having rich human connections. 


Mary:

On the trend of young people leading more solitary lives, living more through the 'virtual' worlds our technology now makes so pervasively available — I cant help but find this troubling. If we already have lost our bearings by forgetting history, how much more will be lost if we also become untethered from the web of human relations? What kind of lives, in what kind of world, will all this disconnection create?? Maybe something we cannot as yet imagine.


Oriana:

In Japan, I read, there are male teenagers who won’t even live their room. Their lives are totally digital. And I know of some young people who work at home, and basically have turned into recluses with no social life other than on social media. As an introvert, I know socializing in person is a lot more demanding and usually leaves me drained and exhausted. And yet, and yet . . . there is also a richness that can’t be duplicated on-screen.

Also, people are more likely to be polite in a face-to-face interaction, more supportive. If you keep meeting them, getting to know them a little more each time, they become dear to you. You’re willing to overlook the little ways in which a person may annoy you, knowing you too probably have eccentricities that get on someone else’s nerves — a lesson in tolerance. Perhaps it’s a coincidence, but didn’t we have less hate language and polarization before the internet?

*


“How wonderful it is to be able to be silent with another person.” ~ Kurt Tucholsky


*

A Soviet-era monument in Vilnius. That patch of snow on the sheaf of wheat makes it surreal —  in a symbolic way if you are inclined to read it as a statement on the Soviet regime.

More poignant: January 13 marks an anniversary of the 1991 “Bloody Sunday” in Vilnius. Thousands of unarmed civilians tried to stop the Soviet tanks, sometimes with their bare hands.

*


~ “Joseph Brodsky said something very good when he was asked: ‘What is the difference between great literature and the merely average?’ Brodsky replied: ‘In the taste for the metaphysical.’ And how are we to understand ‘the metaphysical’? It is when someone sees… more deeply. Her worlds, her space, the enigmas of the world are involved in all this. She is enlightened in another way. That’s where the difference lies.” ~ Svetlana Alexevich

Oriana:

Thinking about how Russian literature differs, I'd say that it is indeed richer in metaphysical concerns. Strangers in a train compartment may at any point start discussing Life's Persistent Questions.

(By the way, let’s remember that it’s fiction. I remember the tales of a woman who studied in Russia during the late fifties. She loved to remember traveling by train: “People would spontaneously start singing. Soon everyone was singing; the whole train would be singing.”)




*

ONE MORE LOOK AT ST. NICK OF YORE

St. Nick, Gingerbread. Alas, I don't know the time period (I suppose gingerbread can be sprayed with a fixative) -- but note the horse rather than reindeer, and the religious trappings: this is definitely meant to be the original St. Nicholas, the Bishop of Myra, 270-342 (back then the town was Greek; now it's Turkish and it's called Demre).
 
*

FORGETTING JOHN LENNON: HOW COLLECTIVE CULTURAL MEMORY DECAYS

 
~ “A few years ago a student walked into the office of Cesar A. Hidalgo, director of the Collective Learning group at the MIT Media Lab. Hidalgo was listening to music and asked the student if she recognized the song. She wasn’t sure. “Is it Coldplay?” she asked. It was “Imagine” by John Lennon. Hidalgo took it in stride that his student didn’t recognize the song. As he explains in our interview below, he realized the song wasn’t from her generation. What struck Hidalgo, though, was the incident echoed a question that had long intrigued him, which was how music and movies and all the other things that once shone in popular culture faded like evening from public memory.

Hidalgo is among the premier data miners of the world’s collective history. With his MIT colleagues, he developed Pantheon, a dataset that ranks historical figures by popularity from 4000 B.C. to 2010. Aristotle and Plato snag the top spots. Jesus is third. It’s a highly addictive platform that allows you to search people, places, and occupations with a variety of parameters. Most famous tennis player of all time? That’s right, Frenchman Rene Lacoste, born in 1904. (Roger Federer places 20th.) Rankings are drawn from, essentially, Wikipedia biographies, notably ones in more than 25 different languages, and Wikipedia page views.

Last month Hidalgo and colleagues published a Nature paper that put his crafty data-mining talents to work on another question: How do people and products drift out of the cultural picture? They traced the fade-out of songs, movies, sports stars, patents, and scientific publications. They drew on data from sources such as Billboard, Spotify, IMDB, Wikipedia, the U.S. Patent and Trademark Office, and the American Physical Society, which has gathered information on physics articles from 1896 to 2016. Hidalgo’s team then designed mathematical models to calculate the rate of decline of the songs, people, and scientific papers.

The report, “The universal decay of collective memory and attention,” concludes that people and things are kept alive through “oral communication” from about five to 30 years. They then pass into written and online records, where they experience a slower, longer decline. The paper argues that people and things that make the rounds at the water cooler have a higher probability of settling into physical records. “Changes in communication technologies, such as the rise of the printing press, radio and television,” it says, affect our degree of attention, and all of our cultural products, from songs to scientific papers, “follow a universal decay function.”

Why does collective memory decay matter?

 
If you think about it, culture and memory are the only things we have. We treasure cultural memory because we use that knowledge to build and produce everything we have around us. That knowledge is going to help us build the future and solve the problems we have yet to solve. If aliens come here and wave a magic wand and make everyone forget everything—our cars, buildings, bridges, airplanes, our power systems, and so forth, we would collapse as a society immediately.

In your mind, what is a classic example of collective memory decay?

 
I thought everybody knew “Imagine” by John Lennon. I’m almost 40 and my student was probably 20. But I realized “Imagine” is not as popular in her generation as it was in mine, and it was probably less popular in my generation than in the generation before. People have a finite capacity to remember things. There’s great competition for the content out there, and the number of people who know or remember something decays over time. 


There’s another example, of Elvis Presley memorabilia. People had bought Elvis memorabilia for years and it was collecting huge prices. Then all of a sudden the prices started to collapse. What happened is the people who collected Elvis memorabilia started to die. Their families were stuck with all of this Elvis stuff and trying to sell it. But all of the people who were buyers were also dying.

You write collective memory also reflects changes in communication technologies, such as the rise of the printing press, radio, and TV. How so?

 
Take print. Changing the world from an oral tradition to a written tradition provided a much better medium for data. A lot of people have linked the revolution in the sciences and astronomy to the rise of printing because astronomical tables, for instance, could be copied in a reliable way. Before printing, astronomical tables were hand-copied, which introduced errors that diminished the quality of the data. With printing, people had more reliable forms of data. We see very clearly from our data that with the rise of printing you get the rise of astronomers, mathematicians, and scientists. You also see a rise in composers because printing helps the transmission of sheet music. So when you look at people we remember most from the time when print first arose, you see ones from the arts and sciences.

What did the mediums that came next mean for science?

 
The new mediums of radio and TV were much more adaptive for entertainment than science, that’s for sure. The people who belong to the sciences, as a fraction of the people who became famous, diminished enormously during the 20th century. The new mediums were not good for the nuances that science demands. For good reason, scientists need to qualify their statements narrowly and be careful when they talk about causality. They need to be specific about the methods they use and the data they collect. All of those extensive nuances are hard to communicate in mediums that are good for entertainment and good for performance. So the relative power of scientists, or their position in society, have diminished as we exited the printing era and went into this more performance-based era.

What does your analysis tell us we didn’t know before about the decay of collective memory?

 
We began by looking at how popular something is today based on how long ago it became popular in the first place. The expectation is collective memory decays over time in a smooth pattern, that the more time goes by, the more things become forgotten. But what we found when we looked at cultural products—movies, songs, sports figures, patents, and science papers—was that decay is not smooth, but has two defined regimes. There’s the first regime in which the attention starts very high and the decay is really fast. Then there’s the second regime in which it has a much longer tail, when the decay is smoother, and the attention is less.

When we started to think about decay, we realized we could take two concepts from anthropology—“communicative memory” and “cultural memory.” Communicative memory arises from talking about things. Donald Trump is very much in our communicative memory now. You walk down the street and find people talking about Trump—Trump and tariffs, Trump and the trade war. But there’s going to be a point, 20 years in the future, in which he’s not going to be talked about everyday. He’s going to exit from communicative memory and be part of cultural memory. And that’s the memory we sustain through records. Although the average amount of years that something remains in communicative memory varies—athletes last longer than songs, movies, and science papers, sometimes for a couple decades—we found this same overall decay pattern in multiple cultural domains.

In your forthcoming paper, “How the medium shapes the message,” you refer to the late cultural critic Neil Postman who argued that the popular rise of TV led to a new reign of entertainment, which dumbed us down, because entertainment was best suited for TV. Is that what you found?

We found evidence in that favor, yes. Because the fraction of people who belong to the sciences, as a fraction of all of the people that become famous, diminishes enormously during the 20th century. It would completely agree with that observation.

Did you come away from your study with insights into what may or may not cause something to stick in collective memory?

I read a very good book recently called The Formula by Albert-Laszlo Barabas. He says you can equate quality and popularity in situations in which performance is clearly measurable. But in cases in which performance is not clearly measurable, you cannot equate popularity with quality. If you look at tennis players, you find tennis players who win tournaments and difficult games are more popular. So quality and fame are closely correlated in a field in which performance is measured as tightly as professional tennis players. As you move to things that are less quantifiable in terms of performance, like modern art, your networks are going to be more important in determining popularity.

How should we think about quality in media content?
Well, I would say that collective memory decay is an important way to measure and think about quality. If you publish some clickbait that is popular in the beginning, that gets a lot of views in the first couple of days, but a year later, nobody looks at it, you have a good metric. The same is true if publish a more thoughtful piece that might not be as popular in the beginning because it didn’t work as clickbait—it required more engagement from the readers—but keeps on building readers over time. So the differences in longevity are important metrics for quality.

That goes back to a paper I did when I was an undergrad about the decay functions of attendance of movies. There were some movies that had a lot of box office revenue in the first week but then decayed really fast. And there were other movies that decayed more slowly. We created a model in which people would talk to each other and communicate information of the quality of the movie. And that model only had one parameter, which was how good was the movie was. So the quality of the movie would increase or decrease the probability that people would go watch it. We could then look at the curves and infer how good the movie was, based not on the total area it was shown, or on the total revenue, but on the shape of the curve. That was interesting because there were movies that were really bad like Tomb Raider, which at first was a box office success. But if you put it on our model, you would see that it was just hype, people watched it, hated the movie, and the curve decayed really fast.” ~

http://nautil.us/issue/68/context/how-well-forget-john-lennon


 
Oriana:

I think it's a very important article. It touches on the subject that poets often comment on: a poet can be very popular while alive, but goes into near-oblivion shortly after his death. Then some poets "come back" and maintain a steady readership, while others are truly forgotten. And we've proven to be poor judges of “who'll survive.”

I have anthologies going way back, and the farther back you go, the less you can understand why 90% (or more) of those poets were chosen, presumably carefully, in tight competition. Academia used to rule — which poets were studied in college. But some poets have gained fame in spite of academic disdain — Mary Oliver is an example, I think. Whether she'll have a literary afterlife is another question.

Bukowski seems to have endured. So perhaps it’s a question of meeting a certain need in readers, and also of timing. Dickinson is now a stellar figure, but had no chance in her lifetime.

The main point here is this: “The decay [of collective memory] is not smooth, but has two defined regimes. There’s the first regime in which the attention starts very high and the decay is really fast. Then there’s the second regime in which it has a much longer tail, when the decay is smoother, and the attention is less.”

(By the way, “mediums”? That plural is correct only for a group of psychics.)


ECONOMICS AND MORALITY: CAN THE RICH SAVE THE WORLD? 

 
~ “Meritocracy loomed large over Victorian capitalism. However, in the course of the 19th century, the free market largely failed to deliver the developmental goods, proving itself to be more adept at generating than distributing wealth. And so, in the early decades of the 20th century, stirred by both political and intellectual developments – the growing appeal of communism to a working class that had tasted comparatively few of the market’s fruits, and the consequent rise of economic schools that aimed to renew capitalism, such as Keynesianism and the German social-market – the state gradually took on a much more active role in both society and the economy.

Thus a tremendously successful initiative that would come to be known as the development industry was born. By 1948, Western economies had emerged from crisis, beginning a decades-long period of rising growth and prosperity. Rather than pack up and go home, the development industry now turned its attention to a new frontier. With Europe’s overseas empires breaking up, dozens of new nation-states were coming into being, each of them eager to ‘catch up’ with its erstwhile colonial master. Amid this exciting atmosphere, the development industry could use its expertise to play a clear and prominent role, one captured in the subtitle to the then-Bible of development, Walt Rostow’s Stages of Economic Growth (1960) – ‘a non-communist manifesto’.

But when growth slowed in the 1970s, governments began to turn away from state-led approaches and to free up the market. Leaders such as Ronald Reagan in the US and Margaret Thatcher in the UK, early proponents of this new libertarian approach, harkened back to the unbridled individualism of the Victorian age. Reagan told a 1981 development summit in Cancun that Third World countries ought to follow the model set by the US, whose economy, in this telling, had been built by self-sufficient, independent farmers. Reagan elided the roles of slavery and industrialisation behind post-Civil War tariff-walls, major chapters in US history impossible to square with the libertarian ideal. But with a bit of editing of the historical record, the ‘neoliberals’ took a decidedly dim view of the government. They tended to think that those who can, do, while those who can’t, administer, looking for ways to frustrate society’s makers.

The ‘less government, more growth’ approach became orthodoxy, but it brought back – with a vengeance – the challenges of distribution. By the end of the 20th century, social indicators in developing countries were going backwards, just as the tide was turning against conservative politics in the West. Unlike what happened after the Second World War, though, the pendulum this time did not swing back toward the more social roadmap to development. Instead, the development industry asserted its autonomy from government, and assumed a new role. Rather than have the state build the bridge between accumulation and distribution, we now learned that philanthrocapitalism, a radically new approach to development, would offer a whole new way of doing things.

Reagan and Thatcher and other exponents of free-market economics had been social conservatives. By the 1990s, a new generation had come along, represented by leaders such as Bill Clinton in the US and Tony Blair in the UK, who mixed conservative economics with social liberalism. As much as possible, they preferred a progressive politics that channelled private initiative, and the logic of philanthrocapitalism was pleasingly straightforward. Since the rich were getting richer, they had more money to throw around. The lure of yet more lucre could now be used to steer them into sinking some of this new wealth into the poorest communities, something touted by Clinton late in his presidency when he went on a four-day ‘new markets’ tour of deprived American neighborhoods. Urging the super-rich to do some good with a portion of their rapidly growing prosperity, Clinton told them that a better world would make them richer yet. ‘Every time we hire a young person off the street in Watts and give him or her a better future,’ he said, ‘we are helping people who live in the ritziest suburb in America to continue to enjoy a rising stock market.’

In fact, in the two decades after Clinton took office, the number of charitable foundations doubled. A new problem arose, though. Due to the worsening inequality produced by free-market policies, this growing number of foundations and NGOs found themselves relying on a diminishing pool of wealthy donors. Inevitably, that forced them to cultivate the plutocrats, and reflect their views. However, even this supposed vice could be turned into a virtue. If the free market had in fact sorted the best from the rest, and enabled them to use their ingenuity to enrich themselves, it followed that this same ingenuity could subsequently be applied to the solution of social problems. As the state withdrew behind the curtains, the development industry thus moved beyond its traditional supporting role in tackling social problems to take centre stage. If Rostow’s Stages of Economic Growth had articulated the ideals of a liberal age, Matthew Bishop and Michael Green’s Philanthrocapitalism (2008) did it for a neoliberal one. As Rostow’s subtitle had done, theirs offered a pithy summation of the doctrine: ‘How the Rich Can Save the World’.

The story that philanthrocapitalists told was a great one: history marching forward, heroes and villains, and a Hollywood ending. History has a way of surprising us, however, and most of the script ended up on the cutting-room floor, the actors left to ad-lib parts they weren’t expecting to play. Russia’s shock therapy didn’t beget a flourishing capitalist democracy. Chinese autocracy didn’t collapse under the weight of its contradictions – in fact, scholars today wonder if China gives the lie to the long-cherished rule that economic dynamism demands a lean state. As for the rich people who were meant to save the world, almost to a man, they chucked the script in the bin: for every billionaire funding a progressive cause, there would be dozens who used their wealth to support conservative campaigns to further roll back the state’s social provisions.

 A wave of popular anger against disconnected ‘elites’ has resulted, which authoritarian populists have skilfully exploited to launch crackdowns on the development industry. Whether it be the Orthodox chauvinism of Russia’s Vladimir Putin, the Hindu nationalism of India’s Narendra Modi, or the Muslim fundamentalism of Turkey’s Recep Tayyip Erdoğan, they are tapping into disillusionment with the old model to repress ‘globalist elites’. As they decry NGOs for their lack of patriotism, these governments are pushing them to fall into line behind government, or to leave the field altogether (as Hungary, for instance, has done with its notorious ‘Stop Soros’ law).

And as disturbing as their conspiracy-theories of ‘globalist’ or ‘metropolitan’ elites can sound, the populists might be on to something. In 2011, the Swiss Federal Institute of Technology conducted a network analysis of the global corporate elite. What they found was a small web, made up of a few hundred tightly networked and extraordinarily wealthy individuals, dominated by bankers, and commanding vast pools of capital. If this was Davos man, then meritocracy was arguably its governing ethos – its mission, to replace the narrow, limiting confines of the old nation-state. Through compulsion and cajoling, much of the development industry got drawn into an alliance with this new global elite.

On the face of it, it seems puzzling that philanthrocapitalism ever got much of a hearing, because history had surely shown it would never work. If the rapid but unequal economic growth of the Victorian age failed to produce commensurate social development, what made anyone think that the rapid but unequal growth of the contemporary period would do any different? Moreover, the idea that the rich should be left to use their wealth to solve the world’s problems because they have proved their merit in the market ignores the science behind success. Does anyone really believe that, if Steve Jobs had been born into a Bengali peasant family, he would have still created Apple? In fact, economists who’ve actually worked out scientifically what contribution our own initiative plays in our success have found it to occupy an infinitesimally small share: the vast majority of what makes us rich or not comes down to pure dumb luck, and in particular, being born in the right place and at the right time.

At heart, philanthrocapitalism offered not a new science of development, but an old-fashioned moral tale – one in which a hero, who would reveal himself by some magnificent achievement, would come along to save us from some peril. There is no shame in weaving moral tales. Economics has always given us moral narratives by which to live our lives – in fact, that’s arguably its primary reason for being. But if it is to enter our canon, a story needs an audience that finds it rings sufficiently true to then retell it. Philanthrocapitalism failed that test. It will probably end up in history’s remainder bin as a result, while storytellers devote themselves to crafting more compelling narratives.” ~

https://aeon.co/essays/development-as-a-chapter-in-the-moral-tale-of-economics


Oriana:

Actually I was astounded to discover a few genuine philanthropists. But when I heard that Bill Gates chose which disease he’d try to wipe out, and where, I felt totally uneasy. Should one super-rich person, with no expertise in the area, have this kind of power? Shouldn't reforestation, or clean water, or solar energy be priorities? Or girls' education, which we know has a cascade of benefits?


And besides, when decisions are made from the top down, without understanding how ordinary people actually live, things tend to go wrong — sometimes catastrophically so.

It seems that “one size doesn’t fit all.” Different countries have different paths, and no path remains the right one forever.



DEBUNKING THE MYTH OF THE EARL OF SANDWICH

 
~ “The revolutionary possibilities of the sandwich have always been well hidden by its sheer obviousness. The best history, written by Woody Allen in 1966, imagines the conceptual journey taken by the fourth Earl of Sandwich 200 years earlier. “1745: After four years of frenzied labour, he is convinced he is on the threshold of success. He exhibits before his peers two slices of turkey with a slice of bread in the middle. His work is rejected by all but David Hume, who senses the imminence of something great and encourages him.” ~

The first definite sandwich sighting occurs in the diaries of Edward Gibbon, who dined at the Cocoa Tree club, on the corner of St James Street and Pall Mall in London on the evening of 24 November 1762. “That respectable body affords every evening a sight truly English,” he wrote. “Twenty or thirty of the first men in the kingdom … supping at little tables … upon a bit of cold meat, or a Sandwich.” A few years later, a French travel writer, Pierre-Jean Grosley, supplied the myth – beloved by marketing people ever since – that the Earl demanded “a bit of beef, between two slices of toasted bread,” to keep him going through a 24-hour gambling binge. This virtuoso piece of snacking secured his fame.

The evidence for this, though, is weak. In his definitive biography, The Insatiable Earl, published in 1994, NAM Rodger concludes that Sandwich was hard-up, and never wagered much for a man of his rank. A large, shambling figure, prone to breaking china, the Earl ran the Admiralty, by most accounts badly, for a total of 11 years. He lived alone after his wife went mad in 1755. Visitors to his house remarked on the poor quality of the food. “Some of his made dishes are either meagre or become absolutely obsolete,” said his friend, Lord Denbigh. The likely truth is that the entire future of the sandwich – its symbiotic relationship with work, its disregard for a slower, more sociable way of eating – was present at its inception. In 18th-century English high society, the main meal of the day was served at around 4pm, which clashed with the Earl’s duties at the Admiralty. He probably came up with the beef sandwich as a way of eating at his desk.” ~

https://getpocket.com/explore/item/how-the-sandwich-consumed-britain

 John Montagu, 4th Earl of Sandwich

*


WHAT KILLED 15 MILLION AZTECS?
 
~ “When Europeans arrived in North America, they carried with them pathogens against which the continent's native people had no immunity. And the effects could be devastating. Never was this more true than when smallpox wiped out 5-8 million Aztecs shortly after the Spanish arrived in Mexico around 1519. Even worse was a disease the locals called “huey cocoliztli" (or “great pestilence" in Aztec) that killed somewhere from 5 to 15 million people between 1545 and 1550. For 500 years, the cause of this epidemic has puzzled scientists. Now an exhaustive genetic study published in Nature Ecology and Evolution has identified the likely culprit: a lethal form of salmonella, Salmonella enterica, subspecies enterica serovar Paratyphi C. (The remaining Aztecs succumbed to a second smallpox outbreak beginning in 1576.)

Cocoliztli was therefore probably enteric fever, a horrible disease characterized by high fever, headaches, and bleeding from the nose, eyes, and mouth, and death in a matter of days once the symptoms appeared. Typhoid is an example of one enteric fever. “The cause of this epidemic has been debated for over a century by historians, and now we are able to provide direct evidence through the use of ancient DNA to contribute to a longstanding historical question," co-author Åshild Vågene of the Max Planck Institute in Germany tells AFP. (S. enterica no longer poses a serious health problem to the local population.)

The study is based on DNA analysis of teeth extracted from the remains of 24 Aztecs interred in a recently discovered cemetery in the Mixteca Alta region of Oaxaca, Mexico. The epidemic grave was found in the Grand Plaza of the Teposcolula-Yucundaa site.

Researchers suspect the Spanish brought the disease in tainted food or livestock because the teeth from five people who died prior to the Europeans' arrival show no trace of it—this is not a huge sample, of course, so it's difficult to be certain. Another team member, Kirsten Bos says, “We cannot say with certainty that S. enterica was the cause of the cocoliztli epidemic," adding, “We do believe that it should be considered a strong candidate.”

A chilling consideration is that the same strain of bacteria has been identified in a Norwegian female who died in 1200, 300 years before it appeared in the Aztec community. Clearly, Europeans weren't as defenseless against it as those in the Western Hemisphere.” ~

https://bigthink.com/robby-berman/dna-analysis-may-have-finally-revealed-what-killed-15-million-aztecs?utm_medium=Social&facebook=1&utm_source=Facebook&fbclid=IwAR0YbxavoQERcjLsq6o0es2bJ6I-iLdrNqmODQ4BCWMd2tVUqtV1THu3L7Q#



ALARM OVER SUGAR WAS FIRST SOUNDED IN 1972 — AND IGNORED

 
~ "Robert Lustig is a paediatric endocrinologist at the University of California who specializes in the treatment of childhood obesity. A 90-minute talk he gave in 2009, titled Sugar: The Bitter Truth, has now been viewed more than six million times on YouTube. In it, Lustig argues forcefully that fructose, a form of sugar ubiquitous in modern diets, is a “poison” culpable for America’s obesity epidemic.

A year or so before the video was posted, Lustig gave a similar talk to a conference of biochemists in Adelaide, Australia. Afterwards, a scientist in the audience approached him. Surely, the man said, you’ve read Yudkin. Lustig shook his head. John Yudkin, said the scientist, was a British professor of nutrition who had sounded the alarm on sugar back in 1972, in a book called Pure, White, and Deadly.

 
“If only a small fraction of what we know about the effects of sugar were to be revealed in relation to any other material used as a food additive,” wrote Yudkin, “that material would promptly be banned.” The book did well, but Yudkin paid a high price for it. Prominent nutritionists combined with the food industry to destroy his reputation, and his career never recovered. He died, in 1995, a disappointed, largely forgotten man.

When Yudkin looked at the data on heart disease, he was struck by its correlation with the consumption of sugar, not fat. He carried out a series of laboratory experiments on animals and humans, and observed, as others had before him, that sugar is processed in the liver, where it turns to fat, before entering the bloodstream.

He noted, too, that while humans have always been carnivorous, carbohydrates only became a major component of their diet 10,000 years ago, with the advent of mass agriculture. Sugar – a pure carbohydrate, with all fiber and nutrition stripped out – has been part of western diets for just 300 years; in evolutionary terms, it is as if we have, just this second, taken our first dose of it. Saturated fats, by contrast, are so intimately bound up with our evolution that they are abundantly present in breast milk. To Yudkin’s thinking, it seemed more likely to be the recent innovation, rather than the prehistoric staple, making us sick.

The British Sugar Bureau dismissed Yudkin’s claims about sugar as “emotional assertions”; the World Sugar Research Organization called his book “science fiction”.

*

In 2008, researchers from Oxford University undertook a Europe-wide study of the causes of heart disease. Its data shows an inverse correlation between saturated fat and heart disease, across the continent. France, the country with the highest intake of saturated fat, has the lowest rate of heart disease; Ukraine, the country with the lowest intake of saturated fat, has the highest. When the British obesity researcher Zoë Harcombe performed an analysis of the data on cholesterol levels for 192 countries around the world, she found that lower cholesterol correlated with higher rates of death from heart disease.

 
In the last 10 years, a theory that had somehow held up unsupported for nearly half a century has been rejected by several comprehensive evidence reviews, even as it staggers on, zombie-like, in our dietary guidelines and medical advice.

 
The UN’s Food and Agriculture Organization, in a 2008 analysis of all studies of the low-fat diet, found “no probable or convincing evidence” that a high level of dietary fat causes heart disease or cancer. Another landmark review, published in 2010, in the American Society for Nutrition, and authored by, among others, Ronald Krauss, a highly respected researcher and physician at the University of California, stated “there is no significant evidence for concluding that dietary saturated fat is associated with an increased risk of CHD or CVD [coronary heart disease and cardiovascular disease]”.

*

If Yudkin was ridiculed, Atkins was a hate figure. Only in the last few years has it become acceptable to study the effects of Atkins-type diets. In 2014, in a trial funded by the US National Institutes of Health, 150 men and women were assigned a diet for one year which limited either the amount of fat or carbs they could eat, but not the calories. By the end of the year, the people on the low carbohydrate, high fat diet had lost about 8lb more on average than the low-fat group. They were also more likely to lose weight from fat tissue; the low-fat group lost some weight too, but it came from the muscles. The NIH study is the latest of more than 50 similar studies, which together suggest that low-carbohydrate diets are better than low-fat diets for achieving weight loss and controlling type 2 diabetes. As a body of evidence, it is far from conclusive, but it is as consistent as any in the literature.

Professor John Yudkin retired from his post at Queen Elizabeth College in 1971, to write Pure, White and Deadly. The college reneged on a promise to allow him to continue to use its research facilities. It had hired a fully committed supporter of the fat hypothesis to replace him, and it was no longer deemed politic to have a prominent opponent of it on the premises. The man who had built the college’s nutrition department from scratch was forced to ask a solicitor to intervene. Eventually, a small room in a separate building was found for Yudkin.

When I asked Lustig why he was the first researcher in years to focus on the dangers of sugar, he answered: “John Yudkin. They took him down so severely – so severely – that nobody wanted to attempt it on their own.”

Today, as nutritionists struggle to comprehend a health disaster they did not predict and may have precipitated, the field is undergoing a painful period of re-evaluation. It is edging away from prohibitions on cholesterol and fat, and hardening its warnings on sugar, without going so far as to perform a reverse turn. But its senior members still retain a collective instinct to malign those who challenge its tattered conventional wisdom too loudly. " ~

 
https://getpocket.com/explore/item/the-sugar-conspiracy



ending on beauty:

 
Don't bring the ocean if I feel thirsty,
nor heaven if I ask for a light;
but bring a hint, some dew, a particle,
as birds carry only drops away from water,
and the wind a grain of salt.

~ Olav Hauge



No comments:

Post a Comment