*
DEATH
It's the sparrow
you see in your dreams.
If I try to explain this to you,
I can’t.
If you try to explain it to me,
I won’t listen.
Wisdom is the river
we all drown in.
~ John Guzlowski
*
The bible says that to god you're worth more than a hundred sparrows. Talk about species bigotry! But never mind the political incorrectness. I think we should test this saying on our Significant Other as a sign of appreciation: "You are worth more to me than a hundred sparrows!" Will he be flattered? Will he fall to his knees weeping, overcome with gratitude for such love? Or will he demand crucifixion (or at least a good dinner) to believe it?
(A shamelesss digression: Where have all the sparrows gone? There used to be so many, raising cheerful racket. Now there are mostly crows. But I seem to have a resident bat.)
*
OLIVER SACHS ON THE JOY OF BEING EIGHTY: AN ENLARGEMENT OF PERSPECTIVE, BEING ABLE TO TAKE THE LONG VIEW
~ “Eighty! I can hardly believe it. I often feel that life is about to begin, only to realize it is almost over . . .
My father, who lived to 94, often said that the 80s had been one of the most enjoyable decades of his life. He felt, as I begin to feel, not a shrinking but an enlargement of mental life and perspective. One has had a long experience of life, not only one’s own life, but others’, too. One has seen triumphs and tragedies, booms and busts, revolutions and wars, great achievements and deep ambiguities, too. One has seen grand theories rise, only to be toppled by stubborn facts. One is more conscious of transience and, perhaps, of beauty.
At 80, one can take a long view and have a vivid, lived sense of history not possible at an earlier age. I can imagine, feel in my bones, what a century is like, which I could not do when I was 40 or 60. I do not think of old age as an ever grimmer time that one must somehow endure and make the best of, but as a time of leisure and freedom, freed from the factitious urgencies of earlier days, free to explore whatever I wish, and to bind the thoughts and feelings of a lifetime together.”
But what about dying, you may ask. Sachs seems unperturbed. He hopes to die “in harness,” working to the end. “I have no belief in (or desire for) any post-mortem existence, other than in the memories of friends and the hope that some of my books may still ‘speak’ to people after my death.” ~
(Alas, I seem to have lost the link, and know only that it goes back to 2013. Sachs died two years later from metastatic ocular melanoma. When the metastases were found, Sachs wrote “February 2015 New York Times op-ed piece and estimated his remaining time in “months.” He expressed his intent to "live in the richest, deepest, most productive way I can". He added: "I want and hope in the time that remains to deepen my friendships, to say farewell to those I love, to write more, to travel if I have the strength, to achieve new levels of understanding and insight.” [wiki])
THE CONFEDERACY WAS A CON JOB ON WHITES. AND STILL IS
~ I’ve lived 55 years in the South, and I grew up liking the Confederate flag. I haven’t flown one for many decades, but for a reason that might surprise you.
I know the South well. We lived wherever the Marine Corps stationed my father: Georgia, Virginia, the Carolinas. As a child, my favorite uncle wasn’t in the military, but he did pack a .45 caliber Thompson submachine gun in his trunk. He was a leader in the Ku Klux Klan.
Despite my role models, as a kid I was an inept racist. I got in trouble once in the first grade for calling a classmate the N-word. But he was Hispanic.
As I grew up and acquired the strange sensation called empathy (strange for boys anyway), I learned that for black folks the flutter of that flag felt like a poke in the eye with a sharp stick. And for the most prideful flag waivers, clearly that response was the point. I mean, come on. It’s a battle flag.
What the flag symbolizes for blacks is enough reason to take it down. But there’s another reason that white southerners shouldn’t fly it. Or sport it on our state-issued license plates as some do here in North Carolina. The Confederacy — and the slavery that spawned it — was also one big con job on the Southern white working class. A con job funded by some of the ante-bellum one-per-centers, that continues today in a similar form.
You don’t have to be an economist to see that forcing blacks — a third of the South’s laborers — to work without pay drove down wages for everyone else. And not just in agriculture. A quarter of enslaved blacks worked in the construction, manufacturing and lumbering trades, cutting wages even for skilled white workers.
Thanks to the profitability of this no-wage/low-wage combination, a majority of American one-per-centers were southerners. Slavery made southern states the richest in the country. The South was richer than any other country except England. But that vast wealth was invisible outside the plantation ballrooms. With low wages and few schools, southern whites suffered a much lower land ownership rate and a far lower literacy rate than northern whites.
My ancestor Canna Hyman and his two sons did own land and fought under that flag. A note from our family history says: “Someone came for them while they were plowing one day. They put their horses up and all three went away to the War and only one son, William, came back.”
Like Canna, most Southerners didn’t own slaves. But they were persuaded to risk their lives and limbs for the right of a few to get rich as Croesus from slavery. For their sacrifices and their votes, they earned two things before and after the Civil War. First, a very skinny slice of the immense Southern pie. And second, the thing that made those slim rations palatable then and now: the shallow satisfaction of knowing that blacks had no slice at all.
Cotton pickers; Earle Richardson, 1934
How did the plantation owners mislead so many Southern whites?
They managed this con job partly with a propaganda technique that will be familiar to modern Americans, but hasn’t received the coverage it deserves in our sesquicentennial celebrations. Starting in the 1840s wealthy Southerners supported more than 30 regional pro-slavery magazines, many pamphlets, newspapers and novels that falsely touted slave ownership as having benefits that would – in today’s lingo – trickle down to benefit non-slave owning whites and even blacks. The flip side of the coin of this old-is-new trickle-down propaganda is the mistaken notion that any gain by blacks in wages, schools or health care comes at the expense of the white working class.
Today’s version of this con job no longer supports slavery, but still works in the South and thrives in pro trickle-down think tanks, magazines, newspapers, talk radio and TV news shows such as the Cato Foundation, Reason magazine, [talk radio hosts,] and Fox News. These sources are underwritten by pro trickle-down one-per-centers like the Koch brothers and Rupert Murdoch.
For example, a map of states that didn’t expand Medicaid – which would actually be a boon mostly to poor whites – resembles a map of the old Confederacy with a few other poor, rural states thrown in. Another indication that this divisive propaganda works on Southern whites came in 2012. Romney and Obama evenly split the white working class in the West, Midwest and Northeast. But in the South we went 2-1 for Romney.
Lowering the flag because of the harm done to blacks is the right thing to do. We also need to lower it because it symbolizes material harm the ideology of the Confederacy did to Southern whites that lasts even to this day.
One can love the South without flying the battle flag. But it won’t help to get rid of an old symbol if we can’t also rid ourselves of the self-destructive beliefs that go with it. Only by shedding those too, will Southern whites finally catch up to the rest of the country in wages, health and education. ~
Frank Hyman lives in Durham,where he has held two local elected offices. He’s a carpenter and stonemason and policy analyst for Blue Collar Comeback.
Below: Confederate artillery at Charleston Harbor, 1863
MARY: THE MYTH OF AMERICA AS A CLASSLESS SOCIETY
Oh yes, the confederacy was a con job, or as I think of it, a false narrative, very like a religious mythology, that those southern devotees of the confederate flag, the white right wing radicals, and the "Christian" Fundamentalists are all still devoted to, even against their own true interests.
How can such a false narrative perpetuate itself? And inspire such a degree of hysterical fanaticism that it spawns insurrectionist mobs and cults like QAnon?? I think the lynchpin here is racism, and the way it intersects with the class structure in the US, helped along with the myth that we don't have classes here...an idea patently false but fiercely held, particularly by those in the working class, the rural and urban working poor. The myth of social mobility, the myth that anyone can become rich and powerful if they work hard or are lucky is a cherished basic element of our creed — even though it's untrue.
In fact, this idea gets its strength in large part because the basic assumption is that it's true....but only if you're white. The effect this has is to make the poor working class staunch supporters of the system because it rules that They can Win before Any black person — that they will always have that advantage and that privilege if not any other . Taking that advantage, embracing it, insisting on it, means they buy into the whole structure it supports. To give up their racism is to give up their only advantage, their only privilege. And they are ready to fight and die for that.
I think this is a tragedy, and particularly because it is a con. Supporting and defending the racist capitalist status quo is not going to win these folks anything..and yet they will continue to remain opposed to those who should be their best allies, continue to act against their own best interest, chasing lies and promises meant only to keep them exactly where they are. In their place. And blind to the irony of that position.
At work here is also the idea of scarcity, that there is only so much, and if others get more I must get less; any advance another wins is a setback for me. This ignores the inherent unequal distribution that's been with us forever, more and more, most of everything goes to the one percent, and then there's the huge, ever-growing, impossible gap between that one percent and everyone else. And if that one percent can keep everyone else divided and distrustful of each other, all the better for them .
The concentration of wealth, power and choice in the one percent has gradually negated all the hard-won strategies of the working poor...destroying unions, refusing to allow actual living wages...so workers work harder, longer, and in multiple jobs just to approximate an actual living wage...while the elite one percent become more and more obscenely rich.
So much of our national character feeds into this. Americans love the rich and famous... believing that's where they belong, and still might find themselves someday. They persist in the ideas of the self-made man and pulling oneself up by the bootstraps, even though very few of the wealthy came by their riches through labor of any sort. Most wealth here is inherited, not earned. Even the most vulgar displays of greed and consumption are admired and envied...like Trumpian displays of all sorts.
The fact that this is all false and based on mythologies actually makes it harder to combat. Fanatics are not amenable to reason. People who feel that a challenge to their beliefs will wipe out the small bit of power and privilege they can claim, will resist with desperate strength. So they wave their flags and threaten all the basics of democracy.
Oriana:
I think it was during my second month in the US., in Milwaukee, that I was instructed “there are no social classes in America” (people in Milwaukee loved to instruct me about America). Somehow I knew better than to point out the obvious — the lovely rich neighborhoods and the ghost-like downtown, the slums of the Inner City or South Side Chicago, with liquor stores, pawnshops, bail bond lenders, waste lots, “affordable bankruptcy” signs, and other marks of despair. But as long as the worst areas were black, other myths could be maintained.
Now, it’s true all over the world that you can tell the rich from the poor at a glance: the clothes alone say a lot, the smiles and self-confidence as opposed to the grim and tense faces of those whose life is a merciless struggle. And whenever you looked closer at the “self-made,” you discovered that they were raised by caring parents, got their first job thanks to their uncle, and so on. Usually it’s the whole immediate social milieu — but psychologists tell us that all it takes is just one caring person to lift someone to greater success. But the myth of individualism won’t give credit even to that one person. Or to the parents . . . because that smacks of unearned advantage, and even (horror!) social class.
But this pandemic has been yet another big eye-opener, and propagandists will have to work harder now . . .
*
ABOLISH THE MONARCHY
~ A recent interview you may have heard about revealed that the British monarchy is a toxic den of backbiting and racism. And who would doubt it? There is nothing easier to believe than that an institution created to be the physical embodiment of classism is awash in inhumanity. Where the public response to this humdrum revelation has gone astray is in the widespread conviction that we should make the monarchy better. Not at all. You cannot turn a bottle of poison into a refreshing drink, no matter how much sugar you pour into it.
A just and proper response to what we have learned would be for the entire United Kingdom to come together, join hands in a great circle around the institution of the monarchy and burn it to the ground, while singing “Sweet Caroline,” to maintain a positive spirit. Then the members of the royal family can sweep up the ashes and deposit them neatly in the bin, a ceremonial beginning to a new life of working for a living.
The existence of a monarchy is an admission that a government can’t, or doesn’t care to, solve people’s problems. Instead, it offers spectacle. It has always been easier to elevate one family to a fairy-tale life of luxury than to do the dreary work of elevating every single family to a decent standard of living. The common people fund the lifestyle of a tiny, exalted and thoroughly unworthy elite, rather than the other way around. Any nation that still has a monarchy in 2021 is proving itself to have a mortifying lack of revolutionary gumption.
America is guilty of many crimes against humanity, but this is one thing we got right. Our presidents may be national embarrassments, but at least Americans are not required to scrape and bow before some utterly random rich wastrel whose claim to legitimacy is being the child of the child of the child of someone who was, centuries ago, the nation’s biggest gangster. Yes, we have our own hypnotic capitalist addiction to celebrity, but monarchy is something altogether more twisted — as if the Bush family, the Kardashians and the Falwells were all rolled into one bejeweled quasi-religious fame cult, topped off with a bracing dose of imperialism.
What is a monarchy if not the highest veneration of inequality? Based not on moral worth but on accidents of heredity, a small group of people are lavished with millions of dollars skimmed from the public till and are worshiped as sentimental nationalist gods, in exchange only for performing the duty of “being pleasant in public,” which they do with mixed success.
More than 60 million citizens, many of them living in poverty, are instructed to celebrate rather than to loathe this tableau of excess. They are told to be happy that someone has a dream life, even if it is not them, and to live vicariously through this soap opera cast of royals, rather than demanding equality for everyone else. The crown would greatly appreciate if you tune in to this show rather than spending your time reading Karl Marx.
And that plan appears to be working: More than four in five British adults have a positive view of the queen. The appeal of fancy hats is hard to overcome.
The stars of this insipid show will change with time. New princes and princesses will be born, opulent weddings will be had, different coddled butts will get their turn to sit on the cushioned throne. These machinations, each of them designed to occupy the public’s attention for a while, are just the scrambling of termites atop the enormous nest that is the monarchy itself. It feeds on the vigor of the working people and regurgitates it into a giant home for itself.
Abolishing the monarchy shouldn’t be too tricky. First you take away their homes. Then you take away their wealth. Then you take away their titles. All of those things properly belong to the public, and those squatters have held them for far too long.
The good news for the royal family is that the economy seems to be on the rebound. It shouldn’t be too hard for them to find jobs, even considering their lack of practical experience. They could get honorable jobs at a Tesco market. What a wonderful opportunity for them to earn an honest living, for the first time in their lives. As our social betters often tell the rest of us, hard work is good for self-esteem. I expect that they will soon be happier than ever. ~ Hamilton Nolan
A reader’s comment:
At least the monarchy provides an element of common identity and pride. To have something like that in the US right now would be a steal at twice the price.
Another reader:
To me it is the same for the Pope, what is his job? A fairy tale and people give so much that never gets returned to the people. Both need to go!
"For monarchy in every instance is the Popery of government."
~ Thomas Paine, Common Sense, 1776
Yet another:
Being English I find being a “subject” extremely embarrassing. On the other hand many of us look at the pseudo-democracy that is the USA and are thankful to have a hereditary Head of State with limited powers than run the risk of a British Trump getting elected.
And one more:
Have any of you checked the amount of tourism pounds the royal family creates? And the fact that they are personally very wealthy and really don’t need to do this? And the number of people who would be out of a job if they all resigned?
Oriana: IF IT WERE GONE, WOULD ANYONE MISS IT?
I have no strong opinion on this. The British appear to love the queen. It’s the notion of a king that appears rather absurd in modern times, but there is something to be said for the queen as a mother figure. Queens only, please!
As for the word “subject,” it should definitely be retired in favor of “citizen.”
But if the monarchy got abolished, would anyone miss it? This reminds me of a conversation I had with an ex-Jesuit priest. I argued that it’s time to allow women priests to say the mass. He replied, “But should the mass be said at all? Let’s stop having masses for three years, and see if anyone misses anything.” And that was the end of it, the question basically rhetorical.
*
*
~ I’ve been thinking about a poignant memorial at the Reichstag in Berlin dedicated to the 96 members of the German congress who spoke out against Hitler, but failed to stop him as he thundered into power in 1933. These were the last people who could have stopped the fascist dictator…so they became his first victims. Each slate slab remembers one politician, with his name, political party, and the date and location of his death (generally in a concentration camp). They are honored in front of the building — Germany’s capitol — where they worked to defend democracy in their country. ~ Rick Steves
HOW DICTATORS SELF-DESTRUCT
~ With authoritarian rulers ascendant in many parts of the world, one wonders what must happen for their countries to liberalize. The likes of Vladimir Putin in Russia, Recep Tayyip Erdogan in Turkey or Xi Jinping in China are entrenched, experienced and not unpopular — so should their opponents simply resign themselves to an open-ended period of illiberal rule?
According to Daniel Treisman, a UCLA political scientist, that's not necessarily the case. For a recent paper, he analyzed 218 episodes of democratization between 1800 and 2015 and found they were, with some exceptions (such as Danish King Frederick VII's voluntary acceptance of a constitution in 1848), the result of authoritarian rulers' mistakes in seeking to hold on to power. The list of these errors is both a useful handbook for authoritarians and a useful reminder that even the most capable of them are fallible.
According to Treisman, deliberate liberalization — whether to forestall a revolution, motivate people to fight a foreign invader, defeat competing elite groups or make a pact with them — only occurred in up to a third of the cases. In the rest, democratization was an accident: As they set off a chain of events, rulers didn't intend to relinquish power. Some of them — such as Mikhail Gorbachev, the last Soviet president — have admitted as much.
Treisman's list of mistakes is worth citing in full. There are five basic ones:
HUBRIS. An authoritarian ruler underestimates the opposition's strength and fails to compromise or suppress it before it's too late. King Louis Philippe of France was deposed in 1848 after, as Treisman puts it, turning "a series of reform banquets into revolution by refusing even mild concessions." Romanian Communist dictator Nicolae Ceausescu was making a routine speech when he realized he was being overthrown. Indonesian President Muhammad Suharto believed he could get the country under control right up to the moment of his resignation.
NEEDLESS RISK. A ruler calls a vote which he "fails to manipulate sufficiently" (like Chilean dictator Augusto Pinochet in 1988, when he lost a plebiscite on whether he should be allowed to stay in power) or starts a war he cannot win (like Leopoldo Galtieri in Argentina with the Falklands conflict of 1982).
SLIPPERY SLOPE. That's Gorbachev's case: a ruler starts reforms to prop up the regime but ends up undermining it.
TRUSTING A TRAITOR. This is not always a mistake made by the dictator itself, although it was in the case of Francisco Franco in Spain, who chose King Juan Carlos, the dismantler of fascism, as his successor. In Gorbachev's case, it was the Politburo — the regime's elite — that picked the wrong man to preserve its power.
COUNTERPRODUCTIVE VIOLENCE. Not suppressing the opposition when necessary can be a sign of hubris in a dictator, but overreacting is also a grave mistake. The example Treisman gives is Bangladeshi President Hussain Muhammad Ershad, who was forced to resign by an uprising that started after police shot an opposition activist at a rally. But the error was also made by Ukrainian President Viktor Yanukovych in 2013, when his riot police descended on a few hundred peacefully protesting students and brutally beat them, setting off the much bigger protests that resulted in Yanukovych's ouster.
These are all very human errors of judgment. Dictators are people, too, and sometimes they'll act on imperfect information or erroneous gut feeling. But Treisman makes the point that they may be prone to such errors precisely because they are dictators. They'll be fooled by polls which people don't answer sincerely, taken in by their own propaganda (like Malawi ruler Hastings Banda, who called and lost a referendum in 1993 because he'd been impressed by the high turnout at rallies in his support even though people had been forced to attend them). And sometimes they'll rule for so long that their mental faculties will be less sharp than at the outset.
I have a particular interest in watching Putin for any of the errors on Treisman's list. So far, it's as if he'd read the paper before Treisman wrote it. His suppression has been timely and cleverly measured, his election manipulation always sufficient, his temporary successor, Dmitri Medvedev, avoided the liberal slippery slope, and he's only started wars against much weaker rivals. He helps his regime's propaganda by treating it as truth, but he doesn't buy it to the point of losing vigilance. In the 2018 election, he kept his main opponent, Alexei Navalny, out of the race, mindful that modern technology allows a rival to loosen media restrictions — something Treisman notes can lead a hubristic dictator to an electoral loss.
But even Putin, after 17 years in power, is in danger of making a miscalculation one day, perhaps finally misreading the mood of the increasingly cynical Russian public that keeps registering support for him in largely worthless polls. It's easy to imagine the choleric Erdogan getting into an armed conflict Turkey cannot sustain or using disproportional violence as Turks' patience with his reprisals wear thin. It's a possibility, although a remote one, that, after Xi's power consolidation, the Chinese Communist Party will opt for a more liberal successor and he won't be able to hold the reins as tightly.
Treisman notes that in 85 percent of the episodes he studied, democratization was preceded by mass unrest. Sooner or later, people tend to get tired of regimes in which they have little say. Then, it only takes a misstep from the one person at the center of such a regime. Dictators often overestimate the external danger to their power, the plots of foreign or exiled enemies. In the final analysis, they are the biggest threat to themselves. ~
https://getpocket.com/explore/item/most-dictators-self-destruct-why?utm_source=pocket-newtab
*
Mikhail Iossel shared: 40% of all Americans alive today were born after the invention of the World Wide Web. 40%!
Slava Malamud to Mikhail: "You and I lie on the opposite sides of the great Toilet Paper Is Available For Sale In the Soviet Union Divide.
Which, incidentally, is 1969, eight goddamn years after we sent a man into space."
Oriana:
The Soviet Union placed consumer goods far behind priorities such as the space race and the arms race. And as long as the newspaper Pravda ("Truth") remained extremely cheap, why worry about toilet paper?
*
GLOBAL WARMING AND THE AIR-CONDITIONING DILEMMA
~ On a regular day, New York City demands around 10,000MW every second; during a heatwave, that figure can exceed 13,000MW. “Do the math, whatever that gap is, is the AC,” Michael Clendenin, a company spokesman, told me. The combination of high demand and extreme temperature can cause parts of the system to overheat and fail, leading to blackouts. In 2006, equipment failure left 175,000 people in Queens without power for a week, during a heatwave that killed 40 people.
By the evening of Sunday 21 July, 2019, with temperatures above 36C (97F) and demand at more than 12,000MW every second, Con Edison cut power to 50,000 customers in Brooklyn and Queens for 24 hours, afraid that parts of the nearby grid were close to collapse, which could have left hundreds of thousands of people without power for days. The state had to send in police to help residents, and Con Edison crews dispensed dry ice for people to cool their homes.
As the world gets hotter, scenes like these will become increasingly common. Buying an air conditioner is perhaps the most popular individual response to climate change, and air conditioners are almost uniquely power-hungry appliances: a small unit cooling a single room, on average, consumes more power than running four fridges, while a central unit cooling an average house uses more power than 15. “Last year in Beijing, during a heatwave, 50 percent of the power capacity was going to air conditioning,” says John Dulac, an analyst at the International Energy Agency (IEA). “These are ‘oh shit’ moments.”
There are just over 1bn single-room air conditioning units in the world right now – about one for every seven people on earth. Numerous reports have projected that by 2050 there are likely to be more than 4.5bn, making them as ubiquitous as the mobile phone is today. The US already uses as much electricity for air conditioning each year as the UK uses in total. The IEA projects that as the rest of the world reaches similar levels, air conditioning will use about 13 percent of all electricity worldwide, and produce 2bn tonnes of CO2 a year – about the same amount as India, the world’s third-largest emitter, produces today.
All of these reports note the awful irony of this feedback loop: warmer temperatures lead to more air conditioning; more air conditioning leads to warmer temperatures. The problem posed by air conditioning resembles, in miniature, the problem we face in tackling the climate crisis. The solutions that we reach for most easily only bind us closer to the original problem.
The global dominance of air conditioning was not inevitable. As recently as 1990, there were only about 400m air conditioning units in the world, mostly in the US. Originally built for industrial use, air conditioning eventually came to be seen as essential, a symbol of modernity and comfort. Then air conditioning went global. Today, as with other drivers of the climate crisis, we race to find solutions – and puzzle over how we ended up so closely tied to a technology that turns out to be drowning us.
Like the aqueduct or the automobile, air conditioning is a technology that transformed the world. Lee Kuan Yew, the first prime minister of independent Singapore, called it “one of the signal inventions of history” that allowed the rapid modernization of his tropical country. In 1998, the American academic Richard Nathan told the New York Times that, along with the “civil rights revolution,” air conditioning had been the biggest factor in changing American demography and politics over the previous three decades, enabling extensive residential development in the very hot, and very conservative, American south.
A century ago, few would have predicted this. For the first 50 years of its existence, air conditioning was mainly restricted to factories and a handful of public spaces. The initial invention is credited to Willis Carrier, an American engineer at a heating and ventilation company, who was tasked in 1902 with reducing humidity in a Brooklyn printing factory. Today we assume that the purpose of air conditioning is to reduce heat, but engineers at the time weren’t solely concerned with temperature. They wanted to create the most stable possible conditions for industrial production – and in a print factory, humidity curled sheets of paper and smudged ink.
All of these reports note the awful irony of this feedback loop: warmer temperatures lead to more air conditioning; more air conditioning leads to warmer temperatures. The problem posed by air conditioning resembles, in miniature, the problem we face in tackling the climate crisis. The solutions that we reach for most easily only bind us closer to the original problem.
The global dominance of air conditioning was not inevitable. As recently as 1990, there were only about 400m air conditioning units in the world, mostly in the US. Originally built for industrial use, air conditioning eventually came to be seen as essential, a symbol of modernity and comfort. Then air conditioning went global. Today, as with other drivers of the climate crisis, we race to find solutions – and puzzle over how we ended up so closely tied to a technology that turns out to be drowning us.
***
It wasn’t until the late 1940s, when it began to enter people’s homes, that the air conditioner really conquered the US. Before then, according to the historian Gail Cooper, the industry had struggled to convince the public that air conditioning was a necessity, rather than a luxury. In her definitive account of the early days of the industry, Air-Conditioning America, Cooper notes that magazines described air conditioning as a flop with consumers. Fortune called it “a prime public disappointment of the 1930s.” By 1938 only one out of every 400 American homes had an air conditioner; today it is closer to nine out of 10.
***
What fueled the rise of the air conditioning was not a sudden explosion in consumer demand, but the influence of the industries behind the great postwar housing boom. Between 1946 and 1965, 31m new homes were constructed in the US, and for the people building those houses, air conditioning was a godsend. Architects and construction companies no longer had to worry much about differences in climate – they could sell the same style of home just as easily in New Mexico as in Delaware. The prevailing mentality was that just about any problems caused by hot climates, cheap building materials, shoddy design or poor city planning could be overcome, as the American Institute of Architects wrote in 1973, “by the brute application of more air conditioning.” As Cooper writes, “Architects, builders and bankers accepted air conditioning first, and consumers were faced with a fait accompli that they merely had to ratify.”
Equally essential to the rise of the air conditioner were electric utilities – the companies that operate power plants and sell electricity to consumers. Electric utilities benefit from every new house hooked up to their grid, but throughout the early 20th century they were also looking for ways to get these new customers to use even more electricity in their homes. This process was known as “load building,” after the industry term (load) for the amount of electricity used at any one time. “The cost of electricity was low, which was fine by the utilities. They simply increased demand, and encouraged customers to use more electricity so they could keep expanding and building new power plants,” says Richard Hirsh, a historian of technology at Virginia Tech.
The utilities quickly recognized that air conditioning was a serious load builder. As early as 1935, Commonwealth Edison, the precursor to the modern Con Edison, noted in its end-of-year report that the power demand from air conditioners was growing at 50 percent a year, and “offered substantial potential for the future.” That same year, Electric Light & Power, an industry trade magazine, reported that utilities in big cities “are now pushing air conditioning. For their own good, all power companies should be very active in this field.”
By the 1950s, that future had arrived. Electric utilities ran print, radio and film adverts promoting air conditioning, as well as offering financing and discount rates to construction companies that installed it. In 1957, Commonwealth Edison reported that for the first time, peak electricity usage had occurred not in the winter, when households were turning up their heating, but during summer, when people were turning on their air-conditioning units. By 1970, 35 percent of American houses had air conditioning, more than 200 times the number just three decades earlier.
At the same time, air-conditioning-hungry commercial buildings were springing up across the US. The all-glass skyscraper, a building style that, because of its poor reflective properties and lack of ventilation, often requires more than half its electricity output be reserved for air conditioning, became an American mainstay. Between 1950 and 1970 the average electricity used per square foot in commercial buildings more than doubled. New York’s World Trade Center, completed in 1974, had what was then the world’s largest AC unit, with nine enormous engines and more than 270km of piping for cooling and heating. Commentators at the time noted that it used the same amount of electricity each day as the nearby city of Schenectady, population 80,000.
The air-conditioning industry, construction companies and electric utilities were all riding the great wave of postwar American capitalism. In their pursuit of profit, they ensured that the air conditioner became an essential element of American life. “Our children are raised in an air-conditioned culture,” an AC company executive told Time magazine in 1968. “You can’t really expect them to live in a home that isn’t air conditioned.” Over time, the public found they liked air conditioning, and its use continued to climb, reaching 87 percent of US households by 2009.
The postwar building spree was underpinned by the idea that all of these new buildings would consume incredible amounts of power, and that this would not present any serious problems in the future. In 1992, the journal Energy and Buildings published an article by the British conservative academic Gwyn Prins, arguing that the American addiction to air conditioning was a symbol of its profound decadence. Prins summarized America’s guiding credo as: “We shall be cool, our plates shall overflow and gas shall be $1 a gallon, Amen.”
During the time that air conditioning was reshaping America’s cities, it had little effect elsewhere. (With some exceptions – Japan, Australia and Singapore were early adopters.) Now, however, air conditioning is finally sweeping across the rest of the world. If the march of air conditioning across the US tracked its postwar building and consumption boom, its more recent expansion has followed the course of globalization. As the rest of the world adopts more Americanized ways of building and living, air conditioning follows.
In the 1990s, many countries across Asia opened up to foreign investment and embarked on an unprecedented urban building spree. Over the past three decades, about 200 million people in India have moved to cities; in China, the number is more than 500 million. From New Delhi to Shanghai, heavily air-conditioned office buildings, hotels and malls began to spring up. These buildings were not only indistinguishable from those in New York or London, but were often constructed by the same builders and architects. “When you had this money coming in from the rest of the world for high-end buildings, it often came with an American or European designer or consultancy attached,” says Ashok Lall, an Indian architect who focuses on housing and low-energy design. “And so it comes as a package with AC. They thought that meant progress.”
As the rate and scale of building intensified, traditional architectural methods for mitigating hot temperatures were jettisoned. Leena Thomas, an Indian professor of architecture at the University of Technology in Sydney, told me that in Delhi in the early 1990s older forms of building design – which had dealt with heat through window screens, or facades and brise-soleils – were slowly displaced by American or European styles. “I would say that this international style has a lot to answer for,” she said. Just like the US in the 20th century, but on an even greater scale, homes and offices were increasingly being built in such a way that made air conditioning indispensable. “Developers were building without thinking,” says Rajan Rawal, a professor of architecture and city planning at Cept University in Ahmedabad. “The speed of construction that was required created pressure. So they simply built and relied on technology to fix it later.”
Lall says that even with affordable housing it is possible to reduce the need for air conditioning by designing carefully. “You balance the sizes of opening, the area of the wall, the thermal properties, and shading, the orientation,” he says. But he argues that, in general, developers are not interested. “Even little things like adequate shading and insulation in the rooftop are resisted. The builders don’t appear to see any value in this. They want 10- to 20-story blocks close to one another. That’s just how business works now, that’s what the cities are forcing us to do. It’s all driven by speculation and land value.”
This reliance on air conditioning is a symptom of what the Chinese art critic Hou Hanru has called the epoch of post-planning. Today, planning as we traditionally think of it – centralized, methodical, preceding development – is vanishingly rare. Markets dictate and allocate development at incredible speed, and for the actual inhabitants, the conditions they require to live are sourced later, in a piecemeal fashion. “You see these immense towers go up, and they’re already locking the need for air conditioning into the building,” says Marlyne Sahakian, a sociologist who studies the use of air conditioning in the Philippines.
To its proponents, air conditioning is often presented as a simple choice that consumers make to improve their lives as they climb the economic ladder. “It’s no longer a luxury product but a necessity,” an executive at the Indian branch of the Japanese air-conditioner manufacturing giant Daikin told the Associated Press last year. “Everyone deserves AC.”This refrain is as familiar in Rajasthan now as it was in the US 70 years ago. Once air conditioning is embedded in people’s lives, they tend to want to keep it. But that fact obscures the ways that consumers’ choices are shaped by forces beyond their control. In her 1967 book Vietnam, Mary McCarthy reflected on this subtle restriction of choice in American life. “In American hotel rooms,” she wrote, “you can decide whether or not to turn on the air conditioning (that is your business), but you cannot open the window.”
One step towards solving the problem presented by air conditioning – and one that doesn’t require a complete overhaul of the modern city – would be to build a better air conditioner. There is plenty of room for improvement. The invention of air conditioning predates both the first airplane and the first public radio broadcast, and the underlying technology has not changed much since 1902. “Everything is still based on the vapor compression cycle; same as a refrigerator. It’s effectively the same process as a century ago,” says Colin Goodwin, the technical director of the Building Services Research and Information Association. “What has happened is we’ve expanded the affordability of the air conditioner, but as far as efficiency, they’ve improved but they haven’t leaped.”
But, as with other technological responses to climate change, it is far from certain that the arrival of a more efficient air conditioner will significantly reduce global emissions. According to the RMI, in order to keep total global emissions from new air conditioners from rising, their prize-winning efficient air conditioner would need to go on sale no later than 2022, and capture 80 percent of the market by 2030. In other words, the new product would have to almost totally replace its rivals in less than a decade. Benjamin Sovacool, professor of energy policy at Sussex University and a lead author on the next Intergovernmental Panel on Climate Change (IPCC) report, describes this ambition as not impossible, but pretty unlikely.
New air-conditioner technology would be welcome, but it is perhaps “the fourth, or maybe fifth thing on the list we should do” to reduce the emissions from air conditioning, says Diana Ãœrge-Vorsatz, a professor of climate change and energy policy at Central European University, and a lead author on the forthcoming IPCC report. Among the higher priorities that she mentions are planting trees, retrofitting old buildings with proper ventilation, and ”no longer building “concrete and glass cages that can’t withstand a heatwave.” She adds: “All of these things would be cheaper too, in the long run.
But while these things are technically cheaper, they require changes in behavior and major policy shifts – and the open secret of the climate crisis is that nobody really knows how to make these kind of changes on the systematic, global scale that the severity of the crisis demands.
If we are not about to be rescued by technology, and worldwide policy changes look like a distant hope, there remains a very simple way of reducing the environmental damage done by air conditioning: use less of it. But, as the ecological economist and IPCC author Julia Steinberger has written, any serious proposals to change our lifestyles – cutting down on driving, flying or imported avocados – are considered “beyond the pale, heretic, almost insane.” This is especially true of air conditioning, where calls to use it less are frequently treated as suggestions that people should die in heatwaves, or evidence of a malicious desire to deny other people the same comforts that citizens in wealthy countries already enjoy.
This summer, the publication of a New York Times article asking “Do Americans need air conditioning?” touched off a thousand furious social media posts, uniting figures from the feminist writer and critic Roxane Gay (“You wouldn’t last a summer week in Florida without it. Get a grip”) to the conservative professor and pundit Tom Nichols (“Air conditioning is why we left the caves … You will get my AC from me when you pry it from my frozen, frosty hands”).
But not everyone has accepted the notion that there is such as thing as the objectively “right” temperature. Studies have suggested that men have different ideal temperatures from women. In offices around the world, “Men toil in their dream temperatures, while women are left to shiver,” argued a 2015 article in the Telegraph, one of many suggesting that the scientific research had simply confirmed something millions of women already knew.
Researchers have also shown that people who live in hotter areas, even for a very short time, are comfortable at higher indoor temperatures. They contend that, whether it is a state of mind or a biological adjustment, human comfort is adaptive, not objective. This is something that seems obvious to many people who live with these temperatures. At a recent conference on air conditioning that I attended in London, an Indian delegate chided the crowd: “If I can work and function at 30C, you could too – believe you me.”
Adding to the weight of evidence against the idea of the “ideal” temperature, Frederick Rohles, a psychologist and member of the American Society of Heating, Refrigerating and Air-Conditioning Engineers, has conducted studies showing that subjects who were shown a false thermometer displaying a high temperature felt warm, even if the room was cool. “These are the sorts of things that drive my engineering colleagues crazy,” he wrote in 2007. “Comfort is a state of mind!”
Ashok Lall points out that once people are open to the idea that the temperature in a building can change, you can build houses that use air conditioning as a last resort, not a first step. “But there is no broad culture or regulation underpinning this,” he says. At the moment, it is the deterministic camp that has control of the levers of power – and their view continues to be reflected in building codes and standards around the world.
***
How, then, can we get ourselves out of the air-conditioning trap? On the continuum of habits and technologies that we need to reduce or abandon if we are to avoid the worst effects of the climate crisis, the air conditioner probably falls somewhere in the middle: harder to reduce than our habit of eating meat five times a week; easier than eliminating the fossil-fuel automobile.
According to Nick Mabey, a former senior civil servant who runs the UK-based climate politics consultancy E3G, air conditioning has – like many consumer products that are deeply embedded in society and, in aggregate, drive global warming – escaped the notice of most governments. There is little precedent for top-down regulation. “There is no department that handles this, there’s no guy you can just go talk to who controls air conditioning,” he says.
The key, Mabey says, is to find the places it can be controlled, and begin the push there. He is supporting a UN program that aims to improve the efficiency – and thus reduce the emissions – of all air conditioners sold worldwide. It falls under the unglamorous label of consumer standards. Currently, the average air conditioner on the market is about half as efficient as the best available unit. Closing that gap even a little bit would take a big chunk out of future emissions.
At the local level, some progress is being made. The New York City council recently passed far-reaching legislation requiring all large buildings in the city to reduce their overall emissions by 40 percent by 2030, with a goal of 80 percent by 2050, backed with hefty fines for offenders. Costa Constantinides, the city council member spearheading the legislation, says it is “the largest carbon-emissions reduction ever mandated by any city, anywhere.” The Los Angeles mayor’s office is working on similar plans, to make all buildings net-zero carbon by 2050.
Other cities are taking even more direct action. In the mid-1980s, Geneva, which has a warmer climate than much of the US, the local government banned the installation of air conditioning except by special permission. This approach is relatively common across Switzerland and, as a result, air conditioning accounts for less than 2 percent of all electricity used. The Swiss don’t appear to miss air conditioning too much – its absence is rarely discussed, and they have largely learned to do without.
In countries where air conditioning is still relatively new, an immense opportunity exists to find alternatives before it becomes a way of life. The aim, in the words of Thomas, should be to avoid “the worst of the west.” Recently, the Indian government adopted recommendations by Thomas, Rawal and others into its countrywide national residential building code (“an immensely powerful document” says Rawal). It allows higher indoor temperatures based on Indian field studies – Indian levels of comfort – and notes the “growing prevalence” of buildings that use air conditioning as a technology of last resort.
Cutting down on air conditioning doesn’t mean leaving modernity behind, but it does require facing up to some of its consequences. “It’s not a matter of going back to the past. But before, people knew how to work with the climate,” says Ken Yeang. “Air conditioning became a way to control it, and it was no longer a concern. No one saw the consequences. People see them now.”
Oriana:
At this point, AC is here to stay. The question is how to power it. The answer, I think, lies in solar and wind technology.
Eventually human population will shrink, liberating more land for planting trees, or, should conditions really get extreme, "cool air shelters," possibly underground. And yes, we will have to learn how to live with climate all over again: thick walls, overhangs, shutters, insulation, shade trees. This is old technology. If we can land vehicles on Mars, surely we can design heat-resistant buildings. But I'm afraid that many more people will die of excess heat before serious progress is accomplished.
Mary:
The whole story of air conditioning is a perfect example of one of the core problems of capitalism: it wants and depends on continuous and unlimited growth. In the biology of living cells this is called cancer, and as an economic principle it creates a cancer in society. The AC creators saw how to manufacture comfort from an occasional treat to a necessity expected, then demanded, everywhere. Following the singular aim of the marketplace, everything else is discarded but the need to grow the market, enlarge the customer base, create the expectation that AC is not a luxury but a basic necessity…everywhere and for everyone.
All else gets subsumed under this primary goal, so we have big glass and concrete towers everywhere, without any windows that open, creating more heat by their very presence and their huge power consumption required to fuel their AC. So the world keeps getting even hotter and we need more AC. This makes for a great market, but also a feedback loop that is destructive and unsustainable.
*
TROPICAL REGIONS ARE APPROACHING UNLIVABILITY
~ The climate crisis is pushing the planet’s tropical regions towards the limits of human livability, with rising heat and humidity threatening to plunge much of the world’s population into potentially lethal conditions, new research has found.
Should governments fail to curb global heating to 1.5C above the pre-industrial era, areas in the tropical band that stretches either side of the equator risk changing into a new environment that will hit “the limit of human adaptation”, the study warns.
Humans’ ability to regulate their body heat is dependent upon the temperature and humidity of the surrounding air. We have a core body temperature that stays relatively stable at 37C (98.6F), while our skin is cooler to allow heat to flow away from the inner body. But should the wet-bulb temperature – a measure of air temperature and humidity – pass 35C, high skin temperature means the body is unable to cool itself, with potentially deadly consequences.
“If it is too humid our bodies can’t cool off by evaporating sweat – this is why humidity is important when we consider livability in a hot place,” said Yi Zhang, a Princeton University researcher who led the new study, published in Nature Geoscience. “High body core temperatures are dangerous or even lethal.”
The research team looked at various historical data and simulations to determine how wet-bulb temperature extremes will change as the planet continues to heat up, discovering that these extremes in the tropics increase at around the same rate as the tropical mean temperature.
This means that the world’s temperature increase will need to be limited to 1.5C to avoid risking areas of the tropics exceeding 35C in wet-bulb temperature, which is so-called because it is measured by a thermometer that has its bulb wrapped in a wet cloth, helping mimic the ability of humans to cool their skin by evaporating sweat.
Dangerous conditions in the tropics will unfold even before the 1.5C threshold, however, with the paper warning that 1C of extreme wet-bulb temperature increase “could have adverse health impact equivalent to that of several degrees of temperature increase”. The world has already warmed by around 1.1C on average due to human activity and although governments vowed in the Paris climate agreement to hold temperatures to 1.5C, scientists have warned this limit could be breached within a decade.
This has potentially dire implications for a huge swathe of humanity. Around 40% of the world’s population currently lives in tropical countries, with this proportion set to expand to half of the global population by 2050 due to the large proportion of young people in region. The Princeton research was centered on latitudes found between 20 degrees north, a line that cuts through Mexico, Libya and India, to 20 degrees south, which goes through Brazil, Madagascar and the northern reaches of Australia.
Mojtaba Sadegh, an expert in climate risks at Boise State University, said the study does “a great job” of analyzing how rising temperatures “can render portions of the tropics uninhabitable in the absence of considerable infrastructure investments.”
“If this limit is breached, infrastructure like cool-air shelters are absolutely necessary for human survival,” said Sadegh, who was not involved in the research. “Given that much of the impacted area consists of low-income countries, providing the required infrastructure will be challenging.”
“Theoretically no human can tolerate a wet bulb temperature of above 35C, no matter how much water they have to drink,” he added.
The study is just the latest scientific warning over severe dangers posed by heat. Extreme heatwaves could push parts of the Middle East beyond human endurance, scientists have found, with rising temperatures also posing enormous risks for parts of China and India.
The global number of potentially fatal humidity and heat events doubled between 1979 and 2017, research has determined, with the coming decades set to see as many as 3 billion people pushed beyond the historical range of temperature that humans have survived and prospered in over the past 6,000 years. ~
https://www.theguardian.com/science/2021/mar/08/global-heating-tropical-regions-human-livability?utm_source=pocket-newtab
Oriana:
We keep hearing of miracle technologies like taking CO2 out of the atmosphere, but — where is the nearest plant doing it? Apparently there exists a small “pilot plant” in Canada.
The irony is that all the predictions for the brave new world of the 21st century, including the famous movie, 2001: A Space Odyssey, imagined our era mainly in terms space exploration. Famine, war, disease — all that would be history, and schoolchildren would have trouble understanding such archaic horrors.
Climate catastrophe? Pandemics? Acidified oceans polluted with microplastics? Rapid extinction of many animal species? The melting of polar icecaps and the predicted rise in sea levels? In 1968, the year when “2001” was released, the only visions of the future were utterly rosy. Yes, there was the apocalyptic science fiction as well, but it centered on nuclear war, even as the prospect of World War 3 was quickly fading. Most viewers of Kubrick’s hit movie assumed that everything was going to get better and better — even though there was already plenty of writing on the wall.
*
ABRAHAM AND ISAAC REVISITED: WHAT IF GOD COMMANDED YOU TO KILL YOUR SON?
~ My catechism teacher and friend, Father Martin, wasn’t a priest during WW II. At sixteen, he joined the Marines and fought in the Philippines. He didn’t like the story of Abraham and Isaac. He believed God would not want a man to demonstrate his loyalty by killing. He did not believe holding a knife at a boy’s throat would instill the child with faith or prove the father’s loyalty. To sidestep the brutality of this story, Father Martin used the number three as a holy number. Then, he equated the story with two other stories.
He saw Abraham and Isaac as the first sacrifice of a son by the father. The last plague on Egypt was the second sacrifice of this nature. God killed all the Egyptian firstborn sons. The third sacrifice was the firstborn son was Christ’s Crucifixion by the Romans. Father Martin skipped over the fact that God told Abraham to sacrifice his son to demonstrate his loyalty. He highlighted God sparing Isaac because Abraham’s faith was real. A faithless Pharaoh refused God’s demand, and all the Egyptian’s firstborn sons died.
Father Martin only mentioned the Crucifixion to validate his theory of three as a holy number. He didn’t believe in torturing children or in using faith as an excuse to slay your child. He saw enough brutality firsthand in the Philippino villages. He said the story about Abraham and Isaac is the only story in the Bible in which God is a friend who comes to dinner. He tried to focus on God not as an angry emperor but as a friend.
He liked to discuss Sara laughing when God revealed to Abraham that she would give birth. Sara laughed because she knew Abraham no longer functioned, and she was past childbearing age. Yet, God moved heaven and earth for Sarah to have a child. He realigned His laws of nature and aging. When Sarah conceived Isaac, God was committed to Abraham, Sarah, and Isaac. Christian leaders teach that God knew the outcome when He ordered Abraham to sacrifice Isaac, and His knowledge made it okay.
Father Martin disagreed with giving God a pass. He thought God knew that ordering Isaac’s sacrifice would torture Abraham by making him think about killing his son for days. He couldn’t believe God would torture his friend. Father wanted us to think about Abraham’s thoughts. Could he think of anything but killing his son as he built the altar and gathered firewood? Did he try to convince himself that murder was God’s will?
After the war, this Biblical story confused Father Martin. The brutality of the story was too close to his combat experience. He wanted the Bible to be a path to sanity in a world of senseless killing. He wanted the catechism class not to experience war. It seemed Father Martin doubted a vengeful and fear-inspiring God would stick by you in battle. He knew a friend would because his boyhood friend died next to him in the Philippines. He saw an entire village bayoneted and left rotting in the streets.
After the war, he was stationed in Japan and saw the effect of the atomic bomb. When I try to interpret biblical stories, I often think of him. He wanted these holy men to write about the Good God. Maybe, he also thought the scribes translated the Holy Book to support the brutal quest for power by kings, princes, politicians, and ministers. If that is true, it gives the Bible a dark legacy.
In this story, sacrifice is the keyword, and the Hebrew word for it is Korban. It means to come near or to become involved in an intimate relationship. No single word that has those meanings in English. The closest we come to it is sacrifice. Did this word choice influence the translation to be harsher on Isaac than God intended? After all, He chose the boy to be the father and the spiritual leader of His ‘Chosen People.’
Maybe when Isaac reached the appropriate age, God told Abraham to take Isaac into the wilderness and teach him to pray and form an intimate relationship with Him. Perhaps God told Abraham He’ll provide the animal. (?? Oriana: no evidence for this in the text)
Nevertheless, Abraham’s focus was on sacrifice and his son. He didn’t tell Isaac that he believed God wanted him to sacrifice the boy. Abraham tried to convince himself that he would rather have his son die hating him than God. As he works, he instructs Isaac how to build the altar, fire, and offer up his labor as part of the prayer. Subconsciously, the thought of killing his son tortured Abraham. God does nothing to ease the father’s torture. Suddenly, God realizes that Abraham only sees his son, the altar, and the knife. He sends an angel to point out the ram trapped by his horns in the brush. Abraham teaches Isaac how to slaughter an animal for a sacrifice.
Father Martin told us that it is impossible to know what God wants us specifically to do. God delivers general orders. To believe God gives us specific instructions is wrong and will result in the death of the innocent. He wants us to act with compassion and empathy because he made us to love our neighbors as ourselves. This is what Father Martin told the boys in my catechism class — all of whom were destined for the military and Vietnam. ~ Joseph Milosch
Caravaggio: The Sacrifice of Isaac, 1603
Oriana:
I must say that of all the attempts to “justify the ways of God to man” in regard to this very disturbing story, the one above is relatively easiest to swallow — including the brilliant irony of the last sentence.
This isn’t the only disturbing story in the bible, but because it involves a child’s trust in his father, it emotionally hits us harder than the more anonymous general destruction, e.g. Noah’s Flood. There exists a mountain of commentaries by Hebrew, Christian, and Muslim theologians — not to mention the famous (or infamous, depending on the reader’s point of view) conclusion drawn by Kierkegaard in his “Fear and Trembling”: this was the foundational test of faith, meant to demonstrate that the duty to a deity is more important than a father’s duty to his son. Yahweh is satisfied that Abraham does indeed fear him.
William Blake: Abraham and Isaac. There is a tenderness here. We don’t doubt that Abraham loves Isaac. Note that Yahweh does not insist that Abraham love him more than he loves his son. Yahweh does not demand that he be loved; he wants only to be feared.
My own catechism nun seemed to have no problems with the story. Abraham’s blind obedience earns him a last-minute reprieve, which also means the end of the practice of child sacrifice. This skips the much-later sacrifice of Jephthah’s virgin daughter, or indeed (dangerous ground) the validity of Jesus’ crucifixion as a human sacrifice, leading to the barbarous (to us moderns) conclusion that “without blood, there is no forgiveness.”
But if “without blood, there is no forgiveness” (Hebrews 9:22), why have we stopped animal sacrifice? For instance, why doesn’t the Pope sacrifice a lamb on major holidays? True, the easy answer is that the sacrifice of Jesus as The Holy Lamb made further animal sacrifice unnecessary, and besides, the wafer and the wine become the actual flesh and blood of Jesus. (By the way, the Catholic mass was modeled after the temple sacrifice ritual; the Latin word “hostia” [the communion wafer] means “victim.”)
High Priest sacrificing a goat
But why has animal sacrifice ceased in Judaism as well? Here we are usually told that the destruction of the Jerusalem Temple meant the destruction of the sacrificial altar — as if one couldn’t get around that, since there already existed the tradition of offering sacrifices on "high places"; any mountain would do. (And besides, for Christians at least, isn't St. Peter's the equivalent of the Jerusalem Temple? And, by extension, every consecrated church?)
I don't buy the "lack of temple" explanation. I think that both Christianity and Judaism have evolved to the point that killing an innocent living being as an act of worship became morally unacceptable, even revolting.
*
Wikipedia offers a quick summary of various interpretations of the Abraham and Isaac story. Here is one that spoke to me most: “Francesca Stavrakopoulou has speculated that it is possible that the story "contains traces of a tradition in which Abraham does sacrifice Isaac.” R.E. Friedman argued that in the original [Elohist] story, Abraham may have carried out the sacrifice of Isaac, but that later repugnance at the idea of a human sacrifice led the redactor of [the Yahvist account] to add the lines in which a ram is substituted for Isaac. Likewise, Terence Fretheim wrote that the text bears no specific mark of being a polemic against child sacrifice.
This is also interesting: “It has been suggested that Genesis 22 contains an intrusion of the liturgy of a rite of passage, including mock sacrifice, as commonly found in early and preliterate societies, marking the passage from youth to adulthood” (Wikipedia, Binding of Isaac).
But before we get into the liturgy of preliterate tribes, let us consider the written text: 15“The angel of the Lord called to Abraham from heaven a second time and said, “I swear by myself, declares the Lord, that because you have done this and have not withheld your son, your only son, I will surely bless you and make your descendants as numerous as the stars in the sky and as the sand on the seashore. Your descendants will take possession of the cities of their enemies, and through your offspring all nations on earth will be blessed, because you have obeyed me.”
19 Then Abraham returned to his servants, and they set off together for Beersheba. And Abraham stayed in Beersheba. (New International Version)
“Then Abraham returned to his servants etc.” could be interpreted to mean that Abraham returned from the mountain alone, Isaac having been reduced to a heap of ashes (holocaust means total burning). It’s possible that we have here a fusion of different versions, which leaves room for contradictions.
What the text mentions immediately after suggests an explosion of fertility: a lot of male children are born. Was Isaac a payment for this favor? It’s certainly a possible interpretation, if we follow the notions of cultures whose fertility cult was based on human sacrifice.
But here we are in the twenty-first century, trying to make sense out of this disturbing story and preserve the idea of divine goodness (Remember: the god of Abraham wanted to know if he was FEARED; he wasn't asking to be loved.) If we adopt god’s goodness as first and foremost, then Milosch’s ending is brilliant in presenting both a clever solution and the cruel irony of the last words, which presages the fact that many sons are going to be sacrificed, this time in Vietnam:
"Father Martin told us that it is impossible to know what God wants us specifically to do. God delivers general orders. To believe God gives us specific instructions is wrong and will result in the death of the innocent. He wants us to act with compassion and empathy because he made us to love our neighbors as ourselves. This is what Father Martin told the boys in my catechism class — all of whom were destined for the military and Vietnam."
God is an awkward human invention that seems to create more problems than it solves. But the pre-scientific man wanted both answers about the world, and the granting of his wishes. All early deities were cruel and demanded sacrifice, usually a blood sacrifice. The altars flowed with blood, and the choking smoke of incense was used to cover up the stench of slaughter. That's how it was in the childhood of humanity, and that is the knowledge we have to live with.
Mary:
I don't buy the God and Abraham argument, that God only gives general orders and leaves the details up to us, with the understanding we must act with compassion. The old testament God does not seem compassionate, nor does he seem to regard it as important. He is much more interested in power and vengeance, and his "shock and awe" tactics are anything but compassionate. Think the Flood, Sodom, the plagues of Egypt, the bullying of Job. He throws tantrums to impose his will with force and the threat of terrible consequences for noncompliance.
As for specific orders — Noah got very specific orders for that arc, down to the cubit.
I think the solution given for the problem of the Abraham story is really one that looks back at that old testament god through the lens of Christianity, and tries to re-interpret it according to Christian values. The stories of Jesus do encourage compassion, and his only tantrum was throwing those moneylenders out of the temple.
Oriana: THE SACRIFICE OF ISAAC CONTINUES
You forget that Jesus also cursed a fig tree and it withered. He seemed enraged because there was no fruit on the tree — since “its season was not yet.” And thus a perfectly innocent and beautiful living thing was killed for no excusable reason.
Of course some have tried very hard to find some excuse, but nothing works. It was a temper tantrum — perhaps minor compared to throwing out the money changers, but then you could argue that the money changers were at fault (though within the law — for temple business, Roman coins had to be exchanged for shekels), desecrating the temple.
The cursing of the fig tree remains an embarrassment. I think the church gave up trying to defend it. The last thing I read about it was that a young priest asked his spiritual director about it, and the mentor’s answer is that we should read “only those stories that inspire us.”
Milosch’s Father Martin would probably agree, except that Abraham and Isaac is a foundational story that cannot be skipped. It hangs like Abraham’s knife over us. The blood of all the sacrificed sons cries out — something that Milosch’s subtle ending of his essay brings up. It’s not just about Isaac.
I think the ending is brilliant and exposes the weakness of Father Martin’s argument, one which tries to preserves the notion of Yahweh’s goodness at all cost. But Yahweh doesn’t care to be seen as benevolent — on the contrary, he wants to be feared. He praises Abraham’s obedience: “Now I know that you fear Yahweh.”
Near-Eastern scholars have pointed out that millennia ago the sacrifice of one’s child, especially the first-born son (and Isaac is treated as Abraham’s “only” son, Ishmael having been left out here, probably for ethnic-political reasons) was seen as an act of supreme piety, to be admired rather than condemned. And it’s indeed only much later that human culture evolved to the point of rejecting human sacrifice, calling it barbarous.
Yet it’s precisely this barbarous sacrifice, a “bloody ransom,” that is the very foundation of Christianity, allegedly the religion of forgiveness and compassion. Without the barbarous crucifixion, there is no forgiveness and no resurrection. The whole crazy logic of Christian theology collapses no matter how careful we are to read only those stories that inspire us.
Actually as a child I wasn’t particularly disturbed by the story of Abraham and Isaac. To me it seemed perfectly in character with god’s basic indifference to human suffering. I was raised by an Auschwitz survivor who told me the heavy stench from the smoke rising from the chimneys of the crematoria filled the air for miles — those still on the train could smell the holocaust of multiple Isaacs well before they arrived.
*
(I agree with you that Father Martin’s argument of non-specificity is awfully weak and easily refuted. Details of observance are in fact more important than its spirit. “Spirit of the law” is a later development. Father Martin is desperate to prop up Christianity and its inclusion of the Old Testament as part of the Holy Scripture, though the fit is poor and basically forced. Harold Bloom observed that the term “Judeo-Christian” is an oxymoron just as “Judeo-Islamic” would be.)
PS. recently, as I was waiting in the check-out line at Costco, I saw a young woman, perhaps not yet twenty (or maybe early twenties) who was in a wheelchair. The bottom half of her lower left leg was amputated. And I remembered that quite a few people insist that everything that happens is part of the Divine Plan, and that Plan includes suffering.
The young amputee looked sad. She probably goes to support-group meetings, and hopefully receives adequate care, including counseling — but what is that next to her shattered dreams? Before the accident, she must have dreamed of fun-filled dates and of course eventually The Prince, and two adorable children. Perhaps she liked sports. I hope she’ll qualify for a prosthesis, which will make it possible for her to walk in an awkward way, but cannot prevent her from looking at her stump and remember the days when she could slide a steep cliff and run into the ocean with her surfing board, along with other daring, carefree teens. Now she’s lucky to have her mother to push her wheelchair. Divine Plan? To cure her of being young and happy? Was there no other way to teach life lessons?
If we don’t think in terms of the Divine Plan, there is no problem. Accidents happen. Pain and suffering happen. We have other people to help us. We have pets, we have books and movies. There is no point getting enraged at a fictional figure in the sky who failed to protect us.
But if the accident was military, then there is the war machine to blame. The military does not hesitate to sacrifice Isaacs, these days including young women. The story of Abraham and Isaac is profoundly disturbing because it continues.
Rembrandt: Abraham and Isaac, 1630
*
GERTY CORI: A PIONEER RESEARCHER OF CARBOHYDRATE METABOLISM
~ Gerty was born to a Jewish family in Prague in 1896, and she was eager to pursue a higher education at a time when women had few opportunities to study. Encouraged by a supportive family and hoping to go to medical school but lacking the required prerequisites, Gerty completed the equivalent of eight years of Latin, five years of science, and five years of math in a single year. She met her future husband in medical school and they married soon after graduating in 1920.
Due to the difficult conditions in Europe following WWI and, with Gerty's Jewish heritage, the rising anti-Semitism in the region, the couple decided to immigrate to the United States in 1922. Carl was offered a research position at the State Institute for the Study of Malignant Diseases in Buffalo, New York; Gerty was delayed due to the difficulty of finding a position but eventually joined him six months later after she was offered an assistant role. Despite the director's threats to dismiss Gerty if she continued to collaborate on research with her husband, the pair were tremendously productive as a research team. They became naturalized U.S. citizens and published fifty papers together while at the State Institute, with primary authorship going to whomever had done the most research for a given paper.
The pair announced their most famous discovery in 1929: the process that converts glucose, pyruvate, and lactic acid in the body's muscles and liver. In this metabolic cycle, the lactic acid produced by working muscles — which we feel as muscle cramping — is processed by the liver and turned back into glucose from which the body can draw energy. Their explanation of what is now known as the Cori Cycle would win the couple the Nobel Prize for Physiology and Medicine in 1947. When they received the award, Gerty became the first American woman, and third woman ever (after Marie Curie and Irène Joliot-Curie), to receive a Nobel Prize in the sciences.
Despite their groundbreaking work, the Institute still discouraged the Coris from collaborating, and the pair decided to search for other opportunities. However, they ran into the same problem over and over: universities would offer a job to Carl, but refuse to make a similar offer to Gerty, with one school even claiming it was "un-American" for a married couple to work together.
They finally moved to Washington University in St. Louis, Missouri in 1931 after the university's chancellor waived the institution's nepotism rules so they could work together. However, Gerty was only offered a position as a research associate at a salary one tenth of that received by her husband although their qualifications were identical. It was only after thirteen years that she attained the same rank as her husband. She was finally promoted to full professor just months before she won the Nobel Prize in 1947.
Even given these challenges, the Coris continued their pioneering research while at Washington University. There, the couple discovered the compound that allows glycogen to be broken down into glucose that the body can use; it's now called the Cori ester. On her own, Gerty also studied glycogen storage disease and was the first person to show that an enzyme could cause a human genetic disease.
Sadly, just before the Nobel Prize was announced, Gerty was diagnosed with myelosclerosis, a fatal bone marrow disease. For ten years, she fought the disease while continuing her research. She died in her home in 1957 at the age of 61, only a few months after she stopped going in to the lab. However, her spirit of curiosity inspired people for years after her death. "For a research worker the unforgotten moments of his life are those rare ones which come after years of plodding work," she once wrote, "when the veil over nature's secret seems suddenly to lift & when what was dark & chaotic appears in a clear & beautiful light & pattern.” ~
https://www.amightygirl.com/blog?p=25497
*
NEANDERTHAL GENES AND THE HUMAN IMMUNE SYSTEM
~ DNA acquired from breeding with Neanderthals may explain why people of European descent respond differently to infection than those of African descent, two studies suggest. The findings might also offer insight into why people of African descent are more prone to autoimmune diseases caused by an overactive immune system.
In a paper1 published on 20 October in Cell, geneticist Luis Barreiro of the University of Montreal in Canada and his colleagues collected blood samples from 80 African Americans and 95 people of European descent. From each sample, they isolated a type of immune cell called macrophages, which engulf and destroy bacteria, and grew these cells in a dish. Next, they infected each culture with two types of bacteria and measured how the cells responded. Macrophages from African Americans, they found, killed the bacteria three times faster than those of European Americans.
The researchers then measured how gene expression changed in response to the infection. About 30% of the approximately 12,000 genes that they tested were expressed differently between the two groups, even before infection. And many of the genes whose activity changed the most during the immune reaction had sequences that were very similar between Europeans and Neanderthals, but not Africans.
Barreiro suspects that when modern humans first left Africa — some time between 100,000 and 60,000 years ago — they had to adapt to a different set of pathogens on the European continent. Breeding with Neanderthals, and obtaining their different immune response, probably helped them to better fight off the new kinds of infections that they encountered there.
In the second study, population geneticist Lluis Quintana-Murci and his colleagues at the Pasteur Institute in Paris collected samples from 200 people living in Belgium, half of whom were of African descent and the other half of European descent. The researchers grew a different type of immune cells called monocytes in a dish and infected them with bacteria and viruses. Once again, the two groups showed differences in the activity of numerous genes, and Neanderthal-like gene variants in the European group played a major role in altering their immune response. The differences were especially stark in the way that the two groups responded to viral infection.
Immune systems tend to evolve rapidly because infections produce immediate evolutionary pressure, says computational biologist Janet Kelso of the Max Planck Institute for Evolutionary Anthropology in Leipzig, Germany. So it makes sense that European ancestors would have held onto any advantage they could get from the Neanderthals. “There’s an appreciation now that contributions are coming from many sources, and archaic humans are one,” she says.Kelso says that the studies cannot reveal exactly what drove the evolution — such as a particular viral outbreak in Europe, for instance. For some diseases, such as tuberculosis, a lower immune response tends to help with survival, and modern humans in Europe adopted the Neanderthal traits that helped with this. “Maybe the most important thing is to live in peace with the microbes,” Quintana-Murci says.
Overactive immune systems could help to explain why African American women, for instance, are up to three times more prone to the autoimmune disease lupus than white Americans, Barreiro says. The differences seem to persist irrespective of socioeconomic status and other environmental factors such as smoking and diet, although these probably have a role.
Determining how much of the difference is due to genetics could help researchers to tease out the role of environmental factors, and therefore could guide public-health efforts. ~
https://www.nature.com/news/neanderthal-dna-affects-ethnic-differences-in-immune-response-1.20854
DEATH IS A CONTINUUM
~ The living have always worried about the dead coming back to life. It’s the plot of the New Testament, the reason 19th century families installed bells in their loved one’s coffins, and a source of tension in end-of-life care today.
While doctors work to reassure families holding vigil in intensive care units and hospice facilities that the end has indeed come, death remains something of a mystery — even among medical researchers.
These unresolved questions around things like brain death, cardiac death, and more have led to the proliferation of “myths and misinformation,” said Sonny Dhanani, chief of pediatric intensive care at the Children’s Hospital of Eastern Ontario.
“We felt [stories about the dead coming back to life] might have been impacting people’s motivation to consent for their loved one to be a donor, and for the medical community to offer donations,” he said. “We wanted to provide scientific evidence to inform the medical understanding of dying.”
In a new study, published in the New England Journal of Medicine, Dhanani and his team report the results of the largest international study into the physiology of dying to date. It suggests the living can rest easy, kind of.
Between 2014 and 2018, the researchers observed the heart function of 631 patients in 20 adult intensive care units in Canada, the Czech Republic, and the Netherlands after they were taken off life support. The scientists found that 14 percent of the dead showed some flicker of cardiac activity — measured by the electrical activity of the heart and blood pressure — after a period of pulselessness.
But the doctors at the patient’s bedside never got a determination of death wrong. “No one lived. Everyone died. No one actually came back to life,” Dhanani said.
The sputtering was short-lived — the furthest cardiac activity came just 4 minutes and 20 seconds after their heart initially stopped beating — and not strong enough to support other organs, like the brain.
The data “help us understand how to medically define death, which is more of a continuum than the flipping of a switch,” according to Joanna Lee Hart, a pulmonary and critical care physician and assistant professor at the University of Pennsylvania’s Perelman School of Medicine.
“Our bodies are physiologically designed to stay alive… As our bodies try to keep us alive, they will pump out natural chemicals to sustain life as long as possible,” Hart wrote in an email to Motherboard. But, she added, “Once the dying process starts, it is very hard to return a person’s body back to a condition where the person can survive.”
This should be comforting to families and medical providers. Among other things, the research affirms that current practices, which typically tell doctors to wait 5 minutes after the pulse stops to name a time of death, are working. At that point, things like organ retrieval are safe to start.
While there are still plenty of questions about death, dying, and the afterlife, this study — which is unlikely ever to be repeated, given its scope — is something close to the definitive word on the question of the post-mortem cardiac activity.
“Determining death is so emotional to everyone,” Dhanani said. “We hope that rigorously studying death and dying, not being afraid of that conversation, will help.” ~
*
THE BENEFITS OF HUMBLE CABBAGE
Vitamin C
You might find it a bit surprising, but cabbage actually contains more Vitamin C than oranges do, which are traditionally assumed to be the best natural source. Vitamin C is essential if you want to age gracefully or maintain a powerful immune system, and it can even reduce the effects of Alzheimer’s disease and other degenerative neural diseases.
Sulfur
Sulfur is a highly useful nutrient because it is used to fight infections, and a deficiency can increase the time wounds take to heal, and also put you at a much higher risk of contracting a microbial infection. Luckily, all you need to do is to add a little cabbage to your daily diet to prevent this from ever happening.
Fiber
Cabbage is an excellent source of fiber, which helps the body retain water, and also maintains the bulkiness of food as it's being digested. As a result, cabbage is the perfect natural remedy for many digestion-related conditions, such as constipation.
It contains anti-cancer compounds
Cabbage contains a good amount of anti-cancer compounds, such as sinigrin, sulforaphane, and lupeol, that are known to inhibit the growth of dangerous tumors. In one Chinese study performed on breast cancer patients, cruciferous vegetables like cabbage were shown to provide the patients with a significant improvement.
It’s an anti-inflammatory
Cabbage tends to accumulate a build-up of cadmium-binding complexes within its leaves, and one of its primary components is glutamine, which is a powerful anti-inflammatory agent. Consequently, consuming cabbage can reduce the effects of many kinds of inflammation, fever, joint pain, allergies, skin irritation, and quite a few other skin disorders.
Useful for weight loss
Cabbage is the perfect ingredient for someone who wants to lose weight healthily. It's packed with many essential nutrients, vitamins, and minerals, and its high levels of fiber makes it quite filling. The best part is that there are only 33 calories in a single cup of cooked cabbage!
Good for brain health
The presence of anthocyanins and vitamin K within cabbage can provide a strong boost to concentration levels and overall mental functioning. Red cabbage contains even higher levels of these important brain-fueling components, and that is why it's recommended for sufferers of dementia and neural degeneration.
Helps lower blood pressure
The potassium found within cabbage can help you to lower your blood pressure, thereby decreasing the risk of strokes and heart attacks. This is because potassium acts as a vasodilator, which means that it opens up the blood vessels and eases the blood flow.
Promotes Eye Health
As a rich source of beta-carotene, cabbage is able to delay the formation of cataracts, prevent macular degeneration and generally promote good eye health. What's more, beta-carotene can also reduce your chances of contracting prostate cancer.
Helps prevent age spots
The many antioxidants that cabbage contains play a huge role in your body's response to the aging process, since they help to eliminate free radicals. These free radicals have been found to be the underlying cause of skin discoloration, age spots, and wrinkles.
Strengthens the bones
Cruciferous vegetables, such as cabbage, are great sources of potassium, magnesium, and calcium. Providing your body with these 3 essential minerals will help keep your bones stronger, and can help prevent the onset of osteoporosis.
Helps detoxify your body
By getting rid of free radicals and uric acid, cabbage can help to prevent gout, rheumatism, renal calculi [kidney stones], arthritis, skin diseases, and eczema. This detoxifying effect is due to the high amount of sulfur and Vitamin C that cabbage contains.
https://www.ba-bamail.com/content.aspx?emailid=26615
ending on beauty:
Have you ever seen
anything
in your life
more wonderful
than the way the sun,
every evening,
relaxed and easy,
floats toward the horizon
and into the clouds or the hills,
or the rumpled sea,
and is gone—
~ Mary Oliver, The Sun
Oriana:
I love the path that the setting sun's reflection makes on the water. Full moon does it too, and Venus, if bright enough.
Mary:
There was a story in my family that when I was very young, a toddler I guess, I had a major tantrum because the sun set. Wow, was that a warning of things to come!
No comments:
Post a Comment