I, MAY I REST IN PEACE
I, may I rest in peace — I, who am still living, say
May I have peace in the rest of my life.
I want peace right now while I’m still alive.
I don’t want to wait like that pious man who wished for one leg
of the golden chair of Paradise, I want a four-legged chair
right here, a plain wooden chair. I want the rest of my peace now.
I have lived out my life in wars of every kind: battles without
and within, close combat, face-to-face, the faces always
my own, my lover-face, my enemy-face.
Wars with the old weapons — sticks and stones, blunt axe, words,
dull ripping knife, love and hate,
and wars with new-fangled weapons — machine gun, missile,
words, land mines exploding, love and hate,
I don’t want to fulfill my parents’ prophecy that life is war.
I want peace with all my body and all my soul.
Rest me in peace.
~ Yehuda Amichai, tr. Chana Bloch and Chana Kronfeld
"My head is bloody but unbowed": a Ukrainian woman after an airstrike on an apartment complex.
*
“.... Mitya was calm, and even looked more cheerful, but only for a moment. He felt more and more oppressed by a strange physical weakness. His eyes were closing with fatigue. The examination of the witnesses was, at last, over. They proceeded to a revision of the protocol. Mitya got up, moved from his chair to the corner by the curtain, lay down on a large chest covered with a rug, and instantly fell asleep.
He had a strange dream, utterly out of keeping with the place and the time.
He was driving somewhere in the steppes, where he had been stationed long ago, and a peasant was driving him in a cart with a pair of horses, through snow and sleet. He was cold, it was early in November, and the snow was falling in big wet flakes, melting as soon as it touched the earth. And the peasant drove him smartly, he had a fair, long beard. He was not an old man, somewhere about fifty, and he had on a grey peasant's smock. Not far off was a village, he could see the black huts, and half the huts were burnt down, there were only the charred beams sticking up.
And as they drove in, there were peasant women drawn up along the road, a lot of women, a whole row, all thin and wan, with their faces a sort of brownish color, especially one at the edge, a tall, bony woman, who looked forty, but might have been only twenty, with a long thin face. And in her arms was a little baby crying. And her breasts seemed so dried up that there was not a drop of milk in them. And the child cried and cried, and held out its little bare arms, with its little fists blue from cold.
"Why are they crying? Why are they crying?" Mitya asked, as they dashed gaily by.
"It's the babe," answered the driver, "the babe weeping.”
And Mitya was struck by his saying, in his peasant way, "the babe," and he liked the peasant's calling it a "babe." There seemed more pity in it.
"But why is it weeping?" Mitya persisted stupidly, "why are its little arms bare? Why don't they wrap it up?”
"The babe's cold, its little clothes are frozen and don't warm it.”
"But why is it? Why?" foolish Mitya still persisted.
"Why, they're poor people, burnt out. They've no bread. They're begging because they've been burnt out.”
"No, no," Mitya, as it were, still did not understand. "Tell me why it is those poor mothers stand there? Why are people poor? Why is the babe poor? Why is the steppe barren? Why don't they hug each other and kiss? Why don't they sing songs of joy? Why are they so dark from black misery? Why don't they feed the babe?”
And he felt that, though his questions were unreasonable and senseless, yet he wanted to ask just that, and he had to ask it just in that way. And he felt that a passion of pity, such as he had never known before, was rising in his heart, that he wanted to cry, that he wanted to do something for them all, so that the babe should weep no more, so that the dark-faced, dried-up mother should not weep, that no one should shed tears again from that moment, and he wanted to do it at once, at once, regardless of all obstacles, with all the recklessness of the Karamazovs.”
~ Dostoyevsky, Brothers Karamazov, chapter 8
Olga Pilyuhina
LUCK THROUGH THE AGES; CAN YOU BECOME MORE LUCKY?
~ Throughout history, people have clung to totems for good luck. The ancient Egyptians wore scarabs for good fortune and placed amulets into tombs to speed the journey to the next life.
Who among us has not wished upon a falling star, searched for a four-leaf clover, knocked on wood, or tossed coins in a fountain hoping it would usher in a good fate. Haven’t we all made a wish for the future while blowing out birthday candles? My family would serve pork on New Year’s Day as an omen of fortune.
And remember the lucky rabbit’s foot that once hung on keychains and car mirrors? Countless cultures believed that the left foot of the rabbit was imbued with magical powers. The Aztecs worshiped the rabbit and his foot in particular. The Chinese associated rabbits with the blessing of fertility. One can’t help but think of this long-held tradition and feel sympathy for the poor rabbits who apparently had no luck at all.
Republican Presidential candidate John McCain was known for his sacred rituals, for carrying a lucky compass, a lucky feather, and a lucky penny. Democratic consultant James Carville was well known for courting political fortune by wearing the same underwear for many days at a time. Fierce competitors both—could luck simultaneously rain down on both McCain and Carville? How would that work?
Good luck and bad luck appear to be fiercely held beliefs. A Gallup poll revealed that 72% of the American public admitted to possessing at least one good luck charm.
*
Does luck exist? Where is the evidence?
By definition, luck is a whim of chance. It is defined as intangible, something outside of personal control.
Yet social scientists view it differently. They see luck as the result of personal actions, an alchemy of openness to new experiences and a penchant for chance-taking.
Stephen Mark, a British academician, found that those who view themselves as lucky tend to behave much differently than those who see their lives as plagued by bad breaks. Mark observed that lucky people have this in common: They regularly change up their routine, vary their environment, and mix with a broad swath of people. He determined that this positive, go-getter attitude results in new experiences and the enthusiasm to take advantage of them.
In short, he found the more varied one’s routine, the more chance encounters one will have. The greater the opportunities one is exposed to, the greater potential for good outcomes, which many view as luck.
Steve Jobs, the iconic founder of Apple, once said, “The harder I work, the luckier I get.” He attributed his good luck to a trusting of his gut—to intuition. But in reality, was his success dependent on luck, hard work, or on something more? Jobs took enormous personal gambles, underscoring Mark’s theory that lucky people take risks. Jobs dramatically veered off a traditional career path by dropping out of school and taking up calligraphy, not the common roadmap to success.
Best known for his longitudinal studies of good and back luck, academician and author Richard Wiseman—who incidentally began his career as a professional magician—agrees that luck, or the absence of it, is primarily determined by measurable habits. He hypothesized that lucky people have a tendency to be extroverted, and they invest themselves in a multitude of endeavors, therefore, having greater potential for positive outcomes.
*
What do we know about unlucky people? They tend to be more tense and anxious than lucky people. As a result, their anxiety can make them less able to see opportunities and, thus, less able to take advantage of them. Unlucky people may unwittingly be manifesting negative experiences by viewing the world through a negative prism, not unlike the cartoon character Ziggy who was best known for his “poor me” attitude, and for being perennially disappointed.
What happens when lucky people encounter bad luck or poor outcomes? Social psychologists observed that lucky people tend to be optimists who address adversity by relying on well-honed coping mechanisms to navigate craters and bumps in the road. They tend to view the glass of life as half-full—not half-empty—and they welcome failure as something to be mined for greater understanding and growth.
Wiseman reported on a mental exercise in which study participants were told they were shot during a bank robbery. Optimists—the glass-half-full, lucky people—looked on the bright side and considered themselves lucky not to have been killed. Conversely, the pessimists—the glass-half-empty, unlucky folks—considered themselves victims and unlucky to have been shot.
THE HABITS OF THE LUCKY
Wiseman studied the principles of luck in a 10-year research project that included 400 men and women, ages 18 to 84, who hailed from all walks of life. His study confirmed that lucky people have a powerful hand in outcomes, in their own good fortune. Here are a few habits of lucky people:
They are awake to possibility. Lucky people are skilled at creating and noticing chance opportunities.
They listen to their gut. Lucky people trust their intuition and act on it.
They map out their desires. Lucky people manifest a positive future by creating self-fulfilling prophecies and actively embracing positive expectations.
They make lemonade out of lemons. Lucky people adopt a resilient attitude that transforms bad luck into good.
Perhaps not surprisingly, these are traits shared by those viewed as resilient. What resilient and lucky people have in common is that both accept adversity. When confronting a crisis or disappointment, they learn from the experience and recalibrate their thinking in order to choose a new hopeful path. They benefit from their misfortune.
As the Dalai Lama XIV once said, “Remember that sometimes not getting what you want is a wonderful stroke of luck.”
STEPS TO IMPROVE YOUR LUCK
1. Say "why not me?" When misfortune comes your way, don’t say "why me,” but rather “why not me?" Embrace failure as a life lesson. We know lucky people tend to employ counterfactual thinking that softens a bad experience by acknowledging whatever happened could have been worse.
2. Broaden your social circle. Take the initiative and reach out to people you don’t know. Make new friends. Join a meet-up group. Volunteer on a campaign. Start up a conversation with someone at the grocery store. Introduce new people into your life to expose yourself to alternative experiences and points of view.
3. Say yes more often. Accept invitations outside of your comfort level. Welcome change. Vacation in a different location. Try a new sport.
4. Smile. You may be surprised to see the doors and windows that will open only with a smile. Smiling is known to create a chemical cascade of endorphins that create greater happiness. Smiling can welcome new people into your life and boost joy and optimism.
As Tennessee Williams said, and as social scientists have shown, it all comes down to this: “Luck is believing you’re lucky.” ~
https://www.psychologytoday.com/us/blog/buoyant-life/202202/how-improve-your-luck
Oriana:
I largely agree with this article. More exposure to people means opportunities. To this day I have trouble forgiving myself for having missed an important poetry conference. I was put off by the cost and lack of my preferred housing. How minor the cost and inconvenience now seem, considering the likely benefits! It was my only opportunity to meet a poet whose work I greatly admired. At the time I had neither the wisdom nor the right attitude toward money. (Yes, I did have enough in my savings account, but my attitude was “save at any cost” rather than “spend wisely in order to improve your life.” I count it as the folly as my “advanced youth.” Still, I have to forgive myself because I was genuinely poor, which generates stress and interferes with clear thinking.)
Mary:
As far as luck, much may be a matter of perspective. I never thought of myself as particularly lucky, yet some saw me as someone who had a great deal of luck, someone with opportunities just "falling into her lap." Someone to be envied. I think the perception of luck depends on what you want and what you do to get there. You must be, as was said, alert and aware of opportunities, open to change, and above all, flexible. Key is the ability to learn from failures and setbacks instead of collapsing under them. To see yourself as victim becomes a self fulfilling prophecy, a bad habit, a chain of self designed defeats.
Oriana:
I used to regard myself as very unlucky. I was even explicitly told that I was born under an unlucky star. Depression shows in the face, I think — that sadness in the eyes, even when you smile for a photo; that ever-ready despair leading to crying fits when I had enough privacy to cry.
Of course as depression continues, your thinking becomes more and more distorted, and my thinking became downright delusional. At one point, for instance, I blamed my mother for my coming to America, which opened a period of great suffering in my life. As if I hadn’t been 100% eager to come!
Only when, miraculously, rationality won and I realized that I didn’t have enough life left to squander it on being miserable, did I come to understand what I’d call the middle position: I was by no means totally unlucky — any more than I could ever see myself as totally lucky. In some ways I had fantastic luck: I was a child of two highly intelligent and educated parents who spared no expense to give me the best education available, I was acquainted with masterpieces of music, art, literature; I’ve had the adventure of being bilingual and bicultural; and I’ve had my golden period as a poet. In terms of bad luck, I would have to go into my lengthy medical history and I’m not about to do that. Let me merely point out that I am officially “disabled” because I can walk only a limited distance before pain stops me — and that’s even when I use my walker. And the pain can strike even when I’m not walking.
But look, I’m not in a wheelchair!
Yes, feeling lucky or unlucky is a very complex matter, and perception plays an enormous part.
Flexibility is also enormously important. When luck fails, it's best to decide to make the best of the situation. To light one candle rather than curse the darkness.
*
*
WE ARE WIRED FOR SOCIAL CONNECTION
~ “To the extent that we can characterize evolution as designing our modern brains, this is what our brains were wired for: reaching out to and interacting with others,” writes neuroscientist Matthew Lieberman in his book Social: Why Our Brains Are Wired to Connect.
We are each equipped with biological mechanisms that underlie our ability to empathize, cooperate, give, and love. These neural circuits underpin all of our relationships, beginning at birth—and maybe even before.
This is exemplified in a study by anthropologist James Rilling and his colleagues. They used functional magnetic resonance imaging (fMRI) to scan the brains of 36 women while they each played a game based on the prisoner’s dilemma with one other woman. In this game, a player behaving selfishly could win $60 and their partner would win nothing. If both players cooperated, they both would win $40.
While participants stood to gain more through making selfish choices, mutual cooperation was the most popular outcome. When partners had mutually cooperative interactions, brain regions involved in reward processing were activated. The researchers propose that this pattern of brain activation is “involved in sustaining cooperative social relationships, perhaps by labeling cooperative social interactions as rewarding, and/or by inhibiting the selfish impulse to accept but not reciprocate an act of altruism.”
The reward system is also activated when people make anonymous charitable donations, according to another study. This suggests that human brains are wired to be able to extend altruism beyond people we know into a more abstract sense of care toward a group of strangers or a moral cause—and feel good doing it.
In fact, researcher Martha Welch’s “calming cycle theory” hypothesizes that the earliest relationship—between mother and infant—actually begins before birth via the co-conditioning of mother’s and baby’s autonomic nervous systems.
According to this theory, through dynamic changes in hormone release and heart rate, mother and infant can influence each other’s physiology while the baby is in utero. After the baby is born, sensory information from the mother such as scent, touch, voice, or eye contact can initiate this autonomic response, calming both baby and mother. This is important because infants cannot regulate their emotions on their own and rely on their mothers and other caregivers to help them through periods of distress. The early formation of autonomic co-regulation between mother and baby may help lay the groundwork for a mother’s ability to help calm her infant after periods of separation.
The areas of the reward system activated by love also contain receptors for oxytocin, a naturally occurring hormone that plays an important role in attachment. Research suggests that its release decreases stress and anxiety, increases well-being and trust, and may be a biological mechanism that underlies bonding between parent and child, between friends, and between romantic partners.
The studies mentioned thus far, along with a rich body of other work, support the social baseline theory (SBT) by psychologists James Coan and David Sbarra. This theory suggests that the human brain operates under the assumption that our interactions with others are a vital resource that helps us stay safe and meet our goals.
“At its simplest, SBT suggests that proximity to social resources decreases the cost of climbing both the literal and figurative hills we face, because the brain construes social resources as bioenergetic resources, much like oxygen or glucose,” write Coan and Sbarra.
This suggests that when we don’t have access to social connections, we shift our cognitive and biological resources to focus more on ourselves, leading to distress, ill health, and limited achievement. When our lives are rich in social connection, however, we can move mountains—and, indeed, we are happier, healthier, and more successful. ~
https://greatergood.berkeley.edu/article/item/how_biology_prepares_us_for_love_and_connection?utm_source=pocket-newtab
Oriana:
Nevertheless, interaction with others takes energy, and those of us who are introverts need some time to recuperate after a "social" day. Still, humanity would be nowhere without large-scale cooperation. And apparently our social training starts already in utero. Amazing!
But there is a dark side to this readiness to form social connections. We (this includes even pre-verbal infants) prefer people who look like us and seem "familiar." This makes us ready to organize ourselves into "us" versus "them" groups.
Mary:
I find myself much heartened by that State of the Union Address. It could hardly have been better, and it was good to see unity and support for the Ukrainian community, as well as for the world's united position against the autocratic invasion. It gave me hope.
And in terms of how damaging the isolation and distancing forced on us by the pandemic, it was almost dizzyingly joyful to see people unmasked, standing close, embracing, shaking hands. We are indeed wired for connection, and isolation from community has serious effects on both physical and emotional health. I think that has become undeniably obvious, in the disregulation of emotion and erratic behaviors we have been seeing, so much crazy anger, depression, anxiety, and perhaps much unexplored pain for children whose lives and routines have been disrupted for so long. Hopefully we have the resilience to heal, once the worst is over, even with the deep losses so many have suffered, as Biden noted, all those empty places at so many tables.
Oriana:
It's beautiful watching the world unify on behalf of Ukraine. I too found myself heartened. Old historical grievances buried at last, both Poland and Germany are helping generously, along with many other countries.
*
Philip Glass, who just turned 85, writing about his taxi-driving days in his memoir, Words Without Music.
“DIMINISHING EXPECTATIONS”: WHY WE BECOME LESS AMBITIOUS WITH AGE
~ Cultural mores notwithstanding, there is a big difference in what advancement looks like at various stages of our lives. There’s also a conventional wisdom that as we age, we become less driven and goal-oriented, and instead find meaning in a wider set of experiences.
You could argue that this makes us better or worse at leadership, depending on your viewpoint. But what’s the truth? Do we really care less about work as we move throughout our lives, and how does that affect leadership? Or happiness?
A recent study by the Families and Work Institute found that workers begin losing their ambition to get promoted or seek out more responsibilities around age 35. Researchers attributed this decline in motivation to the demands of having children.
There is further evidence that this loss of ambition has more to do with our natural life patterns than with anything having to do with an aversion to work itself. Research consistently shows that people peak in happiness at ages 18 and 82, and hit a nadir of unhappiness at age 46 (or what is commonly known as the mid-life crisis).
This life pattern is called the U-bend of life. When you start out in life, you are fresh and excited about the future with few responsibilities, so you tend to be happier. And for different reasons, you will also tend to be happier at an older age: you are wiser, have a sense of accomplishment, and care less about pleasing people. You’re also out of the demanding years of childrearing.
What’s also fascinating is that no matter what culture you’re from or how much money you have, the U-bend rings true. It also holds up whether or not you have children.
In another study, Joseph Schwartz and Joan Broderick found that as you progress through your twenties, you worry less. Then, when you hit age 30, worry increases up until mid-life, and then it falls again.
The happiness factor can have real implications on our success and promotability. Harvard professor Teresa Amabile and researcher Steven Kramer, the authors of “The Progress Principle,” found that “Employees are far more likely to have new ideas on days when they feel happier.” Further, they explain, “Conventional wisdom suggests that pressure enhances performance. Our real-time data, however, shows that workers perform better when they are happily engaged in what they do.”
So if we buy the sociological trends that our ambition and happiness wane in our middle age—key years for promotion to senior levels—then are younger and much older leaders best?
Not really. A 2011 study found that while there is a correlation between wisdom and effective leadership, there is no correlation between wisdom and age—or between age and effective leadership.
Where does this leave us?
Clearly trends can’t apply to each individual so there’s a variety of sentiments and approaches here. But in my own experience, I can’t help seeing the truths here as well.
What makes us happy does change over time. And part of finding our own happiness is letting go of others’ expectations of us – an outside view of ambition – and figuring it out for ourselves. It’s completely acceptable to let go some dreams that don’t make sense.
Staying open to our own development and being flexible to set our own path might be what ambition looks like as we age. Ironically, this enhanced perspective may be one of the qualities that allow us to be better at what we do.
This is exactly what happened to my friend, who by doing work she’s great at, just got promoted without even trying. ~
https://www.forbes.com/sites/work-in-progress/2014/07/16/why-youre-losing-ambition-as-you-age/?sh=50d447081f73
Oriana: THE PRIORITY OF WORK OVER OUTCOME
There is also such thing as getting realistic. When we are young, the options seem almost endless. Then we realize that it’s best to focus on what we do best, which is usually what we also love doing. Recognition, however, can be tricky and beyond our control. But we can let go of the draining pursuit at recognition and concentrate on the work itself. Our recognition may be limited, but at least we can be happy.
Mary:
On the U curve of life, and the shrinking of ambition...maybe it is the lessening of ambition that brings about some of that late life happiness. It's not that striving stops, but it is no longer spurred by any need to satisfy more than what suits you. You do what you love, but on your own terms, without pressing demands for more, faster, better production, or the threat of failure to meet expectations.
I think one solid reason for satisfaction is simply the fact of being older in and of itself — you have survived what all the years have brought you. And there is the freedom that comes with no longer having to do and be what's expected, becoming more concerned with what you want than what society says you should want. Devoting yourself to work you love you may actually increase work time because you are now living outside the frame of your "job" and can live with a system more natural for your own comfort, interest and energy.
*
The violent exaggeration of the colors and the thick texture of the paint made The Night Cafe painting "one of the ugliest pictures I have done,”Van Gogh wrote at one point.
‘In my painting of The Night Café I’ve tried to express the idea that the café is a place where you can ruin yourself, go mad, commit crimes. … Anyway, I tried with contrasts of delicate pink and blood-red and wine-red. Soft Louis XV and Veronese green contrasting with yellow greens and hard blue greens. All of that in an ambience of a hellish furnace, in pale sulfur. To express something of the power of the dark corners of a grog-shop.’
*
*
IS IT TIME TO GIVE UP THE NOTION OF CONSCIOUSNESS AS “THE GHOST IN THE MACHNE”?
As individuals, we feel that we know what consciousness is because we experience it daily. It’s that intimate sense of personal awareness we carry around with us, and the accompanying feeling of ownership and control over our thoughts, emotions and memories.
But science has not yet reached a consensus on the nature of consciousness – which has important implications for our belief in free will and our approach to the study of the human mind.
Beliefs about consciousness can be roughly divided into two camps. There are those who believe consciousness is like a ghost in the machinery of our brains, meriting special attention and study in its own right. And there are those, like us, who challenge this, pointing out that what we call consciousness is just another output generated backstage by our efficient neural machinery.
Over the past 30 years, neuroscientific research has been gradually moving away from the first camp. Using research from cognitive neuropsychology and hypnosis, our recent paper argues in favor of the latter position, even though this seems to undermine the compelling sense of authorship we have over our consciousness.
And we argue this isn’t simply a topic of mere academic interest. Giving up on the ghost of consciousness to focus scientific endeavor on the machinery of our brains could be an essential step we need to take to better understand the human mind.
Is consciousness special?
Our experience of consciousness places us firmly in the driver’s seat, with a sense that we’re in control of our psychological world. But seen from an objective perspective, it’s not at all clear that this is how consciousness functions, and there’s still much debate about the fundamental nature of consciousness itself.
One reason for this is that many of us, including scientists, have adopted a dualist position on the nature of consciousness. Dualism is a philosophical view that draws a distinction between the mind and the body. Even though consciousness is generated by the brain – a part of the body – dualism claims that the mind is distinct from our physical features, and that consciousness cannot be understood through the study of the physical brain alone.
It’s easy to see why we believe this to be the case. While every other process in the human body ticks and pulses away without our oversight, there is something uniquely transcendental about our experience of consciousness. It’s no surprise that we’ve treated consciousness as something special, distinct from the automatic systems that keep us breathing and digesting.
But a growing body of evidence from the field of cognitive neuroscience – which studies the biological processes underpinning cognition – challenges this view. Such studies draw attention to the fact that many psychological functions are generated and carried out entirely outside of our subjective awareness, by a range of fast, efficient non-conscious brain systems.
Consider, for example, how effortlessly we regain consciousness each morning after losing it the night before, or how, with no deliberate effort, we instantly recognize and understand shapes, colors, patterns and faces we encounter.
Consider that we don’t actually experience how our perceptions are created, how our thoughts and sentences are produced, how we recall our memories or how we control our muscles to walk and our tongues to talk. Simply put, we don’t generate or control our thoughts, feelings or actions – we just seem to become aware of them.
Becoming aware
The way we simply become aware of thoughts, feelings and the world around us suggests that our consciousness is generated and controlled backstage, by brain systems that we remain unaware of.
Our recent paper argues that consciousness involves no separate independent psychological process distinct from the brain itself, just as there’s no additional function to digestion that exists separately from the physical workings of the gut.
While it’s clear that both the experience and content of consciousness are real, we argue that, from a science explanation, they are epiphenomenal: secondary phenomena based on the machinations of the physical brain itself. In other words, our subjective experience of consciousness is real, but the functions of control and ownership we attribute to that experience are not.
To better align psychology with the rest of the natural sciences, and to be consistent with how we understand and study processes like digestion and respiration, we favor a perspective change. We should redirect our efforts to studying the non-conscious brain, and not the functions previously attributed to consciousness.
This doesn’t of course exclude psychological investigation into the nature, origins and distribution of the belief in consciousness. But it does mean refocusing academic efforts on what happens beneath our awareness – where we argue the real neuropsychological processes take place.
Our proposal feels personally and emotionally unsatisfying, but we believe it provides a future framework for the investigation of the human mind – one that looks at the brain’s physical machinery rather than the ghost that we’ve traditionally called consciousness. ~
https://neurosciencenews.com/consciousness-ghost-in-machine-18566/amp/?fbclid=IwAR1TmoNkiEtd6aVs31HSxT6Ax1Io8kOKmmSZJ3fhFjm1gP9EsKL8DOoIdAM
Mary:
We experience consciousness subjectively as embodied, all awareness is seated in the body itself. I can’t imagine a disembodied consciousness. How can such a "ghost" of consciousness exist? What would it be conscious of, without senses, without any input from physical existence as a body in space and time? All "ideas" are rooted in the body and physical experience, as language is rooted in sound and contains within itself the phenomenon of time, time as the experience of one syllable, one breath, followed by another. The body is both the anchor and the measure of thought, without the electrochemistry of the brain's synapses there is no thought. The self cannot exist separately from the body, because the body is its foundation. Without nerves and synapses functioning, there is no thought, no sensation, no self.
Perhaps that "unconscious" we talk about is the linkages and functioning of those same physical bodies and physical processes, going on but below our awareness, or at least our attention. The neurological activity of the brain, those synapses, don't need our attention to keep firing. We have all had the experience of a problem sitting around without a solution, like an undercurrent, barely perceivable, for quite a while, then suddenly rising into full consciousness, solved, resolved, like a sudden unexpected gift. That is the working of those neurons, firing, connecting and disconnecting, sort of chewing on the problem, mulling it over, until the resolution comes floating up to the surface — not magical or miraculous, but part of the action of these physical cells, who've been shuffling it around the web of neurons for some time.
I think a disembodied soul is an oxymoron.
*
PLATO: “THE INCURABLE WICKEDNESS OF MAN”
~ In Plato’s Laws the soul must be duly honored as the most divine element of man’s nature.
Mendicant priests who offered for any kind of fee to intercede with the gods to win the favor of Heaven, and to bring up the dead from Hades, would be imprisoned during life. Never again would they hold intercourse with their fellows, and when they died their bodies would be cast beyond the borders without burial.
Such were the conclusions in the mature writings of this pagan Greek philosopher who provided a philosophical rationale for Western religions that did not have one before, and have not found a better one since, and whose hatred of mankind was a match for his own.
“It is the incurable wickedness of man that makes the work of the legislator a sad necessity,” declared Plato.
For the incurable wickedness of the legislator he gave no efficacious remedy.
~ Joseph Heller "Picture This"
Plato by Raphael
*
When war breaks out people say: 'It won't last, it's too stupid.' And war is certainly too stupid, but that doesn't prevent it from lasting. ~ Albert Camus
*
RADIATIVE COOLING: HOW TO USE LESS AC
~ They look like mirrors: 32 rectangles neatly arranged in eight rows on the rooftop of a supermarket called Grocery Outlet in Stockton, California. Shimmering beneath a bright sky, at first glance they could be solar panels, but the job of this rig is quite different. It keeps the store from overheating.
Tilted toward the sun, the panels absorb almost none of the warmth beating down on them; they even launch some into space, improving the performance of the systems that keep things inside cold. The feat relies on a phenomenon called radiative cooling: Everything on Earth emits heat in the form of invisible infrared rays that rise skyward. At night, in the absence of mercury-raising daylight, this can chill something enough to produce ice. When your car’s windshield frosts over, even if the thermometer hasn’t dipped below freezing? That’s radiative cooling in action.
To Aaswath Raman, who was a key mind behind Grocery Outlet’s shiny tiles, that effect seemed like an opportunity. “Your skin, your roof, the ground, all of them are cooling by sending their heat up to the sky,” he says.
Raman, a materials science and engineering professor at the University of California at Los Angeles, is the co-founder of SkyCool Systems, a startup trying to flip the script on the technology we depend on to create chill. As the world warms, demand for air conditioning and refrigeration is going up. But these systems themselves expel a tremendous amount of heat, and the chemical compounds they use can escape skyward, where they act as a planet-warming greenhouse gas. According to the Birmingham Energy Institute in the UK, these substances and the power involved accounted for at least 11 percent of global greenhouse gas emissions in 2018.
By 2050, more than 4.5 billion air conditioners and 1.6 billion refrigerators are projected to consume nearly 40 percent of all electricity. If it goes mainstream, SkyCool’s tech—and similar approaches in the works from competitors and other researchers—could slow the cycle by naturally lowering building temperatures and easing the energy burden on conventional methods.
After Grocery Outlet put the panels on the roof of the 25,000-square-foot building in late 2019, energy use by the store’s refrigeration system dropped by 15 percent. That amounts to almost $6,000 in savings per year.
It’s hard to say if the installation has grabbed the infrastructural upgrade brass ring and paid for itself. Lime Energy, a national retrofitter specializing in upgrades to boost efficiency, financed the supermarket’s setup costs, which made the panels affordable. To work on a massive scale, though, radiative cooling needs to be cheap to manufacture and install. Make that happen, and it could be one way to conserve power and reduce emissions. “I was somewhat skeptical that you could gain this significant amount of cooling even under direct sun,” says Chris Atkinson, a former program director of the Advanced Research Projects Agency–Energy (ARPA-E), a division of the US Department of Energy (DOE) that funded Raman’s early research. “But once it was explained to me, it sounded plausible—and the results are remarkably compelling.”
BECAUSE OF THE PROMISE of radiative cooling, other startups have rushed into the field. Engineers from the University of Colorado, Boulder and the University of Wyoming teamed up to create their own film-like material in 2017. Engineers at the University of Buffalo published research in February 2021 on their own version: two mirrors composed of 10 thin layers of silver and silicon dioxide. They’re now trying to bring it to market through their company, Sunny Clean Water.
The big question is how likely people are to implement a brand-new product. “The technology makes sense,” says Jeremy Munday, a University of California, Davis professor who studies clean-energy innovations. “It really comes down to things like the market, the cost, and then just having the motivation to adopt it.”
Raman and Goldstein aren’t disclosing their pricing, but they admit that SkyCool’s future challenges will be on the manufacturing—not the scientific—side of things. A 2015 study by the Pacific Northwest National Laboratory, part of the DOE, estimated that if rooftop materials like SkyCool’s could be built and installed for less than $6.25 a square meter, the costs would be covered by energy savings over five years.
The pair think they can hit a worthwhile price inside three years, in part because they’ve further refined the film they originally tested at Stanford. These days, the precise makeup is proprietary, although it still contains a mix of polymers and inorganic materials. “We’ve figured out ways to do it that are lower cost and better suited to manufacturing,” Raman says.
With the help of a $3.5 million federal energy grant, SkyCool soon hopes to have the sort of connections that could make its film cost-effective. The startup is collaborating with the 3M Company to devise an affordable means of making hundreds of thousands of its films. The goal is to drive down the price enough by 2023 that customers with persistent cooling needs can recoup installation costs in three to five years.
On top of those challenges, other researchers say they can get the same result with paint. The white version tried decades ago didn’t reflect enough rays to create a cooling effect. In 2020, though, Purdue University engineers created an ultrawhite variety that works like SkyCool’s mirrorlike material. According to Xiulin Ruan, a professor of mechanical engineering involved in its development, the product reflects 98.1 percent of sunlight and radiates infrared at the right wavelength to escape into space—cooling buildings midday to 8°F below the ambient temperature.
For now, SkyCool is trying to win over more businesses. Soon it plans to deploy its panels in office buildings to augment commercial AC. In March, a big-box retailer in Southern California became the latest customer. On the roof, five full rows of the creaseless, mirrorlike films sit between two columns of solar panels—a fitting juxtaposition, considering Raman’s prior interests. Now he wants to cut energy, not produce it.
“All you have to do,” he says, “is put the material outside, and it stays cool.” ~
https://www.popsci.com/science/modern-air-conditioning-obsolete/?utm_source=pocket-newtab
SkyCool mirror panels on roof of a big store
From another source:
~ Using light colors to reflect heat and keep thermal mass cool in hot climates is nothing new. Villas in Spain and Greece are perfect examples of that, where white paint has been the main strategy for natural cooling of homes and buildings for centuries.
What is new about this [heat-reflective paint] is that it is a doubled-layer paint, so it has heat reflective qualities but it is not limited to white.
This paint consists of a top layer that offers options in color, and an underlayer that reflects near-to-short infrared wave lengths, reducing the surface temperature of the wall. The intended use is not just limited to homes and commercial buildings such as Data centers that have enormous cooling loads, it can also be used for electric vehicles, and the implications are significant. By reducing the need for cabin cooling, that can lead to extended range of the best EVs, making them an even greater option over gas-powered cars.
While highly-reflective surfaces can have a notable effect on reducing energy consumption, the downside is that they can cause eye damage. It was this reality that led the team at Columbia University towards developing a double-layered paint with a thickness of half a millimeter. In addition to reducing the glare, this can offer the same energy-savings from heat reflection without being limited to white.
This ultra-thin layer contains interconnected micro-pores and nano-pores which reflect infrared light, a type of electromagnetic wave that transfers heat.
The top layer of the paint contains colorants and TiO² (titanium dioxide), which is already being used to add opacity to paint. When compared to traditional single-layer paints, the results showed higher reflectance values for each color.
”The top layer absorbs appropriate visible wavelengths to show specific colors, while the underlayer maximizes the reflection of near-to-short wavelength infrared light to reduce solar heating," said Dr Chen.
”Consequently, the bilayer attains higher reflectance compared with commercial paint monolayers of the same color and stays cooler by as much as three to 15.6 degrees Celsius (28°F) under strong sunlight.” ~
https://www.ecohome.net/news/1501/reflective-paint-can-reduce-energy-consumption-lower-cooling-bills/
*
HOW THE GOVERNMENT SUBSIDIZES “GOOD NEIGHBORHOODS”
~ The housing boom of the postwar period always pretended to be an expression of the pioneering spirit of hardworking individualists. Yet, it was principally the consequence of the most significant welfare program in American history, one that was targeted with unerring precision at the hearts and wallets of the white middle class.
In the postwar Dream period, federally funded financial and tax incentives (including changes in mortgage regulation, G.I. Bill provisions, allowances for the deduction of mortgage interest and property taxes, and omission of imputed rents) accounted for 40% to 45% of the increase in the homeownership rate. That comes before considering the contribution of the highway expansion program, in which the federal government covered 90% of the cost of making it possible for white people to live with other white people in otherwise remote locations.
Beginning around 1980, however, the great real estate welfare program shifted down and then quietly, without really drawing much attention to the fact, slipped into reverse gear. In its present form, the real estate system in America today transfers wealth from poor to rich, from young to old, from black to white, and from the future to the past.
If one mentions the words “government” and “housing” in polite conversation today (generally not a good idea), the words “taxes” and “poor people” are likely to make an appearance in short order. Yes, we want to help, the response will come, but aren’t we doing enough already? The funny thing is that government works much harder to help rich people make money off their homes than to help poor people find shelter.
According to the Center on Budget and Policy Priorities, the federal government spent $190 billion per year on housing assistance of various sorts, as of 2015. But 60% of this corresponds to the mortgage interest deduction, which benefits only the 7 million or so households with more than $100,000 in income. The bigger the house, the more they get. A further $32 billion per year goes to covering the exclusion of capital gains taxes on inherited homes. An uncounted additional amount pays for the capital gains exclusion on home sales. All but a few rounding errors land in the pockets of the 9.9%—every year.
Another piece of the affordable housing budget goes into housing vouchers. But these vouchers are often all but impossible to redeem outside the same, underserved neighborhoods where recipients already live, and so they, too, have the effect of concentrating disadvantage.
Homeownership, it has long been said, has the marvelous effect of making people put down roots in their communities. The evidence shows that homeowners’ first priority, as they joyfully plant themselves at the center of a community of fully realized fellow citizens, is to pour as much concrete as possible over those roots. The way to keep values up is to prevent other people from moving in.
The charms of the not-in-my-backyard movement, or NIMBYism, are sometimes thought to be universal to the human condition. In fact, they grow in power and impact with rising inequality. They grow with inequality both because there is more money at stake and, more importantly, because local power is a function of local money. According to Brookings, the deployment of zoning and land-use regulations to curb growth has risen in tandem with inequality. In a study of 95 metropolitan areas, researchers found that areas with high concentrations of wealth have more restrictive land-use regulations.
Why exactly does the sun appear to shine brighter in some neighborhoods than others? For starters, they’re usually located near a giant cash machine: the local tech monopoly or financial oligopoly. More optimistically, we could say that in the modern economy, high economic productivity happens in clusters where a certain density of know-how, networks, and human interaction yields high levels of economic activity and innovation. Thus, the wealthy neighborhoods tend to fall on transportation corridors or in easy reach of the leading urban centers.
Then they work to keep density low and squeeze every undesirable person out to some other location. That’s why the not-so-good-neighborhoods, almost by definition, are the ones that force their residents into long commutes, which are associated with increased stress levels, health problems, and likelihood of divorce. According to Harvard economist Nathaniel Hendren, commuting time is a better predictor of social mobility than education quality, family structure, and local crime rates.
The other big thing that the good neighborhoods have going for them is their schools. The dismantling and re-privatization of America’s system of public education has been happening one privileged neighborhood at a time. Ten of the top thirteen ranked public elementary schools in California on niche.com are located in the Palo Alto Unified School District, as are two of the top four public high schools in the state. They are free and open to the public. All the public has to do is buy a home in a neighborhood where the median home value was $2.8 million in 2020.
The effects of living in underserved areas are so well known, and so dismal, that they hardly need to be stated. Research consistently shows that bad neighborhoods really are bad for children, above all. In one particularly telling study, children from a randomly selected group of families who moved to wealthier neighborhoods were more likely to go to college, get married, have higher incomes, and live in higher-income neighborhoods themselves than the peers they left behind.
The geographical concentration of wealth brings many additional advantages for the lucky few beyond better schools and commutes. It supplies residents with social capital, in the form of networks that can help deliver valuable internships for the kids and open up business opportunities with prospective clients and employers. It delivers better security, nicer parks, and other public amenities.
The outcome of the process is visible in skylines and landscapes across the country. In leading urban centers like Manhattan, luxury apartments up in the stratosphere sit empty even as the population migrates away in search of more affordable housing. Ryan Avent, a columnist for The Economist, aptly describes the process as a “flight to stagnation.”
As economist Enrico Moretti and others have pointed out, this is bad news for the economy. Workers are fleeing the areas where they can be most productive and moving into the land of permanently lowered wages, all because the rent is too damn high. In the land of the 9.9%, we like to pretend that every neighborhood has a chance to become a good neighborhood. The reality is that our neighborhoods are so good precisely because the other neighborhoods are not. ~
https://www.fastcompany.com/90687984/the-dark-truth-behind-what-makes-a-good-neighborhood?utm_source=pocket-newtab
*
A NEW LOOK AT THE QUR’AN (to be released May 3, 2022 by Bombardier Books; 560 pages)
~ [We now have] Robert Spencer's new book, "The Critical Qur'an: Explained from Key Islamic Commentaries and Contemporary Historical Research." "The Critical Qur'an" is an essential book that every thinking person would benefit from reading. About one in four humans is a Muslim. Given child marriage, polygyny, and women's low status, Muslims have high fertility rates and the percentage of the world's population that is Muslim is predicted to increase till Islam is the world's majority religion in 2075.
While it is true that the Qur'an is often not read or understood by most Muslims, Muslims do revere the Qur'an. Muslims may have little idea what the book contains, but they are ready to kill over it. When, in 2005, Newsweek circulated false rumors that Americans were flushing Qur'ans down toilets – which is of course impossible – at least seventeen people were killed in ensuing violence and "a council of more than 300 mullahs … threatened to declare holy war."
Islam teaches that the Qur'an was never written by anyone. It is uncreated. Like God himself, the Qur'an has always existed and will always exist. There are numerous rules for handling the Qur'an. Kufar – Non-Muslims – should never touch the Qur'an in Arabic, but may touch "interpretations" in other languages. One must say "interpretation" because the Qur'an exists only in Arabic, the language of Allah. Muslims must perform ablutions before reading the Qur'an. The Qur'an must be stored in a specially designated place, and never be put on the floor or taken into a bathroom.
To say that the Qur'an was created, as opposed to eternally existing, is a death penalty offense. Even Western scholars have hesitated to explore the Qur'an's origins. For example, scholar Christoph Luxenberg must hide behind a pseudonym to protect his life. The Qur'an "leaves no room for dispute"; see also Qur'an 33:36. Indeed, the Qur'an suggests that even a second of doubt will lead to an eternity in hell (e.g. 49:15) . Thus, rather than debating or discussing the meaning of the Qur'an, Islam places emphasis on memorization. A Muslim once said to Robert Spencer that he had memorized the entire Qur'an, and one day he was going to find out what it says. The hafiz, or Qur'an memorizer, did not speak Arabic, and had no idea of the meaning of the sounds he had memorized.
Mohammed Hijab, an Islamic apologist, demonstrated Muslim beliefs about the magic powers of the Qur'an in a November 10, 2021, YouTube discussion with Dr. Jordan Peterson. Hijab began to recite in Arabic, in the voice prescribed for reading the Qur'an. That prescribed voice is a singsong, nasal drone, with drawn out vowels. Peterson asked what Hijab's point was. Why recite Arabic to me, a non-Arabic speaker? Hijab said, "We believe that the Qur'an has divine qualities itself. We believe it is a physical cure.” Ibn Kathir, an important exegete, claimed that recitation of Sura 2 causes Satan to fart. It can be argued that Islam treats the Qur'an as if it were a "divine, conscious agent."
Muslim history claims that Islam was founded by an orphaned, illiterate, seventh-century Meccan caravan driver named Muhammad who was visited by the angel Jibril (from the Biblical Gabriel) who ordered him to recite. Muhammad's followers wrote down his recitations and compiled them into the Qur'an. Textual criticism suggests that the Qur'an is a compilation of heavily edited, pre-existing material.
Muslims express [highest] praise for the Qur'an. For example, Ibn Kathir said, "The Arabic language is the most eloquent, plain, deep and expressive of the meanings that might arise in one's mind. Therefore, the most honorable Book, was revealed in the most honorable language, to the most honorable Prophet and Messenger, delivered by the most honorable angel, in the most honorable land on earth, and its revelation started during the most honorable month of the year, Ramadan. Therefore, the Qur'an is perfect in every respect."
Scholar Gerd Puin estimates that twenty percent of the Qur'an is unclear to anyone. This lack of clarity is thanks in part to words, often of non-Arabic derivation, like "jibt," "sijill," "ghislin," "abb," "as-sakhkhah," "sijjin," "illiyyin," "tasnim," "saqar," and many others, whose meanings are uncertain. The Qur'an acknowledges its own lack of clarity in 3:7, in which Allah states that he alone knows the meaning of some verses. Which verses? He never says. Readers can only guess which verses they are understanding correctly and which verses whose meaning is beyond their grasp.
Jihad is one of main themes of the Qur'an. The Qur'an makes abundantly clear that jihad is warfare for the sake of expanding Islam's worldly power, not an interior struggle to, say, remain on a diet, a message promoted by a 2013 CAIR public relations campaign. The Qur'an says, multiple times, that believers should strike the necks of kufar, kill them wherever the Muslims find them, etc.
Spencer's new "Critical Qur'an" doesn't offer only an accurate and accessible translation. It offers commentary by canonical Islamic experts, including Ibn Kathir, a fourteenth century exegete, and Syed Abul Ala Maududi, a twentieth-century author. Thus, the reader knows not just what the Qur'an says, but how influential Muslims understand it. Spencer's footnotes also draw the reader's attention to variations in the Qur'an. These variations are of a utmost importance, as it is a tenet of Islam that the Qur'an is a perfect, eternal, unchanging and unchanged document that exists in Heaven. Variations in the text [contradict] this tenet.
Spencer's footnotes describe Islamic traditions designed to justify changes in the Qur'an, a book that Islam teaches is perfect, unchanging, and unchangeable. Again, one current theory is that the Qur'an was not written as one document, the product of one man, Muhammad. Rather, many scholars now think that the Qur'an was pieced together from pre-existing materials, materials that were then heavily edited to meet the needs of Arab conquerors. These changes occurred over time. Some early Muslims might have witnessed, and questioned, such changes. Traditions were invented to explain away the changes. For example, Muhammad's child bride Aisha is made to say that sheep ate some Qur'an verses that previously existed but then went missing. ~
Oriana:
What you read here is far from the full, lengthy text of the original review. These are brief excerpts from an extensive post by Danusha Goska, author of God through Binoculars: A Hitchhiker at a Monastery, and other books. Her review was published on Facebook and in Front Page Magazine. Robert Spencer’s The Critical Qur’an is scheduled to be released on May 3 2022.
*
I’ve tried to find something that would balance this — or at least be a different and interesting perspective on the Qur’an. Harold Bloom, who included the Qur’an in his book titled “Genius,” says this:
“Sometimes I reflect that the baffling arrangement (or lack of it) of the Koran actually enhances Mohammad’s eloquence; the eradication of context, narrative, the formal unity forces the reader to concentrate upon the immediate, overwhelming authority of the voice, which, however molded by the Messenger’s lips, has a massive, persuasive authority to it, recalling but expanding upon the direct species of God in the Bible.” (Harold Bloom, Genius, p. 146)
“In my own experience as a reader of literature, the Koran rarely makes a biblical impression upon me, particularly of an esthetic sort. Sometimes, as I immerse myself in reading the Koran, I am reminded of William Blake or of Walt Whitman; at other moments, I think of Dante, who would have found the association blasphemous. Partly the analogues are suggested by the personal authority of the seer’s voice, which is what we hear incessantly in the Koran.” (Harold Bloom, Genius, p. 153)
By the way, the Koran (now mainly referred to as Qur’an) is written in classical Arabic. Responders in Quora differ on the subject of the ability of the majority of Muslim to read and understand classical Arabic.
I once had an Arab classmate in a class on psycholinguistics. As part of his presentation on the special features of the Arabic language, he said he’d read a sura from the Koran. What followed was a chant. This particular man did not have a good voice, but we could imagine how beautiful the chant might sound if performed by a trained singer.
(For those seriously interested in the question of Qur’an as literature, please click on https://podcasts.ox.ac.uk/quran-literature )
*
THE HIGH COST OF RELIGION TO HUMANITY: KEN DANIELS “I REGRET”
~ I regret having used up the best years of my youth pursuing religious goals. In retrospect I would have preferred a career seeking a vaccination for malaria, which kills person every 30 seconds in sub-Saharan Africa alone.
I am concerned by the enormous diversion of time, energy and financial resources used to maintain and propagate religion. These activities too often take priority over believers’ concrete charitable contributions to society.
I am concerned by the lack of care for the future of our planet on the part of many of the millions believers who expect Jesus’ imminent return.
It grieves me to witness bright, promising young men and women distracted by the study of fundamentalist theology, or by the prospect of traveling the world to convert people from one empirically unverifiable form or supernaturalism to another. ~
also by Ken Daniels:
~ While it is unrealistic to expect a large percentage of Muslims to abandon their faith, most of us can agree that the world would be a better place if Muslim fundamentalists moderated their rigid commitment to every precept of the Qur’an as the divine word of Allah, especially those that call for the destruction of infidels and apostates. Likewise, the world would be a better place if fundamentalist Christians could frankly acknowledge the good, the bad, and the ugly in their own scriptural tradition, whether or not they end up abandoning their faith outright. ~
Oriana:
And this doesn’t even mention religious wars. I find this common in the US: probably due to their ignorance of history, people seem unfamiliar with the darkest side of religion. On the other hand, 9-11 opened the eyes of many to the horrors that religious fanaticism can lead to.
*
TUBERCULOSIS AND OTHER THE DISREGARDED PANDEMICS
~ When hundreds of members of the American Legion fell ill after their annual meeting in Philadelphia in 1976, the efforts of epidemiologists from the Centers for Disease Control to explain the spread of this mysterious disease and its newly discovered bacterial agent, Legionella, occupied front-page headlines. In the years since, however, as the 1976 incident faded from memory, Legionella infections have become everyday objects of medical care, even though incidence in the U.S. has grown ninefold since 2000, tracing a line of exponential growth that looks a lot like COVID-19’s on a longer time scale. Yet few among us pause in our daily lives to consider whether we are living through the slowly ascending limb of a Legionella epidemic.
Nor do most people living in the United States stop to consider the ravages of tuberculosis as a pandemic, even though an estimated 10 million new cases of tuberculosis were reported around the globe in 2018, and an estimated 1.5 million people died from the disease. The disease seems to only receive attention in relation to newer scourges: in the late twentieth century TB co-infection became a leading cause of death in emerging HIV/AIDS pandemic, while in the past few months TB co-infection has been invoked as a rising cause of mortality in COVID-19 pandemic. Amidst these stories it is easy to miss that on its own, tuberculosis has been and continues to be the leading cause of death worldwide from a single infectious agent. And even though tuberculosis is not an active concern of middle-class Americans, it is still not a thing of the past even in this country. More than 9,000 cases of tuberculosis were reported in the United States in 2018—overwhelmingly affecting racial and ethnic minority populations—but they rarely made the news.
These features of the social lives of epidemics—how they live on even when they seem, to some, to have disappeared—show them to be not just natural phenomena but also narrative ones: deeply shaped by the stories we tell about their beginnings, their middles, their ends. At their best, epidemic endings are a form of relief for the mainstream “we” that can pick up the pieces and reconstitute a normal life. At their worst, epidemic endings are a form of collective amnesia, transmuting the disease that remains into merely someone else’s problem.
Epidemics carry within them their own tempos and rhythms: the slow initial growth, the explosive upward limb of the outbreak, the slowing of transmission that marks the peak, plateau, and the downward limb. This falling action is perhaps best thought of as asymptotic: rarely disappearing, but rather fading to the point where signal is lost in the noise of the new normal—and even allowed to be forgotten.
Recent history tells us a lot about how epidemics unfold, how outbreaks spread, and how they are controlled. We also know a good deal about beginnings—those first cases of pneumonia in Guangdong marking the SARS outbreak of 2002–3, the earliest instances of influenza in Veracruz leading to the H1N1 influenza pandemic of 2009–10, the outbreak of hemorrhagic fever in Guinea sparking the Ebola pandemic of 2014–16. But these stories of rising action and a dramatic denouement only get us so far in coming to terms with the global crisis of COVID-19. The coronavirus pandemic has blown past many efforts at containment, snapped the reins of case detection and surveillance across the world, and saturated all inhabited continents. To understand possible endings for this epidemic, we must look elsewhere than the neat pattern of beginning and end—and reconsider what we mean by the talk of “ending” epidemics to begin with.
And yet, like World War One with which its history was so closely intertwined, the influenza pandemic of 1918–19 appeared at first to have a singular ending. In individual cities the epidemic often produced dramatic spikes and falls in equally rapid tempo. In Philadelphia, as John Barry notes in The Great Influenza (2004), after an explosive and deadly rise in October 1919 that peaked at 4,597 deaths in a single week, cases suddenly dropped so precipitously that the public gathering ban could be lifted before the month was over, with almost no new cases in following weeks. A phenomenon whose destructive potential was limited by material laws, “the virus burned through available fuel, then it quickly faded away.”
As Barry reminds us, however, scholars have since learned to differentiate at least three different sequences of epidemics within the broader pandemic. The first wave blazed through military installations in the spring of 1918, the second wave caused the devastating mortality spikes in the summer and fall of 1918, and the third wave began in December 1918 and lingered long through the summer of 1919. Some cities, like San Francisco, passed through the first and second waves relatively unscathed only to be devastated by the third wave. Nor was it clear to those still alive in 1919 that the pandemic was over after the third wave receded. Even as late as 1922, a bad flu season in Washington State merited a response from public health officials to enforce absolute quarantine as they had during 1918–19. It is difficult, looking back, to say exactly when this prototypical pandemic of the twentieth century was really over.
In Kano, Nigeria, a ban on polio vaccination between 2000 and 2004 resulted in a new national polio epidemic that soon spread to neighboring countries. As late as December 2019 polio outbreaks were still reported in fifteen African countries, including Angola and the Democratic Republic of the Congo. Nor is it clear that polio can fully be regarded as an epidemic at this point: while polio epidemics are now a thing of the past for Europe, the Americas, Australia, and East Asia as well—the disease is still endemic to parts of Africa and South Asia. A disease once universally epidemic is now locally endemic: this, too, is another way that epidemics end.
From a strictly biological perspective, the AIDS epidemic has never ended; the virus continues to spread devastation through the world, infecting 1.7 million people and claiming an estimated 770,000 lives in the year 2018 alone. But HIV is not generally described these days with the same urgency and fear that accompanied the newly defined AIDS epidemic in the early 1980s. Like coronavirus today, AIDS at that time was a rapidly spreading and unknown emerging threat, splayed across newspaper headlines and magazine covers, claiming the lives of celebrities and ordinary citizens alike. Nearly forty years later it has largely become a chronic disease endemic, at least in the Global North. Like diabetes, which claimed an estimated 4.9 million lives in 2019, HIV/AIDS became a manageable condition—if one had access to the right medications. ~
https://bostonreview.net/articles/jeremy-greene-dora-vargha-how-epidemics-end-or-dont/
*
ALZHEIMER’S: “MICROGLIA BECOME KILLERS, NOT JUST JANITORS”
~ If scientists are to one day find a cure for Alzheimer's disease, they should look to the immune system.
Over the past couple decades, researchers have identified numerous genes involved in various immune system functions that may also contribute to Alzheimer’s.
Some of the prime suspects are genes that control humble little immune cells called microglia, now the focus of intense research in developing new Alzheimer's drugs.
Microglia are amoeba-like cells that scour the brain for injuries and invaders. They help clear dead or impaired brain cells and literally gobble up invading microbes. Without them, we'd be in trouble.
In a normal brain, a protein called beta-amyloid is cleared away through our lymphatic system by microglia as molecular junk.
But sometimes it builds up. Certain gene mutations are one culprit in this toxic accumulation. Traumatic brain injury is another, and, perhaps, impaired microglial function.
One thing everyone agrees on is that in people with Alzheimer's, too much amyloid accumulates between their brain cells and in the vessels that supply the brain with blood.
Once amyloid begins to clog networks of neurons, it triggers the accumulation of another protein, called tau, inside of these brain cells. The presence of tau sends microglia and other immune mechanisms into overdrive, resulting in the inflammatory immune response that many experts believe ultimately saps brain vitality in Alzheimer’s.
THE GENE SCENE
To date, nearly a dozen genes involved in immune and microglial function have been tied to Alzheimer’s.
The first was CD33, identified in 2008.
"When we got the results I literally ran to my colleague's office next door and said you gotta see this!" says Harvard neuroscientist Rudolph Tanzi.
Tanzi, who goes by Rudy, led the CD33 research. The discovery was quickly named a top medical breakthrough of 2008 by Time magazine.
Over time, research by Tanzi and his group revealed that CD33 is a kind of microglial on-off switch, activating the cells as part of an inflammatory pathway.
Microglia normally recognize molecular patterns associated with microbes and cellular damage as unwanted. This is how they know to take action – to devour unfamiliar pathogens and dead tissue. Tanzi believes microglia sense any sign of brain damage as an infection, which causes them to become hyperactive.
Much of our modern human immune system, he explains, evolved many hundreds of thousands of years ago. Our lifespans at the time were far shorter than they are today, and the majority of people didn't live long enough to develop dementia or the withered brain cells that comes with it. So our immune system, he says, assumes any faulty brain tissue is due to a microbe, not dementia. Microglia react aggressively, clearing the area to prevent the spread of infection.
“They say, We better wipe out this part of the brain that's infected, even if it's not. They don't know,” quips Tanzi. "That's what causes neuroinflammation. And CD33 turns this response on. The microglia become killers, not just janitors.”
A BRAKE ON THE OVERACTIVE GLIA
If CD33 is the yin, a gene called TREM2 is the yang.
Discovered a few years after CD33, TREM2 reins in microglial activation, returning them to their role as cellular housekeepers.
Neurologist David Holtzman of Washington University in St. Louis, who studies TREM2, agrees that where you find amyloid, tau or dead brain cells, there are microglia, raring to go and ready to scavenge.
"I think at first a lot of people thought these cells were reacting to Alzheimer's pathology, and not necessarily a cause of the disease," he says.
It was the discovery of TREM2 on the heels of CD33 that really shifted the thinking, in part because it produces a protein that in the brain is only found in microglia. Genes are stretches of DNA that encode for the proteins that literally run our bodies and brains.
"Many of us [in the field] immediately said 'Look, there's now a risk factor that is only expressed in microglia. So it must be that innate immune cells are important in some way in the pathogenesis of the disease," he adds.
Holtzman sees microglial activation in impending dementia as a double-edged sword. In the beginning, microglia clear unwanted amyloid to maintain brain health. But once accumulated amyloid and tau have done enough damage, the neuroinflammation that comes with microglial activation does more harm than good. Neurons die en masse and dementia sets in.
Not all researchers are convinced.
Serge Revist is a professor in the Department of Molecular Medicine at the Laval University Medical School in Quebec. Based on his lab's research, he believes that while impaired immune activity is involved in Alzheimer's, it's not the root cause. "I don't think it's the immune cells that do the damage, I still think it's the beta-amyloid itself," he says, "In my lab, in mouse studies, we've never found that immune cells were directly responsible for killing neurons.”
He does believe that in some Alzheimer's patients microglia may not be able to handle the excess amyloid that accumulates in the disease, and that developing treatments that improve the ability of microglia and the immune system to clear the protein could be effective.
MICROGLIAL MEDICINES
The biological cascade leading to Alzheimer's is a tangled one.
Gene variants influencing the accumulation and clearance of amyloid are likely a major contributor. But immune activity caused by early life infection might also be involved, at least in some cases. This infectious theory of Alzheimer's was first proposed by Tanzi's now-deceased colleague Robert Moir. Tanzi's group even has evidence that amyloid itself is antimicrobial, and evolved to protect us from pathogens, only to become a problem when overactive and aggregated.
And the same goes for microglia, cells whose over-ambition might cause much of the brain degeneration seen in Alzheimer’s.
In theory, if a treatment could, say, decrease CD33 activity, or increase that of TREM2, doctors might one day be able to slow or even stop the progression of dementia. Instead of going after amyloid itself – the mechanism behind so many failed investigational Alzheimer's drugs – a therapy that quells the immune response to amyloid might be the answer in treating dementia.
"There are number of scientists and companies trying to figure out how to influence genes like TREM2 and CD33, and to both decrease amyloid and act on the downstream consequences of the protein," says Holtzman. "All of this is to say that somewhere in the biology that causes Alzheimer's the immune system is involved.”
It seems that in many cases the most common form of a dementia might be due to a well-intentioned immune cell going rogue.
"I think you'd hear this from basically any researcher worth their salt," says Tanzi. "I feel strongly that without microglial activation, you will not get Alzheimer's disease.” ~
https://www.npr.org/sections/health-shots/2022/01/30/1076166807/how-a-hyperactive-cell-in-the-brain-might-trigger-alzheimers-disease
Oriana:
Nothing here contradicts the theory that Alzheimer's is an autoimmune disorder initially caused by an infection with Herpes simplex (or another pathogen).
It has already been shown that immunosuppressive drugs such as rapamycin can significantly delay the onset of Alzheimer's.
*
HIGHER BRAIN GLUCOSE MAY MEAN MORE SEVERE ALZHEIMER’S
~ Led by Madhav Thambisetty, M.D., Ph.D., researchers looked at brain tissue samples at autopsy from participants in the Baltimore Longitudinal Study of Aging (BLSA), one of the world’s longest-running scientific studies of human aging. The BLSA tracks neurological, physical and psychological data on participants over several decades.
Researchers measured glucose levels in different brain regions, some vulnerable to Alzheimer’s disease pathology, such as the frontal and temporal cortex, and some that are resistant, like the cerebellum. They analyzed three groups of BLSA participants: those with Alzheimer’s symptoms during life and with confirmed Alzheimer’s disease pathology (beta-amyloid protein plaques and neurofibrillary tangles) in the brain at death; healthy controls; and individuals without symptoms during life but with significant levels of Alzheimer’s pathology found in the brain post-mortem.
They found distinct abnormalities in glycolysis, the main process by which the brain breaks down glucose, with evidence linking the severity of the abnormalities to the severity of Alzheimer’s pathology. Lower rates of glycolysis and higher brain glucose levels correlated to more severe plaques and tangles found in the brains of people with the disease. More severe reductions in brain glycolysis were also related to the expression of symptoms of Alzheimer’s disease during life, such as problems with memory.
While similarities between diabetes and Alzheimer’s have long been suspected, they have been difficult to evaluate, since insulin is not needed for glucose to enter the brain or to get into neurons. The team tracked the brain’s usage of glucose by measuring ratios of the amino acids serine, glycine and alanine to glucose, allowing them to assess rates of the key steps of glycolysis. They found that the activities of enzymes controlling these key glycolysis steps were lower in Alzheimer’s cases compared to normal brain tissue samples. Furthermore, lower enzyme activity was associated with more severe Alzheimer’s pathology in the brain and the development of symptoms.
Next, they used proteomics – the large-scale measurement of cellular proteins - to tally levels of GLUT3, a glucose transporter protein, in neurons. They found that GLUT3 levels were lower in brains with Alzheimer’s pathology compared to normal brains, and that these levels were also connected to the severity of tangles and plaques. Finally, the team checked blood glucose levels in study participants years before they died, finding that greater increases in blood glucose levels correlated with greater brain glucose levels at death.
“These findings point to a novel mechanism that could be targeted in the development of new treatments to help the brain overcome glycolysis defects in Alzheimer’s disease,” said Thambisetty.
The researchers cautioned that it is not yet completely clear whether abnormalities in brain glucose metabolism are definitively linked to the severity of Alzheimer’s disease symptoms or the speed of disease progression. The next steps for Thambisetty and his team include studying abnormalities in other metabolic pathways linked to glycolysis to determine how they may relate to Alzheimer’s pathology in the brain. ~
https://www.nih.gov/news-events/news-releases/higher-brain-glucose-levels-may-mean-more-severe-alzheimers
Oriana:
Alzheimer’s has been called “Type III diabetes.” But a study back in 2013 showed that a person need not be diabetic to be at a high risk of the disease. It’s enough to have above normal blood sugar. The higher the blood glucose levels, the higher the risk. For those already diagnosed the Alzheimer’s, the higher the glucose, the more beta-amyloid plaque.
The keto diet, even if not taken to the extreme that would produce true ketosis, seems a promising preventive treatment. Fasting (it can be intermittent fasting -- simply skipping breakfast) seems especially effective at making the brain run on ketones rather than glucose.
Or don't even call it keto: it does need to be a low-carbohydrate, moderate (or low) protein diet. (Protein needs to be restricted because any excess is converted to glucose; only fat cannot be converted to glucose.) So, an avocado a day? MCT oil, anyone? Maybe in combination with extra-virgin olive oil, which nurtures the microbiome, supremely important in brain function and immune function. (Come to think of it, there was a TED talk on the benefit of coconut oil in lessening symptoms of dementia.) (We might also think of lysine, a ketogenic aminoacid.)
Meanwhile,
given our growing understanding of how our microbiome controls our
immune system, it would make sense to try to nourish our good bacteria
with soluble fiber and fermented foods. Another answer may lie in
switching from predominantly glucose-based metabolism to ketone-based
energy production. The most efficient way to achieve this metabolic
shift is fasting.
The interesting thing is that fasting need not
be prolonged to produce significant benefits. Just a little fasting may
do it. It can be as easy as skipping breakfast — or, if fasting doesn’t
make you utterly miserable, fasting one day a week.
So, there are the benefits of intermittent fasting. And there’s berberine, the miraculous supplement that in most cases lowers blood sugar, and improves the cholesterol profile. (High cholesterol levels raise the risk of dementia; statins lower that risk.)
In any case, “Let them eat cake” is a recipe for disaster, especially in older age.
*
ending on beauty:
I am out with lanterns
looking for myself.
~ Emily Dickinson
Image: Emil Nolde: The Sea, c, 1950
No comments:
Post a Comment