Saturday, December 4, 2021

THERE IS NO “REAL YOU”: WE MAKE UP FALSE MEMORIES; BLINDSIGHT AND CONSCIOUSNESS; ACHTUNG BABY; SEX HORMONES AND SCHIZOPHRENIA; GUT BACTERIA AND SCHIZOPHRENIA; THE HISTORY OF BREAKFAST

 

“A fire broke out backstage in a theater. The clown came out to warn the public; they thought it was a joke and applauded. He repeated it; the acclaim was even greater. I think that's just how the world will come to an end: to general applause from wits who believe it's a joke.” ~ Søren Kierkegaard, Either/Or, Part I

*
WE ARE PEOPLE WHO LOVE OUR MIRACLES

we are people who love our miracles,
believe in them,
fervent in praying for them, like,
in the middle of a drought, praying
that it won’t rain and ruin
the picnic planned
for the afternoon, and, through God’s loving grace,
it doesn’t
hallelujah, hallelujah
hosannas on the highest
the subject came up yesterday
at a men’s bible study group that meets here
at the restaurant every Thursday (usually
they meet in a separate room, but for reasons
unimportant to the poem, they met in the main room
near me this week) about fifteen or so
middle-aged to older men, and the skinny old priest
I see here often in the morning, whose call, apparently,
is to have free breakfast with boring people
a couple of times a week…
none of the men look particularly hard-up, though
I'm sure each has his own personal challenges like
we all do, but all apparently all are
21st century prosperous, business-types, mid-career
to retired, which made their discussion of charity
interesting, they apparently never having any need
for that kind of stuff, certain that the $30 a week or so
they put in the collection plate could take care
of all the world’s needs if the damn government
would just get out of the way…

but, beside the point…

one of the fellas
from down at the end of the table
made an interesting point
about miracles…
what if Jesus’ miracles never actually
happened? he asked,
what if that talk of miracles
was just a way to get people to pony up
some of their own resources
for a good cause, kind of like, lookee you,
at what Jesus did, surely you can help him
by dropping an extra couple of bucks into the plate
next Sunday, and, anyway, he said, those miracles weren’t
really such big things, like the loaves and fishes and water to wine
thing, big deal, he fed the multitudes for a day,
a real miracle, he said,
betraying his main street republican bottom-line good sense,
would have been if he had made a loaves and fishes, water-to-wine
machine that would feed the multitudes for years…
and I have to admit this comment led me to a whole new
understanding of miracles, reminding me
of the miracles of McCormick and his reaper, Morse
and his telegraph, Ford and his Model-T, Salk and his vaccine,
Jobs and his Apple…
all of a sudden the theory of miracles makes sense to me…

~ Allen Itz


Photo: Allen Itz

*

Normally I wouldn’t choose such a prosy, “unpoetic” poem,  but this one aligned so well with my beliefs and values that I couldn’t resist it.

Yes, Jesus cured this or that particular leper, but he never produced a cure for leprosy (and anyway, leprosy, blindness, etc were all punishment for your sins). Think, by contrast, of Jonas Salk and his polio vaccine. Of course we can’t blame an ancient healer for not knowing about bacteria, viruses, and so forth . . .  The point is not to depreciate ancient healers, some of whom had amazing intuition (e.g. Hippocrates: All disease starts in the gut), but to illuminate the collective benefits of modern “miracles”:

a real miracle, he said, . . . .
would have been if he had made a loaves and fishes, water-to-wine
machine that would feed the multitudes for years…
and I have to admit this comment led me to a whole new
understanding of miracles, reminding me
of the miracles of McCormick and his reaper, Morse
and his telegraph, Ford and his Model-T, Salk and his vaccine,
Jobs and his Apple…
all of a sudden the theory of miracles makes sense to me…

Absolutely! Add to this the improvements in public hygiene: flush toilets, sewers, garbage collection (rather than throwing trash into the street), doctors washing their hands, safe drinking water, and on and on. Yes, it took centuries, and the cumulative work of thousands of brave, visionary individuals and those who helped them in a variety of ways.

Of course there is nothing “supernatural” about the polio vaccine, and the part of us that’s fascinated with anything that seems supernatural (or at least inexplicable) is disappointed by natural, “materialist” explanations (though if you study them deeply enough, more mysteries open up). And yet, when we ponder the past, these could indeed be called miracles.

As with everything, there are both positive and negative sides, and someone else might focus on environmental destruction and other negative factors of those modern miracles. No doubt, and yet who’d really want to live centuries ago? Give me modern miracles any time. 

*
THE CONTINUING APPEAL OF “PEANUTS”

~ ”Peanuts” was manifestly predicated on the constancy of things; the knowable, predictable, reliable status quo of characters and what we glean from them. Over the years, the cast changed almost not at all: the indefatigable, fantastical Snoopy, who never spoke, was a would-be author, and from him we learned to embrace and enjoy and celebrate the joys and mysteries of life; the perpetually wishy-washy, forlorn, to-be-sympathized-with, black and yellow zig-zag–shirted Charlie Brown, from whom we learn about perseverance in the face of adversity; the fussbudget, faux-psychiatrist Lucy, from whom we learn about independence, strength, and honesty—sometimes a brutal variety of it; philosopher-king Linus with his omnipresent blanket, from whom we learn about faith, hope, and spiritual peace; and across the whole rest of the gang, each with his or her signatures and hallmarks.

It’s not just that the characters themselves are always there, but also the themes, which so often have to do with trying, failing, and trying again; being hopeful, having those hopes dashed, and somehow reconstituting that hope. Often, that hope is in the form of the many manifestations of unrequited love in the strip, from Charlie Brown and the little red-headed girl, Lucy and Schroder, Sally and Linus, and so on. But also, it shows up each time that Charlie Brown thinks that Lucy will, just once, let him kick that football, and Linus hopes—against all hope?—that the Great Pumpkin will finally appear in the patch.

And yet, consistency alone could not explain the enduring, universal appeal of “Peanuts.” Were it not to evolve, it would have doubtless become arcane, anachronistic. But it did change, in subtle ways, and the themes that did not change continue to resonate because it continues to speak to what it is to be human: the anxieties and anger, the joys and loves of life, in our quotidian existence.

Schulz’s frequent enthusiasm for the life of his characters to correspond with anthropogenic changes created an abiding admiration for “Peanuts.” When Franklin was introduced to the “Peanuts” series in 1968, after Schulz was apprised of the restricted relatability of the cartoon, the United States was suffering a time of unrestrained social turbulence in reaction to desegregation; the presence of mutual environments, or even shared squares of a comic strip, would still have been a matter of substantial controversy.

Despite criticisms towards and failings in the formation of Franklin—those who label the addition as tokenistic often note that to treat him with too much caution, with a lack of depth and no flawed human qualities, depicts judgements of the time—the insertion of Franklin was in no way a small step for a nationally syndicated comic strip in the middle of race-related civil unrest. We can still argue that Schulz’s modifications were what brought “Peanuts” a step closer to a realistic and perpetual symbol of American life.

“Peanuts” possesses and projects a critical, palpable, uncanny sense of humanity. It intuits and mirrors so much of what it is to be alone, small, and vulnerable. Human. There is a poetry to the strip, to the children, their problems, and the way that they try to work through them. And what was invented seven decades ago now speaks to us as strongly as ever.

When I asked Schulz about writing a book of his own, he said, as he always maintained, that anything that he had to say, he said through the strip. He died within 24 hours of the last day that the strip ran. He was it, and it was him. And he added, so poignantly, as the coda, of a kind, to the last strip: ”Charlie Brown, Snoopy, Linus, Lucy . . . how can I ever forget them. . . .” How can we, then, ever forget the world that he gave us? We can’t. We shan’t. We are in his and their debt for always. ~

https://lithub.com/the-spiritual-message-at-the-heart-of-peanuts/?fbclid=IwAR1_Yv37IRjcjXf-9Hhzm6AStuy8J2Tioj4Od9HY_maSLRxJYc1dPsQYl3g

Oriana:

My favorite statement here:

“Peanuts” possesses and projects a critical, palpable, uncanny sense of humanity. It intuits and mirrors so much of what it is to be alone, small, and vulnerable. Human.”

*
SOCIAL DEVOLUTION



*
ACHTUNG BABY: HOW GERMAN PARENTS RAISE INDEPENDENT CHILDREN

~ In a memorable scene of Sara Zaske’s guide to German-style parenting, Achtung Baby: An American Mom on the German Art of Raising Self-Reliant Children, Zaske sends her 4-year-old daughter Sophia to her Berlin preschool with a bathing suit in her bag. It turns out, however, that the suit is unnecessary: All the tykes at Sophia’s Kita frolic in the water-play area naked. Later that year, Sophia and the rest of her Kita class take part in a gleefully parent-free sleepover. A sleepover! At school! For a 4-year-old! These two snapshots of life as a modern German child—uninhibited nudity; jaw-dropping independence—neatly encapsulate precisely why Zaske’s book is in equal turns exhilarating and devastating to an American parent.

Zaske argues that thanks in large part to the anti-authoritarian attitudes of the postwar generation (the so-called “68ers”), contemporary German parents give their children a great deal of freedom—to do dangerous stuff; to go places alone; to make their own mistakes, most of which involve nudity, fire, or both. This freedom makes those kids better, happier, and ultimately less prone to turn into miserable sociopaths. “The biggest lesson I learned in Germany,” she writes, “is that my children are not really mine. They belong first and foremost to themselves. I already knew this intellectually, but when I saw parents in Germany put this value into practice, I saw how differently I was acting.” 

Yes, Zaske notes, we here in the ostensible land of the free could learn a thing or zwei from our friends in Merkel-world. It’s breathtaking to rethink so many American parenting assumptions in light of another culture’s way of doing things. But it’s devastating to consider just how unlikely it is that we’ll ever adopt any of these delightful German habits on a societal level.

This is not just because Americans pride ourselves on eschewing the advice of outsiders, though that certainly doesn’t help. Our political and social institutions are so firmly entrenched that no amount of wise Germanic advice can help us. “We’ve created a culture of control,” Zaske laments. “In the name of safety and academic achievement, we have stripped kids of fundamental rights and freedoms: the freedom to move, to be alone for even a few minutes, to take risks, to play, to think for themselves.” It’s not just parents who are responsible for this, says Zaske, “it’s culture-wide,” from the “hours of homework” to the “intense” focus on competitive sports and extracurriculars; it’s also the “exaggerated media that makes it seem like a child can be abducted by a stranger at any time,” though stranger kidnappings in the United States are actually exceedingly rare. I mean, this is America, where the simple act of feeding an infant in public is enough to set off mommy warfare—allowing the entire nude body of a child in the out-of-doors is enough to warrant calling Child Protective Services or the cops.

Achtung Baby is organized in roughly chronological order, beginning with Zaske’s arrival, toddler in tow, in the midst of a frigid German January for the start of her husband’s job in a small town outside of Berlin. The family eventually settles in the dynamic, child-adoring German capital, and although Zaske isn’t working full time, new friends encourage her to enroll Sophia in Kita; when she does, her crash course in German parenting begins in earnest, moving to the advanced level with the German-style midwife birth of her second child, Ozzie.

The chapters progress through Sophia’s Einschulung (AYN-shool-oong, or start of school) and the family’s eventual repatriation to the States, each brimming with examples—both anecdotal and research-based—of why the German approach, focused on childhood independence, is more humane and respectful than the prevailing American bourgeois ethos of sequestered play dates and recess-bereft schooldays.

Zaske’s vignettes—and especially the research that backs them up—also exemplify everything that is maddening about this particular era in the American parenting milieu. As with Pamela Druckerman’s Bringing Up Bébé, much of this consternation stems from the dramatic disparities in government support for parents. I mean, we can’t even secure emergency insurance for terminally ill children, much less subsidies for preschool—which in Germany are, of course, standard and generous.

Although Zaske does end every chapter with well-meaning suggestions for how American parents and governments (ha) might deutsch-ify their approaches, the book’s many eye-popping (but fun-sounding) stories—solo foot commutes for second-graders; intentionally dangerous “adventure playgrounds”; school-sanctioned fire play; and my personal favorite, a children’s park that consists solely of an unattended marble slab and chisel—just remind me of all the reasons my American compatriots will double down on their own car-clown garbage lifestyles. I found myself frustrated into tears while reading Achtung Baby, because the adoption of any German customs stateside would require nothing less than a full armed revolution.

For example, when Sophia starts first grade, school administrators remind parents that under no circumstances should they drop children off in an automobile. Could you imagine? I can’t. In the contemporary United States, even in larger cities (with New York being the only notable exception), school is so synonymous with the interminable “drop-off line” that its vicissitudes are the subject of bestselling mom-book rants.

In open defiance of this custom, I ride my daughter the 4½ miles to preschool on a bike—she gets pulled along the mean streets of St. Louis in a Burley trailer—only to get yelled at by moms in idling SUVs outside the school. A few weeks ago, all of us parents even got a sternly worded email from the director, chastising the few who do pick up their children on foot for blocking the valet-style “carpool line” with “pedestrian traffic.” This is unsurprising; most children in the U.S. do not walk to school, even if they live close enough to do soto the detriment of their physical fitness, independence and joy, and of course also the environment. (Zaske experiences this culture shock in reverse when her family moves back to the U.S. and she makes the unheard-of suggestion of a solitary “walk to school day” at Sophia’s new San Francisco elementary.)

This is America. We arrest mothers who let their kids go to the park alone; we restrict play to expensive registered classes and parent-present “dates”; our playgrounds, meanwhile, are lawsuit-proof and correspondingly stultifying—though who cares, when we have to drive our kids miles to the nearest park anyway? 

This is America, where we would sooner die than allow our 5-year-old to go naked in our front yard. This is America, where many parents of a certain demographic will surely enjoy Achtung Baby but probably ignore most of its best advice.

While well-intentioned liberal parents (aka this book’s audience) will find numerous aspects of the German style superior—and many of our own trends duly worrying—most of the substantial change Achtung Baby suggests requires a large-scale shift in both prevailing attitude and state funding, neither of which will be forthcoming in this country for the foreseeable future. There’s only so much one American parent can do—I and my sad little bike commute can certainly attest to that. And what’s more, there’s only so much one American parent, slammed with work and barely hanging on, will want to do. Achtung Baby is a great read, but it may leave the American reader feeling helpless rather than inspired—a sentiment all too common in, if you’ll pardon the expression, the current Zeitgeist.

https://slate.com/human-interest/2018/02/achtung-baby-by-sara-zaske-reviewed.html?fbclid=IwAR3qAh__C3czaCAvnojUXLsWBZRqge-YAscggPzrFGsp8WhC8i8XlBbHbLE

Mary:

I think we are robbing our children of the freedom to play, explore, act independently, and develop the maturity to make wise decisions for themselves. My mother was considered overprotective and  unwilling to give us too much leeway. But we were out in the neighborhood all day, unguided and unsupervised. There weren't many trees, but we climbed lots of fences, walked across rooftops,  invented our own games, walked everywhere, to libraries, parks and museums, made our own friends, and visited their homes. We were also given responsibilities...tasks and errands, including caring for younger siblings, cooking simple meals and washing up. Work came before play. Homework was our job to do...no "help" from parents there.

We weren’t “regulated” — weren’t  micromanaged. Our freedoms were balanced by responsibility. If we shirked tasks, didn’t do homework, got into scuffles, the consequences were our own to suffer. No one, especially not our parents, was going to apologize for or excuse us. I think children with the current "helicopter parents" ordering their lives, making their choices, and clearing away obstacles and challenges, are left without the ability, without the habit, of making decisions and taking actions for themselves. They come into adult life at a huge disadvantage since they don't even know how to be independent.


Oriana:

Thanks to good public transportation, all of Warsaw was mine, so to speak. Besides, my parents were too busy to be escorting me, or otherwise “helicoptering” me. But here . . . the distances are so large, for one thing. And some relic of the Puritan prudery, perhaps, forbids nudity and too much frolicking. As the author sadly observes, her daughter can’t even walk to school. Independence comes later, from the “school of hard knocks.”

*

OCTOPODES NOT WAITRONS; HOW ENGLISH BECAME SO STRANGE

~ We can blame the barbarian invaders who brought the Germanic tongue to what became England. They gave us the vowel gradations in the verbs we call irregular—as when “sing” becomes “sang” in the past tense. Old English had different rules of inflection for different classes of verbs, and the rule that added “-ed” (“walk/walked”) became the dominant one, likely because it accommodated the arrival of French words after the Norman Conquest. The vowel-gradation rule fell away, but the stubborn holdovers survive as our irregular verbs. They endured, Okrent posits, because “words like ate, drank, took, found, knew, and spoke” are such fundamental parts of daily life that their frequent use made them resistant to change. Arika Okrent laments that “chode,” the past tense of “chide,” was lost forever, and I agree.

We can also blame the people of France, an activity which has become a national pastime for English speakers. After the Battle of Hastings brought Norman rule to England in 1066, French became the language of power and administration, giving us legal, political, and economic terms like “court,” “govern,” “appeal,” and “tax.” The English lexicon swelled. The two languages lived cheek-by-jowl, spawning synonym pairs that sometimes reflected the class divide between peasants and elites. The animals that Anglo-Saxon farmers called “calf” or “pig,” both Germanic words, became known under the Latin-derived French terms “veal” or “pork” when Anglo-Normans ate them. This distinction lives on today, when the same creature takes a different name, depending on whether we find it in a field or on our fork.

The printing press muddied the waters too. When William Caxton planted the first one on English soil in 1476, it began to ossify English spelling, even as pronunciation was undergoing the huge change apocalyptically called the “Great Vowel Shift.” Okrent writes that, when the dust settled, the written language still contained “spellings that represented pronunciations that were sometimes hundreds of years out of date.” English speakers originally pronounced “food” and “blood” the same way, something like “foad” and “bload.” But the dissemination of printed texts froze the spelling before the vowel sounds landed where they are today. Even Caxton’s staffing decisions shaped our language. He employed Flemish typesetters who sometimes applied Flemish spelling to English words, as when they added the ‘h’ to the word “ghost.”

We should also blame the snobs. The English we speak and write today bears the fingerprints of satirists in the vein of Bierce, dictionary makers like Samuel Johnson or Noah Webster, and countless pedants, columnists, academics, and teachers. Renaissance scholars enamored of Latin roots added the now-silent consonants to “salmon,” “doubt,” and “debt,” English words that had already come in through the French without that orthographic baggage. Nineteenth-century grammarians further shamed English-speakers into Latinate plurals: up to that point, the plural of “fungus” was “funguses.” And with the characteristic social insecurity of our human species, we overcompensated. “Octopus,” whose ending looks deceptively Latinate, is actually Greek, and should therefore be pluralized not as “octopi” but as the delightful “octopodes.” Okrent concludes that “Octopuses is perfectly fine.”

We must finally blame ourselves, Okrent concludes. For it is always individual human beings who make linguistic choices in real time, even when influenced by geopolitical shifts or grammarian snobs. We make these decisions in a marvelous tension between habit and innovation that makes language less like a steered vessel and more like Dr. Dolittle’s fabulous pushmi-pullyu, with heads at both ends, tugging in opposite directions. A language too static becomes a fossil, failing to express our experience of the real—but innovate too far and you won’t get buy-in from English speakers on the ground. So when things were getting stuffy we got Walt Whitman’s “barbaric yawp.” But when language ideologues pushed their luck with the gender-neutral “waitron” in the 1980s, nobody, and I mean nobody, would use it. Literally.

https://newcriterion.com/issues/2021/12/octopodes-not-waitrons

*

THE “REAL YOU” IS A MYTH — WE CONSTANTLY CREATE FALSE MEMORIES TO COMPOSE THE SELF WE WANT
    
~ We all want other people to “get us” and appreciate us for who we really are. In striving to achieve such relationships, we typically assume that there is a “real me”. But how do we actually know who we are? It may seem simple – we are a product of our life experiences, which we can be easily accessed through our memories of the past.

Indeed, substantial research has shown that memories shape a person’s identity. People with profound forms of amnesia typically also lose their identity – as beautifully described by the late writer and neurologist Oliver Sacks in his case study of 49-year-old Jimmy G, the “lost mariner”, who struggles to find meaning as he cannot remember anything that’s happened after his late adolescence.

But it turns out that identity is often not a truthful representation of who we are anyway – even if we have an intact memory. Research shows that we don’t actually access and use all available memories when creating personal narratives. It is becoming increasingly clear that, at any given moment, we unconsciously tend to choose and pick what to remember.

When we create personal narratives, we rely on a psychological screening mechanism, dubbed the monitoring system, which labels certain mental concepts as memories, but not others. Concepts that are rather vivid and rich in detail and emotion – episodes we can re-experience – are more likely to be marked as memories. These then pass a “plausibility test” carried out by a similar monitoring system which tells whether the events fit within the general personal history. For example, if we remember flying unaided in vivid detail, we know straight away that it cannot be real.

But what is selected as a personal memory also needs to fit the current idea that we have of ourselves. Let’s suppose you have always been a very kind person, but after a very distressing experience you have developed a strong aggressive trait that now suits you. Not only has your behavior changed, your personal narrative has too. If you are now asked to describe yourself, you might include past events previously omitted from your narrative – for example, instances in which you acted aggressively.

FALSE MEMORIES

And this is only half of the story. The other half has to do with the truthfulness of the memories that each time are chosen and picked to become part of the personal narrative. Even when we correctly rely on our memories, they can be highly inaccurate or outright false: we often make up memories of events that never happened.

Remembering is not like playing a video from the past in your mind – it is a highly reconstructive process that depends on knowledge, self image, needs and goals. Indeed, brain imaging studies have shown that personal memory does not have just one location in the brain, it is based on an “autobiographical memory brain network” which comprises many separate areas.

A crucial area is the frontal lobes, which are in charge of integrating all the information received into an event that needs to be meaningful – both in the sense of lacking impossible, incongruent elements within it, but also in the sense of fitting the idea the individual remembering has of themselves. If not congruent or meaningful, the memory is either discarded or undergoes changes, with information added or deleted.

Memories are therefore very malleable, they can be distorted and changed easily, as many studies in our lab have shown. For example, we have found that suggestions and imagination can create memories that are very detailed and emotional while still completely false. Jean Piaget, a famous developmental psychologist, remembered all his life in vivid detail an event in which he was abducted with his nanny – she often told him about it. After many years, she confessed to having made the story up. At that point, Piaget stopped believing in the memory, but it nevertheless remained as vivid as it was before.

We have assessed the frequency and nature of these false and no-longer-believed memories in a series of studies. Examining a very large sample across several countries, we discovered that they are actually rather common. What’s more, as for Piaget, they all feel very much like real memories.

This remained true even when we successfully created false memories in the lab using doctored videos suggesting that participants had performed certain actions. We later told them that these memories never actually happened. At this point, the participants stopped believing in the memory but reported that the characteristics of it made them feel as if it were true.
A common source of false memories are photos from the past. In a new study, we have discovered that we are particularly likely to create false memories when we see an image of someone who is just about to perform an action. That’s because such scenes trigger our minds to imagine the action being carried out over time.

But is all this a bad thing? For a number of years, researchers have focused on the negatives of this process. For example, there are fears that therapy could create false memories of historical sexual abuse, leading to false accusations. There have also been heated discussions about how people who suffer from mental health problems – for example, depression – can be biased to remember very negative events. Some self-help books therefore make suggestions about how to obtain a more accurate sense of self. For example, we could reflect on our biases and get feedback from others. But it is important to remember that other people may have false memories about us, too.

Crucially, there are upsides to our malleable memory. Picking and choosing memories is actually the norm, guided by self-enhancing biases that lead us to rewrite our past so it resembles what we feel and believe now. Inaccurate memories and narratives are necessary, resulting from the need to maintain a positive, up-to-date sense of self.

My own personal narrative is that I am a person who has always loved science, who has lived in many countries and met many people. But I might have made it up, at least in part. My current enjoyment for my job, and frequent travels, might taint my memories. Ultimately, there may have been times when I didn’t love science and wanted to settle down permanently. But clearly it doesn’t matter, does it? What matters is that I am happy and know what I want now.

https://theconversation.com/the-real-you-is-a-myth-we-constantly-create-false-memories-to-achieve-the-identity-we-want-103253

MEMORY IS A TRAITOR

“Now memory is a traitor: gilding, altering. The word is, in sad fact, meaningless, based as it is on the false assumption that identity is single, soul continuous. A man has no more right to set forth any self-memory as truth than to say ‘Maratt is a sour-mouthed University cynic’ or ‘Dnubietna is a liberal and madman.’” ~ Thomas Pynchon, V

 *

BLINDSIGHT AND CONSCIOUSNESS

~ Imagine being completely blind but still being able to see. Does that sound impossible? Well, it happens. A few years ago, a man (let’s call him Barry) suffered two strokes in quick succession. As a result, Barry was completely blind, and he walked with a stick.

One day, some psychologists placed Barry in a corridor full of obstacles like boxes and chairs. They took away his walking stick and told him to walk down the corridor. The result of this simple experiment would prove dramatic for our understanding of consciousness. Barry was able to navigate around the obstacles without tripping over a single one.

Barry has blindsight, an extremely rare condition that is as paradoxical as it sounds. People with blindsight consistently deny awareness of items in front of them, but they are capable of amazing feats, which demonstrate that, in some sense, they must be able to see them.

In another case, a man with blindsight (let’s call him Rick) was put in front of a screen and told to guess (from several options) what object was on the screen. Rick insisted that he didn’t know what was there and that he was just guessing, yet he was guessing with over 90% accuracy.

Blindsight results from damage to an area of the brain called the primary visual cortex. This is one of the areas, as you might have guessed, responsible for vision. Damage to primary visual cortex can result in blindness – sometimes total, sometimes partial.

So how does blindsight work? The eyes receive light and convert it into information that is then passed into the brain. This information then travels through a series of pathways through the brain to eventually end up at the primary visual cortex. For people with blindsight, this area is damaged and cannot properly process the information, so the information never makes it to conscious awareness. But the information is still processed by other areas of the visual system that are intact, enabling people with blindsight to carry out the kind of tasks that we see in the case of Barry and Rick.

Blindsight serves as a particularly striking example of a general phenomenon, which is just how much goes on in the brain below the surface of consciousness. This applies just as much to people without blindsight as people with it. Studies have shown that naked pictures of attractive people can draw our attention, even when we are completely unaware of them. Other studies have demonstrated that we can correctly judge the color of any object without any conscious awareness of it.

WHAT DOES BLINDSIGHT TELL US ABOUT CONSCIOUSNESS?

Exactly how you answer this question will heavily depend on which interpretation you accept. Do you think that those who have blindsight are in some sense conscious of what is out there or not?

If they’re not, then blindsight provides an exciting tool that we can use to work out exactly what consciousness is for. By looking at what the brain can do without consciousness, we can try to work out which tasks ultimately require consciousness. From that, we may be able to work out what the evolutionary function of consciousness is, which is something that we are still relatively in the dark about.

On the other hand, if we could prove that people with blindsight are conscious of what is in front of them, this raises no less interesting and exciting questions about the limits of consciousness. What is their consciousness actually like? How does it differ from more familiar kinds of consciousness? And precisely where in the brain does consciousness begin and end? If they are conscious, despite damage to their visual cortex, what does that tell us about the role of this brain area in generating consciousness?

In my research, I am interested in the way that blindsight reveals the fuzzy boundaries at the edges of vision and consciousness. In cases like blindsight, it becomes increasingly unclear whether our normal concepts such as “perception”, “consciousness” and “seeing” are up to the task of adequately describing and explaining what is really going on. My goal is to develop more nuanced views of perception and consciousness that can help us understand their distinctly fuzzy edges.

To ultimately understand these cases, we will need to employ careful philosophical reflection on the concepts we use and the assumptions we make, just as much as we will need a thorough scientific investigation of the mechanics of the mind. ~

https://theconversation.com/blindsight-a-strange-neurological-condition-that-could-help-explain-consciousness-141625



The visual cortex

Mary:

Blindsight is fascinating, and the question of consciousness, what it is, how it works , and how unconscious processes may guide behavior, raises essential questions about identity. In my own experience, 45 years ago I underwent a series of ECT treatments — the older kind, bilateral, with higher amps than they use now. I remember "coming to" sitting in a lounge watching TV on a Sunday evening. The time before that moment was a blank. From that moment, time started for me.again — that is, I eventually remembered events leading up to my hospitalization, but then, nothing, until that Sunday evening.

The unremembered interval was 4 weeks of treatments, three times a week. My family would visit, and they said I knew them, but kept asking for a toothbrush, never recognizing the one they brought yesterday. However, I knew where my room was, and didn’t get lost in the halls. I remembered none of this. But I was acting and talking and going to my room...or someone was. Like the blind man with blindsight, some other mechanism, not my consciousness of myself as an individual, was doing these things. Things that were never remembered, never integrated into my conscious identity. I had them only secondhand, from what I was told.

I have always found this whole experience unsettling, if I can act without knowing I am, who is it acting then? How can we be so layered, so divided, so sealed off from these other, unconscious selves?? I imagine this is somewhat like the experience of sleepwalkers, or those unfortunate enough to have episodes of driving while asleep as an untoward effect of some medications. 

Add these questions about consciousness to our invention of false memories, and identity becomes amazingly elusive and complex.


Oriana:

What scared me no end was this year’s episode of food poisoning with dehydration severe enough to cause temporary dementia. I suddenly lived in the past, when my husband was still alive. I was waiting for him to come home from work as usual..

Later I thought, “So this is what it’s like to have dementia — you go into the past.” And sure enough, that’s what I read in the descriptions of what happens to dementia patients, incapable as they are of making new memories — but still having some function in their frontal lobes, where some old memories are preserved. And you see that in the dementia that often develops as death draws near.

And yes, given that those memories may be largely fiction, what does that do the idea that “memory is where we live”? It seems that we are the unreliable narrator of the novel of our lives that we keep on creating and changing. I'm glad that I preserved some memories in poems — of course in a selected, literary fashion. If not for that record, such as it is, even larger regions of my life would be blanks.

But perhaps we shouldn’t be pondering identity so much. It keeps on dissolving, even in our youth. It’s more important to be kind, and to create some sort of beauty — be it poetry or gardening. 

Van Gogh: Garden at the asylum of Saint-Remy, 1889.

*
INFECTIOUS DISEASE AND COLONIZATION OF NORTH AMERICA

~ The Europeans who began colonizing North America in the early 17th century steadfastly believed that God communicated his wrath through plague. They brought this conviction with them – as well as deadly disease itself.

Plague brought by early European settlers devastated Indigenous populations during an epidemic in 1616-19 in what is now southern New England. Upwards of 90% of the Indigenous population died in the years leading up to the arrival of the Mayflower in November 1620.

It’s still unclear what the disease behind the epidemic actually was [Oriana:

evidence points chiefly to smallpox]. But this was the first of many plagues that swept through Algonquian territory – Algonquian being the linguistic term used to describe an array of Indigenous peoples stretching, among other places, along the northeastern seaboard of what is now the US.

The 1620 Charter of New England, given by King James I, mentioned this epidemic as a reason why God “in his great goodness and bountie towards us and our people gave the land to Englishmen”. Plague supported property rights – it informed the back story of Plymouth Colony that was founded after the arrival of the Mayflower.

The English believed God communicated through plague. But my research argues that declaring “God willed the plague” simply opened, rather than closed, the debate. Rulers, explorers and colonists in the 17th century had an interest in pinpointing the cause of disease. This was partly because plague was used to procure land deemed as empty, and even clear it of inhabitants.

JUSTIFICATION FOR ENTERING THE LAND

Many colonists described New England as an “Eden”. But in 1632 the early colonist Thomas Morton said the epidemic of 1616-19 had rendered it “a new found Golgotha” – the skull-shaped hill in Jerusalem described in the Bible as the place of Christ’s death. Most pilgrims and puritans viewed plague as a confirmation of divine favor toward the English, in part because few of the colonists died in comparison to the Algonquians of New England. Colonists often referred to Indigenous peoples’ bodies as more healthy and fit than European ones, and this sense of physical disparity made the subsequent decline of Algonquians seem all the more striking.

John Winthrop, the first governor of the Massachusetts Bay Colony, argued in 1629 that God providentially removed most of the original inhabitants before the colony was planted. A few years later, in 1634, he wrote that God continued to “drive out the natives” and that God was “deminishinge them as we increase”. The right to possess a previously occupied land rested in part on the belief that God had personally removed the original inhabitants. Arguments similar to Winthrop’s litter the landscape of early colonial reflections.

Yet, reactions to the epidemic are far more complex than a simple narrative of land acquisition. Some thought God plagued Algonquians and that it was their duty to try to save their lives and souls. In one 1633 account, compassionate acts for the afflicted coexisted with thankfulness that God was clearing the land – however mutually exclusive those two emotions seem.

Some Algonquians connected the plague with the English and their God. According to Edward Winslow’s Good Newes from New-England in 1624, some thought the English had buried the plague in their storehouses and could use it against them at will. The English tried to dispel the notion that the plague was a weapon they wielded.

Over the 17th century, additional plagues swept through different Algonquian regions at different times. These waves of disease upset indigenous power relations and contributed to the Pequot War of 1636–38 – a conflict between the English and their Mohegan allies and the Pequot which resulted in the massacre and enslavement of the Pequot.

After the war, the English took a more active role in “civilizing” and evangelizing Algonquians, for example founding an Indian College at Harvard in the mid-1650s. The inclusion of Algonquians into Christianity seemed to contradict the colonists’ earlier view that God had evicted them from the land through epidemic. Some now argued American Indians descended from Israel and their conversion would usher in God’s kingdom on earth.

Decades of disease also influenced Native American spirituality. The trauma of the previous decades – plague being only one factor – made some Algonquians receptive to evangelistic efforts. Some shifted loyalty (at least in part) to the English and their God and their split allegiance undermined traditional authority structures and exacerbated tensions with the English.

JUSTIFICATION FOR CLEARING THE LAND

English attitudes towards land acquisition ranged from contract to conquest. Most Englishmen thought taking land from Algonquians was wrong, but over time land transactions gave way to conquests.

It was the emptiness of the land due to plague that justified initial settlements – and over the decades the English purchased additional lands that were occupied. But this arrangement proved insufficient as the decades wore on and tens of thousands of immigrants from Europe wanted more and more land. Roger Williams – a defender of Indigenous people and founder of Rhode Island – critiqued what he called the growing worship of “God Land” .

The early colonists mainly viewed themselves as passively being pulled by God into a void left by plague. Over time they transitioned to viewing themselves as more actively involved in repelling Algonquians, clearing the land of inhabitants with God’s help.

King Philip’s War in 1675-78, a conflict that involved almost all of the European and Indigenous inhabitants of New England, was disastrous for the English victors and much worse for the defeated Algonquians. After the earlier Pequot War, many colonists had come to believe their destiny was tied to the well-being of Indigenous Americans. But after King Philip’s War, destiny seemed to pull them apart.

The growth of racial theories coupled with the recent conflict fed the belief that the English and Algonquian could not coexist. This belief, in turn, led to the myth of the “vanishing Indian” – Indigenous populations declined through plague and war as God strengthened the English. Evangelism receded. Slavery increased.

Expulsion of Indigenous Americans from their lands became more widely accepted after the mid-1670s. The English increasingly saw themselves as pushing American Indians out, with divine approval. This shift would have profound implications for the long and deadly history of white expansion in North America.

Throughout the 17th century, plague invisibly reshuffled the relationship between colonization, “civilization”, evangelization and racism behind the scenes. In doing so it played an important role in altering the political and religious landscape of America.

https://bigthink.com/the-past/plague-colonial-new-england/

Smallpox and measles, 15th century Aztec drawing

From Wiki:

~ Smallpox was the disease brought by Europeans that was most destructive to the Native Americans, both in terms of morbidity and mortality. The first well-documented smallpox epidemic in the Americas began in Hispaniola in late 1518 and soon spread to Mexico. Estimates of mortality range from one-quarter to one-half of the population of central Mexico.

Smallpox was lethal to many Native Americans, resulting in sweeping epidemics and repeatedly affecting the same tribes. After its introduction to Mexico in 1519, the disease spread across South America, devastating indigenous populations in what are now Colombia, Peru and Chile during the sixteenth century. 

The disease was slow to spread northward due to the sparse population of the northern Mexico desert region. It was introduced to eastern North America separately by colonists arriving in 1633 to Plymouth, Massachusetts, and local Native American communities were soon struck by the virus. It reached the Mohawk nation in 1634, the Lake Ontario area in 1636, and the lands of other Iroquois tribes by 1679. Between 1613 and 1690 the Iroquois tribes living in Quebec suffered twenty-four epidemics, almost all of them caused by smallpox. By 1698 the virus had crossed the Mississippi, causing an epidemic that nearly obliterated the Quapaw Indians of Arkansas. ~

https://en.wikipedia.org/wiki/Native_American_disease_and_epidemics

*

LAMED-VAVNIKS

The charm of mythology understood as mythology -- an imaginative fount of moral lessons, but not at the price of forcing yourself to believe in nonsense, and even cowering in fear before a fictitious judge and ruler. I’m thinking of the notion of a Lamed-Vavnik, and how 36 righteous men (I guess women don’t count) keep god from destroying the world. It’s charming as long as we don’t believe in it. If we take it literally (I doubt that anyone does, but let’s suppose), it’s monstrous -- what if one generation becomes short of just one Lamed-Vavnik? And the dangerous lunatic up there is constantly counting and re-counting . . . But if we take it metaphorically -- yes, the world is sustained by the righteous -- if we think of examples of people who are indeed very kind -- then no harm is done and our hearts are uplifted.

I like the idea that a Lamed-Vavnik doesn’t realize he’s one of the 36 Righteous Men; what pressure that kind of knowledge would be!

*

“Those who have suffered much become very bitter or very gentle.” ~ Will Durant

Oriana:

Those who become very gentle are along the lines of Lamed-Vavniks. I wonder, though; could we do anything to improve the well-being of the very bitter? For instance, if they are healthy enough, could they not be given some useful employment? It seems that meaningful work resolves many problems. Thanks to my long experience with depression I've discovered that "work works." I also love the saying that rather than pursue happiness, we should pursue usefulness.

Indeed my own journey happened to be "from bitterness to usefulness."

But I don’t mean to insist on the need for socially recognized useful employment. The very gentle know the secret. Even a smile, not to mention "random acts of kindness" — ideally, just our being is the greatest gift we can offer.

Will and Ariel Durant

*

SCHIZOPHRENIA IS DIFFERENT IN WOMEN

~ Men typically show schizophrenic symptoms at younger ages. The onset for most schizophrenic men is during their late teens or early twenties.

Although some schizophrenic women develop the disease in their late teens or early twenties, others don’t see symptoms until their 40s or even their 60s.  Women are twice as likely to present with symptoms after age 40. That means a woman with late-onset schizophrenia might live most of her life without any indication that she will one day be schizophrenic. 

“It could be the first mental illness at the onset,” says Nicola Cascella, an assistant professor of psychiatry and behavioral sciences at the Johns Hopkins University School of Medicine.  

It’s a diagnosis that many people don’t expect, and women are often misdiagnosed with mood disorders or early-onset Alzheimer’s disease. 

“The diagnosis is very complicated,” says Abigail Donovan, an associate psychiatrist at Massachusetts General Hospital. 

Scientists are also finding the disease and its causes are complicated, too.   

THE ESTROGEN THEORY

Schizophrenia is a nonaffective disorder with symptoms that include hallucinations, delusions, apathy, and social withdrawal, according to the Diagnostic and Statistical Manual of Mental Disorders (DSM-5).

About 1 percent of the population has schizophrenia, according to the National Institute of Mental Health.

Late-onset schizophrenia occurs at age 45. Scientists think it might be timed with a woman’s decreasing estrogen as she enters perimenopause. 

“The idea is that estrogen exerts protective effects so during the time when women have a lot of estrogen in their bodies, they are protected from developing schizophrenia,” Donovan says. 
The estrogen theory would also suggest that women with lower levels of estrogen are more vulnerable to the disease, including women who have naturally lower levels of estrogen as well as postpartum women experiencing hormonal fluctuations. 

The estrogen theory is still being explored, and although Donovan says many scientists support it, they want to study it more and “prove it beyond a shadow of a doubt.”

COGNITIVE CHANGES

Scientists, however, are more in agreement that the younger a person develops schizophrenia, the more cognitive impairment the person will experience. A teenager who develops schizophrenia, for example, will see a significant and permanent decline in their IQ. 

“You lose 10 to 15 points from your baseline,” Cascella says. 

Processing speed and working memory is affected, which means the person experiences disorganization in their thinking. Cascella says it’s not uncommon to see newly diagnosed college students leave school due to a loss in academic abilities. 

Late-onset patients appear to have less of a disruption to their cognitive abilities. “With the late-onset women, there is some cognitive impairment, but it appears to be less disruptive than what people are experiencing if they were to develop schizophrenia in their late teens or early twenties,” Donovan says. 

Women with late-onset schizophrenia have the same symptoms as people who develop it earlier in life, but scientists currently believe these symptoms tend to be milder. 

SHOWING SYMPTOMS

Schizophrenia is divided into “positive” symptoms, which add hallucinations or delusions to a person’s life, and “negative” symptoms, which cause a person to experience a loss of well-being through apathy, depression, and lack of enjoyment in life.  


For women with late-onset schizophrenia, negative symptoms can appear to their family or friends like a personality change. 

“In that way, when people experience those negative symptoms, it does feel as though their personality has changed,” Donovan says. “They aren’t talking the same way they once were, they aren’t engaging with the same connection, and they aren’t motivated to go out and do the things they once did.” 

These symptoms often appear years prior to onset, which can mean a physician doesn’t have enough information to diagnose the patient. Studies show the time between first symptoms and the diagnosis, called the prodromal period, averages five years. 

Cascella says he currently has a patient who was diagnosed in her senior year of college. She remembers suddenly feeling anxious in her junior year of high school about speaking in class. “There were changes happening,” Cascella says. “What I suspect in this patient is she started to be anxious as a result of her suspiciousness, thinking that ‘people are going to listen to me talk in class and I don’t know what intention they might have.’”

Women with late-onset schizophrenia also have changes occurring years before diagnosis. Women who present with negative symptoms such as apathy and depression can be misdiagnosed with a mood disorder. 

It’s a complex disease, and Donovan says scientists have a lot they would like to learn about why a person develops schizophrenia — at any stage in life. They would like to be better at predicting who will develop schizophrenia so that early interventions can be made. And as a clinician who treats patients of all ages, she says she’d like to see more progress in treatment options.  

“We have medication to treat those positive symptoms, and much of the time, those symptoms will get better,” she says. “But we don’t have medication for treating negative or cognitive symptoms. I wish we had more to offer.”

https://www.discovermagazine.com/mind/why-schizophrenia-is-different-for-women?utm_source=acs&utm_medium=email&utm_email=ivy333%40cox.net&utm_campaign=News0_DSC_211202_000000&eid=ivy333%40cox.net

Oriana:

It’s important to bear in mind that in schizophrenia we have both psychotic and affective (depressive) symptoms. Estrogen appears to protect agains the psychotic symptoms. That’s of course a huge thing right there, and raises the possibility of using hormones in the treatment of schizophrenia, at least in women.


More men than women suffer from schizophrenia, and men appear to have more severe symptoms. Is it because schizophrenia is in some ways a dopamine-based disorder, and testosterone increases dopamine? My search revealed no clear answers.

By contrast, the estrogen theory has gained a lot of support. Hormone replacement therapy has been in use for a long time, and if it’s appropriate to give it to postmenopausal women to prevent osteoporosis, for instance, it should be OK to at least try it out  against the psychotic symptoms in women. (Natural [bio-identical] progesterone is neuroprotective as well.)

It is very telling that the onset of schizophrenia in women is 3-4 years (or more) later than in men, that the severity of symptoms of women's symptoms varies according to the menstrual phase (being lowest when both estrogen and progesterone are high) — and particularly that there is a second, smaller peak in the onset of symptoms after age 45, which for most women coincides with peri-menopause.

Women already diagnosed with schizophrenia experience a worsening of symptoms as they enter menopause. “Psychotic symptoms of schizophrenia such as hallucinations and delusions become active as women approach menopause. This is clinically meaningful because in men, at the corresponding age, these symptoms generally become quiescent.” https://www.psychiatryadvisor.com/home/schizophrenia-advisor/implications-of-menopause-in-schizophrenia-treatment/

Of course it’s terrific that at last psychiatry recognizes the impact of sex hormones on the brain. I remember the Dark Ages when schizophrenia (along with all mental illness) was blamed on bad mothers. The brain had nothing to do with it, much less sex hormones. But then even menopause was seen as causing symptoms only in neurotic women. It may be hard for younger readers to believe this, but I remember the indignation I caused when first writing about schizophrenia as a brain disease.

But that’s totally minor considering that some menopausal women used to be locked up on a mental ward when what they needed was hormone replacement. But that concept didn’t yet exist.

As for the very old observation that mental illness runs in the family, it does seem true: a family history of schizophrenia is associated with an eight-fold risk of schizophrenia. But while there is nothing surprising about a strong genetic component of schizophrenia, I admit I had no idea that the differences between men and women and the (not yet fully explored) role of sex hormones were so profound.

The obvious question is: what about the role of progesterone? We know that it’s usually a very beneficial, neuroprotective hormone. So why aren’t sex hormones used in the treatment of schizophrenia?

The role of the immune system and viral infections in regard to schizophrenia should also be explored. Again, it would not be surprising to find gender differences.

Then there is a possible role of obesity and diabetes. And of course the topic of schizophrenia and microbiome needs to be further explored. And about about auto-immunity? What exactly is the link between autoimmune diseases and schizophrenia? Could schizophrenia be an auto-immune disease? Should we be thinking in terms of a vaccine? The questions seem to multiply with time. Obviously, schizophrenia, affecting only 1% of the population, continues to generate the most interest because it remains such a mystery.



*

SCHIZOPHRENIA AND THE MICROBIOME

~ Researchers say they’ve discovered that the way to heal schizophrenia might be through the gut. There’s an ecosystem of bacteria and microbes that live in our digestive tracts, known as the gut microbiome. And these may lead to some features of schizophrenia, an international team of scientists announced this week in the journal Science Advances. The discovery could revolutionize treatment options for schizophrenia.

The idea that the gut is connected to mental health has been gaining traction in recent years. And research is now pointing to a link between the gut microbiome and an array of mental health disorders including anxiety, memory and motor deficits in Parkinson’s disease. A separate study, published Monday in the journal Nature Microbiology connected missing gut microbes to depression. Those mounting associations extend to schizophrenia as well, where the mental illness often co-occurs with gastrointestinal disorders marked by atypical gut bugs.

To explore the connection, Wong and colleagues sequenced the genetic material in stool samples from patients with schizophrenia, as well as healthy individuals recruited from the First Affiliated Hospital of Chongqing Medical University in China.

They found that patients with schizophrenia had less diverse gut microbiomes than patients without schizophrenia, the researchers report. The microbiomes from schizophrenic patients also harbored unique kinds of bacteria. They were so distinct, in fact, that the researchers were able to tell patients with schizophrenia apart from healthy controls based just on the bacteria in their guts.

THE ALL NEW APPROACH

The most intriguing evidence came when the researchers gave germ-free mice fecal transplants from the schizophrenic patients. They found that “the mice behaved in a way that is reminiscent of the behavior of people with schizophrenia,” said Julio Licinio, who co-led the new work with Wong, his research partner and spouse. Mice given fecal transplants from healthy controls behaved normally. “The brains of the animals given microbes from patients with schizophrenia also showed changes in glutamate, a neurotransmitter that is thought to be dysregulated in schizophrenia,” he added.

The discovery shows how altering the gut can influence an animals behavior, and it also provides a new target for drug treatment.

“This would give us a completely new pathway toward treating schizophrenia,” Licinio said. “No treatments that we give today are based on a change of the microbes in the gut. So if you could show that would change behavior in a positive way, we would have a whole new way to approach schizophrenia.” ~

https://www.discovermagazine.com/health/researchers-find-further-evidence-that-schizophrenia-is-connected-to-our-guts

*

SCHIZOPHRENIA, THE IMMUNE SYSTEM, AND COVID

~ People with a schizophrenia spectrum diagnosis faced more than two and a half times the average person’s risk of dying from Covid-19, even after controlling for the many other factors that affect Covid-19 outcomes, such as cardiovascular disease, diabetes, smoking, obesity, and demographic factors — age, sex, and race.

Since then, more studies have come out — as well as meta-studies pooling the conclusions of those studies — showing worse Covid-19 outcomes among people with diagnosed mental health disorders including depression, bipolar disorder, and schizophrenia.

It’s not necessarily that people with schizophrenia or mood disorders are more likely to become infected with Covid-19. Rather, once they are infected, the outcomes are worse. 

Psychiatrists who study these mental illnesses say the culprit might lie in a connection between mental health and the immune system. They’re finding that mental health stressors could leave people more at risk for infection, and, most provocatively, they suspect that responses in the immune system might even contribute to some mental health issues.

People living with mental illnesses like schizophrenia, bipolar disorder, and major depression tend to have shorter-than-average life spans and worse health overall. They’re more at risk for heart disease and obesity; they smoke at higher rates. All these risk factors put people with these mental health issues — particularly schizophrenia — at higher risk of death from many causes, including severe infections.

The scientific literature does find links between mental health and immune system health. The biggest one: Studies have reported that many people with depression, bipolar, and schizophrenia have higher levels of inflammation throughout the body.

Inflammation is one of the body’s responses to dealing with dangerous invaders like the SARS-CoV-2 virus. Inflammation is literally a flood of fluids containing immune system cells. They get released from the blood into body tissues to help clear infections. This is why infected areas of the body get swollen.

When inflammation is short-lived, it can help clear out an infection. When it is chronic, it can cause problems. It wears on the heart and can contribute to illnesses like diabetes. When it comes to Covid-19, scientists suspect that underlying inflammation — or underlying dysregulation of the immune system — is what causes some patients’ bodies to overreact to the virus, causing the worst symptoms that can land people in hospitals and lead to death.

It’s possible that the immune system plays a role in generating schizophrenia. “There’s a theory that viral exposure while in utero is closely tied with developing psychotic illness or schizophrenia later on,” says Ellen Lee, a psychiatrist and researcher with the University of California San Diego. It’s possible that the mother’s immune response during the infection leaves a lasting impact on the child’s brain and immune system. Other studies have suggested that having a prior autoimmune disorder puts a person at risk for schizophrenia. But, Lee stresses, “There’s so much that we don’t fully understand.”

The bigger point, Lee says, is to recognize that schizophrenia is “a whole-body disorder.” “We see inflammation increase in the brain and we see inflammation increase throughout the body.” That leaves people with schizophrenia at risk of a whole host of chronic illnesses. “The inflammation worsens metabolic health, which then in turn usually leads to obesity and worse inflammation,” Lee says. “So it’s all kind of a cycle.”

An enormous study of the health records of 3.56 million people born between 1945 and 1996 in Denmark showed that a history of infection and autoimmune disorders predicted later diagnosis of mood disorders. More specifically, the study found that the more infections a person had, the more at risk they’d be for mental health issues later on; there could be a causal pathway here. That makes it seem like the infections themselves are a risk factor.

If scientists can use the pandemic to learn even more about the nature of these mental illnesses and how they interact with the immune system, more future lives could be saved, too. ~

https://www.vox.com/science-and-health/22783685/covid-19-depression-mental-health-risks-immunology?fbclid=IwAR1V2qqOSWV_krbjCej78oJQB_Ju_RoqUFknq1x-LxwCBZgPEWqZCZf4IZY

*
THE HISTORY OF BREAKFAST

For much of history, breakfast existed in the shadows. While beautifully preserved cookbooks and food histories tell us about lavish meal productions in ancient times, this simple morning meal has often been overlooked in the annals of history. But we’ve clearly been eating something for a morning meal. From an etymological perspective, the English word “breakfast” is derived from the French disdéjeuner, which in turn comes from the Latin disieiunare meaning “un-fast.”

Disdéjeuner was contracted in the 11th-century to disner and became the English “dinner.” The usage of “dinner” coincided with a shift of the main meal of the day from midday to evening. The French déjeuner became the midday meal (lunch), and other languages followed suit, relegating the early morning meal to a petit déjeuner--literally a small lunch. It is almost as if from a linguistic perspective we were looking to intentionally de-emphasize the significance of a morning meal by making it a variation of an existing one.

However named, it's clear that people were breaking their nighttime fast in some fashion. Many Ancient Romans ate three meals a day, but the times for these meals changed for a myriad of reasons. The morning equivalent of breakfast (jentaculum) consisted of bread, cheese, olives, salad, nuts, dried grapes, and cold meat. If resources permitted, they may have also included milk, eggs, and a wine and honey mixture. The poet Martial notes that breakfast likely included pastries. And there seems to have been a separate offering for children who were too young to feed themselves: they were given porridge or puls, which was comprised of a base of millet. The Roman adoption of the meal is reflected in the arts, as well. In the play Curculio, Phaedromus gathers wine, olives, and capers for a pre-dawn early breakfast with his mistress. But even before that, in Grecian literature, Homer makes reference to ariston, an early morning meal. And in both The Iliad and The Odyssey, characters partake in ariston before attending to the day's chores and needs.

Somewhere along the path to the Middle Ages, breakfast fell out of favor. The meal was labelled as an indulgence as the Catholic Church gained ground. And this label stuck when Thomas Aquinas asserted that praepropere—eating too soon—was one of the ways to commit gluttony. Two meals were sufficient unless you were a child, old, ill, or a laborer. Needing breakfast reflected on you as a person: either you lacked the stamina or the means to go without the additional calories. In the former instance, you were too weak to go without sustenance until the first real meal of the day. And in the latter, you were poor and needed energy for labor. In either of these cases you were not treated to a lavish meal. At best you would get a bit of bread and cheese and some ale. By the 1500s, physicians were actually warning against the morning meal—which is a far cry from the much publicized message of “the most important meal of the day” that we have come to know. Instead they recommended a morning walk to get the blood flowing and trigger bodily functions.

Still, there were people who continued to break their fast in the morning. Queen Elizabeth I, for example, was an early riser and had been known to have a morning meal of white bread, ale, beer, wine, and a stew. And other nobles followed suit so that by the seventeenth-century breakfast was more readily accepted, and in some regards, celebrated: The Dutch immortalized it in their still-lifes, and the wealthy began to build breakfast rooms specifically for the meal. In the nascent United States, colonists (depending on their means) filled themselves with oatmeal porridge, fish, poached eggs, broiled ham or bacon, buttered bread, marmalade or fruit, and tea, coffee or cocoa. Building a nation was hungry work, and this meal was meant to carry them through lunch at the very least.

The state of breakfast had largely been driven by religious or economic needs in this story. There existed a divide between who can eat the meal and why, and what foods are appropriate for this event. This remained true in the 1800s. In America, breakfast continued to impart a status: Laborers rose early and attended to tasks (which may have included preparing breakfast for others) before sitting down to eat, while the wealthier classes could sleep in and have a later and larger meal. People were encouraged to have a small meal and avoid warm foods and tea (though coffee, milk, water and cocoa were acceptable). This is where breakfast cereal enters the story.

The American breakfast landscape was forever altered during this time when a growing health awareness swept the nation. Suddenly people were obsessed with the pursuit of unprocessed foods and clean eating, which was a cornerstone to the treatments offered in the sanitariums of the period. In 1863 Dr. James Caleb Jackson invented “granola,” crisped and crumbled nuggets of bran-rich Graham flour. It was a great source of fiber and was intended to help combat the digestive pains that came with eating too much meat, but required soaking overnight to be edible.

In 1894, Dr. John Harvey Kellogg accidentally created a flaked cereal when a pot of cooked wheat went stale. Kellogg tried to save the wheat by putting it through a roller. It dried in flakes and corn flakes was born. Corn flakes fit the prescribed regimen from Kellogg: exercise, fresh air, lots of baths, and simple, bland foods—like cereal.

But cereal was also accidentally convenient. It didn’t need to be cooked and it had a relatively long shelf life. It fit in with the changing world. In cities where people were reporting for shift work instead of working the fields, a lighter breakfast was not only suitable but economical. It saved on time. The larger meals of bacon and eggs didn’t disappear. They were instead relegated to the weekend when time was less of a pressing factor. This division stands in stark contrast to the soul food of the American South, which maintains its ties to caloric foods meant to power laborers through their workday. This distinction is another manifestation of the ways wealth and status determine who can eat and when.

More than any other meal, breakfast has been susceptible to the rise and fall of our impassioned trends. For example, eggs were blamed for high cholesterol for many years and were shunned at the breakfast table. Carbs similarly came under fire from those watching their sugar intake. And the sugary cereal on today’s shelves is definitely bad for you. We should all probably stick to steel cut oats. Breakfast was marketed as the most important meal of the day by the cereal companies themselves. They leveraged the American focus on healthy eating but failed to fully eradicate our fascination with a petit déjeuner—the idea that breakfast was a small meal.

The coffee cart takes no sides in this debate. The variety of its goods allows us to break our fast with what we think will best serve our needs. Marketers know that breakfast loyalty is lasting: we will eat the same cereals and foods we ate as children, and we’re likely to offer those cereals and foods to our own children. But weekday breakfasts, which may be eaten at your desk or during your commute may need to match what what others around you are eating or be as unobtrusive as possible. Perhaps we’ve settled on that coffee and pastry because we’re not sure what we can eat or because it imparts a certain status within our group.

https://getpocket.com/explore/item/the-breakfast-economy?utm_source=pocket-newtab

Oriana:

And now the emerging trend is to skip breakfast in order to promote the period of fasting -- we know that fasting is good for our health. Do not "break the fast." Keep fasting until lunch. People are amazingly willing to go hungry when given a health rationale for it: the best food is no food! 

So, what is next? Maybe a salad for breakfast. After all, we can't seem to get enough of the leafy greens, with a bit of protein tossed in.

But, who knows . . . if people ever grasp again the fact that egg yolks are fabulously nutritious, the equivalent of a multivitamin, except better, perhaps it will be back to eggs again -- or maybe a salad with egg and avocado slices slices. 


*

ending on beauty:

Job 30:29*

No angel wings for me.
Instead, a barred owl’s.
I long to be able to move
through day and night
in silence.

~ Laura Kaminsky

*“I am a brother to dragons, and a companion to owls.” This may tie in with a lot of suffering making a person embittered. In this instance, though, the poet chose to identify with the owls and see being an owl in the light of silent beauty. 



No comments:

Post a Comment