Sunday, October 3, 2021

CRITICIZING ISLAM WITHOUT DEMONIZING MUSLIMS; ULTIMA THULE; THE NEXT PANDEMIC? HOW THE BRAIN DEFENDS ITSELF AGAINST AGING; PREVENTING FATTY LIVER DISEASE; BILLION-YEAR GAP IN THE FOSSIL RECORD

Georgia O’Keeffe: Jack in the pulpit

*
ULTIMA THULE

With favoring winds, o'er sunlit seas,
We sailed for the Hesperides,
The land where golden apples grow;
But that, ah! that was long ago.

How far, since then, the ocean streams
Have swept us from that land of dreams,
That land of fiction and of truth,
The lost Atlantis of our youth!

Whither, ah, whither? Are not these
The tempest-haunted Orcades,
Where sea-gulls scream, and breakers roar,
And wreck and sea-weed line the shore?

Ultima Thule! Utmost Isle!
Here in thy harbors for a while
We lower our sails; a while we rest
From the unending, endless quest.

~ H.W. Longfellow

Islands north of Norway

Oriana:


Ultima Thule — in antiquity, the northernmost region of the earth, the limit of travel and discovery.

I admit to harboring (ahem) a lot of affection for some of Longfellow’s classics. Ultima Thule is a contrast between the sunlit dream and the stormy reality where “wreck and sea-weed line the shore.” It could easily be the contrast between youth and older age, the heroic ego project and the diminished expectations that following the shattering of dreams.

As Longfellow puts it:

With favoring winds, o'er sunlit seas,
We sailed for the Hesperides,
The land where golden apples grow;
But that, ah! that was long ago.

How far, since then, the ocean streams
Have swept us from that land of dreams,
That land of fiction and of truth,
The lost Atlantis of our youth!

As I say in my one of my poems, Horizontal Rain:

Mountains I haven’t climbed
I would no longer climb.
The arson of passion
lay smoldering behind me.

There’s the heroic ego quest of the first half of life — to put it more simply, the dreams of youth, which can be outrageously ambitious (I’ve discovered that it’s more the rule than the exception) — and the “diminished expectations” of the second half of life. One way or another, we try to make the best of of what we come to know as reality rather than the fantasy we started out with. And yes, after the school of hard knocks we need some rest and recovery. Then, with luck, new meanings unfold, and the quest, though transformed, does indeed seem endless.

In my personal perspective, there’s something to be said for reaching Ultima Thule: it’s a place of rest. It’s not a “stepping stone” to something else. You no longer have to prove yourself. Now you can have a much more mellow relationship with reality. And actually it’s a lot more interesting to have a relationship with reality rather than with a dream.

The first stanza is especially meaningful to me, in a personal sense: the dream of America.

This is the dream speaking:

The untold want by life and land ne'er granted,
Now, voyager, sail thou forth, to seek and find.
~ Walt Whitman

But paradise is wherever you are, if you are attuned to beauty:

“Look around you—
the clear sky, the pure air, the tender grass,
the birds; nature is beautiful and sinless,
and we, only we, are foolish
and we don’t understand that life is heaven
,
for we have only to understand that
and it will at once be fulfilled in all its beauty,
we shall embrace each other and weep.”

~ Fyodor Dostoevsky, The Brothers Karamazov

*

As for the actual island that may have inspired the phrase “Ultima Thule,” Thule’s enigmatic and complicated history begins in the fourth century BCE, when the ancient Greek explorer Pytheas left the port city of Massalia—now Marseille, France—in search of new trading opportunities in the Far North. 

Pytheas and his commercial backers had a special interest in finding amber, used as a form of currency, as well as tin, a key ingredient in manufacturing bronze. Sailing at first west, then north, Pytheas arrived at and mapped the coastline of Prettanikē—now the British Isles—and then boldly headed farther north into uncharted territory. 

And there the journey entered an unworldly realm. After a few days’ sail, Pytheas reached a place he described as neither earth nor sea, “but instead a sort of mixture of these similar to a marine lung, in which the earth and the sea and all things together are suspended, and this mixture is … impassable by foot or ship.” Pytheas landed nearby, on an island whose name he heard as Thule. Eventually he returned to Massalia and wrote his masterwork, On the Ocean, an account of his voyage and a treatise of enormous influence in the ancient world.

photo: Charles Sherman

*
TELLING FORTUNES

My mother used to tell fortunes from cards — not Tarot, but ordinary playing cards. She saw no contradiction between being a scientist and telling fortunes — though of course she had the out, as we all do, of saying it was just for fun, to ponder the unknowns. “If hard times should hit again, I put on a scarf and set myself up as a gypsy,” she liked to joke — though I sensed it wasn’t entirely a joke — she always had Plan B.

I never “believed” in the occult, but something about it was very attractive. I’ll never forget how, during a long break in a Jungian lecture, the woman next to me turned out to be an astrologer — and in no time I was telling her things that I’d never told even my closest friends. Her interpretation was of course completely astrological: “He has to do that, he is a Cancer” — and that made me completely forgiving toward that particular man, completely accepting.

And again I experienced the insight that I'm sure I had before, and more than once: people will tell things to their psychic or astrologer that they will not tell to their therapist. The difference is unconditional acceptance. Religion talked about unconditional love, but delivered condemnation instead. It saw people as evil. But here, no matter how nonsensical the basis, you were like a dear child — you were an Aries, what else could you have done? A true Scorpio Rising, of course you didn’t tell anyone, except now, confiding in this dubious stranger with her soothing voice — a mother figure who understood all, forgave all.

*
THE GREAT CHICAGO FIRE OF 1871 AND OTHER NEARBY FIRES

~ The 1871 fire killed an estimated 300 people. It turned the heart of the city, wood-frame buildings quickly constructed on wooden sidewalks, into ruins, and left 100,000 people homeless.

Our family has an engraving from the London Illustrated News of Chicagoans huddled for their lives along an iron bridge. The reflection of flames makes even the Chicago River look like a cauldron.

Like the Great Fire of London in 1666, the San Francisco Earthquake of 1906, and Hurricane Katrina in 2005, the Great Chicago Fire reminds us that big, swaggering cities can still be fragile. 

But that same night, about 250 miles north of Chicago, more than 1,200 people died in and around Peshtigo, Wis. It was the deadliest wildfire in U.S. history. Survivors said the flames blew like hurricanes, jumping across Green Bay to light swaths of forest on the opposite shore. A million and a half acres burned. 

Chicago's fire came to be seen as a catastrophe that also ignited the invention of steel skyscrapers, raised up on the the city's ashes. It has overshadowed the Peshtigo fire. And for years, the two were seen as separate, almost coincidental disasters. 

Many of those houses and sidewalks that burned in Chicago had been built with timbers grown around Peshtigo, in forests conveniently owned by William Ogden, Chicago's first mayor. He owned the sawmill too. 

Chicago's fire was long blamed — falsely — on an Irish-immigrant family's cow kicking over a lantern. Some people thought the Peshtigo fire started when pieces of a comet landed in the forest, which has never been proven. 

What we understand better today was that the Midwest was historically dry in the summer of 1871. When a low-pressure front with cooler temperatures rolled in, it stirred up winds, which can fan sparks into wildfires. The fires themselves churn up more winds. Several parts of nearby Michigan also burned during the same few days; at least 500 people were killed there.

150 years later, all of those fires on an autumn night in 1871 might help us see even more clearly how rising global temperatures and severe droughts, from Australia to Algeria to California, have made forests more tinder-dry, fragile, and flammable, and people more vulnerable to the climate changes we've helped create. ~

The Randolph Street Bridge during the Great Chicago Fire

*
ARE WE ALREADY BARRELING TOWARD THE NEXT PANDEMIC?

~ America’s frustrating inability to learn from the recent past shouldn’t be surprising to anyone familiar with the history of public health. Almost 20 years ago, the historians of medicine Elizabeth Fee and Theodore Brown lamented that the U.S. had “failed to sustain progress in any coherent manner” in its capacity to handle infectious diseases. With every new pathogen—cholera in the 1830s, HIV in the 1980s—Americans rediscover the weaknesses in the country’s health system, briefly attempt to address the problem, and then “let our interest lapse when the immediate crisis seems to be over,” Fee and Brown wrote. The result is a Sisyphean cycle of panic and neglect that is now spinning in its third century. Progress is always undone; promise, always unfulfilled. Fee died in 2018, two years before SARS-CoV-2 arose. But in documenting America’s past, she foresaw its pandemic present—and its likely future.

It might seem ridiculous to think about future pandemics now, as the U.S. is consumed by debates over booster shots, reopened schools, and vaccine mandates. Prepare for the next one? Let’s get through this one first! But America must do both together, precisely because of the cycle that Fee and Brown bemoaned. Today’s actions are already writing the opening chapters of the next pandemic’s history.

Internationally, Joe Biden has made several important commitments. At the United Nations General Assembly last week, he called for a new council of national leaders and a new international fund, both focused on infectious threats—forward-looking measures that experts had recommended well before COVID-19.

More Americans have been killed by the new coronavirus than the influenza pandemic of 1918, despite a century of intervening medical advancement. The U.S. was ranked first among nations in pandemic preparedness but has among the highest death rates in the industrialized world. It invests more in medical care than any comparable country, but its hospitals have been overwhelmed. It helped develop COVID-19 vaccines at near-miraculous and record-breaking speed, but its vaccination rates plateaued so quickly that it is now 38th in the world. COVID-19 revealed that the U.S., despite many superficial strengths, is alarmingly vulnerable to new diseases—and such diseases are inevitable. As the global population grows, as the climate changes, and as humans push into spaces occupied by wild animals, future pandemics become more likely. We are not guaranteed the luxury of facing just one a century, or even one at a time.

On September 3, the White House announced a new strategy to prepare for future pandemics. Drafted by the Office of Science and Technology Policy, and the National Security Council, the plan would cost the U.S. $65 billion over the next seven to 10 years. In return, the country would get new vaccines, medicines, and diagnostic tests; new ways of spotting and tracking threatening pathogens; better protective equipment and replenished stockpiles; sturdier supply chains; and a centralized mission control that would coordinate all the above across agencies. The plan, in rhetoric and tactics, resembles those that were written before COVID-19 and never fully enacted. It seems to suggest all the right things.

But the response from the health experts I’ve talked with has been surprisingly mixed. “It’s underwhelming,” Mike Osterholm, an epidemiologist at the University of Minnesota, told me. “That $65 billion should have been a down payment, not the entire program. It’s a rounding error for our federal budget, and yet our entire existence going forward depends on this.” The pandemic plan compares itself to the Apollo program, but the government spent four times as much, adjusted for inflation, to put astronauts on the Moon. Meanwhile, the COVID-19 pandemic may end up costing the U.S. an estimated $16 trillion.

*

In 1849, after investigating a devastating outbreak of typhus in what is now Poland, the physician Rudolf Virchow wrote, “The answer to the question as to how to prevent outbreaks is quite simple: education, together with its daughters, freedom and welfare.” Virchow was one of many 19th-century thinkers who correctly understood that epidemics were tied to poverty, overcrowding, squalor, and hazardous working conditions—conditions that inattentive civil servants and aristocrats had done nothing to address. These social problems influenced which communities got sick and which stayed healthy. Diseases exploit society’s cracks, and so “medicine is a social science,” Virchow famously said. Similar insights dawned across the Atlantic, where American physicians and politicians tackled the problem of urban cholera by fixing poor sanitation and dilapidated housing. But as the 19th century gave way to the 20th, this social understanding of disease was ousted by a new paradigm.

When scientists realized that infectious diseases are caused by microscopic organisms, they gained convenient villains. Germ theory’s pioneers, such as Robert Koch, put forward “an extraordinarily powerful vision of the pathogen as an entity that could be vanquished,” Alex de Waal, of Tufts, told me. And that vision, created at a time when European powers were carving up other parts of the world, was cloaked in metaphors of imperialism, technocracy, and war. Microbes were enemies that could be conquered through the technological subjugation of nature. “The implication was that if we have just the right weapons, then just as an individual can recover from an illness and be the same again, so too can a society,” de Waal said. “We didn’t have to pay attention to the pesky details of the social world, or see ourselves as part of a continuum that includes the other life-forms or the natural environment.”

Germ theory allowed people to collapse everything about disease into battles between pathogens and patients. Social matters such as inequality, housing, education, race, culture, psychology, and politics became irrelevancies. Ignoring them was noble; it made medicine and science more apolitical and objective. Ignoring them was also easier; instead of staring into the abyss of society’s intractable ills, physicians could simply stare at a bug under a microscope and devise ways of killing it. Somehow, they even convinced themselves that improved health would “ultimately reduce poverty and other social inequities,” wrote Allan Brandt and Martha Gardner in 2000.

But here is public health’s bind: Though it is so fundamental that it can’t (and arguably shouldn’t) be tied to any one type of emergency, emergencies are also the one force that can provide enough urgency to strengthen a system that, under normal circumstances, is allowed to rot. When a doctor saves a patient, that person is grateful. When an epidemiologist prevents someone from catching a virus, that person never knows. Public health “is invisible if successful, which can make it a target for policy makers,” Ruqaiijah Yearby, the health-law expert, told me. And during this pandemic, the target has widened, as overworked and under-resourced officials face aggressive protests. “Our workforce is doing 15-hour days and rather than being glorified, they’re being vilified and threatened with bodily harm and death,” Harrison told me. According to an ongoing investigation by the Associated Press and Kaiser Health News, the U.S. has lost at least 303 state or local public-health leaders since April 2020, many because of burnout and harassment.

Even though 62 percent of Americans believe that pandemic-related restrictions were worth the cost, Republican legislators in 26 states have passed laws that curtail the possibility of quarantines and mask mandates, as Lauren Weber and Anna Maria Barry-Jester of KHN have reported. Supporters characterize these laws as checks on executive power, but several do the opposite, allowing states to block local officials or schools from making decisions to protect their communities. Come the next pandemic (or the next variant), “there’s a real risk that we are going into the worst of all worlds,” Alex Phelan, of Georgetown University, told me. “We’re removing emergency actions without the preventive care that would allow people to protect their own health.” This would be dangerous for any community, let alone those in the U.S. that are structurally vulnerable to infectious disease in ways that are still being ignored.

But inequity reduction is not a side quest of pandemic preparedness. It is arguably the central pillar—if not for moral reasons, then for basic epidemiological ones. Infectious diseases can spread, from the vulnerable to the privileged. “Our inequality makes me vulnerable,” Mary Bassett, who studies health equity at Harvard, told me. “And that’s not a necessary feature of our lives. It can be changed.”

In this light, the American Rescue Plan—the $1.9 trillion economic-stimulus bill that Biden signed in March—is secretly a pandemic-preparedness bill. Beyond specifically funding public health, it also includes unemployment insurance, food-stamp benefits, child tax credits, and other policies that are projected to cut the poverty rate for 2021 by a third, and by even more for Black and Hispanic people. These measures aren’t billed as ways of steeling America against future pandemics—but they are. Also on the horizon is a set of recommendations from the COVID-19 Health Equity Task Force, which Biden established on his first full day of office. “The president has told many of us privately, and said publicly, that equity has to be at the heart of what we do in this pandemic,” Vivek Murthy, the surgeon general, told me.

Last year, “for a moment, we were able to see the invisible infrastructure of society,” Sarah Willen, an anthropologist at the University of Connecticut who studies Americans’ conceptions of health equity, told me. “But that seismic effect has passed.” Socially privileged people now also enjoy the privilege of immunity, while those with low incomes, food insecurity, eviction risk, and jobs in grocery stores and agricultural settings are disproportionately likely to be unvaccinated. Once, they were deemed “essential”; now they’re treated as obstinate annoyances who stand between vaccinated America and a normal life.

The pull of the normal is strong, and our metaphors accentuate it. We describe the pandemic’s course in terms of “waves,” which crest and then collapse to baseline. We bill COVID-19 as a “crisis”—a word that evokes decisive moments and turning points, “and that, whether you want to or not, indexes itself against normality,” Reinhart told me. “The idea that something new can be born out of it is lost,” because people long to claw their way back to a precrisis state, forgetting that the crisis was itself born of those conditions.

Better ideas might come from communities for whom “normal” was something to survive, not revert to. Many Puerto Ricans, for example, face multiple daily crises including violence, poverty, power outages, and storms, Mónica Feliú-Mójer, of the nonprofit Ciencia Puerto Rico, told me. “They’re always preparing,” she said, “and they’ve built support networks and mutual-aid systems to take care of each other.” Over the past year, Ciencia PR has given small grants to local leaders to fortify their communities against COVID-19. While some set up testing and vaccination clinics, others organized food deliveries or educational events. One cleaned up a dilapidated children’s park to create a low-risk outdoor space where people could safely reconnect. Such efforts recognize that resisting pandemics is about solidarity as well as science, Feliú-Mójer told me.

The panic-neglect cycle is not irresistible. Some of the people I spoke with expressed hope that the U.S. can defy it, just not through the obvious means of temporarily increased biomedical funding. Instead, they placed their faith in grassroots activists who are pushing for fair labor policies, better housing, health-care access, and other issues of social equity. Such people would probably never think of their work as a way of buffering against a pandemic, but it very much is—and against other health problems, natural disasters, and climate change besides. These threats are varied, but they all wreak their effects on the same society. And that society can be as susceptible as it allows itself to be. ~

https://www.theatlantic.com/health/archive/2021/09/america-prepared-next-pandemic/620238/?utm_source=pocket-newtab


from another source:

Rural Americans are dying of Covid at more than twice the rate of their urban counterparts — a divide that health experts say is likely to widen as access to medical care shrinks for a population that tends to be older, sicker, heavier, poorer and less vaccinated.

While the initial surge of Covid-19 deaths skipped over much of rural America, where roughly 15 percent of Americans live, nonmetropolitan mortality rates quickly started to outpace those of metropolitan areas as the virus spread nationwide before vaccinations became available, according to data from the Rural Policy Research Institute.

Since the pandemic began, about 1 in 434 rural Americans have died from Covid, compared with roughly 1 in 513 urban Americans, the institute’s data shows. And though vaccines have reduced overall Covid death rates since the winter peak, rural mortality rates are now more than double that of urban ones — and accelerating quickly.

https://www.nbcnews.com/health/health-news/covid-killing-rural-americans-twice-rate-people-urban-areas-n1280369

*
THE GENTLEMANLY END OF THE AMERICAN REVOLUTION

~ The American War for Independence lasted six years, from April 19, 1775—when “the shot heard round the world” was fired at the battle of Concord—to the British surrender at Yorktown on October 19, 1781. The two sides could have hardly looked more different. Britain had a highly professional expeditionary force; the revolutionaries had an improvised assortment of Continentals and militiamen. The British, in order to win, had to vanquish Patriot armies and persuade Loyalists that Britain was determined to protect them and hold onto its American outpost. The Americans had the advantage of fighting on their own ground, and they had time on their side. 

The conflict began in New England, stalled on the Middle Atlantic coast, and ended in the South. There it pitted one of the eighteenth century’s greatest commanders—Lieutenant General Charles, Second Earl Cornwallis—against Americans of varying competence, from Horatio Gates at the low end all the way up to Nathaniel Greene. On paper Cornwallis ought to have won, and in the field he often did. Yet he fell short in the end.

The earl was not the only Englishman responsible for losing America, but to Southerners, Cornwallis remains the archetypal Redcoat. The conflict that consumed America from 1861 to 1865 with its attendant cloud of slavery does not monopolize war memory in the South. Before Sherman burned Columbia, Cornwallis and the British captured Charleston in a classic siege. At Camden, South Carolina, they destroyed American forces in a set-piece engagement of line warfare. At Guilford Courthouse in North Carolina, they held the field against America’s best troops and commanders. In dozens of backcountry skirmishes and guerilla actions in the Carolinas—Waxhaws, Blackstock’s Plantation, Fish Dam Ford, Musgrove’s Mill, Eutaw Springs—Crown forces and Patriot bands furiously battled each other in swamps and scrublands that remain hallowed grounds to this day. No single instance of such irregular combat was decisive, but it wearied the British and reminded Patriots that at least they had not yet lost.

With things bogged down in the North, the British had decided to roll up the rebellion from the South: from Georgia to South Carolina, North Carolina, and finally Virginia. It was a plausible plan that nearly worked. Scholars tell us why it didn’t: a weak chain of command from the civil authority in London to theater and field commanders in America; faulty assumptions about the Loyalists’ readiness to join Britain’s fight; a British force that, however seasoned and professional, was still too small to carry out an aggressive clear-and-hold strategy; and failure to retain naval supremacy at a critical moment. It was on this last point that the conflict finally turned. Had the British and not the French won the Battle of the Capes off Cape Henry and Cape Charles in the early autumn of 1781, enabling Cornwallis to resupply or possibly evacuate to a stronger position, the war might not have ended as precipitously as it did at Yorktown. The French had not promised their American allies to stay on forever, and the Redcoats still had plenty of fight left in them, which made surrender, when it came so suddenly, all the more ignominious.

Defeat in America, however, did little damage to Cornwallis’s career. He triumphed in a pamphlet war in England with Sir Henry Clinton, Britain’s North American commander-in-chief, over which man was most to blame for the American debacle. Two years later, with American independence formally recognized in the Treaty of Paris, the military controversy fell from the headlines. From Britain’s perspective, Cornwallis is most remembered as the highly regarded governor-general of India (1786–93), where he brought in major administrative reforms and in battle defeated the Sultan of Mysore, assuring British dominion. He later served as Lord Lieutenant, or Viceroy, of Ireland (1798–1801), where he crushed the Wolfe Tone rising and repelled French invaders. And he was a key figure in the Acts of Union of 1800, which created the United Kingdom of Great Britain and Ireland.

The Indian parallel with America is telling. In India, Cornwallis was not the last Redcoat, but among the first, who still had a long run ahead of them to 1947. It was how that run ended that besmirched much of the good that had gone before. Winston Churchill condemned Britain’s clamorous exit as a shameful scuttle with horrific consequences: largely reputational for the British, but quite real for the Indians. In contrast, when the last Redcoat quit the Virginia field 166 years earlier, it was with honor intact. By eighteenth-century standards, Cornwallis was an exceptionally humane commander, and it is history’s judgment that few could have done better with the resources at hand in a difficult war so far from home. Under his command, the action in the closing southern campaign was well fought, and defeat, when it came, was accepted as such.

The surrender scene at Yorktown that October 1781—matched only by the stillness at Appomattox in April 1865 and on the battleship Missouri in Tokyo Bay in September 1945—is etched deep in the mythology of American nationhood. Hopelessly outnumbered and outgunned by combined Franco-American forces under George Washington and the Comte de Rochambeau, Cornwallis chose honorable surrender, which was acceptable under the current laws of war. Washington dictated that surrendering combatants be treated as prisoners of war and then paroled. At noon on October 19, the British garrison lined up to ground arms in a field along the Williamsburg Road. As Cornwallis was disabled with malaria and dysentery, the ceremony was carried out between the seconds-in-command. The British General Charles O’Hara surrendered his sword to the American General Benjamin Lincoln, the same officer who had surrendered Charleston to the British in May 1780, at the start of the southern campaign.

Britain’s presence in the thirteen colonies concluded with an orderly withdrawal. Some Loyalists departed with the Redcoats, but most remained behind in their homeland and blended into the new order. For the Americans, on whom fortune happened to shine, the conflict had been all about a cause: independence from Britain and the establishment of a new American nation. For the British, the motive for conflict had been more mundane and defensive: to maintain an imperial presence deemed vital to Britain’s interests and security. Success required a combination of institutional, administrative, and indeed intestinal fortitude—as well as patience—that late-eighteenth-century Britain could not yet muster, though the heirs of those who directed the war against America later learned to do so. Such capability now appears in retreat across the West. Cornwallis at Yorktown merely pointed the way; all the Redcoats are gone now. ~

https://newcriterion.com/blogs/dispatch/the-last-redcoat?fbclid=IwAR1RP355V9Is6t4PMoR24XSWuGVcRTfakWdc1iIrqAAH9yQAwM4AQ55PvnU

The Surrender of Lord Cornwallis by John Trumbull, 1820

*
THE PUZZLING BILLION-YEAR GAP IN THE FOSSIL RECORD

~ It is called the "Great Unconformity," and it has puzzled geologists ever since it was first noticed in the walls of the Grand Canyon by famed geologist John Wesley Powell in 1869. In geology, time is marked by layers of rock deposited atop each other, little by little over time. The thing about the Great Unconformity is that about a billion years of rock appear to be missing between 3 billion-year-old sediment and relatively young, 550-million-year-old stuff sitting directly on top of it.

Intriguingly, that 550 million-year-date is just a few million years before the Cambrian explosion — the widespread appearance of complex life on Earth.

Although the Great Unconformity is easy to spot in the Grand Canyon, a similar disruption is apparent in lots of other places, and some geologists have hypothesized that whatever caused them was some kind of global event.

Now a new study suggests that, first, there may not have been just one unconformity but rather a series of them roughly coincident around the world. Second, they all may have had to do with an ancient supercontinent named Rodinia that formed about a billion years ago.

The new research is based on a dating technique called thermochronology. With thermochronology, a close examination of atoms inside rock samples allows geologists to construct a history of the stone based on how hot or cold it was at different times.

The researchers analyzed rock samples from a Great Unconformity site in Pikes Peak, Colorado, where the lower layer is from about a billion years ago and the rock above it from no earlier than 510 million years ago. Thermochronology revealed that the lower layer had been thrust upward to the surface about 700 million years ago, at which point it would have been subjected to erosion that scoured away its upper layers of rock.

Erosion is a powerful force. Consider the Grand Canyon. As study co-author Rebecca Flowers says, "Earth is an active place. There used to be a lot more rocks sitting on top of Mount Everest, for example. But they've been eroded away and transported elsewhere by streams."

BLAME RODINIA

It is believed that supercontinent Rodinia — which pre-dated the better-known Pangaea — formed through a process, called extrovert assembly, in which pieces of a prior supercontinent that has broken apart meet again after having traveled all the way around the planet. During their extended journey, the edges of the pieces experience significant erosion before smashing back together.

"At the edges of Rodinia," says Flowers, "where you have continents colliding, you'd see these mountain belts like the Himalayas begin to form. That could have caused large amounts of erosion." In addition, the researchers speculate that the birth and death of Rodinia may have wreaked havoc all over the world as its pieces first came together and then eventually broke apart.

Flowers concludes, "We're left with a feature that looks similar across the world when, in fact, there may have been multiple great unconformities, plural. We may need to change our language if we want to think about the Great Unconformity as being more complicated, forming at different times in different locations and for different reasons."

Other research teams, such as the one at University of California-Santa Barbara, have been coming to similar conclusions. "It's a messy process," says the school's Francis Macdonald. "There are differences, and now we have the ability to perhaps resolve those differences and pull that record out."

Solving Darwin's dilemma

Considering the Great Unconformity's temporal proximity to the Cambrian explosion, a final solution to the puzzle may have implications beyond geology. "The Cambrian explosion," says Macdonald, "was Darwin's dilemma. This is a 200-year old question. If we can solve that, we would definitely be rock stars."

https://bigthink.com/surprising-science/great-unconformity


Pikes Peak, Colorado: an example of Great Unconformity

*
BARBARA EHRENREICH ON AGING

~ To her, aging is “an accumulation of disabilities”, which no amount of physical activity or rigorous self-denial can prevent. If she has symptoms, she’ll have them investigated. But when a doctor tells her there could be an undetected problem of some kind, she won’t play along.

Experience has taught her that standard health checks are at best invasive and at worst a scam. Overdiagnosis has become an epidemic. Bone density scans, dental x-rays, mammograms, colonoscopies, CT scans: she questions them all. Preventive medical care, in the US at least, has become a lucrative industry. Many doctors profit financially from the tests and procedures they recommend. And celebrity-driven campaigns for more screening increase the demand. People are being made sick in the pursuit of wellness. An estimated 70-80% of thyroid cancer surgeries performed on American, Italian and French women in the first decade of this century are now judged to have been unnecessary, she claims. And then there are all the elderly who “end up tethered by cables and tubes to an ICU bed”, their life needlessly prolonged and demeaned.

There’s an argument that health checks have value as rituals, that beeping machines in sterile rooms provide the kind of reassurance to modern western consumers that shamanistic drumming and animal horns do in more “primitive” cultures. Ehrenreich quotes from a 1950s spoof anthropology paper, Body Rituals Among the Nacirema (“American” spelled backwards), in which supplicants lie on hard beds within temples, while magic wands are inserted in their mouths and needles jabbed in their flesh. Modern medicine invokes science in its defense. But whereas science is “evidence-based”, medicine tends to be “eminence-based”, with patients in thrall to the doctor’s superior prestige. It’s no coincidence, Ehrenreich thinks, that most American medical schools still insist on the dissection of cadavers. That’s how living patients are expected to be – as passive and silent as corpses.

Ehrenreich’s skepticism about the medical profession is informed by her feminism and dates back to a moment in late pregnancy when she asked the male obstetrician who had just removed his speculum from her vagina whether her cervix was beginning to dilate: “Where did a nice girl like this learn to talk like that?” he said to the nurse standing nearby. That kind of misogyny may be on the wane but it hasn’t gone away. Women are still needlessly forced into humiliating positions by men in white coats. Gynecological examinations “enact a ritual of domination and submission”, with the patient made to undress and be open to penetration, much as in the criminal justice system, “with its compulsive strip searches”.

Deprived of agency in her encounters with the medical profession, Ehrenreich found an alternative by taking up physical exercise, which offers greater promise of control. Initially mortified by the feebleness of her body, she developed a scary competitiveness and graduated from a women-only gym to a unisex version, where at her zenith she would outdo young men and “draw spectators for my leg presses at 270 pounds and lunges while holding a 20-pound weight in each hand”. She still works out, but no longer sees gym-going as a means of empowerment. Large businesses once notorious for exposing their workers to unhealthy conditions now promote corporate wellness programs. But the benefits are dubious (their coercive nature may even be a source of workplace stress) and the idea that if you’re less than fit you’re less than human is pernicious.

Ehrenreich has fun at the expense of health sages and fitness gurus, with their mantras about the “wisdom of the body”. It would be unfair to describe her as gleeful when she lists some noted casualties – among them Apple co-founder Steve Jobs (who died at 56), US social activist Jerry Rubin (56), The Complete Book of Running author Jim Fixx (52) and the author of the book Younger Next Year: Live Strong, Fit and Sexy – Until You’re 80 and Beyond, Henry S Lodge (58). Still, she can’t resist wryly concluding: “If this trend were to continue, everyone who participated in the fitness culture – as well as everyone who sat it out – will at some point be dead.”

That we’ll all be dead, sooner or later, is no longer indisputable. Death-deniers are a growing body; the richer the individual (invariably a man), the more hubristic their claim to immortality, whether it’s Russian billionaire Dmitry Itskov with his plans to surpass Methuselah by living to 10,000 or Larry Ellison, co-founder of Oracle, who finds mortality “incomprehensible” (“Death makes me very angry”). The diseases of old age used to be seen not just as inevitable but as kindly, even altruistic. Now they’re regarded as cruel, abnormal and, according to one expert (the author of Younger Next Year), “an outrage”.

“Every cell is on your side,” goes the holistic slogan, but Ehrenreich disproves it. Against the utopian presumption that our cells work in harmony, “like citizens of a benign dictatorship” or a smoothly running machine, she presents the body as a site of constant conflict and deadly combat, with cells pitted against each other as well as against external invaders. Those seeming good guys, the macrophages, turn out to be cheerleaders for death, encouraging cancer cells to do their worst.

Philosophically, she concedes, it’s hard to imagine immune cells being accomplices in destruction; some scientists still dispute it and she owns up to “simplifying to an extent that would annoy many cellular immunologists”. But she makes the case persuasively, with 20 pages of notes and citations to back her up.

More controversially, she sees cells acting as though with a mind of their own – not following instructions but doing as they please. “Cellular decision-making” is the term for it, though you could also call it “free will”. Not that cells possess consciousness, but they are capable of acting in ways that are neither predetermined nor random, with an outcome of inflammation and disease. It’s a pessimistic scenario but at the end of the book Ehrenreich offers a glimmer of light. For such agency to exist at a microscopic level in our bodies points to a universe swarming with activity – and to mysteries beyond our ken. To recognize that, and see oneself humbly as a transient cell in a larger animistic order of being, makes the prospect of death easier to accept.

Or so Ehrenreich concludes. After her earlier sendup of New Age platitudes, this quasi-Buddhist celebration of non-selfhood sounds less than convincing. As does her recommendation of psilocybin (the active ingredient in magic mushrooms) as a way to abolish the self and approach death with equanimity: by taking a trip on a psychedelic drug, you’ll appreciate the beauty of the universe and go more gently into that good night. Really? Might you not be keener to stick around? And more resentful of the world going on without you? Like most polemicists, Ehrenreich is more persuasive when on the attack than when it comes to offering solutions.

Still, she is one of our great iconoclasts, lucid, thought-provoking and instructive, never more so than here. That PhD in cellular immunology, left behind while she went on to write books and run campaigns, has proved useful after all.

https://www.theguardian.com/books/2018/apr/12/natural-causes-by-barbara-ehrenreich-review

From another source:

~ Ehrenreich’s concern is that medicine has grown both so powerful and so profitable that many procedures have become practically automatic and unquestionable. In particular, patients have grown accustomed to undergoing "tests and examinations that, in sufficient quantity, are bound to detect something wrong or at least worthy of follow-up" -- procedures the reliability of which are sometimes dubious, with serious consequences from false positives.

An example is the extreme rate of overdiagnosis for thyroid cancer: up 70 to 80 percent of the surgery for it performed on women in the U.S., France and Italy in the '00s is now determined to have been unnecessary, leaving patients with a lifelong dependence on thyroid hormones. Just as excessive reliance on antibiotics gave rise to "nightmare germs" they are ineffective to treat, the effort to steal a march on every medical vulnerability as you age can boomerang.

But overtesting is, by Ehrenreich's account, ultimately more a symptom than the real problem. Moving from social criticism to scientific popularization and contemplative digressions on how we situate ourselves in the natural order, she finds that both medical science and the self-help culture are prone to exaggerating the possibilities for human control over life, health and long-term success in the pursuit of happiness. She puts it this way:

"Our predecessors proceeded from an assumption of human helplessness in the face of a judgmental and all-powerful God who could swoop down and kill tens of thousands at will, while today’s assumption is one of almost unlimited human power. We can, or think we can, understand the causes of disease in cellular and chemical terms, so we should be able to avoid it by following the rules laid down by medical science: avoiding tobacco, exercising, undergoing routine medical screening and eating only foods currently considered healthy. Anyone who fails to do so is inviting an early death. Or to put it another way, every death can now be understood as suicide."

Which is ridiculous, of course, as is the point. In a section of the book that reads as if the author is planting dynamite under every "holistic" institution ever to promote wellness, she challenges the popular understanding of the body as "a smooth-running machine in which each part obediently performs its tasks for the benefit of the common good." There is evidence to the contrary in the behavior of the immune system, and we might do better to picture a norm of "conflict within the body … carried on by the body’s own cells as they compete for space and food and oxygen. We may influence the outcome of these conflicts-- through our personal habits and perhaps eventually through medical technologies that will persuade immune cells to act in more responsible ways -- but we cannot control it.”

Control is a short-term proposition, at best, while our long-range chances were best put by whoever designed that T-shirts that read "Exercise. Eat Right. Die Anyway.”
~

https://www.insidehighered.com/views/2018/04/06/review-barbara-ehrenreich-%E2%80%98natural-causes-epidemic-wellness-certainty-dying-and


 Rembrandt: Old Bearded Man

Mary: THE EQUIVALENCE OF HEALTH AND VIRTUE IS DEEPLY EMBEDDED

Ehrenreich is describing a pervasive attitude in our culture about health and wellness that is not only untrue but destructive and dangerous. The idea that you can control your "wellness" through "healthy choices" is not only misleading, it puts the blame on the sick for their own sickness. If you are sick you didn’t make the choices and do the work to stay well.

This attitude is pervasive. You deserve no sympathy, only a severe lecture. And instructions for better behavior, better "choices." You are doing everything wrong, and that's why you're sick.

This is similar to the Evangelicals' assumption that the rich are those rewarded for their virtue, so poverty is a moral failure by the poor. In terms of health the assumption is that the ones scrupulous about diet and activity, keeping "fit,"  up to date on their "preventive screenings," and avoiding bad, unhealthy habits, are virtuous and deserving. The sick have sinned against those dictates, causing their own sickness, and deserve judgement more than help.

This is simply not true. There is no set of rules to guarantee health and longevity -- witness that list of early deaths in the devoted "wellness" camp. You may improve your odds, but you can't win...your death is inevitable. We do know how to avoid and treat many ills, we have steadily increased average lifetimes, but often that increased time can be measured in misery and disability. You can do everything right and still get sick...I think we need to stop judging this as an individual failure. This won't  be easy, as the equivalence of health and virtue is deeply embedded. Think of how we speak of people with cancer....they put up a fight, then are defeated. This demands the victim fights fiercely and sees death as a failure, as defeat. Defeat because the person did the wrong things, wasn't a good fighter,  gave up, disappointed everybody.

The pandemic itself should teach us the limits of our control, that something unexpected can come and overwhelm our resources, that we can learn strategies to lessen its toll, but it can play havoc with things like average life expectancy, and even our ability to use the weapons we have to address other severe problems, as when surgeries must be cancelled, or patients left untreated because all resources are tied up. This current crisis certainly involves virtue judgement as well. The unvaccinated carry the burden of condemnation, their refusal the sin causing their own ills, and prolonging/worsening the hold of the pandemic on the rest of us.   It's hard not to want to see them punished, and many have spoken their desire to see them refused care at all.

Of course there are many other factors determining the situation...class, education, location, vocation, politics, ethnicity, each part of our present fix, each only partially understood, and addressed feebly at best.

A word on "preventive medicine " and "screenings." Screenings have become available through technology, and are now almost universally "required." Our doctors and health organizations promote them unflaggingly, and, again, those who refuse are seen as irresponsible and careless. But while they may be useful in revealing disease at an early and treatable stage, they also come up with both false positives and questionable results. Some discovered cancers are so small and slow growing the patient might live out his life without significant threat.

But in all cases the procedures generate both income and treatment, primarily surgeries, which also generate income. So I'm sure there are many many unnecessary interventions...at the same time, there is uncertainty enough that most would go with intervention anyway "just to be sure." Among those who have medical coverage (an important caveat) those who refuse preventive screenings are rare. Among those whose screenings show something positive, those who refuse treatment are even rarer.

The medical issues got me carried away here! So many truly hard dilemmas.

Oriana:

I’ve posted on Ehrenreich before. What she says has become so controversial that it’s worth discussing again and again.

There is indeed a self-righteous attitude in both mainstream medicine and alternative health care. There is a rejection of obvious verities, e.g. the most important risk factor for cancer, heart disease and many other nasty diseases is age — the underlying process of aging itself. And you certainly can’t blame someone for being, say, 85. No amount  of diet and exercise can prevent the cumulative damage of aging. To some extent, we can slow down that damage, but we must be aware that ultimately we are not in control.

I like Ehrenreich’s attitude of being “old enough to die” — and needing to arrange one’s life accordingly. Do we want to spend our last years in the medical gulag, going from one specialist to another, having one invasive test after another, undergoing treatments that destroy the quality of any remaining life and eventually fail anyway? Sometimes non-testing and non-treatment is the most rational and humane approach — also in terms of the cost to society, which in the US has reached absurd levels.

(And Ehrenreich also says something that’s very hard to reconcile with the platitudes about the  “wisdom of the body”  — namely, it’s our immune system that turns against us and kills us in the end.)

Having said that, can we do some things that help, without having the delusion of immortality. Una’s example showed me that having something to live for — in her case, poetry and participation in the poetry community — can keep you alive into your nineties, still enjoying life despite of disability. And having something to live for is probably the most important thing, and that lies outside of medicine. Humans need to feel that life is worth living, a challenge to many in their post-retirement years.

Avoiding junk food is important — it’s what we *don’t eat* that seems to matter more than any specific diet. Too many supplements can wreck your liver, but a few are worth taking: berberine, whose action is equivalent to metformin, switching the body to a more beneficial type of metabolism (berberine is actually better than metformin because it also dramatically improves the lipid profile); OMAX curcumin (unfortunately only this brand seems to work) has a potent effect on a master switch of inflammation; Vitamin K2 protects blood vessels from calcification; ubiquinol (the active form of CoQ10) protects the heart and lowers blood  pressure. I'm also watching for new studies of quercetin and astaxanthin — mainly for their anti-inflammatory effect (aging and inflammation are inseparable — and note that the crucial feature of inflammation is the involvement of the immune system; and note again that Ehrenreich has a PhD in immunology).

Ehrenreich has empowered some readers to refuse incessant testing and dubious interventions. I started refusing some of these long before reading Ehrenreich’s book, simply because the waste of time and the torture involved did not seem worth it. Thanks to me my mother avoided the torture of mammography (what sense does it make past the age of 85? If they find something, the woman won't survive the treatment). Now I feel all the more empowered, as well as convinced that our medical system needs a serious re-thinking. 


Galega officinalis, or goat's rue, contains galegine, the substrate for the anti-diabetic and life-extending drug metformin. Metformin began to be used in France in 1957; the U.S. approved it in 1995. It is now the most widely prescribed drug treatment for diabetes.

*

But here is some good news about the brain:

*
HOW THE BRAIN DEFENDS ITSELF AGAINST AGING

~ The brain is a well-designed machine. If it’s working well—like when it’s reading the words on this page—we don’t notice it at all. At night when we sleep, the brain takes our consciousness offline so it can start its real work: sorting through the day’s information, storing the important parts, and cleaning out the gunk that accumulated.

The brain is so well-designed, in fact, that we hardly even notice when it’s breaking down. 
Like the rest of our organs, the brain undergoes its own aging process. And yet the majority of adults don’t experience major cognitive decline—the kind that severely limits their ability to live independently—over time. 

That’s because the brain is one of the most resilient organs in the body. Yes, dementia affects about 5.6% of the world’s population, a share that includes the devastating burden of Alzheimer’s disease. But in normal aging, even as parts of the brain shrink and neurons lose connection with each other, those changes only have a minor effect on our daily lives. It may be frustrating to forget where you put your keys, but you can still learn that you’re prone to forgetting them, and pick up the habit of writing notes for yourself. 

For adults who remain neurologically healthy into their later years, the brain constantly adapts and even thrives under new conditions. But how it pulls it off is a mystery scientists are still trying to solve. The hope is that if researchers can understand how healthy brains stay resilient, they can identify what’s happening when these systems fail—often, leading to dementia. 

NEVER CONSTANT

The brain’s incredible resilience comes, at least in part, from its plasticity. The rest of the body’s organs carry out roughly the same job from the moment we’re born—albeit on a larger scale as we grow. The heart pumps blood, the liver and kidneys filter, and the stomach churns food.

Not the brain.

Babies’ brains are equipped with billions of neurons, but they have to be warmed up and molded to be useful. Over as many as 25 years, neurons form hundreds of thousands of connections as we learn and make memories. Some of these connections are cropped as they’re not needed; others grow stronger as we learn to reason in the abstract, mitigate impulsive and risky behavior, and plan ahead for the future. 

Shortly after the brain finishes fully forming, though, it starts to wear down. 

“Aging is a lifelong biological process,” says Kristin Kennedy, a neuroscientist at the University of Texas at Dallas who studies healthy cognitive aging. There’s some disagreement about exactly when the brain starts to show signs of wear and tear. Some of the limited research available suggests it happens around middle age, some suggests our 30s, and some even in our 20s. But the consensus is that some shrinkage is inevitable and normal. Specifically, the prefrontal cortex and medial lobes—areas involved with high-level functions like planning, emotional processing, learning, and memory—get a little smaller, says Elizabeth Zelinski, a neuroscientist and gerontologist at the University of Southern California. 

Research hasn’t shown that brain shrinkage causes those mild memory changes, though. And on the whole, the physical changes in the brain barely affect our daily lives. Take, for example, the results of the Long Beach Longitudinal Study, which Zelinski started in 1978 to track the cognitive health of hundreds of healthy adults. One of its findings was that people forget roughly one tenth of a word per year. In other words, Zelinski says, if you could remember a series of 17 words in one year, it’d take a full decade for you to only be able to recall 16—a decline most people wouldn’t even notice. The brain’s plasticity, on average, keeps our memories delightfully intact. 

Only about 5% to 8% of adults go on to develop dementia, which is characterized as severe cognitive decline that inhibits everyday function. Scientists still aren’t sure what goes wrong in these cases, and why these individuals’ brains are vulnerable in ways that normal, healthy brains are not.

The problem, says Jonathan Hakun, a psychologist at Penn State University, is that “nothing’s really entirely normal.” 

RESISTANT OR RESILIENT? THE GREAT DEBATE

Advances in neuroimaging have both helped and hindered the study of aging in the brain. It turns out, even when older adults carry out the same tasks as younger adults, their brains may be functioning completely differently. “The functional changes are wild,” Hakun says.

When we complete mentally stimulating tasks, we engage neurons all across our brains. Distant neurons talk to each other through axons, which extend, tentacle-like, from the body of one neuron to tickle another with an electrical signal. Scientists refer to the network of axons in our brains as “white matter,” while the bodies of neurons and other cells in the brain are “grey matter.”  

Starting in the early 2000s, imaging tools like MRIs gave scientists the insight that some cortical shrinking is common in older adults. But they uncovered another change in older brains, too: Imaging showed the structural integrity of the white matter in brains was a little weaker. In order to send electrical signals across brain regions, these neurons had to work harder than they would in a younger brain.

It’s the concept of less wiring, more firing,” says Hakun. 

This phenomenon is called “hyperactivation,” and it’s not clear that it’s pathological—it could just be a normal part of aging. Practically, hyperactivation across older neurons means that they could get tired out faster than younger neurons. It may be why some healthy older adults experience gradual cognitive decline over time; their neural networks just can’t communicate through axons the way they used to.  

But other older adults with hyperactivated brains can complete tasks just as well as younger brains. “That’s a functional reorganization effect,” says Hakun. Somehow, the brain has been able to reconfigure itself to carry out the same task despite its changes in white matter.

With the advent of the PET scan shortly after the MRI, researchers gained another insight into older brains. PET scans show scientists which proteins are present in different areas of the brain. Normally, brains keep themselves pretty clean, flushing out waste proteins that result from day-to-day cellular activities. 

But some get a little messier over time—unbeknownst to their owner. Researchers have found that some perfectly healthy individuals have higher levels of a protein called amyloid-beta in their brains than others. It’s not clear why. It could be that their brains are making more amyloid than they should, or that their brains aren’t cleaning it out properly. In either case, extra amyloid usually isn’t great; it can inflame and eventually kill neurons, and it’s one of the main culprits of Alzheimer’s disease. 

And yet having having higher levels doesn’t necessarily mean a person will go on to develop the disease. Some brains have gotten used to operating at the same level with more amyloid around, or have found ways to cope with its accumulation. 

Which brings up an interesting question: What does “resilience” in the brain actually mean? Should scientists say resilient brains are those that seem to be able to cope with hyperactivation or buildups of protein that should impair their function, but don’t? Or should the resilient ones be those that are somehow resistant to these changes in the first place? 

In a way, it doesn’t matter how the brain stays healthy—just that it does however it can. “The brain has a lot of redundant systems,” says Zelenski. “So basically, if one system gets knocked out, other parts will compensate over time.” 

Scientists aren’t sure how, but it appears that experiences earlier in life that involve connecting more networks of neurons—classroom learning is a classic example—make it easier for brains to adapt to physical changes that occur later in life. Frustratingly, though, not knowing how these cognitive reserves form and keep the brain resilient means it’s difficult to harness this capacity into a treatment for dementia.

SUPER AGERS

There is one other route that researchers could go to try to understand normal cognitive aging: the so-called Super Agers. 

According to Emily Rogalski, a neuroscientist at Northwestern University, Super Agers are those who are at least 80 years old, but perform on cognitive memory tests the same way you’d expect someone in their 50s or 60s. Rogalski and her team have spent over a decade recruiting Super Agers to her lab and studying them over time. To date, she’s had about 80 participants, some of whom have passed away but donated their brains to the research team.

It’s hard to say how many Super Agers are out there. Rogalski’s lab merely advertises for them, and in some cases recruits them; it’s impossible to do a population estimate based on the people who come to her. However, she suspects they’re rare—which, for her research, would ultimately be a good thing. “It’s easier to see if they have some features in common,” she says.

Superficially, Super Agers aren’t a homogenous group. Rogalski’s set spans socioeconomic backgrounds, education levels, and ethnicities. Some have gone through significant trauma; one is a Holocaust survivor, and others have lost children at a young age. Some drink alcohol, some exercise regularly, and some need walkers or wheelchairs to get around. 

They share some deeper features, though. For one thing, brain scans have shown that the front part of their brains, the frontal cortex, is a lot larger than their normal peers. It still appears to shrink—but at a much slower rate than normal agers. They also seem to have more of a specific type of neuron, called von Economo neurons. Little is known about these cells, but they’re less common in people with severe mental illness, like schizophrenia—it could be that they play a role in social connections, Rogalski says. Super Agers tend to have stronger social networks than their peers, too. 

This last trend is the most actionable for adults worried about their cognitive health. Scientists don’t know why having social connections would contribute to the brain’s resilience, but loneliness and social isolation can be early signs of  dementia.
 
Given the complexity of the brain however, there’s likely not one secret to its success. And each redundant system to keep it functioning is an opportunity for dementia researchers to someday find a successful treatment.

https://getpocket.com/explore/item/how-the-human-brain-stays-young-even-as-we-age?utm_source=pocket-newtab

*

OUR STRANGE BRAIN

Blindsight

~ “ When the rear portion of the cerebral cortex, called the occipital or visual cortex, is destroyed by trauma, tumor, or stroke, patients become completely blind, in that they are not consciously aware of any visual stimuli. This condition, called cortical blindness does have one very strange side effect in some patients, however—blindsight.

Because “unconscious” parts of the visual system (such as the superior colliculus of the midbrain which controls visual orientation) are preserved in cortical blindness, completely “blind” patients can walk across a room, weaving around obstacles in their way, and get to the other side unscathed, without ever consciously being aware that they are seeing anything. Similarly, when a ball is tossed at a person with blindsight, often that person can grab the ball in mid-flight, again, without consciously sensing anything.

Paris Syndrome

Some foreign tourists in Paris, almost always from Japan, and almost always with “typical” mental health, experience depersonalization, hallucinations, delusions, paranoia, racing heart, nausea, and vomiting when they visit Paris. French psychiatrists speculate that about 20 Japanese tourists a year suffer these symptoms because the reality of Paris (a normal, bustling big city with normal-looking people) diverges radically from the romantic Paris portrayed in Japanese media, in which all Parisians are pencil thin fashion models, and the very air of Paris is suffused with magic. In other words, otherwise healthy first-time Japanese tourists in Paris undergo an extreme case of culture shock and decompensate.” ~

https://www.psychologytoday.com/us/blog/long-fuse-big-bang/201805/six-brain-phenomena-should-be-impossible-arent

Oriana:

There is also the “Jerusalem Syndrome.” It happens when a man arrives in Jerusalem and promptly announces he's the Messiah. Israeli security agents are prepared to deal with that, and the would-be Messiah is promptly escorted to a psychiatric hospital.


*

CRITICIZING ISLAM WITHOUT DEMONIZING MUSLIMS

Sean Illing: This is not an easy book to write. You’re exposing yourself to a lot of criticism on all sides. So why write it?

Ali Rizvi: I grew up in a moderate to liberal Muslim family in three Muslim-majority countries that were culturally very different. I developed certain perspectives about the religion and the Muslim experience that most others didn’t have. I’m not just talking about Islam itself, but also the Muslim experience, which is more personal and more to do with identity rather than ideology or belief.

Like most issues, in the United States especially, the conversation around this issue — about Islam, Muslims, and terrorism — eventually diverged into the left and the right. You had the liberals with their view, and the conservatives with their view, and I felt both of them were really missing the mark. They were both conflating “Islam” the ideology and “Muslim” the identity. Islam is a religion; it’s a set of beliefs, a bunch of ideas in a book. It's not human. Muslims are real, living, breathing people, and to me, there's a big difference between criticizing ideas and demonizing human beings.

Sean Illing: And your sense was that both the left and the right were failing to capture this distinction?

Ali Rizvi: Neither side was making that distinction. On the left, people were saying that if you have any criticism against Islam, then you were a bigot against all Muslims. On the right, it was like, there are a lot of problematic things in Islamic scripture, so everyone who is Muslim must be banned, or profiled, or demonized. Both sides weren't making that distinction between challenging ideas, which has historically moved societies forward, and demonizing human beings, which only rips societies apart.

Sean Illing: How does your book split this difference?

Ali Rizvi: I think all of us have the right to believe what we want, and we must respect that right, but that doesn't necessarily mean we have to respect the beliefs themselves. That's what this book is about. It’s about making that distinction between Islamic ideology and Muslim identity, and explores how we can have an honest conversation about ideas and beliefs without descending into bigotry against those who might challenge or hold them.

Sean Illing: I think a lot of what you’re saying leads back to a fundamental question about whether Islam (or really any religion) is essentially a culture — or where the line between the two is drawn.

Ali Rizvi: There’s definitely some interplay between the two. But culture is always evolving. If you look at secular societies like the United States, the way it was in the 1950s is very different from the way it is now. It's moved a lot, culturally. But religion freezes culture in time. Religion dogmatizes culture and arrests its evolution.

Sean Illing: You might also say that religion helps to create and reinforce culture, but I take your point.

Ali Rizvi: Sure, and there are aspects of this that can be positive. There are many of us who are atheists but retain some cultural elements of the religion. For example, I still enjoy the Eid holiday and the fast-breaking iftar feasts of Ramadan with my family. I have pleasant childhood associations and memories with these things.

This is true for other religions too. Richard Dawkins himself, who is a ... well, you don't get more atheist than Richard Dawkins. Yet he has also described himself as a cultural Christian. He even says he prefers singing the religious Christmas carols like “Silent Night” to the others, like “Jingle Bells” and “Rudolph the Red-Nosed Reindeer.” I think we should be able to enjoy some of these rituals without the burden of belief.

Sean Illing: There’s a lot more to be said about this, but I want to refocus us on the political questions. I’ll be honest: I came to this conversation with some trepidation. I’m of the left, but I do believe there is an element of the left that struggles to talk honestly about the problems in the Muslim world, in part because so many feel obliged (rightly, I think) to beat back the bigotry on the right and also because religion is rarely the only variable driving behavior.

But when I saw your tweet the other day claiming that the left was wrong about Islam and the right was wrong about Muslims, that felt like a good way into this difficult debate. Can you tell me what you meant by that?

Ali Rizvi: I think the left has a blind spot when it comes to Islam and the right has a blind spot when it comes to Muslims. When Christian fundamentalists like Pat Robertson say something that's homophobic or misogynistic, people on the left descend on them like a ton of bricks. They’re very comfortable with criticizing and satirizing fundamentalist Christianity. But when it comes to Islam, which has many of the same homophobic and misogynistic teachings, they throw their hands up, back off, and say, whoa, hold on, we must respect their religion and culture.

Sean Illing: You seem to applaud the intent here but still think it’s ultimately counterproductive.

Ali Rizvi:I get that it comes from a good place. I’m a liberal myself, and I vote liberal. It’s part of our liberal conscience to protect the rights of minorities, as they should be protected. But that doesn’t mean we must protect and defend all of their beliefs as well, many of which are just as illiberal as the beliefs of Christian fundamentalists.

This is very frustrating to our liberal counterparts in Muslim-majority countries, who are fighting fundamentalist Islam the same way that liberals here fight fundamentalist Christianity, and they’re even risking their lives for it. Many have died for it. Yet they hear their liberal counterparts in the West calling their ideas “Islamophobic.” This is a devastating double standard for them.

Sean Illing: And what of the right?

Ali Rizvi: Those on the right paint all Muslims with the same brush. The title of my book speaks to millions of people in the Muslim world who are atheist or agnostic but must publicly identify as Muslim or they’d be disowned, ostracized, or even killed by their families and governments. They’re atheist in thought but Muslim by presentation. They’re living a contradictory existence. Hence the title of the book.

They retain the Muslim label because the governments and Islamist groups in their countries won’t let them shake it off. Well, now, with Trump’s Muslim ban, especially the first one he proposed as a candidate in 2015, Trump won’t let them shake it off either. Blanket bans like that include many people like me, because we have Muslim names and come from Muslim-majority countries.

Islam isn’t a religion of war or peace

Sean Illing: I don’t believe Islam is inherently or necessarily violent, and I think a broad view of history justifies that claim. But there is, at this moment, an inordinate amount of chaos springing out of the Muslim world. Much of that is due to political and economic and social and historical factors, and I’m sure some of it has to do with specific religious doctrines. I don’t feel equipped to assign weights to these causes, and I’m about as far from an authority on Islam as one can get, so I struggle to say anything definitive or useful about these problems.

Ali Rizvi: I'm going to paraphrase my friend Maajid Nawaz on this. He says Islam is neither a religion of war nor a religion of peace. It's just a religion, like any other religion. Sure, the scriptures of these religions have inspired a lot of people to do good things, but they have also inspired a lot of people to do bad things as well.

Look at it this way. Do you know Jewish people who eat bacon? Almost all of my Jewish friends eat bacon. Now, does that mean that Judaism is suddenly okay with bacon?

This is the difference between religion and people. You can’t say, hey, I have a lot of Jewish friends who eat bacon, so Judaism must be okay with pork. It doesn't make sense. So when I say that most Muslims I know are very peaceful and law-abiding, that they wouldn't dream of violence, that doesn't erase all of the violence and the calls for martyrdom and jihad and holy war against disbelievers in Islamic scripture. Most of my Muslim friends, both in Pakistan and here, had premarital sex and drank alcohol too. That doesn’t mean Islam allows either of those things.

The hard truth is there is a lot of violence endorsed in the Quran, and there are other terrible things, as there are in the Old Testament. But there are more people in the world — even if it’s a minority of Muslims — who take their scripture seriously. It’s dishonest to say that violent Muslim groups like ISIS are being un-Islamic.

Sean Illing: And how do you account for all the other external factors that conspired to create the conditions of unrest in these countries?

Ali Rizvi: Foreign policy is a factor. It wasn’t long ago that the United States was hailing the Afghan mujahedeen as heroes for fighting against the Soviets. The word “mujahedeen” literally means people who wage jihad. That was a good thing for America in the 1980s. Bin Laden was among these fighters, and himself was a recipient of US funding and training. We’ve seen how that turned out.

But there’s also this — if you're a young Iraqi man and your family was bombed by the US, your reaction may be that you may become anti-American. You might say, okay, I'm going to fight these guys. But would your reaction to US foreign policy be to start enslaving and raping 9-year-old Yazidi girls? Or forcing local non-Muslim minorities to pay a tax or convert to Islam, or be crucified publicly, as commanded in the Quranic verses 9:29-30 and 5:33? Or beheading Shias or apostates who have left Islam? Or throwing gays off rooftops?

That isn’t just the reaction of someone simply to US foreign policy. These are things they're doing to their own people. Killing apostates and taking sex slaves
. So the question about weighting and how much it matters, it's a good question. But these people tell us why they do what they do. There are terrorists who after a terrorist attack will say, “This is our revenge for what you're doing to our lands and our people.” And then there are other times that they’ll put out statements saying, “This is what the Quran says.” ISIS often puts out very accurate statements quoting the Quran that completely fit their actions.

Sean Illing: Right, but again, it becomes awfully tempting to analyze this disorder in a vacuum. When states fail and societies collapse, you often see tribal and ethnic and religious violence depending on how the fault lines are drawn, and so it’s never as easy as isolating a text or some doctrines as the chief cause.

Ali Rizvi: Fair enough. The thing is, we have had a lot of discussion about the US foreign policy and how that has caused problems in the Muslim world, but we somehow shy away from talking about the equally important religious, doctrinal basis for these terrorist acts. We shouldn’t deny either. I’m convinced that one of the main reasons we haven’t resolved this problem is that we are afraid to make the complete diagnosis.

The appeal of fundamentalism

Sean Illing: I wonder if a complete diagnosis is even possible. Step back and take a broader historical view. The contents of these texts haven’t changed — it’s the political and social and economic conditions that have changed. So the question then becomes what is it about these conditions that produces certain interpretations or leads to certain doctrines becoming more manifest?

Ali Rizvi: A lot of this has to do with the lure of belonging to a group and the search for an identity. In my book, I discuss the ideas of Erik Erikson, who coined the term “identity crisis,” and James Marcia, who wrote at length about how young people go about resolving it in terms of exploration and commitment to a set of values.

Identity achievement is characterized by high exploration and high commitment, meaning you expose yourself to a variety of options and then commit to a set of values that represents you best. Identity foreclosure is low exploration, high commitment. These are people who commit to a set of values without much exploration — such as those from strict religious upbringings who adopt their parents’ teachings without much questioning. Identity moratorium is high exploration, low commitment, marked by indecisiveness. And identity diffusion is low exploration, low commitment. These are your wandering souls.

If you look at how we as human beings resolve our identity crises as adolescents and young adults, you see that some of these processes, such as identity foreclosure in this case, lend themselves better to explaining what might cause a young person to join, or resist, violent ideologies like jihadism. I think it’s a much better model to both understand it and counter it. It also acknowledges the role of the ideology and doctrine itself, rather than deflecting from it.

Sean Illing: I agree that in many cases we’re talking about existentially adrift people, people pining for something grand or noble or meaningful in their lives. And in a lot of ways, ISIS or Islamic extremism is the biggest game in town on that front. These movements or groups offer a singularly purposeful struggle, and it’s hard to overstate the appeal of that.

Ali Rizvi: Yeah. I think that's actually very legitimate. A lot of these people are just wandering souls. They're just trying to find a place for themselves. But the more interesting question for me is why is Islam, why is this particular religion, so appealing to them? Why do people prone to violence find Islam so appealing for their purpose?

The way we think about this is strange. We try really, really hard to dance around it. When someone tells us they did something for political reasons, we accept it easily. “Sure, they did it for politics." When someone says, "I did this for money," we believe them. Even when people say, "I played Doom, the video game, and I listened to Marilyn Manson," we take it at face value and have all these cultural conversations about the role of video games and music in violence. 

But when people say, “I'm doing this in the name of Allah,” and quote verse 8:12, which says, “Strike the disbelievers upon the neck and strike from them every finger tip," and we see them doing exactly what those words say, we look at that and go, "No, no, it's got to be politics. It’s got to be for money. Let's see what video games they were playing.”

That's the only thing I have a problem with. I acknowledge the other causes. I have explored them in my book. Yes, there are political grievances, and there are foreign policy grievances. We never deny those. So why do we deny that religion itself, the scripture itself, can drive these atrocities?

Sean Illing: Those are fair points. I’ve often found myself struggling to argue that people can be confused about what’s actually motivating them, or at least blind to the root causes. But this is a difficult case to make in this context. In any case, we obviously need a nuanced conversation, and it’s just not happening.

Take someone like Sam Harris, who I think makes a decent point when he talks about the link between ideas and actions. Harris often understates the extent to which religious ideas can be props or justifications for behaviors that are motivated by nonreligious grievances. On the other hand, though, there are a lot of people who just deny such a connection altogether, which is absurd. Again, what’s interesting to me is what makes specific ideas attractive at specific periods of history? We have to isolate those conditions and causes.

Ali Rizvi: I think it's more complicated than that. Think of the [National Rifle Association] slogan, “Guns don’t kill people; people kill people.” The typical liberal response to that, and rightly so, is no, don’t downplay the deadliness of guns. You can’t take them out of the equation. Even if they’re just a tool or prop, they’re central to it.

Now replace “guns” in that statement with “religion” or “beliefs.” Religion is a much worse prop in this case, because it’s got ideological roots. There are words in the scripture that command, verbatim, exactly the kinds of violent acts we see Islamic militant groups do. They’re not quoting Islamic Studies professors at Al-Azhar University. They’re quoting the Quran and Hadith.

And yes, in some cases Islam is used by nonreligious people for other motives. A good example of this is when the Pakistani government banned YouTube in the country after a film mocking Islam and Muhammad went viral. This helped the government because it deprived political dissenters of a huge platform. Now, if they’d said, “We’re banning YouTube because we want to quash political dissent,” the entire country would’ve risen up against them. But when they said they wanted to do it to stop blasphemy against our beloved prophet, the masses supported them. So they used religious reasons for nonreligious purposes.

But this still doesn’t take away from the point. It still stands that religion — and I say religion in general this time because while Islam is especially dangerous today, the other Abrahamic religions have served the same purpose when they were — lends itself extremely well to the goals and whims of authoritarians, tyrants, and the violent everywhere, whether it’s being used as a prop or driving them by belief.

The Trump factor

Sean Illing: Trump and Trumpism adds a whole other layer of urgency to this conversation. I think we have to find a way to talk about these problems in honest and productive ways, and that is all the more difficult against the backdrop of an explicitly anti-Muslim administration.

Ali Rizvi: That's what the book is about. The book is my answer to that question. How do we have an honest conversation about this without descending into bigotry, and how do we do it in a morally responsible way? I won't completely blame liberals for the rise of Trump — I think the far right owns a lot of that. But liberals aren’t blame-free. They left a vacuum.

The failure of liberals to address Islamism from an honest and moral position left a void that allowed the Trumpian right to opportunistically address it from a position of xenophobia and bigotry.

Harris warned of this — the hijacking of the conversation by irrational actors on the far right — over 10 years ago. And disagree with him as much as you want, but he has always been mindful of that distinction between criticizing Islam and demonizing Muslims. His book with Maajid, Islam and the Future of Tolerance, is evidence of that. This is a point that we just need to drive home and keep repeating. Unless we do that, we can't have a responsible conversation about it.

Sean Illing: Agreed. But that’s why someone like Harris, who I do think is occasionally unfairly criticized, makes a more productive conversation less likely. This came up recently in his podcast with Fareed Zakaria. If we say that a religion is reducible to the concretized doctrines in its holy text, then we don’t leave much room for evolution or reformation. As you said, a religion at any moment is essentially what its believers decide it is. The Bible is riddled with terrible Bronze Age dogmas, but most Christians don’t take those parts seriously any longer. The same can be true for any religion.

Ali Rizvi: Now we're getting into the idea of reform, and this is what Maajid Nawaz talks about, who actually helped change Sam's view on this as well.

I don’t think anyone’s saying that a religion is reducible to the concretized doctrines in its holy text. I know Sam doesn’t think that either. But we are saying that those texts are a huge, huge part of the religion. In Islam, the divinity and infallibility of the Quran is the only thing that every sect and denomination agrees on. And again, no matter how many Jews start eating pork, the religion of Judaism will never be okay with swine flesh.

One thing Christians and Jews don’t always understand, because it’s hard to relate to, is that most Muslims do revere their holy text very differently from them. It’s not just divinely inspired or written by men of God. It is written by God himself, every letter, every punctuation mark. It’s literal, and it’s infallible. You can’t even touch the book unless you’ve performed an ablution ritual. It’s very serious.

What a reformation looks like

Sean Illing: Your book is partly a call for reformation. Given what you just said, what is it that you think should be done?

Ali Rizvi: I say that the first step to reform in Islam is rejection of infallibility. This seems outrageous to some. They say it’ll never happen. But it has happened in the past. 

Reform Jews today make up a majority of American Jews. None of them believe the Torah is the literal word of God anymore. But for a long time, that was the deal — the Torah was revealed to Moses at Mount Sinai and the Tabernacle like the Quran was to Muhammad starting in the Cave of Hira at Mecca. It was error-free. Suggesting otherwise was blasphemy — and look up Leviticus 24:16 to learn the consequences of that. [“And he that blasphemeth the name of the Lord, he shall surely be put to death, and all the congregation shall certainly stone him: as well the stranger, as he that is born in the land, when he blasphemeth the name of the Lord, shall be put to death.”]

Amazingly, in the last 10 or 15 years, I've started seeing younger Muslims start to doubt the absolute infallibility of the Quran. They say, you know, it was compiled so long after the prophet’s death by his companions, pieced together from their collective memories, something could’ve been left out or added in, and you can only say it’s divinely inspired, not purely divine. Now, that seems like a small demotion — but it's actually huge.

This is why I say I believe in Muslim reform, not Islamic reform. I don’t think using mental gymnastics to reinterpret scripture is convincing. You can’t keep saying “kill” actually means “love,” or “beat your wife” is misinterpreted and actually means “kiss your wife,” and stay credible. In the internet age, everything is exposed. It's online, you can look it up in a dozen languages, multiple translations, the context and syntax and etymology of every word — any 12-year-old can dig that up today.

But when you look at the entire book as a whole and you say, "Well, is this divine or is this just divinely inspired? What is the likelihood that God really said this? If God created binary pulsars and time dilation and tectonic plate shifts, all these amazing things, why would he care if I eat pork or who I have sex with?” That you can work with. Don’t change the way Islam reads, but try and change the way young Muslims think, how they approach and process information. Skepticism, empirical analysis, critical thinking.

Sean Illing: I think your perspective here is desperately needed, though I have no idea how likely it is to resonate. But I remain convinced that telling Muslims their religion is bullshit and built on false claims won’t make the world any better. Your book does a wonderful job of showing how religions are about a lot more than ideas. Any time you’re talking about religion, you’re also talking about identity and culture and ritual and community, and any approach that condemns Islam as such will no doubt alienate the vast majority of Muslims.
 
Ali Rizvi: Well, my book is pretty hard on the religion too, but it’s not about telling people their religion is bullshit — it’s about how you tell them that. I say in the book that setting the stage for the conversation is often more important than the conversation itself. I’ve had this conversation with my Muslim friends and family for a very long time, and I’ve often had to find creative ways to have it in countries where saying it like it is can have horrible consequences. I’ve always wanted to figure out the best way to have it, a way that is both honest and constructive.

The thing is, most Muslims don’t really know too much about Islam. They were born into Muslim families, so being Muslim is a lot like a birth identity for them. And when you criticize Islam or a problematic verse in the Quran, or joke about Muhammad, they take it personally as an attack on them, on their identity.

In my book, I deliberately tried to first establish a connection based on shared identity, and then move to the ideas. I talked about how I was raised, all the rituals my family and I participated in, all the little things that happen when you grow up Muslim — with the message that I’ve been where you’ve been. I was raised the same way. I respect how important this identity feels and how real this experience is. We come from the same place.

And once that’s locked down, I’ve noticed, in nearly every case, that people are much more receptive to criticism of their beliefs. It’s really amazing. It’s because now they know you’re talking about ideas and beliefs, and you’re not attacking them as people. And when that happens, I notice that many more people have doubts about their beliefs than you’d think.

Sean Illing: Are there times when being honest might be counterproductive, and if so, how do we balance that tension?

Ali Rizvi: There's an argument about what's productive versus what is honest, and how do we balance that — how do we be constructive while also being honest. When you talk very seriously about something like, you know, a man living inside a fish, there's no way to really talk about that without sounding like you're mocking something. That's just one example. I’m just saying how difficult it can sometimes be to have honest conversations about beliefs.

But I would urge liberals to have this conversation openly, honestly, and responsibly. It’s already happening within the Muslim world. Several white Western liberals have confided to me that they agree with what I say, but won’t say it themselves because they’re afraid they’ll be labeled bigots or Islamophobes. I call that “Islamophobo-phobia,” the fear of being called Islamophobic. It’s a great way to shut down the conversation and silence people with colonial or white guilt. 

I get that. That’s why the Muslim Brotherhood loves the term so much. It conflates legitimate criticism of Islam with anti-Muslim bigotry. And it exploits victims of anti-Muslim bigotry by using their experiences for the political purpose of censoring criticism of Islam. When you fall for that, when you hold back from standing up for your liberal values, you’re not helping to curb terrorism. You’re already a victim of it.

Liberals today enjoy the benefits of the Enlightenment, which their predecessors brought about through great acts of blasphemy and rebellion, often at deadly cost to their lives and livelihoods. Today, this conversation and this movement is happening within the Muslim world. It doesn’t just include the hijab-wearing women and bearded men you see on your TV. It includes the beer-drinking Muslim colleague you work with; it includes the Muslim girl at college who had doubts about her religion’s views on women; it includes agnostics, atheists, and free thinkers like me who want the freedom to change our minds without literally having to lose our heads. There are many voices in this conversation, and you don’t have to choose. 

Just let it happen. 

Be an ally, not a savior. 


*

FATTY LIVER DISEASE: HOW TO HEAL YOUR LIVER (HINT: EAT EGGS) 

Do you experience pain in you upper right abdomen? It could be your liver trying to let you know it’s being infiltrated by fat. Non-alcoholic liver disease is very common — according to one source, 80% of women over forty have some degree of it. Being post-menopausal is a risk factor. But fatty liver disease can be caused by being overweight, with high blood level of triglycerides. Another factor contributing to fatty liver can be hypothyroidism, very common among older women.

Advanced cases (for instance, when you develop reddish palms and yellowish skin and whites of eyes) definitely require medical help. When your symptoms are still mild (just abdominal discomfort on the right side), you can actually reverse fatty liver disease.

Here is an enlightening article:

Choline Metabolism Provides Novel Insights into Non-alcoholic Fatty Liver Disease and its Progression

~ Choline is an essential nutrient and the liver is a central organ responsible for choline metabolism. Hepatosteatosis and liver cell death occur when humans are deprived of choline. In the last few years there have been significant advances in our understanding of the mechanisms that influence choline requirements in humans and in our understanding of choline’s effects on liver function. These advances are useful in elucidating why non-alcoholic fatty liver disease (NAFLD) occurs and progresses sometimes to hepatocarcinogenesis [liver cancer].

Choline biology

Choline is a constituent of cell and mitochondrial membranes and of the neurotransmitter acetylcholine. Given its essentially ubiquitous incorporation into cellular components and pathways, it is not surprising that this nutrient influences diverse processes such as lipid metabolism, signaling through lipid second messengers, methylation-dependent biosynthesis of molecules (including epigenetic regulation of gene expression), activation of nuclear receptors , enterohepatic circulation of bile and cholesterol, plasma membrane fluidity, and mitochondrial bioenergetics.

Recent findings

Humans eating low choline diets develop fatty liver and liver damage. This dietary requirement for choline is modulated by estrogen and by single nucleotide polymorphisms (SNPs) in specific genes of choline and folate metabolism. The spectrum of choline’s effects on liver range from steatosis to development of hepatocarcinomas, and several mechanisms for these effects have been identified. They include abnormal phospholipid synthesis, defects in lipoprotein secretion, oxidative damage caused by mitochondrial dysfunction, and endoplasmic reticulum (ER) stress. Furthermore, the hepatic steatosis phenotype and can be characterized more fully via metabolomic signatures and is influenced by the gut microbiome. Importantly, the intricate connection between liver function, one carbon metabolism, and energy metabolism is just beginning to be elucidated.

Summary

Choline influences liver function, and the dietary requirement for this nutrient varies depending on an individual’s genotype and estrogen status. Understanding these individual differences is important for gastroenterologists seeking to understand why some individuals develop NAFLD and others do not, and why some patients tolerate total parenteral nutrition and others develop liver dysfunction.

https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3601486/




Oriana: HOW TO PREVENT FATTY LIVER DISEASE

The best dietary source of choline is EGGS. True, chicken liver is an even richer source, but I know how unpopular organ meats are. Steak, salmon, and pork are also good sources, but let’s face it, in terms of easy preparation, nothing beats eggs. 


As with everything, moderation is encouraged: you don’t need to live on eggs and chicken livers. Listen to your body.

Nuts and seeds also provide choline, as those who prefer plant-based nutrition probably already know.

CRUCIFEROUS (CABBAGE-FAMILY) VEGETABLES

Here we go again: kale, brussel sprouts, broccoli, bok choy, cabbage, cauliflowers, arugula, collard greens (surprisingly tasty). They provide the indole chemicals that result in less fat being deposited in the liver. 

BURSTS OF EXERCISE

Though there is nothing wrong with a sustained work-out, many of us find our lifestyle is not compatible with spending time at the gym. But practically everyone can take a quick walk, do some arm circles, sit-ups or whatever works. These five-minute exercise breaks can be a life-saver for the sedentary.

TRY TO SUBSTITUTE CERAMIC OR GLASS CONTAINERS FOR PLASTIC

The liver works hard to get rid of environmental toxins. It can become overwhelmed.

BEWARE OF CERTAIN SUPPLEMENTS

Certain herbal extracts, especially the high-dose green tea extract used in weight-loss supplements, can injure the liver. Have a cup or two of green tea — it will have an appetite suppressive effect without overwhelming your liver.

Try not to overload on supplements. Reduce them to the ones you’ve found to be essential.

ALCOHOL — THE MAIN CAUSE OF LIVER DAMAGE

This goes especially for postmenopausal women, who have a poor capacity to detox alcohol. Even is your alcohol use seems “moderate,” if you notice a swelling of the liver and frequent discomfort in you right abdomen, it’s a sign to cut out alcohol — completely at first, to let the liver regenerate. Then try to keep it minimal. There is an inexpensive supplement that seems to protect against the toxic effects of alcohol, and that N-Acetyl-Cysteine (NAC).

Another supplement that protects the liver is S-Adenosyl-Methionine (SAMe).

Both NAC and SAMe work by increasing the levels of glutathione, an important detoxifying enzyme. NAC is very inexpensive, and I recommend it to everyone, even if you don't drink.

*
ending on humor:
 



No comments:

Post a Comment