Saturday, March 19, 2016



I knew I’d never get there
but kept following the signs —
hiked as far as the meadow plateau,
the necklace of narrow lakes

linked by a darkling stream —
a tuft of mimulus, an eighteen-carat
gold evening primrose;
something purple, something blue —

and where the streamside trees
deepened into twilight,
beaver dams — and below, between
the afternoon-hot boulders, a whistling

marmot fattening herself for winter.
Did I really hear that whistle —
is that its echo bouncing off
the cooling boulders of my mind?

I knew I'd never get there
but kept following the signs —
I didn’t need to stand
on a wind-tormented summit

admiring my own strength,
straining to hear the silent
applause of the clouds.

I’d like to show

one of those weather-worn
sun-bleached trail signs to a god —
not sulky Yahweh or gore-smeared Jesus,
but a foolish yet dear ancient god,

as forgotten as I will be in time —
slouched like a boulder, sweating,
swatting at mosquitoes — to show
him or her what my life was about.

~ Oriana © 2016

A new poem! Where did the inspiration come from? It’s almost embarrassing to admit: in a book was reading at breakfast, I came across the word “Kearsarge” — and my thoughts drifted back to the Eastern Sierra, those glorious mountains I will never see again.

For the few of you who more or less know the area, yes, you take the Onion Valley exit from Independence, not far from Lone Pine and Whitney Portal. The first part of the trail is murder, steep and exposed. Then it gradually becomes paradise.

You may be wondering why a committed atheist like myself is invoking a god. Of course all gods and religions are a human inventions, but they can be a useful metaphor in poetry. A dilemma, I know, but in life there is no "purity." I always wished for a group of kindred minds; among other things, we might discuss what in our lives stands out as important, or at least memorable. I don't have such a group, so for the sake of this poem, I reach for an imaginary sympathetic listener (other than you, reader).

If a god, then a part of nature. One of the worst aspects of Yahweh is his being apart from nature. He may have started as the god of the storm, but then became more and more abstract, until in Christianity he became the Orwellian “Eye in the Sky
, spying on people’s sins, reading their thoughts for thought crimes like lust — since to “lust in your heart” is allegedly as bad as committing adultery (what idiocy). What a relief to acknowledge that this is just fiction made up to scare believers, to “keep them in line”!

Heaven was — at least for some weeks, while the thought still had novelty — the bliss of knowing that the Monster did not exist and there was no 24/7 spying going on. It was radical relief from the obsession with sin and punishment. And when I think about it again, and the certainty deepens (if we don’t have to be forever “agnostic” about the existence of leprechauns, then it’s the same with an invisible guy in the sky), the bliss returns.

I'm not saying that atheists are blissfully happy all the time — there are only blissful moments, as for everyone. For me it's enough to look at the moon — that fills me with happiness. Maybe that's ridiculous, but that's it. Every night I walk out and look at the stars and the moon — if visible. If not, the white clouds are enough. So beautiful!

I simply have no need for any deity. The world is enchanting enough. I know I'm incredibly privileged to live in California, surrounded by beauty — much of it human-created — all the gardens, the clusters of palm trees, the bougainvilleas -- I have found a purple one at least and am planning building a little hill for it, so the riot of blossoms can pour down in fuller glory. Have I rejected god's love? Can you reject something that doesn't exist? I have accepted what I can see and touch, and human love, and dogs' affection, and the beauty and innocence of animals in general. And trees, the trees of life.

Perceptive reader, I hear you say, Yes, but why do you imagine showing a trail sign like a symbol of your life to an old nature god? Ideally, I wouldn’t feel lonesome for some such being. In an optimal world, there would be a group of kindred minds with whom we could have a serious conversation about what is important in life, and what we think we have and have not accomplished — our joys, our regrets. Those “kindred minds” would accepting rather than judging, and old enough to have the wisdom of experience. I imagine they would say that no one gets everything they want in life, but what a privilege to accomplish and experience anything.

And rather than tell a long story, perhaps the idea would be an object or a word or two. And if I were to bring one word — all right, two — plucked from my whole life, it just might be “Kearsarge Pass,” high in the Sierra Nevada. 


~ “Those working the longest throughout their lives (even taking part time jobs after retirement) and working in the most stressful jobs lived the longest.”

“Cognitive ability predicts mortality.”

(Note: this study was based on the high-IQ children followed up through life by Lewis Terman. It started in 1921. Because all subjects had a high IQ, other variables came to light, especially conscientiousness.)

Those who were the most cheerful and optimistic, on average, lived shorter lives than those who were less cheerful and joking. According to the researchers, optimistic people tended to take more risks overall: going to more parties, using more drugs and alcohol, and getting into more accidents. Friedman notes that “fun can be overrated.”

Who lived the longest? Those who were the most conscientious and committed to their jobs, friends, and community lived the longest. In fact, those working the longest throughout their lives (even taking part time jobs after retirement) and working in the most stressful jobs lived the longest. Those working in low-status jobs were far more likely to die before the age of 60 than those working in higher status jobs. According to the researchers,

“It was the most prudent and persistent individuals who stayed healthiest and lived the longest.”

They also found an effect of divorce. While early parental loss didn't have an effect on longevity, early parental divorce was a very strong predictor of mortality in adulthood. The authors note how traumatic and painful divorce was for the children. There were also important gender differences. Men who remarried improved their odds of a long life, whereas women who stayed single after divorce were just about as well off as if they had stayed single.
Exercise also played a role, but not the kind you may think. People who lived the longest weren't obsessed with health and exercise. They didn't have structured regimes, but just tried to live as active a life as they could.

Also counterintuitively, kids who started first grade at too early an age had problems later in life and lived shorter lives. Early school entry was associated with less educational attainment and worse midlife adjustment. Also, while early reading ability was associated with academic success, precocious reading was less associated with lifelong educational attainment and was hardly related to midlife adjustment at all. Parents may want to re-think whether they push their child to enter school too soon.

The rapidly advancing field of cognitive epidemiology is showing that across a much broader range of IQ levels and demographics, cognitive ability predicts mortality, even after controlling for a number of related variables such as education and socioeconomic status. This research is pointing to the inescapable conclusion that cognition is related to health and longevity. Of course, the causal path is still unclear, but research in the coming years will get us closer to understanding why there is such a strong relationship.

The conclusion seems pretty banal: self-control and interpersonal stability leads to a long life. The finding that optimistic and cheerful people died younger is surprising though." ~


Ask anyone about what predicts longevity, and they are likely to say, “Diet and exercise.” In fact two best predictors of longevity are high IQ and high degree of autonomy (bosses live longer than subordinates). The first surprising (back then) finding in longevity studies was that Harvard professors live much longer than former Harvard athletes. The professors continued to lead an active professional life even after retirement — writing books and articles in their field, attending conventions. 

By “stressful jobs” I think the authors mean high-status jobs, those with the most autonomy. Studies have repeatedly shown that being the boss rather than a subordinate strongly correlates with better health and longevity.

IQ is a broad term. What seems important for longevity is EXECUTIVE FUNCTIONS, especially self-control. This means the ability to choose long-term benefits over short-term gratification. And yes, you see differences between children as early as the age of five. I suspect that self-control can be improved with training.

What was not surprising was the finding that marriage benefits men more than women. Is it that men enjoy more autonomy in marriage, being more often the dominant partner? I wonder, if that finding becomes more widely known, along with the widely confirmed finding that having children lowers self-rated happiness, will women be less motivated to get married and have children? This already seems to be the trend, though socioeconomic status plays an increasing role: educated people tend to get married later in life, and also to stay married; divorced women and widows tend not to remarry, and staying single doesn’t seem to have a bad impact. Indeed it’s striking that we speak about the “merry widow” but never of the “merry widower.”

Of course we need to bear in mind the fact that correlation does not equal causation. Perhaps those who work hardest and longest are those who are exceptionally healthy to start with.

And again we need to bear in mind that “life isn’t fair”: ultimately nothing beats coming from a long-lived family. Being a health nut means nothing if people in your family tend to die of cancer before the age of sixty, and often sooner. It’s a classic story: he was a vegetarian who ran 3 miles every day, and died at 45 of pancreatic cancer.

But that’s an extreme. The take-away lesson is that it’s not diet and exercise, but rather intelligence, self-control, and leading an active working life — for as long as possible.


I remember the exact moment of my shift . . . On the outside, nothing changed; on the inside, everything did.

“Most psychologists agree that if you define wisdom as maintaining positive well-being and kindness in the face of challenges, it is one of the most important qualities one can possess to age successfully — and to face physical decline and death.

An impediment to wisdom is thinking, “I can’t stand who I am now because I’m not who I used to be,” said Isabella S. Bick, a psychotherapist who, at 81, still practices part time out of her home in Sharon, Conn. She has aging clients who are upset by a perceived worsening of their looks, their sexual performance, their physical abilities, their memory. For them, as for herself, an acceptance of aging is necessary for growth, but “it’s not a resigned acceptance; it’s an embracing acceptance,” she said.

Dr. Clayton says there’s a point in life when a fundamental shift occurs, and people start thinking about how much time they have left rather than how long they have lived. Reflecting on the meaning and structure of their lives, she said, can help people thrive after the balance shifts and there is much less time left than has gone before.”

Here is a minister who after 30 years discovered that he simply didn’t believe in Jesus anymore than he believed in Santa Claus. And it struck him that it’s too late in life to keep serving the wrong institution:

“If there really is a God, I no longer want to serve him. If he is as awful as he is depicted in the Old Testament, then I will oppose him. In the New Testament, Jesus said God is loving, kind, and forgiving. He said that his Holy Spirit resides in us and guides us. I have loved that idea but I have never seen it or felt it and I’ve waited long enough. Time to move on.”

I got older, lonelier, and more tired, and have grown rather intolerant of bullshit. I have come to the place where I don’t want to waste my remaining years saying things I don’t believe and propping up the failing institution we call the church.”

"I don't want to waste my remaining years doing X" -- this is the basic formula that changed my life when I finally felt cornered by mortality. A lot of people do something radical when they reach that point, astonishing others by leaving hated spouses, jobs, moving to Australia or Bali, all kinds of things they wanted to do for 30-40 years but just didn't have the courage to reach for their dream -- until it's almost too late (but never too late, except if you want to be a ballerina -- and even then, you can still do it just for the pleasure of it).


In a different vein: I know a story of a woman who only on her deathbed revealed that her uncle had raped her when she was thirteen. It’s easy enough to understand why she stayed silent when she was a young girl. But why stay silent in adulthood? She was, in effect, protecting a pedophile — exactly what pedophiles and rapists always count on.

The shame and not wanting to cause trouble in the family were of course still there as she grew older. I think that this story presents an extreme case of how mortality can make us take action. Only on her deathbed this woman realized that it was her last chance to tell the truth. She could die having told the truth, or she could die never having told the truth — and she made her choice.

(Much too late, that’s true. We need places where a rape victim can go for emotional help and counseling — places not connected with the police. That may be the next step, but first, let’s provide a place of safety and support.)


The problem of existence and the desirable approaches to the inevitability of death are well-known in philosophy. We are alive only for a while. Shall we give up in resignation and depression? Pretend we will live forever under a cloud of anxiety? Or make the best of things with the time we have remaining?

Ding ding ding: the answer is C, make the best of things we have with the time remaining. We should not live like the gardener who told Zorba the Greek that he lives each day as if he will never die; nor should we live like Zorba, who tells the gardener that he lives as if he will die each day. The correct way to live is based on a reasoned estimate of how much time we have left. If you’re thinking of learning a foreign language, you really need to know how long it will take, how much fun it will be, and how long you will have to live to enjoy whatever you learn.

Watney understands this, and the first thing he does is assess his life expectancy to estimate what he can accomplish in the time he has. Then, like a wise consumer of the serenity prayer (changing the things he can), he considers what he can reasonably do to extend his life expectancy. Thus, the first important psychological trait he displays in the face of existential despair is reason. After all, it’s reason that makes us aware of our own death, so the least it can do is start us off on a way to think about the time we have left productively.

Watney is blessed with a secure attachment. Although abandoned by his colleagues on a distant planet, he understands deeply that it’s not their fault, that he is loved, that if he manages his existential abyss, there will be a payoff in human relationships. This understanding helps to motivate him, but more importantly, it helps him not to ruminate. Resentment about reality is often the greatest impediment to improving things. Horney teaches us that the essence of neurosis is investing in how things should be instead of in how they are, and suspicions about injustice when the only villain is randomness is one of the main distractions from how things are.

One sign of dealing with the way things are instead of the way things should have been is a focus on the problems in front of you that can be solved instead of problems that can’t be solved or the problems that are brewing. Watney sees life as a series of puzzles and predicaments and addresses them as they arise.

Watney is undoubtedly a much more intelligent person than most of us, and this gives him an edge. But even more important than his level of intelligence is the use to which he puts it. Many people use whatever intellectual ability they have to make excuses, refine accusations, curse fate, or show off. Watney uses his intelligence to solve his problems. When he makes mistakes, large or small, he tries to learn from them.

Humor is central to Watney’s ability to muster his other assets to face the truth of his existence. The effort to live within reality and to avoid despair and depression on the one side and denial and anxiety on the other is best supported by a comedic or ironic frame. Like two independent, aggressive, selfish humans purporting to live for each other, it’s not sustainable if they really mean it. Only an ironic frame can sustain romantic love. In parallel, a person capable of imagining infinitude and perfection but settling for what is real cannot do so successfully if he or she really settles. That is just another route to despair. But a comedic or ironic frame around the settling enables us to make the most of our limited time on our planet, winking at ourselves as we indulge our petty desires before what Janna Goodwin calls the “glorious indifference” of the universe.

Presumably, if Watney’s life expectancy were too short to develop a plan to get off the planet, he would have devoted himself to making the most of a more limited time frame. Buddha tells a parable about a monk who is running from a tiger and comes to the edge of a cliff. He lowers himself down a vine, but there is another tiger at the base of the cliff. Mice emerge above him and start eating the vine. With only moments to live, the monk notices a strawberry growing in a crevice. Buddha reports, “How sweet it tasted!”



Reading this brief review stunned me because of a parallel with what happened as result of my sudden awareness of life expectancy. Once I fully grasped how little time was left, I was cured of life-long depression. It was the greatest event of my adult life, the only one that compares in magnitude and significance with my leaving Poland for America.

By the way, long before I had my own insight, I helped a friend reach a life-changing decision. She kept complaining how much she hated her job. Since she was around sixty and didn’t actually need to work, I asked only, “How much longer do you think you’ve got to live?” The question stunned her into silence. Within a month, she was training her replacement. The time I talked with her after that, she was playing with the city orchestra. She was radiant. In the past she’d been so busy complaining, she never even managed to tell me she loved music.

Bosch: Concert in an Egg, 1564


“Isaac was to be a whole burnt offering, meaning after Abraham slaughtered Isaac, he was supposed to burn him. the smoke from burnt offerings was to rise up to heaven and be a pleasing aroma. This would point to the totality of the sacrifice and the rising up of the essence of whatever it was toward heaven. There’s not going to be a body, bones or anything else left to be “raised” and the writer of Hebrews doesn’t seem to pay any heed to that little detail….I’m merely pointing out that it would be incredibly unnatural for Abraham to conclude that a pile of ashes would be raised back to life. Such a belief would require a highly developed theology that’s completely foreign to the Old Testament and unprecedented in any Biblical example of resurrection…and most importantly, there is no mention in the text of Genesis itself that Abraham believed that God would raise Isaac from the dead…The writer of Hebrews is either offering this resurrection belief up as his own supposition or is repeating some other tradition, but it’s nowhere in the text of Genesis.

In fact, there is nothing at all in the entire Old Testament that would give us any indication whatsoever that people in Abraham’s day even had a kind of bodily resurrection theology at all. It’s not until Daniel 12 (after coming in contact with Persian/Zoroastrian theology) that we even find a clear, overt reference to the idea of a bodily resurrection from the dead following the lapse of any time…

Caravaggio: Abraham and Isaac, 1604


In a previous post, I related that it suddenly occurred to me that the story got sanitized by later scribes: most likely, Abraham really did kill Isaac. It was interesting to learn that in a few medieval midrashim, Isaac does get killed. But this detail — just like animal sacrifice, he was supposed to be burned so that the smoke would rise up to heaven — makes this story, always difficult for me to stomach, even more difficult.

The official term for a sacrifice completely consumed by fire is holocaust.

I took everything literally, "historically" (as we were supposed to) until the age 14, and only then, around my birthday (I remember that lilacs were in bloom), I had a thought rise up that changed my life: “It's just another mythology.” But during my first religion lesson I thought we were being told a fairy tale (I didn't yet know words like mythology), and the years between 12 and 14 were perhaps the most terrible in my life in that I was constantly tormented by the question of the veracity of those stories and the existence of god, and tried to suppress the thoughts that I was sure would send to hell forever. I lived in terror.

By the way, I don't think our nun went into the detail of Lot offering his virgin daughters — or maybe she rushed over it. But it wasn't possible to skip over the story of Isaac.

Of course our modern understanding of someone hearing the alleged voice of god telling him/her to kill a child (or anyone) is psychosis. The woman in Utah who drowned her five children had that type of command hallucination. But way way back human sacrifice was not an exception, but part of regular worship. Also, what about Jephta's daughter? There was no last-minute reprieve. By the way, that story was part of our religious instruction; the nun said the moral was “not to make rash vows.” 

Antonio Giovanni Pelegrini (1675-1741), The Return of Jephta


~"News and people traveled faster than anyone had ever experienced. The cost of moving products and services plummeted in the same way Amazon or cloud-based apps have driven down distribution costs. Such forces made it easier for big companies in one place to serve customers everywhere. The technology “made possible a division of labor and specialization of production for ever larger and more distant markets,” wrote James McPherson in Battle Cry of Freedom, his epic Civil War history. So by 1850, factories were making certain types of craftsmen obsolete, department stores were driving local shops to close, and people found themselves losing jobs to someone far away.

Much like today, money in the early 1800s flowed to the new economy and away from the old economy. Capitalists who owned production got richer, and laborers lost power. The gap between rich and poor widened.

Cue the kind of anger Donald Trump is tapping into now.

Slavery turned into a flashpoint issue, but the real unrest boiled up from this giant economic rift. Technology transformed the North into an industrial economy while the South was anchored in an agricultural economy, one that couldn't operate without slavery. The North had a population that saw the advantage in embracing technology and progressive ideas (including that slavery was bad) and moving forward. The South's way of life and economic fortunes rested on keeping things as they'd been. The South viewed the North as a threat.

Look today at red states vs. blue, or even Trump supporters vs. “establishment” Republicans. Those divisions broadly define where digital-cloud-mobile technology and the modern economy work in favor of the population vs. where they work against them. Trump says “make America great again,” which, to his supporters, means “make America what it used to be.” To people whose livelihoods have suffered because of economic shifts ushered in by technology, moving backward looks better than moving forward—not just in economic issues but in social mores as well.

The big difference between now and then is that instead of that shift from agriculture to industry in 1850, today we’re seeing a shift from industry to software. The more that software can leverage the work of fewer humans, the fewer humans are needed for work, and the more profits flow to owners of the software. One industrial company, United Technologies, provides an example. At 218,300 employees, the company’s workforce hasn’t grown in seven years, even while revenue jumped from $42.7 billion in 2005 to $57.7 billion in 2012. That’s $15 billion not being spent on more employees. Productivity created by technology tends to put more earnings into fewer hands.

The lives of many of the people in tech hubs such as Silicon Valley, Seattle, Boston, New York and Washington, D.C., are going one way. The lives of many people in industrial or rural areas are going another. It may not be a North-South divide, but you can see a break widening between the coasts and the nation’s interior.

Trump has become the voice of those technology has hurt. He’s not just a protest vote; he’s a rebel vote. It’s a rebellion against Republican leaders who failed to conserve industrial jobs and a more traditional society. It’s not that different from the Whigs in 1850, when the party split between “Conscience” Whigs, who were pro-industry and anti-slavery (and thus threatened the whole Southern economic house of cards), and “Cotton” Whigs, who would fight to preserve an increasingly outmoded way of life.

The current rift in America isn’t going to mend if Trump wins, or loses. Look at what’s coming. Autonomous vehicles will eat driving jobs of every kind. Artificial intelligence will eat rules-based white-collar jobs like accounting. Block-chain technology will result in software-based contracts that eliminate the need for mortgage brokers and lots of lawyers. Factory work will be diminished by 3-D printing. The total disruption of the 20th-century way of life is inevitable and far from over.

Of course, like the tech revolution of 1850, ours should eventually create enormous opportunities we never dreamed possible. It is the path to wealth and comfort for every part of the country and every level of society. The best news is that, like in 1850, the U.S. leads the world in all of the important technologies. If we as a people can get through this, we won’t make America great “again”—we’ll make it into something cooler and better than it’s ever been.


Brian Wansink rejects the notion of good calories and bad calories — within reason, he believes, what we eat matters less than how much we eat. Indeed, researchers at the National Institutes of Health recently found that adults placed on balanced diets containing processed carbs from foods like white bread, instant rice, and fruit packed in sweet syrup fared just as well — at least in terms of cardiovascular risk factors — as those who got their carbs from apples, whole grains, and steel-cut oats. But eating fewer carbs and overall calories made a difference.

He and his grad students had planned to dump Wheat Thins and M&M's into large Ziploc bags, but by mistake they also brought some tiny, snack-sized ones. Since there weren't enough large bags to go around, some moviegoers got four small ones instead.

Something surprising happened: Most people who received the four small bags finished only one or two. In a follow-up questionnaire, Wansink asked the participants how much more they would pay for snacks that came in lots of small packages instead of one big one. A majority said they'd spend 20 percent more.

In the snack food aisle of a local supermarket, Wansink stops in front of the chips to tell me about a recent study he did with cans of Pringles. At intervals of either 7 or 14 chips (it didn't matter much which), his team inserted a Pringle dyed with red food coloring. Lab subjects who got these subtle reminders consumed 50 percent fewer chips on average than control snackers who got regular Pringles.

Outside the boundaries of the lab, Wansink did take on one major private client: McDonald's. In 2008, he'd independently funded a study on Happy Meals, spending three weeks watching kids dine. He found that it didn't matter much what McDonald's put in the meal. Kids mainly cared about the toy—in fact, most stopped eating once they'd unwrapped it. Three years later, McDonald's hired Wansink to determine whether some changes it had made to Happy Meals—ditching the caramel sauce that accompanied the apple slices and promoting milk instead of soda—had actually prompted kids to eat more nutritious food at its restaurants. (Wansink found that they had.) "What makes Happy Meals happy and fun is not the food, it's the atmosphere and the toys," he says. "McDonald's wins because parents feel less guilty about taking their kids there.”

Many parents won't be surprised to learn that Wansink found children to be exquisitely sensitive about food presentation. One of his studies, in 2011, determined that serving fruit in colorful bowls instead of metal trays more than doubled fruit consumption at school. In another, from 2013, he found that schools that switched from whole to sliced apples saw 48 percent fewer apples wasted and a 73 percent increase in students eating more than half of their apples. It also turned out that giving vegetables fun names — like "X-Ray-Vision Carrots" or "Silly Dilly Green Beans" — persuaded kids to eat 35 percent more veggies.

So far, some 17,000 schools have used the Smarter Lunchrooms training. Many report success. Jessica Shelly, director of food services for Cincinnati's public schools, implemented a few simple changes, such as placing the plain milk before the flavored milk in the line, changing food names, and adding a toppings station. "It's so awesome to see a student who went over to the salad bar to put some cumin on their chicken soft taco also end up adding some red pepper strips and broccoli florets to their plate," Shelly told me via email. Lunch attendance increased, and her once-struggling program climbed out of the red. In 2013, it turned a $2.7 million profit.

He tells me about a study he did with Birds Eye on how to get people to eat more frozen vegetables. Two sets of participants were told different versions of a story about a woman named Valerie. In the first one, she has a busy day, and when she gets home she serves her family a dinner of pasta, warmed-up leftover chicken, bread, and green beans from the freezer. The second version is exactly the same — minus the green beans.

When the researchers then asked study participants to describe Valerie, they were shocked at the difference in the responses. "People will rate Valerie when she uses beans as, 'Oh, she's a good mother, she is stressed out, but you can see that she cares for her family; she's really a good cook,'" Wansink says. "If you don't have the beans, people are like, 'Oh my God, this lazy excuse for a woman. What is she doing? It's all about herself; she is so self-centered.’”


Many people believe that they can eat bread as long as it's whole-grain, or that granola cereal is OK, etc. But all those are fattening because they raise blood sugar, which leads to the release of insulin, the fattening hormone. To lose weight, you have to keep your blood sugar low-normal.

Almost all carbs are fattening, even the “good carbs,” i.e. those with nutritional value. Exceptions include raw celery sticks, raw spinach, lettuce, and a few other things that have hardly any calories and sometimes take more calories to process than they provide. Usually absence of sweet taste is a reliable guide. Usually doesn’t mean always, e.g. whole-grain bread may not taste sweet but is in fact just as fattening as white bread.

And yet another problem is that fructose (which does betrayed its presence with sweetness — in fact it’s sweeter than glucose) is particularly fattening, though it acts through a mechanism different than raising blood sugar.

Ending on wisdom

A story of the Hasidim: Rabbi Moses from Kobryn said, “When you speak a word before God, enter into that word with your whole self.” One of his listeners asked, “How on earth can a big man enter a little word?” “Anyone who thinks he’s bigger than a word,” the Tzadik replied, “is not the person of whom we speak.” ~ from the Notebooks of Anna Kamieńska


No comments:

Post a Comment