Saturday, February 17, 2018


Vermeer: Mistress and Her Maid with a Letter, 1667. It's interesting that we get to see more of the maid than of the mistress.
Let us forget with generosity those who cannot love us. ~ Pablo Neruda



Ten years after your suicide,
this is  the moment I love best:
in silence you take my hand
and put your arm around my waist.

We take narrow steps as though
on a crowded dance floor,
our rhythm perfect,
the same silence leading us both.

We turn in tight circles,
we are almost formal. No
kissing, no: we dance as if
still only dreaming of each other.

We feel each other’s breathing,
our bodies’ boundaries of warmth. 
Slowly we dance without music —
unless we are the music —

How else can I explain
that in such silence we don’t hear
the shot that travels farther and farther
into the past, while we dance.

~ Oriana


I hate to start with the trite “This really happened,” but I feel this statement is important. One time the long-go lover who later committed suicide did take my hand and put his arm around my waist, and we began to dance — in silence. It was brief and magical. Yet because of the tragedy that eventually took place, it took a while for this memory to rise up in my mind — and when it did, it made me smile with pleasure. And I realized that it would always bring me pleasure, even though I knew what happened later. 

This was quite a psychological discovery: that an enchanting moment could be retrieved and enjoyed in spite of the knowledge of the unhappy ending. Ultimately, the pain could not cancel that unique moment. It has its own being, inviolate, untouched by the tragedy, the shock, the grief. It has its own kind of eternity — at least for as long as my memory lasts.

And perhaps most important of all, it’s a memory of tenderness. 

 A peach orchard in Georgia; Hayley Hyatt


~ “Because we all want to expand beyond ourselves. Psychologist Arthur Aron at Stoney Brook University has conducted studies suggesting that a primary motive for us as humans is to “expand the self and to increase our abilities and our effectiveness.” [what I elsewhere called “personality enlargement” — we explore the interests and knowledge of our new partner, learning new things, changing in surprising ways]

Good eye contact. Arthur Aron again (see #1). He conducted a study that encouraged strangers of the opposite sex to discuss intimate details about themselves for 90 minutes. At the end of that time, each couple stared into each other’s eyes for four minutes in silence. The results? Many of the couples said they felt a deep attraction to each other, even though they’d never met before. Two of the couples ended up married.

Because of inner and outer synchronicity. We fall in love, says psychologist Mark B. Kristal in the University at Buffalo College of Arts and Sciences, when processes in our bodies align with appropriate triggers from the outside world. He speaks of “visual, regular olfactory, auditory and tactile cues” happening in “the proper time, order and place.” He told

    There are several types of chemistry required in romantic relationships. It seems like a variety of different neurochemical processes and external stimuli have to click in the right complex and the right sequence for someone to fall in love.

Because we like the way they smell. Many studies have shown that smell plays a role in love. Plus we’re not just talking about the ordinary smell of your lover’s dirty T-shirts (dirty T-shirts, by the way, have been the stock-in-trade of smell studies), but also those other, perhaps odorless, signals that enter the brain through the olfactory system. That’s right, pheromones. Volumes have been written on the subject of smell and pheromones in attraction, love and marriage, and don’t we all know it’s true?

Because we like the way they kiss. Kissing has an element of smell to it, obviously, but kissing all by itself can determine if the relationship holds promise. Sheril Kirshenbaum, author of the book The Science of Kissing, told EarthSky that a kiss, and especially a first kiss, plays a big role in determining the future of a relationship, according to scientific studies. She said:

    Fifty-nine percent of men and 66 percent of women say they have ended a budding relationship because a kiss didn’t go well. It’s your body’s way of saying, look elsewhere.

Because of our hormones. You know how your heart pounds and your mouth goes dry when your new lover rings the doorbell? It’s basically a stress response. Romantic, eh? Adrenaline, dopamine and serotonin all come into play in love’s early stages. Love-struck couples also have high levels of the neurotransmitter dopamine, which stimulates an intense rush of pleasure, essentially the same effect on the brain as taking cocaine.

Because sex is good for us. Sex relieves stress, boosts immunity, burns calories, boosts heart health, improves intimacy … and so much more.

To make and raise babies, together. Martie Haselton, a psychologist at UCLA, believes love is a “commitment device,” a mechanism that encourages two humans to form a lasting bond to ensure the “long-term health of children.” Haselton and her colleagues conducted experiments, asking people to think about how much they love their partners while suppressing thoughts of other attractive people. They then have the same people think about how much they sexually desire their partners while suppressing thoughts about others. It turns out that love does a much better job of pushing out potential rivals than sex does. This is what you’d expect, Haselton says, if love was a drive to form a long-term commitment.

Because love is a drug. Neuroscientist Thomas Insel and colleagues at Emory University in Atlanta conducted studies showing that that monogamous pair bonding among prairie voles (small rodents that mate for life) affects the same brain reward circuits that are responsible for addiction to cocaine and heroin.

Okay, so now we know some of what the world of science has to offer on the subject of falling in love. Meanwhile, what’s the best way to stay in love? Psychologist Arthur Aron says the best predictor for lasting longterm relationships is kindness.” ~


Only some people's kisses feel just right — delicious. I wonder if chemical/genetic compatibility is involved in some indirect way — or if it’s accidental. Sometimes the partner is appealing in terms of personality, intelligence, kindness, etc —  and then the kissing doesn’t work. And at other times there are red flags — but the kissing is ecstatic. This is perverse, so I suspect physiology.

Still, even with imperfect kissing, what I call “personality enlargement” is probably the best part of falling in love. We start exploring the new partner's knowledge and interests, learn new things, try new foods, meet new people —  our world expands.


Of course in the eyes of the world, the answer is obvious: easy access to assault weapons. No need for long articles on the psychopathology of the shooters or the “toxic culture of violence” when this answer blazes its stark truth. But since there is little hope for change, we might as well delve into the psychology and culture . . .

~ “Obsessed with revenge, those aspiring to mass murder draw from the archetypal American hero who relies on gun violence to right wrongs and overturn oppressive institutions. Those who transition from fantasy to action are those who rationalize no other option than murder-suicide by ‘going out in a blaze of glory’. No doubt this rationalization represents a distinct kind of tunnel vision, distorting the traditional US hero into an anti-hero who regards society as the enemy.

In psychiatry, a ‘culture-bound syndrome’ is an idiosyncratic, locale-specific pattern of behavior that represents a culturally sanctioned expression of distress if not a mental illness per se. In Malaysia, for example, the culture-bound syndrome amok involves episodes of mass violence committed by an individual following a period of brooding. Unfortunately, in addition to borrowing the word amok in our own lay speech, it would appear that the US, along with other Western societies, has developed our own brand of running amok in the form of mass shootings. Once the cultural mythology of such mass murder has been firmly planted into public consciousness, a select few distressed individuals will look to this model to guide their own behavior, creating the problem of copycat killings.

Perhaps we need to look at these elements within the context of the culture itself. The US was born out of violent revolt, and the idea of the underdog responding with force to defeat an aggressor has been an archetype for the US hero ever since. As a nation, Americans see themselves as promoters of armed rebellion in the name of freedom and democracy around the globe.

In defiance of stereotypes, most mass shooters are not psychotic, delusional, ‘crazy’, or ‘insane’. A 2002 US Secret Service report found that the majority of school shooters have had a history of ‘feeling extremely depressed or desperate’ (not the same as having a clinical diagnosis of major depression) and nearly 80 per cent had considered or attempted suicide in the past. Almost all had experienced a major loss such as a perceived failure, loss of a loved one or romantic relationship, or a major illness prior to the shooting, and about 70 per cent perceived themselves as wronged, bullied or persecuted by others.

Revenge was a motive in the majority of incidents. Christopher Ferguson, a psychologist at Stetson University in Florida whose work has contributed to the debunking of the link between violent video games and violence, recently summarized the most salient features of a typical mass shooter, noting that risk factors for mass murder are similar for both adults and children. These include antisocial traits, depressed mood, recent loss, and a perception that others are to blame for their problems.

And herein lies the rub – while this kind of profile implies that mental illness could be an important risk factor, what we’re really talking about are negative emotions, poor coping mechanisms and life stressors that are experienced by the vast majority of us at one time or another. These risk factors are not necessarily the domain of mental illness, but rather the ‘psychopathology of everyday life’.

Therefore, it appears that the most important risk factors aren’t those that set mass murderers apart from the rest of us; instead, they are simply appropriated from culturally sanctioned patterns of aggression.

If mass shootings are difficult to predict, potentially self-perpetuating, and result not from easily eliminated sources but rather from untimely interactions between normal instincts, culturally sanctioned patterns of behavior and entrenched features of modern society, is there a rational approach to prevention? Inasmuch as marginalization seems to lie at the heart of the mass murderer’s grievances, further attempts to screen, identify, remove and effectively punish those with the potential to commit such violence are doomed to fail. [Instead,] we should reach out to those who have fallen away from mainstream society, bringing them back to the herd before they come to see only a single, deadly alternative.

Let’s also consider re-assessing some of our cultural values and teach our children about different kinds of heroes, how to resolve conflicts, and cope with loss. And, as a recent report from the Making Caring Common Project suggests, let’s prioritize raising children who are kind. The real solution is not about blame, but opportunity. According to the 2002 Secret Service report, mass shootings are not sudden, impulsive acts. They occur with planning that is known to at least one other person in more than 80 per cent of cases. This means that there’s time to reach out — not to a murderer, loser or weirdo; but to someone’s son, student, classmate and neighbor”.

Richard Pousette-Dart, The Blood Wedding, 1958 

~ “Over two decades ago, I traveled to a city in the Russian provinces called Rostov-On-Don to interview a psychiatrist named Alexander Bukhanovsky.

Bukhanovsky, now deceased, was famous. If you've seen the movie Citizen X, about the capture of serial killer Andrei Chikatilo, Bukhanovsky was the guy played by Max Von Sydow. He was the Soviet Union's first criminal profiler.

One of the first things he said was that both Russia and America produced disproportionate shares of mass killers.

"Giant militarized countries," he said, "breed violent populations.”

Bukhanovsky at the time was treating a pre-teen who had begun killing animals. He told me this young boy would almost certainly move on to killing people eventually. He was seeing more and more of these cases, he said.

The people who point at pop culture as the reason disturbed kids and lone-wolf madmen go on killing sprees are half right. But images of violence are less the problem than the messages behind them, which are profoundly intertwined with deep-seated cultural ideas about the virtue of military supremacy and the political efficacy of violence.” ~


~ “According to research, the sorts of individuals who commit mass murder often are either not mentally ill or do not recognize themselves as such. Because they blame the outside world for their problems, mass murderers would likely resist therapies that ask them to look inside themselves or to change their behavior.

A study of convicted murderers in Indiana found that just 18 percent had a serious mental-illness diagnosis. Killers with severe mental illnesses, in that study, were actually less likely to target strangers or use guns as their weapon, and they were no more likely than the mentally healthy to have killed multiple people.

“If we were able to magically cure schizophrenia, bipolar disorder, and major depression, that would be wonderful,” Jeffrey Swanson, a professor of psychiatry and behavioral sciences at the Duke University School of Medicine, told ProPublica. “But overall violence would go down by only about 4 percent.

After studying mass shooters for decades, Northeastern University criminologist James Alan Fox concluded that the killers have more mundane motivations: revenge, money, power, a sense of loyalty, and a desire to foment terror.
“Revenge motivation is, by far, the most commonplace. Mass murderers often see themselves as victims—victims of injustice. They seek payback for what they perceive to be unfair treatment by targeting those they hold responsible for their misfortunes. Most often, the ones to be punished are family members (e.g., an unfaithful wife and all her children) or coworkers (e.g., an overbearing boss and all his employees).”

“The thing about mass killers is that they externalize blame,” Fox told me. “All the disappointments, all the failures, the broken relationships, are because other people treated them wrong. They don’t see themselves as being inadequate and flawed.

Other experts have echoed Fox’s view. Michael Stone, a forensic psychiatrist at the Columbia College of Physicians and Surgeons and author of The Anatomy of Evil, on the personalities of murderers, recently conducted a study that found that a fifth of mass killers had a serious mental illness. “The rest had personality or antisocial disorders or were disgruntled, jilted, humiliated, or full of intense rage,” as The Washington Post’s Michael S. Rosenwald wrote last year. “They were unlikely to be identified or helped by the mental-health system.”

As Fox notes, mass killers tend to share a few characteristics—“depression, resentment, social isolation, the tendency to externalize blame, fascination with graphically violent entertainment, and a keen interest in weaponry”—that are common in the general population. Attempting to flag so many angsty, un-self-aware young males as potential future killers might push them closer toward violence, rather than away from it.

Instead, a better way of predicting whether someone might be predisposed to violence is if they have a history of violence, as Swanson told ProPublica. For example, Spencer Hight, who killed his ex-wife and seven others at a football-watching party in Plano, Texas, earlier this month, had been violent at least twice, reportedly slamming his wife’s face against a wall.

Compared to those with no criminal record, handgun purchasers who have at least one misdemeanor conviction are seven times more likely to be charged with a new offense after they buy their gun. Right now, only 23 states restrict people with a history of violent misdemeanors from owning firearms.

from another source:


~ “Violence is not a product of mental illness. Nor is violence generally the action of ordinary, stable individuals who suddenly “break” and commit crimes of passion. Violent crimes are committed by violent people, those who do not have the skills to manage their anger. Most homicides are committed by people with a history of violence. Murderers are rarely ordinary, law-abiding citizens, and they are also rarely mentally ill. Violence is a product of compromised anger management skills.

In a summary of studies on murder and prior record of violence, Don Kates and Gary Mauser found that 80 to 90 percent of murderers had prior police records, in contrast to 15 percent of American adults overall. In a study of domestic murderers, 46 percent of the perpetrators had had a restraining order against them at some time. Family murders are preceded by prior domestic violence more than 90 percent of the time. Violent crimes are committed by people who lack the skills to modulate anger, express it constructively, and move beyond it.” ~


Why so many mass killings in the US? It is fairly obvious that it's not due to mental illness, a red herring that allows the interests of the gun industry and the NRA, backed by the substantial contributions they pour into the coffers of politicians, to remain in place and undisturbed even as the massacres continue, and their frequency accelerates. In fact the gun industry and it's lobbyists actually propose MORE guns as a solution. Arm the teachers! Allow students to carry guns! That this is insane is both obvious and unacknowledged.

We will only really understand the positions of these groups by examining how very very profitable the gun industry is, and how much and how powerful that money makes them as they use it to control politicians and lawmakers. They have no shame, and the piles of bodies mean nothing weighed against profit and power. They aren't even embarrassed about tossing the sop of "thoughts and prayers" to the grieving relatives.

The single gunman with an assault weapon is not insane, he's angry — he's looking for revenge, on as big a scale as he can manage, a bloody vengeance, too big for anything but the most efficient weapons, ones made to kill as many as possible as quickly as possible. The gunman will have his "blaze of glory”— his moment at the center of the world’s attention — his apotheosis.

As to why this is a particularly American thing — who are our heroes, and what are our stories? There is that mythos of the Wild West, with it's outlaws and gunslingers and shootouts. Skill with a gun was idolized, guns were power, the “equalizers.” We all grew up playing cowboys and watching westerns. The violence was always there, it has just become more graphic, more extreme . . . movies, TV, videogames.

However, I think it's going at things backwards to say the violence creates the culture — it is more like the violence is created by and a reflection of, the culture. Violent movies don’t make people violent. Violent culture will create violent movies. Of course I'm not speaking of individuals, but of the violence embedded in the culture itself. We relish our heroes, who resolve their dilemmas by eliminating them with superior firepower.

But now this bloody fantasy is being acted out in the real world  again and again. The victims pile up, sacrifices on the altar of narcissistic rage and our most lethal fantasies. The fear of losing our guns trumps the fear of losing our children. We have long ago decided that the mentally ill are disposable, and relegated them to the streets and prisons. Not only is better care and regulation of the mentally ill a false solution, it is unlikely to lead to much of anything. There is no profit in it.

I have so much frustration and grief and anger about these issues it is overwhelming. The deliberate refusal to see that this is a problem here and nowhere else has convinced me nothing useful will be done.


As the article states, if all mental illness were magically eliminated, we’d get only 4% less violence. Besides, there is as much or more mental illness in other countries, but — isn’t this strange? — no problem with mass shootings.

I’d like to learn more about what’s happening in Russia, since the statement “Giant militarized countries breed violent populations” makes intuitive sense. But we are not likely to get accurate information. Still, if American-style mass shooting were happening, there’d be leaks. And Putin would not tolerate civilian access to military-style weapons. So easy access to the most efficient killing firearms remains the most likely answer. Combine it with anger and raging desire for vengeance, and in some cases extreme ideology, and it’s just a matter of time until the next shooting, while we stand by helpless. 

Still, anger, “toxic masculinity,” glorification of violence, vicious ideology, and whatever other reasons have been given for the shootings don’t result in multiples of dead bodies within minutes unless the right lethal weapon is available. And it wouldn’t be without profits. 

If at least a liability insurance were required . . . 

But I don’t have any hope either. But think of the effect on the young people as they see that corrupt adults willing to sacrifice their lives for the sake of continued NRA and gun-industry donations. Even individuals on the no-fly list — i.e. suspected terrorists — can legally buy high-capacity weapons! The evil of it is beyond words. And we live with it, hoping that one day at the mall or office, campus or church (church!!), it won’t be us who get mowed down.


“Have no fear, folks. The Republican Congress is praying for us.” ~ Stuart Balcomb 

THE TWO-SANTA GOP STRATEGY (long but enlightening)

~ “Republican strategist Jude Wanniski’s 1974 “Two Santa Clauses Theory” has been the main reason why the GOP has succeeded in producing our last two Republican presidents, Bush and Trump (despite losing the popular vote both times). It’s also why Reagan’s economy seemed to be “good.”

Here’s how it works, laid it out in simple summary:

First, when Republicans control the federal government, and particularly the White House, they spend money like a drunken sailor and run up the US debt as far and as fast as possible.  This produces three results — it stimulates the economy thus making people think that the GOP can produce a good economy, it raises the debt dramatically, and it makes people think that Republicans are the “tax-cut Santa Claus.”

Second, when a Democrat is in the White House, they scream about the national debt as loudly and frantically as possible, freaking out about how “our children will have to pay for it!” and “we have to cut spending to solve the crisis!” This will force the Democrats in power to cut their own social safety net programs, thus shooting their welfare-of-the-American-people Santa Claus.

Think back to Ronald Reagan, who more than tripled the US debt from a mere $800 billion to $2.6 trillion in his 8 years. That spending produced a massive stimulus to the economy, and the biggest non-wartime increase in the debt in history. Nary a peep from Republicans about that 218% increase in our debt; they were just fine with it.

And then along came Bill Clinton. The screams and squeals from the GOP about the “unsustainable debt” of nearly $3 trillion were loud, constant, and echoed incessantly by media from CBS to NPR.  Newt Gingrich rode the wave of “unsustainable debt” hysteria into power, as the GOP took control of the House for the first time lasting more than a term since 1930, even though the increase in our national debt under Clinton was only about 37%.

The GOP “debt freakout” was so widely and effectively amplified by the media that Clinton himself bought into it and began to cut spending, taking the axe to numerous welfare programs (“It’s the end of welfare as we know it” he famously said, and “The era of big government is over”).  Clinton also did something no Republican has done in our lifetimes: he supported several balanced budgets and handed a budget surplus to George W. Bush.

When George W. Bush was given the White House by the Supreme Court (Gore won the popular vote by over a half-million votes) he reverted to Reagan’s strategy and again nearly doubled the national debt, adding a trillion in borrowed money to pay for his tax cut for GOP-funding billionaires, and tossing in two unfunded wars for good measure, which also added at least (long term) another $5 to $7 trillion. 

There was not a peep about the debt from any high-profile in-the-know Republicans then; in fact, Dick Cheney famously said, essentially ratifying Wanniski’s strategy, “Reagan proved deficits don't matter. We won the midterms [because of those tax cuts]. This is our due.” Bush and Cheney raised the debt by 86% to over $10 trillion (although the war debt wasn’t put on the books until Obama entered office).

Then comes Democratic President Barack Obama, and suddenly the GOP is hysterical about the debt again.  So much so that they convinced a sitting Democratic president to propose a cut to Social Security (the “chained CPI”). Obama nearly shot the Democrats biggest Santa Claus program.  And, Republican squeals notwithstanding, Obama only raised the debt by 34%.

Now we’re back to a Republican president, and once again deficits be damned. Between their tax cut and the nearly-trillion dollar spending increase passed on February 8th, in the first year-and-a-month of Trump’s administration they’ve spent more stimulating the economy (and driving up debt by more than $2 trillion, when you include interest) than the entire Obama presidency. 

Consider the amazing story of where this strategy came from, and how the GOP has successfully kept their strategy from getting into the news; even generally well-informed writers for media like the Times and the Post – and producers, pundits and reporters for TV news — don’t know the history of what’s been happening right in front of us all for 37 years.

Republican strategist Jude Wanniski first proposed his Two Santa Clauses strategy in 1974, when Richard Nixon resigned in disgrace and the future of the Republican Party was so dim that books and articles were widely suggesting the GOP was about to go the way of the Whigs.  There was genuine despair across the Party, particularly when Jerry Ford began stumbling as he climbed the steps to Air Force One and couldn’t even beat an unknown peanut farmer from rural Georgia for the presidency.

Wanniski was tired of the GOP failing to win elections.  And, he reasoned, it was happening because the Democrats had been viewed since the New Deal as the Santa Claus party (taking care of people’s needs and the General Welfare), while the GOP, opposing everything from Social Security to Medicare to unemployment insurance, was widely seen as the party of Scrooge.

The Democrats, he noted, got to play Santa Claus when they passed out Social Security and Unemployment checks – both programs of the New Deal – as well as when their "big government" projects like roads, bridges, and highways were built, giving a healthy union paycheck to construction workers and making our country shine.

Democrats kept raising taxes on businesses and rich people to pay for things, which didn't seem to have much effect at all on working people (wages were steadily going up, in fact), and that added to the perception that the Democrats were a party of Robin Hoods, taking from the rich to fund programs for the poor and the working class.

Americans loved the Democrats back then. And every time Republicans railed against these programs, they lost elections.

Wanniski decided that the GOP had to become a Santa Claus party, too.  But because the Republicans hated the idea of helping working people, they had to figure out a way to convince people that they, too, could have the Santa spirit.  But what?

“Tax cuts!” said Wanniski.

To make this work, the Republicans would first have to turn the classical world of economics – which had operated on a simple demand-driven equation for seven thousand years – on its head. (Everybody understood that demand – aka “wages” – drove economies because working people spent most of their money in the marketplace, producing demand for factory output and services.)

In 1974 Wanniski invented a new phrase – "supply side economics" – and suggested that the reason economies grew wasn't because people had money and wanted to buy things with it but, instead, because things were available for sale, thus tantalizing people to part with their money.

To help, Arthur Laffer took that equation a step further with his famous napkin scribble. Not only was supply-side a rational concept, Laffer suggested, but as taxes went down, revenue to the government would go up!  Neither concept made any sense – and time has proven both to be colossal idiocies – but together they offered the Republican Party a way out of the wilderness.

Ronald Reagan was the first national Republican politician to fully embrace the Two Santa Clauses strategy.  He said straight out that if he could cut taxes on rich people and businesses, those tax cuts would cause them to take their surplus money and build factories, and that the more stuff there was supplying the economy the faster it would grow.

But Wanniski had been doing his homework on how to sell “voodoo” supply-side economics.

In 1976, he rolled out to the hard-right insiders in the Republican Party his "Two Santa Clauses" theory, which would enable the Republicans to take power in America for the next forty years.

Democrats, he said, had been able to be "Santa Clauses" by giving people things from the largesse of the federal government. From food stamps to new schools to sending a man to the moon, the people loved the “toys” the Democrats brought every year.

Republicans could do that, too, the theory went — spending could actually increase without negative repercussions. Plus, Republicans could be double Santa Clauses by cutting people's taxes!

For working people it would only be a small token – a few hundred dollars a year on average – but would be heavily marketed. And for the rich, which wasn’t to be discussed in public, it would amount to hundreds of billions of dollars in tax cuts. 

The rich, Reagan, Bush, and Trump told us, would then use that money to import or build more stuff to market, thus stimulating the economy and making average working people richer. (And, of course, they’d pass some of that money back to the GOP, like the Kochs giving Paul Ryan $500,000.00 right after he passed the last tax cut that gave them billions.)

There was no way, Wanniski said, that the Democrats could ever win again. They'd be forced into the role of Santa-killers by raising taxes, or anti-Santas by cutting spending. Either one would lose them elections.

When Reagan rolled out Supply Side Economics in the early 80s, dramatically cutting taxes while exploding spending, there was a moment when it seemed to Wanniski and Laffer that all was lost. The budget deficit exploded and the country fell into a deep recession – the worst since the Great Depression – and Republicans nationwide held their collective breath.

But David Stockman came up with a great new theory about what was going on – they were "starving the beast" of government by running up such huge deficits that Democrats would never, ever in the future be able to talk again about national health care or improving Social Security.

And this so pleased Alan Greenspan, the Fed Chairman, that he opened the spigots of the Fed, dropping interest rates and buying government bonds, producing a nice, healthy goose to the economy.

Greenspan further counseled Reagan to dramatically increase taxes on people earning under $37,800 a year by doubling the Social Security (FICA/payroll) tax, and then let the government borrow those newfound hundreds of billions of dollars off-the-books to make the deficit look better than it was.

Reagan, Greenspan, Winniski, and Laffer took the federal budget deficit from under a trillion dollars in 1980 to almost three trillion by 1988, and back then a dollar could buy far more than it buys today. They and George HW Bush ran up more debt in eight years than every president in history, from George Washington to Jimmy Carter, combined.

And this so pleased Alan Greenspan, the Fed Chairman, that he opened the spigots of the Fed, dropping interest rates and buying government bonds, producing a nice, healthy goose to the economy.

Greenspan further counseled Reagan to dramatically increase taxes on people earning under $37,800 a year by doubling the Social Security (FICA/payroll) tax, and then let the government borrow those newfound hundreds of billions of dollars off-the-books to make the deficit look better than it was.

Reagan, Greenspan, Winniski, and Laffer took the federal budget deficit from under a trillion dollars in 1980 to almost three trillion by 1988, and back then a dollar could buy far more than it buys today. They and George HW Bush ran up more debt in eight years than every president in history, from George Washington to Jimmy Carter, combined.

Clinton was the anti-Santa Claus, and the result was an explosion of Republican wins across the country as Republican politicians campaigned on a platform of supply-side tax cuts and pork-rich spending increases. State after state turned red, and the Republican Party rose to take over, ultimately, every single lever of power in the federal government, from the Supreme Court to the White House.

Looking at the wreckage of the Democratic Party all around Clinton by 1999, Winniski wrote a gloating memo that said, in part: "We of course should be indebted to Art Laffer for all time for his Curve... But as the primary political theoretician of the supply-side camp, I began arguing for the 'Two Santa Claus Theory' in 1974. If the Democrats are going to play Santa Claus by promoting more spending, the Republicans can never beat them by promoting less spending. They have to promise tax cuts…”

Two Santa Clauses had gone mainstream. Never again would Republicans worry about the debt or deficit when they were in office; and they knew well how to scream hysterically about it as soon as Democrats took power.

George W. Bush embraced the Two Santa Claus Theory with gusto, ramming through huge tax cuts – particularly a cut to the capital gains tax rate on people like himself who made their principle income from sitting around the mailbox waiting for their dividend or capital gains checks to arrive – and blew out federal spending.

Bush, with his wars, even out-spent Reagan, which nobody had ever thought would again be possible. And it all seemed to be going so well, just as it did in the early 1920s when a series of three consecutive Republican presidents cut income taxes on the uber-rich from over 70 percent to under 30 percent.

In 1929, pretty much everybody realized that instead of building factories with all that extra money, the rich had been pouring it into the stock market, inflating a bubble that — like an inexorable law of nature — would have to burst.

In reality, his tax cuts did what they have always done over the past 100 years — they initiated a bubble economy that would let the very rich skim the cream off the top just before the ceiling crashed in on working people. Just like today.

The Republicans got what they wanted from Wanniski's work. They held power for thirty years, made themselves trillions of dollars, and cut organized labor's representation in the workplace from around 25 percent when Reagan came into office to around 6 of the non-governmental workforce today.

Over time, and without raising the cap, Social Security will face an easily-solved crisis, and the GOP’s plan is for force Democrats to become the anti-Santa, yet again. If the GOP-controlled Congress continues to refuse to require rich people to pay into Social Security (any income over $128,000 is SS-tax-free), either benefits will be cut or the retirement age will have to be raised to over 70.

Over time, and without raising the cap, Social Security will face an easily-solved crisis, and the GOP’s plan is for force Democrats to become the anti-Santa, yet again. If the GOP-controlled Congress continues to refuse to require rich people to pay into Social Security (any income over $128,000 is SS-tax-free), either benefits will be cut or the retirement age will have to be raised to over 70.

When this happens, Democrats must remember Jude Wanniski, and accept neither the cut to disability payments nor the entree to Social Security “reform.” They must demand the “cap” be raised, as Bernie Sanders proposed and the Democratic Party adopted in its 2016 platform.

And, hopefully, some of our media will begin to call the GOP out on the Two Santa Clauses program. It’s about time that Americans realized the details of the scam that’s been killing wages and enriching billionaires for nearly four decades.” ~



~ “One of the central embarrassments of Christianity arises from one of the most central errors of its founding figurehead. Jesus Christ was convinced that the next world — a radically different world from the observable reality of Roman Judea in which he found himself — was, as he continuously put it, “at hand.” He was the prophet of this change in the exact same way John the Baptist had been the prophet of his own coming — that is, as a roadside herald, trumpet in hand, declaring the coming of something extremely imminent. Jesus repeatedly tells his listeners that he is a divisive figure, an enemy of complacency. He repeatedly tells people they must choose sides, this dusty live-a-day world all around them, or the next world, which is just about to dawn and change everything.

The problem with this particular mistake (the world didn’t change, the kingdom of Heaven didn’t arrive, the Romans kept nailing troublemakers to scaffolding) is that it elicits some of Jesus’ most straightforward comments — none more so than Matthew 19:21, when the Master is confronted by a rich young man who is righteous and God-abiding (when he’s given a list of commandments, he comments that he’s been following them his whole life – in other words, crucially, he’s not a sinner). The young man asks what he must do to gain eternal life, and Jesus’ answer hits him right between the eyes: “If thou wilt be perfect, go and sell that thou hast, and give to the poor, and thou shalt have treasure in heaven: and come and follow me.”

The young man refuses and goes away disappointed, and that’s when Jesus utters his famous imprecation that it’s easier for a camel to pass through the eye of a needle than for a rich man to enter the kingdom of Heaven.

Hardly any rich Christians have wanted to do what their Savior explicitly commands them to do. The text from Matthew provides the title of Peter Brown’s dense, magnificent new book (with its gigantic sub-title), Through the Eye of a Needle: Wealth, the Fall of Rome, and the Making of Christianity in the West, 350-550 AD, and the subject — the way early Christians got around the embarrassment of not wanting to be poor — is explored in 500 pages of fascinating, engaging prose and 100 pages of close-packed and amazingly comprehensive notes. The conflict between the sacred calling of Christianity and the more mundane concerns of spes saeculi, the hope of advancement in this world, is here given an examination like it’s never had before, with money at the heart of it all.

Also at the heart of it all is that pivotal figure, St. Augustine, and readers who’ve already encountered Brown’s justly revered Augustine of Hippo will know to expect fine writing and fine insight into the figure who, more than anybody, tried to work out a theocratic framework that would allow his congregation to be wealthy if only they avoided avarice. Blatant double-talk like that would come in very handy to Christians of every subsequent century.

~ Augustine’s justification of wealth came at the right time. In a world that had been unexpectedly shaken by renewed civil war and by barbarian invasion, there was no point in denouncing the rich for the manner in which they had gained their wealth. Those whose wealth had survived the shocks of this new crisis were unlikely to feel guilty about what little of it was left to them. The radical critiques of wealth and the wealthy associated with the preachings of Ambrose and with the Pelagian De divitiis were out-of-date. Such radicalism had been the product of an age of affluence. It had played on the disquiet of the comfortable rich of the fourth-century age of gold. It had less effect on persons who now faced the prospect of losing everything.” ~


Since the world was about to end, it made sense to divest oneself of wealth. The end of the world certainly makes the pursuit of money not just irrelevant, but downright sinful. There was time only for the acts of generosity and kindness.

But time kept passing, and the central promise of early Christianity turned out to be false — or, as true believers insist, delayed (indefinitely, it seems). Hence the need for a heavy spin on the question of wealth and the pesky problem of passing through the eye of a needle. And sure enough, such spin has been found — already by St. Augustine, himself no stranger to the comforts of wealth.

St. Augustine, 6th century fresco. A “doctor of the church,” he was a real “Dr. Spin.”



Dog bites are a problem. According to the American Veterinary Medical Association, 4.5 million Americans are bitten by dogs each year, and every day, nearly 1,000 individuals show up in hospital emergency rooms because of dog attacks. The annual cost of medical treatments for dog bites (including 27,000 reconstructive surgeries) is over $250,000,000, and insurance companies fork out $530 million dollars a year in dog bite claims. Then there are the 26 Americans who were killed by dogs last year.

How Many Dog Bites?

But how many people are really bitten by dogs and who is most likely to be bitten by a dog? Researchers at the University of Liverpool realized that a lot of dog bite victims do not actually see a doctor. They figured that the best way to estimate rates of dog bites would be to ask everyone in a community if they had ever been bitten by a dog. Their results have just been published in the Journal of Epidemiology and Community Health, and there are some surprises. (You can read the full text of their article here.)

Led by Dr. Carri Westgarth of the University of Liverpool, the research team attempted to contact people living in all 1,280 household in a semi-rural town near Liverpool. While they did not get everyone, they did have a high degree of cooperation and were able to obtain information from 767 residents. In addition to questions about dog bites, the researchers also asked about basic demography (sex, age, etc.) and the participants took a short test which measures the well-known Big Five personality traits.

Here’s what the researchers found:

    25 percent of the participants had been bitten by a dog.

    Only one in three victims received medical attention.

    Men were nearly twice as likely to have been bitten as women.
    People who owned multiple dogs were three times more likely to be bitten than non-dog owners.
    Children are at higher risk: 44 percent of the bites occurred when the victim was younger than 16.

    In 55 percent of cases, the person had never before seen the dog that bit them.

    But the most interesting finding was related to personality: People with higher scores on the Big Five trait of emotional stability were 22% less likely to have been bitten by a dog than were individuals who were less emotionally stable.

What Is The Link Between Personality and Dog Bites?

This is the first study to link dog attacks to the personalities of victims. Low emotional stability is also called neuroticism, and it is associated with insecurity, fear, self-consciousness, anxiety, and being temperamental. But why is this personality trait related to dog bites? Neuroticism is linked to a slew of mental and physical health problems. These include drug and alcohol dependency, panic disorders, cardiovascular disease, asthma, and irritable bowel syndrome. In their article, Westgarth and her colleagues suggested it is possible that some unknown pattern of behavior in emotionally unstable people makes them especially prone to dog bites.  But they also point out that other factors might be involved. For example, anxious people might be more likely to have nervous dogs. Or the causal arrow point could even point the other direction and being bitten by a dog could make people more fearful and anxious.


But the effect of “personality” isn't that large. People can be emotionally stable and still get bitten.

What should have gotten emphasis is that men are twice as likely to get bitten as women, and owners of multiple dogs three times as likely. Children are also at a higher risk.

Advice on how to prevent being bitten:

    “Don’t approach an unfamiliar animal.
    Do not run from a dog, panic or make loud noises.
    If an unfamiliar dog approaches you, remain motionless. Do not run or scream. Avoid direct eye contact.
    Don’t disturb a dog while they’re eating, sleeping, or taking care of their puppies.
    Allow a dog to sniff and smell you before you attempt to pet it. Afterward scratch the animal under the chin, not on the head.
    Report strays or dogs displaying strange behavior to your local animal control.
    If knocked over by a dog, roll into a ball and remain motionless. Be sure to cover your ears and neck with your hands and arms. Avoid eye contact and remain calm.
    Don’t encourage your dog to play aggressively.”

The breed that does most biting: the chihuahua. Bulldogs and pit bulls come next. Of course chihuahua bites never killed anyone. That’s unfortunately not true about the larger breeds. Most deaths are caused by pit bulls — more than all the other breeds combined.


The latest good news: A study recently published in Neurology finds that healthy seniors who had daily helpings of leafy green vegetables — such as spinach, kale and collard greens — had a slower rate of cognitive decline, compared to those who tended to eat little or no greens.

"The association is quite strong," says study author Martha Clare Morris, a professor of nutrition science at Rush Medical College in Chicago. She also directs the Rush Institute for Healthy Aging.

The research included 960 participants of the Memory and Aging Project. Their average age is 81, and none of them have dementia. Each year the participants undergo a battery of tests to assess their memory. Scientists also keep track of their eating habits and lifestyle habits.

To analyze the relationship between leafy greens and age-related cognitive changes, the researchers assigned each participant to one of five groups, according to the amount of greens eaten. Those who tended to eat the most greens comprised the top quintile, consuming, on average, about 1.3 servings per day. Those in the bottom quintile said they consume little or no greens.

After about five years of follow-up/observation, "the rate of decline for [those] in the top quintile was about half the decline rate of those in the lowest quintile," Morris says.

So, what's the most convenient way to get these greens into your diet?

"My goal every day is to have a big salad," says Candace Bishop, one of the study participants. "I get those bags of dark, leafy salad mixes."

A serving size is defined as a half-cup of cooked greens, or a cup of raw greens.

Many factors play into healthy aging — this study does not prove that eating greens will fend off memory decline. With this kind of research, Morris explains, scientists can only establish an association — not necessarily causation — between a healthy diet and a mind that stays sharp.

Still, she says, even after adjusting for other factors that might play a role, such as lifestyle, education and overall health, "we saw this association [between greens and a slower rate of cognitive decline] over and above accounting for all those factors.”

Some prior research has pointed to a similar benefit. A study of women published in 2006 also found that high consumption of vegetables was associated with less cognitive decline among older women. The association was strongest with greater consumption of leafy vegetables and cruciferous vegetables — such as broccoli and cauliflower.

What might explain a benefit from greens?

Turns out, these vegetables contain a range of nutrients and bioactive compounds including vitamin E and K, lutein, beta carotene and folate.
"They have different roles and different biological mechanisms to protect the brain," says Morris. More research is needed, she says, to fully understand their influence, but scientists know that consuming too little of these nutrients can be problematic.

For instance, "if you have insufficient levels of folate in your diet you can have higher levels of homocysteine," Morris says. This can set the stage for inflammation and a buildup of plaque, or fatty deposits, inside your arteries, which increases the risk of stroke. Research shows elevated homocysteine is associated with cognitive impairment among older adults.

Another example: Getting plenty of Vitamin E from foods in your diet can help protect cells from damage and also has been associated with better cognitive performance.

"So, when you eat leafy greens, you're eating a lot of different nutrients, and together they can have a powerful impact," Morris says.


Don’t forget to put on a generous amount of extra virgin olive oil on your leafy greens. The oil will actually help absorb the micronutrients, besides having neuroprotective benefits of its own.

ending on beauty:

To be alive: not just the carcass
But the spark.
That's crudely put, but . . .

If we're not supposed to dance,
Why all this music?

    ~ Gregory Orr 

Downtown San Diego; Gwyn Henry

No comments:

Post a Comment