Saturday, May 29, 2021


Anne Boleyn’s Book of Hours


To train myself to find, in the midst of hell
what isn’t hell.

The body, bald, cancerous, but still
beautiful enough to
imagine living the body
washing the body
replacing a loose front
porch step the body chewing
what it takes to keep a body

this scene has a tune
a language I can read
this scene has a door
I cannot close I stand
within its wedge
I stand within its shield

Why write love poetry in a burning world?
To train myself, in the midst of a burning world,
to offer poems of love to a burning world.

~ Katie Farris


How wonderful that she finds the body "still beautiful enough”!

The body, bald, cancerous, but still
beautiful enough to
imagine living the body
washing the body
replacing a loose front
porch step the body chewing
what it takes to keep a body
. . .
Why write love poetry in a burning world?
To train myself, in the midst of a burning world,
to offer poems of love to a burning world.




~ Buffett learned a long time ago that the greatest commodity of all is time. He simply mastered the art and practice of setting boundaries for himself. That’s why this Buffett quote remains a powerful life lesson: 

The difference between successful people and really successful people is that really successful people say no to almost everything. 

We have to know what to shoot for to simplify our lives. It means saying no over and over again to the unimportant things flying in our direction every day, and remaining focused on saying yes to the few things that truly matter. ~


Following up on this advice, I unsubscribed from a dozen or so mailing lists that kept littering my mailbox every day. 

I always forget how important the empty days are, how important it may be sometimes not to expect to produce anything, even a few lines in a journal. A day when one has not pushed oneself to the limit seems a damaging day, a sinful day. Not so! The most valuable thing we can do for the psyche, occasionally, is to let it rest, wander, live in the changing light of room, not try to be or do anything whatever. ~ May Sarton

Frederick Friesecke: Sun Room


“Time is how you spend your love.”  ~ Zadie Smith


For me, it’s writing (including the blog) and gardening. Of course I have to take care of all kinds of chores as we all do, but life wouldn’t be worth living if that happened to be all. I need to do things I love doing. It’s my “life support.” 


"Maybe forgiveness is just that. The ability to admit someone else's story.” ~ Lidia Yuknavitch

A Green Heart by Catrin Welz-Stein  


~ In 1991, with America gripped by a struggle between an increasingly liberal secular society that pushed for change and a conservative opposition that rooted its worldview in divine scripture, James Davison Hunter wrote a book and titled it with a phrase for what he saw playing out in America’s fights over abortion, gay rights, religion in public schools and the like: “Culture Wars.”

Hunter, a 30-something sociologist at the University of Virginia, didn’t invent the term, but his book vaulted it into the public conversation, and within a few years it was being used as shorthand for cultural flashpoints with political ramifications. He hoped that by calling attention to the dynamic, he’d help America “come to terms with the unfolding conflict” and, perhaps, defuse some of the tensions he saw bubbling. 

Instead, 30 years later, Hunter sees America as having doubled down on the “war” part—with the culture wars expanding from issues of religion and family culture to take over politics almost totally, creating a dangerous sense of winner-take-all conflict over the future of the country.

“Democracy, in my view, is an agreement that we will not kill each other over our differences, but instead we’ll talk through those differences. And part of what’s troubling is that I’m beginning to see signs of the justification for violence,” says Hunter, noting the insurrection on January 6, when a mob of extremist supporters of Donald Trump stormed the U.S. Capitol in an attempt to overthrow the results of the 2020 election. “Culture wars always precede shooting wars. They don’t necessarily lead to a shooting war, but you never have a shooting war without a culture war prior to it, because culture provides the justifications for violence.”

What changed? In the latter half of the 20th century, the culture war was, on some level, a “cultural conflict that took place primarily within the white middle class,” says Hunter, who now leads the University of Virginia’s Institute for Advanced Studies in Culture. But, today, as that conflict has grown, “instead of just culture wars, there’s now a kind of class-culture conflict” that has moved beyond the simple boundaries of religiosity.

The earlier culture war really was about secularization, and positions were tied to theologies and justified on the basis of theologies,” says Hunter. “That’s no longer the case. You rarely see people on the right rooting their positions within a biblical theology or ecclesiastical tradition. [Nowadays,] it is a position that is mainly rooted in fear of extinction.”

In 1991, politics still seemed like a vehicle through which we might resolve divisive cultural issues; now, politics is primarily fueled by division on those issues, with leaders gaining power by inflaming resentments on mask-wearing, or transgender students competing in athletics, or invocations of “cancel culture,” or whether it’s OK to teach that many of the Founding Fathers had racist beliefs. And this reality—that the culture war has colonized American politics—is troubling precisely because of an observation Hunter made in 1991 about the difference he saw between political issues and culture war fights: “On political matters, one can compromise; on matters of ultimate moral truth, one cannot.”

Where does that leave us? What does it portend for the decades to come? Is there a way to bridge these cultural impasses? And, amid all of this, is there a source for optimism?

Q: Let’s start with a basic question: Whether we’re talking about 2021 or back in 1991, when your book, “Culture Wars,” came out, what is it that we mean by the term “culture war”?

James Davison Hunter: Well, in a world that has politicized everything, there’s a sense that politics is both the root cause of the problems we face and, ultimately, the solution. But the larger argument that I make is that politics is an artifact of culture. It’s a reflection: Culture underwrites our politics.

When it comes to “culture war,” there are two ways of thinking about it. 

One — probably the most prevalent way—is to think of it as a political battle over certain kinds of cultural issues, like abortion, sexuality, family values, church-state issues, and so on. And therefore, the “culture war” is really about the mobilization of political resources —of people and votes and parties—around certain positions on cultural issues. In that sense, a “culture war” is really about politics.

But the bigger story is about the cultures that underwrite our politics, and the ways in which our politics become reflections of deeper cultural dispositions—not just attitudes and values—that go beyond our ability to reason about them

When we talk about “culture war,” it’s really about both things. 

In simpler terms, I would make the distinction between the weather and the climate. Almost all journalists and most academics focus on what’s happening in the weather: “Today, it’s cold. Tomorrow, it’s going to be warm. The next day, it’s going to rain.” I find the climatological changes that are taking place to be much more interesting. And it’s those that are really animating our politics and polarization, animating dynamics within democracy right now. 

The changes you looked at in “Culture Wars” had largely happened over the 30 years prior—basically since the early 1960s, with the civil rights movement, sexual revolution, the gay rights movement, women’s lib and the backlashes that followed. It’s now 30 years since that book came out. How has the culture war changed in that time?

An important demographic and institutional structural shift took place [in recent decades]. Modern higher education has always been a carrier of the Enlightenment, and, in that sense, a carrier of secularization. What happened in the post-World War II period was a massive expansion of higher education and the knowledge-based economy. And with that came a larger cultural shift: What used to be the province of intellectuals now became the province of anyone who had access to higher education, and higher education became one of the gates through which the move to middle class or upper middle class life was made.

With that came profound cultural change. The ’60s revolution and the political, cultural and sexual protests at the time essentially became institutionalized, and it challenged fundamental notions of what was right, decent, good, fair and so on. And in a way, what you had in the late 1970s into the ’80s and ’90s was a reaction against the challenge represented by that structural change. Conservatives—especially conservative Christians, whether Catholic or Protestant—found themselves on the defensive against progressive notions of family structure, “family values,” sexuality; abortion was a—or maybe the—critical issue. 

Martin E. Marty, the church historian from Chicago, once said that after the Volstead Act and the Scopes trial, evangelical Protestants became a cognitive minority—a minority within intellectual realms—but remained a social and behavioral majority—they basically owned middle America. What we have seen since is a continuation of those structural changes. The Enlightenment and post-Enlightenment culture got carried by universities [and] other important cultural institutions, and these cultural institutions are dominated by supermajorities of progressives. 

Conservatives see this as an existential threat. That’s an important phrase: They see it as an existential threat to their way of life, to the things that they hold sacred. So while the earlier culture war really was about secularization, and positions were tied to theologies and justified on the basis of theologies, that’s no longer the case. You rarely see people on the right rooting their positions within a biblical theology or ecclesiastical tradition. [Nowadays,] it is a position that is mainly rooted in fear of extinction.

Are the dividing lines of the “culture war” different now than they were, say, 30 years ago?

I would argue that what abortion was to [culture wars in] the ’70s, ’80s and ’90s and maybe even beyond, when it was really the critical issue, I think that’s now being replaced by race. The earlier culture wars were a cultural conflict that took place primarily within the white middle class. It’s not that minorities didn’t have positions [on those issues] or weren’t divided themselves, but race was never a very prominent part of that conflict. And I think it has reemerged in part because just as the earlier manifestations of the culture war were ultimately a struggle to define the meaning of America, this is also. Latent within these struggles is a conflict over the meaning of America.

2008 was a really important year, insofar as the Great Recession accentuated an important distinction within the white middle class. It drove a wedge between the middle and lower-middle or working class and the highly trained, professionally educated managers, technocrats and intellectuals—basically, between the top 20 percent and the bottom 80 percent. 

And that meant [there] were now class differences that were overlaid upon some of these cultural differences. And in surveys that we've done here at the Institute [for Advanced Cultural Studies at University of Virginia], we’ve tracked that. In 2016, the single most important factor in determining a Trump vote was not having a college degree.

So now, instead of just culture wars, there's now a kind of class-culture conflict. With a sense of being on the losing side of our global economy and its dynamics, I think that the resentments have just deepened. That became obvious, more and more, over the four years of Trump, and part of Trump’s own genius was understanding the resentments of coming out on the losing side of global capitalism.

And I think this is reflected, too, in the ways in which progressives speak about the downtrodden: Most of the time, it is in terms of race and ethnicity, immigration and the like; it is not about the poor, per se. I think that’s a pretty significant shift in the left’s self-understanding. 

What do you think is behind that shift?

Well, if you became an advocate for the working class, you’d be an advocate for a lot of Trump voters. Again, I think there's a class-culture divide: a class element that overlays the cultural divide. And they [white non-college-educated voters] voted en masse for Trump. And I think that’s an element of it. They’re also the carriers of what [some on the left] perceive to be racist and misogynist, sexist understandings and ways of life. That’s my guess. 

Straightforward, materialist social science would say that people are voting their economic interests all the time. But they don’t. The seeming contradiction of people voting against their economic interests only highlights that point: That, in many respects, our self-understanding as individuals, as communities and as a nation trumps all of those things. 

Along those lines, there can be a tendency, especially on the political left, to talk about “culture war” issues as being “distractions” that are raised in order to divide people who might otherwise find common cause around, say, shared economic interests. What do you make of that view?

We are constituted as human beings by the stories we tell about ourselves. The very nature of meaning and purpose in life are constituted by our individual and collective self-understandings. How that is a “distraction” is beyond me. 

You know, people will fight to the death for an idea, for an ideal. I was criticized in the early ’90s for using the word “war” [in the term “culture war”]. But I was trained in phenomenology, in which you are taught to pay attention to the words that people themselves use. And in interviews I did [with those on the front lines of “culture war” fights], people would say, “you know, it feels like a war”—even on the left. 

I talk about this sense of a struggle for one’s very existence, for a way of life; this is exactly the language that is also used on the left, but in a much more therapeutic way. When you hear people say that, for instance, conservatives’ very existence on this college campus is “a threat to my existence” as a trans person or gay person, the stakes — for them — seem ultimate. 

The question is: What is it that animates our passions? I don’t know how one can imagine individual and collective identity—and the things that make life meaningful and purposeful—as somehow peripheral or as “distractions.” 

There’s a passage you wrote 30 years ago that seems relevant to this point: “We subtly slip into thinking of the controversies debated as political rather than cultural in nature. On political matters, one can compromise; on matters of ultimate moral truth, one cannot. This is why the full range of issues today seems interminable.”

I kind of like that sentence. [Laughs] I would put it this way: Culture, by its very nature, is hegemonic. It seeks to colonize; it seeks to envelop in its totality. The root of the word “culture” is Latin: “cultus.” It’s about what is sacred to us. And what is sacred to us tends to be universalizing. The very nature of the sacred is that it is special; it can’t be broached. 

Culture, in one respect, is about that which is pure and that which is polluted; it is about the boundaries that are often transgressed, and what we do about that. And part of the culture war—one way to see the culture war—is that each has an idea of what is transgressive, of what is a violation of the sacred, and the fears and resentments that go along with that. 

It feels like the universe of things that might be considered part of the “culture war” has grown considerably over the past 30 years, such that it seems to now envelop most of politics. In that situation, how does democracy work? Because when the stakes are existential, it would seem like compromise is impossible. Can you have a stable democracy without compromise?

No, I don’t think you can. Part of our problem is that we have politicized everything. And yet politics becomes a proxy for cultural positions that simply won’t brook any kind of dissent or argument. 

You hear this all the time. The very idea of treating your opponents with civility is a betrayal. How can you be civil to people who threaten your very existence? It highlights the point that culture is hegemonic: You can compromise with politics and policy, but if politics and policy are a proxy for culture, there’s just no way. 

In the original book, I had a short chapter about the technologies of communication and discourse, and the ways they’ve accentuated polarization. I argued that because of these technologies, our public culture is more polarized than we, as a people, are. And the technology I was talking about is just going to sound really funny nowadays: It was direct mail. This was [1991], before social media.

So, take the role of some of the extraordinary advances in social media and the ways in which these multiply the anonymity, the extremism of rhetoric, the absence of any kind of accountability in our public speech. They take what is already a shallow discourse—you know, the trading of slogans, and the like—and make it even more difficult to find any kind of depth.

How do you compromise when that becomes the dominant form of discourse? I think that there are ways in which serious and substantive democratic discourse is made difficult, if not impossible, by the democratization and proliferation of free speech. That seems like a strange thing to say, but ...

On that front, I think one of the difficulties is that there is sometimes a very clear calculation made on the part of people involved in politics that conflict leads to attention, and media attention leads to political power. That feels like a cycle difficult to break out of.

Democracy, in my view, is an agreement that we will not kill each other over our differences, but instead we’ll talk through those differences. And part of what's troubling is that I’m beginning to see signs of the justification for violence on both sides. 

Obviously, on January 6, we not only saw an act of violence—I mean, talk about a transgression—but one that the people who were involved were capable of justifying. That’s an extraordinary thing.

If I could draw a parallel, it’s not unlike the Civil War. There was a culture war for 30 years prior to the Civil War. The Civil War was—without question—about slavery and the status of Black men and women, and, yes, the good guys won —at the cost of 4 out of 10 Southern males dying and 1 out of 10 Northern males dying. 

But think about what happened: Dred Scott was an attempt to impose a consensus by law; it took the Civil War and the 13th, 14th and 15th Amendments to overturn Dred Scott. And yet that was also an imposition of solidarity by law and by force. The failures of Reconstruction and the emergence of Jim Crow and “Black Codes” and all of that was proof that politics couldn’t solve culture; it couldn't solve the cultural tensions, and so what you end up with is a struggle for civil rights.

My view is that the reason why we’re continuing to see this press toward racial reckoning is because it's never been addressed culturally. 

In other words, racial justice failed by succeeding. The international slave trade ended in 1808. And it created this sense of complacency: “Oh, we’ve dealt with that.” Yet the slave trade and number of slaves grew astronomically over the next 50 years. Then the Civil War was fought and won: “Oh, we’ve dealt with that. Now we can move on.” It created complacency. 

I think that’s what happened after the civil rights movement and [the Rev. Martin Luther] King’s martyrdom: It was a tremendous success at one level, but created complacency, especially among whites—“We’ve dealt with that. We don’t need to deal with this anymore”—when, in fact, ongoing discrimination is still happening. It represents, again, the attempt to generate a kind of cultural consensus through political means. And that doesn’t seem to work.

What would it look like to actually reckon with that issue, culturally? 

Well, I’m going to sound really old-fashioned here, but I think that this work takes a long time and it’s hard. I think you talk through the conflicts. Don’t ignore them; don’t pretend that they don’t exist. And whatever you do, don’t just simply impose your view on anyone else. You have to talk them through. It’s the long, hard work of education. 

The whole point of civil society, at a sociological level, is to provide mediating institutions to stand between the individual and the state, or the individual and the economy. They're at their best when they are doing just that: They are mediating, they are educating. I know that argument is part of the “old” liberal consensus view, the “old” rules of public discourse. But the alternatives are violence. And I think we are getting to that point.

The book that I followed “Culture Wars” with was called “Before the Shooting Begins: Searching for Democracy in America’s Culture War.” And the argument I made was that culture wars always precede shooting wars. They don’t necessarily lead to a shooting war, but you never have a shooting war without a culture war prior to it, because culture provides the justifications for violence. And I think that's where we are. The climatological indications are pretty worrisome.

Given that, do you feel optimistic about the outlook for things in the United States in this era of constant “culture war,” with so much of it being fed by the Internet?

Look: Not to hope—to give in to despair—is never an option, in my opinion. That’s an ethical position I think one has to take. But I also don’t think that you tell a patient that they have a bad cold when, in fact, they have a life-threatening disease. 

In this tangle between very powerful institutions and very powerful cultural logics, there are serious problems that are deeply rooted. The great democratic revolutions of Western Europe and North America were rooted in the intellectual and cultural revolution of Enlightenment; the Enlightenment underwrote those political transformations. If America’s hybrid Enlightenment underwrote the birth of liberal democracy in the United States, what underwrites it now? 

What is going to underwrite liberal democracy in the 21st century? To me, it’s not obvious. That’s the big puzzle I’m working through right now. But it bears on this issue of culture wars, because if there's nothing that we share in common—if there is no hybrid enlightenment that we share—then what are the sources we can draw upon to come together and find any kind of solidarity? 

It’s as though there are no unifying national myths. And those that once occupied that place in American life are now subject to debate and the culture war.

That’s exactly right. And the myths that do seem to exist are mainly technocratic and dystopian. So … I think we’re in trouble. But I’m not sure you should end with that. 

Well, I’ll end with this, then: Is there something unique about America that makes it especially prone to culture war, or is this kind of par for the course? 

Part of what has made it especially acute in the United States is the proliferation of nonprofit special-interest groups. You don't find that in Europe; you don't find it in England or Germany. Those are more statist regimes, and have much greater control over the nonprofit space. [Whereas, in the U.S.] you have the proliferation of special-interest groups that take sides. And a lot of our charitable money—which is a massive amount compared to other countries—gets channeled through these charitable organizations that exist with a take-no-prisoners policy; that define the enemy, that define a devil, that define transgressions in certain ways. 

They’re all in battle. And it’s, again, part of what you described earlier: It’s just more expansive. The range of the culture war seems to be all-encompassing.

I have this old-fashioned view that what we’re supposed to do is to understand before we take action, and that wisdom depends upon understanding. That basically makes me a conservative today—but it also makes me a progressive by conservative standards.


It seems to me the direction the “culture wars” have taken is dangerously close to becoming violent. If democracy depends upon honest and reasonable discourse that can allow compromise, we seem to be steadily moving in the wrong direction. Politics is not the originator of these conflicts but has become their primary expression. Most significant is the division between the college-educated, progressive, white collar, secular and liberal 20 per cent “upper white middle class” and the blue collar traditionalist, conservative, anti-intellectual “lower white middle class.”

Trumpism, and its opposite,line-up along this divide.

The progressives think of their opponents as the "basket of deplorables" like another iteration of “the great unwashed,” uneducated yahoos, too easily dismissed. The Right, the majority of Trump followers, both resent and fear these godless, secular elitists whose ideas threaten the basics of their way of life. Each side sees the other as a negation of what is good and evil, and you can't compromise with evil. The basis for talk and argument disappears; you end up with a stalemate that seems only a step away from violence.

The degeneration of much public discourse, the dissolution of trust in facts, the voicing of “alternate facts,” lead to a point where any discourse becomes impossible. We can't talk to each other, we can't find any common ground. Every issue championed by the progressives becomes felt as an existential threat — beyond all reason, and impossible to reason with. Wedding cakes for gay couples, who uses what bathrooms, transgender persons in sports— all become huge issues. No matter how harmless, they are cast as advances in an evil agenda that aims to destroy all that is holy and good.

What is holy and good is defined as Traditional Values, or Family Values, meaning marriage between a man and a woman, refusing a woman's right to reproductive choice and control of her body (no abortions, no contraception, no sex education), condemnation of any sexuality outside traditional male/female, (no homosexuality, no cross or trans gender identities) Evangelical Christian tenets and values part of US law and culture (US as a Fundamentalist Christian nation).

For these right-wing conservatives no interpretation of history can be allowed other than the one that sees confederate leaders as heroes, and leaves their monuments in places of honor. Any recognition of the persistence of ingrained racism in our social structure and institutions becomes a threat to the conservatives’ very existence and must be denied. Any protests that might become unruly or threaten property are dangerous and must be met with force. Law and order must be preserved, and that means no questions or protests, period.

The core of opposition to secularism and progressive ideas is an absolute terror of change, and the fear that change will mean the eradication of themselves and their way of life — what they are, what they believe, what they love, and what they need. I can see this terror leading to violence and repression. It is much harder to see, at this point, a positive way forward. And yet, I agree that we can't give up hope.


When I came to this country I was told it was a “classless society” — and tried my best not to burst out laughing, having at that point seen both the slums and the rich neighborhoods. That delusional belief was another odd similarity with the the Soviet Union as idealized in “Soviet Life” type of magazines meant to fool naive Western readers. At this point, though, it’s impossible to deny just how divided the U.S. has become. It is a “cold Civil War.”

And it’s also impossible to deny that during the long years of the Trump administration, the progressives hurt keenly, seeing what’d taken years to achieved overturned with one executive order. And names like “Americans against the Republican Party” showed the no-compromise attitude. We’re still fighting the battles we’ve been fighting for decades.

Nevertheless, America has finally had a black president — a fact that can’t be erased. Prejudices against women runs more deep, but it’s no longer the impossible dream that eventually the country will have a woman president, or an openly gay president. We know which way the wind of history blows. Just as absolute monarchy fell and democracy spread, so will the “Fundamentalist Christian America” crumble into insignificance.

The possibility of violence is certainly there, and we saw some of it on January 6 . . . But these are the last efforts of a cornered animal. The men who bare their chests to storm the Capitol or march with torches chanting “Jews will not replace us” are too much at odds with reality. 

Joe: I’m Not Saying That Science Should Replace Faith

The white conservatives cannot be taken lightly. They constitute a powerful force that is deeply rooted in the racism of the Republican base. In order to discuss the Culture Wars, one needs to speak honestly. One should say that the term, culture wars, is just a euphemism for white supremacy, and the term, culture wars, has always contained a racist connotation.

Until recently, conservatives rarely discussed the racially prejudiced legislation that they introduced. Ronald Reagan said they should never discuss how they aimed their policies at the black community. When he talked about inner-city problems, it was obvious that he meant the Black community. Often, he inferred drugs and sex led to crime in the minority neighborhood.

While scientific studies indicated that poverty and prejudice led to crime and drugs, Reagan dismissed this finding and linked the problems to morality. Therefore, conservative Christians need to be included in any discussion of culture wars.

The major Christian churches backed the Republicans because their leaders liked the way Republicans cloaked White Privilege in religious terms.

Christian preachers recognized that faith is more a matter of collective suggestion than of individual conviction. Churchgoers tend to submit to the communal rhythm. They come to believe it is absurd to believe differently from their religious body. According to J.D. Hunter, today Christians rarely root their positions in biblical theology or ecclesiastical traditions.

Their positions are mainly fixed in their fear of extinction. They say their values and lifestyle are under attack. I ask: are they being hunted and murdered in the street by police? Do they have classes being defunded like African-American studies or Chicano history?

Another example comes from the Michigan Institute for Health Care Policy and Innovation study. It said that death from complications during childbirth rose nine percent in rural (white) areas.  The same states where Republican Governors and legislatures limited the access to the ACA. Despite their lack of health care rural people still vote to limit health care. A reason that people support the Republicans is that churches connected abortion access in the inner city to Obamacare.

Many rural people believe that organizations like Planned Parenthood are only helping minorities. This became obvious by their complaints when Republicans attempted to cancel the ACA. Many conservative Christians said don’t destroy my health care just do away with Obamacare. Didn’t they know that the ACA and Obamacare were identical?

The white conservative church made removing Obamacare a matter of faith. It is hard going against the church because it is seen as denying the historical relevance of the Bible. To see the world from another viewpoint is to exist on the outskirts of the Christian community. Thus, it’s hard to challenge the claim that science is antagonistic to the conservative Christian community.

Science should be seen as a practical method of improving life and health, not as a replacement for faith. It has developed effective measures to fight smallpox, typhus, syphilis, and COVID 19. Including science in the culture wars is promoting the Republicans as protectors of the White Privilege that conservatives fear losing.


Perhaps instead of chanting, "Jews will not replace us," they should have been chanting, "Unitarians will not replace us." That's a church that's antithetical to white supremacy.

The influence of Fundamentalists on politics, though, is a considerable drag, disproportionate to their number. I agree that we must take them seriously.


“Evil cannot be punished, but it is self-destructive.” ~ Ágnes Heller



The type of phone and the fashion (dress, hair) indicate that it's the late fifties or early sixties or so. And even in the late sixties, when I was already in this country, customer service was soooo different! Polite, almost deferential. The customer needed to be pleased, almost wooed: “The customer is always right.”

And a live person answered right away. Not a recording. I understand that it might be difficult for younger readers to imagine that, but yes, people were hired specifically to answer phones so that the customer didn’t have to wait to receive service. In the case of businesses, it was obviously more than politeness; in the case of non-profit organizations, I think it was politeness, and simply respect for the caller.

That’s how it was done. It was a culture of respect.


For millennials, the age group born between 1981-1996, some of the products America used to not be able to live without are now being ditched all together.

1. Cars (and Gas): In fact, in the last eight years, the number of drivers age 18-25 was down nearly 25%.
2. Fabric Softener: Between 2007 and 2015, fabric softener sales dropped by 15 percent.
3. Traditional Gyms
4. Cereal: According to the New York Times, 40 percent of millennials surveyed said they do not eat cereal because it was “inconvenient.”
5. Business Suits
6. Homes: It's true that millennials typically rent longer than earlier generations. In 2018, millennial home ownership was at a record low. But now, millennials make up the largest share of home buyers in the US, according to a 2020 survey from the National Association of Realtors.
7. Regular Milk: Increasingly Millennials are becoming vegan or choosing to go for a more environmentally friendly option. Others are choosing other milks as a healthier alternative. Milk sales have dropped 40% since 1970.
8. Weddings and Diamonds: In the 1980s, two-thirds of people age 25-34 had married. Today, more than half of the people in that age group are single. Millennials are saying “I Don’t” to saying “I Do.” And along with a decline in marriage rates, there has been in a decline in diamond sales.
9. Movie Theaters
10. Bulk Groceries
11. Domestic Brand Beers
12. Cruises
13. Beef
14. Napkins
15. Mayonnaise: mayo sales have been dropping for the last few years.
16. Irons: fabrics that usually require ironing are on the outs. And besides, there is no wrinkle that simply putting an item of clothing in the dryer for a second won’t fix!
17. Land Lines: A survey says that 66 percent of millennials live in a totally wireless home. Forty-one percent have no landline phone (that number would be higher, but many Internet and cable companies provide a landline for free), and 83 percent of millennials sleep next to their cellphones.
18. Lottery Tickets: A Gallup survey found that while 61 percent of people ages 50 to 64 played the lottery, only about a third of people between the ages of 18 to 29 are doing the same. The generation has simply scratched lottery cards off their list of must-dos.
19. Postcards: Once upon a time, 20 million postcards were sold every year. Today, there are only about 5 million sold.
20. Stilettos
21. Life Insurance and Stocks: Seventy-five percent of millennials do not have life insurance simply because they cannot afford it. As far as stocks go, only 13 percent of millennials told Barron’s that they would invest in the stock market. Market experts think that this mainly has to do with witnessing a stock market crash at a young age.
22. Doorbells: First of all, many millennials are living in apartment buildings rather than standalone homes, where doorbells just are not a fixture. But even if there was a doorbell option, millennials probably would not use it. They are more likely to send their friends or family members a text saying they have arrived at their home rather than ringing the bell.
23. Cable Subscriptions
24. Fast Food
It is not just McDonald’s that is struggling to get Millennials into their doors. Fast food restaurants across the board are seeing a steep decline in their Millennial clientele, and that drop even carries over to their younger counterparts in Generation Z.
25. Hotels: Instead, young travelers are choosing to go for more “authentic” experiences. Sometimes the idea of living like a local means subletting an apartment or renting an AirBNB.
26. Golfing: There are a few reasons why Millennials are not teeing up at the golf course anymore. While some feel that the sport is a little overpriced or too expensive, others find the 18-hole sport to be a little boring.
27. Bars of Soap: According to a MarketWatch report, 60 percent of Millennials feel that bars of soap are crawling with germs and would rather use body wash to clean themselves.
28. Casual Dining: Either they are ordering delivery off of Uber Eats or Seamless, or they are deciding to wine and dine themselves with a reservation at a nice restaurant.
29. Department Stores: Millennials are choosing to get their one-stop shopping done right from the comfort of their own couches.
30. Designer Clothing
31. Wine with Corks: Millennials as consumers are focused on having wine that is friendlier for bringing to friends homes or other gatherings.
32. Motorcycle: In the United States as a whole, we just do not see motorcycles as often as people used to back just a few years ago. Overall, motorcycle sales have fallen over the last few years. But no age group has showed less interest in motorcycles than Millennials.


Yes, change is well on its way..just look at those things the millenials don't want!! And how world population is waning China even deciding people can have more than 2 kids! The death throes of the old culture still may be very challenging to get through. It's a dicey situation, volatile to say the least.



~ The Moon has a smell. It has no air, but it has a smell. Each pair of Apollo astronauts to land on the Moon tramped lots of Moon dust back into the lunar module—it was deep gray, fine-grained and extremely clingy—and when they unsnapped their helmets, Neil Armstrong said, “We were aware of a new scent in the air of the cabin that clearly came from all the lunar material that had accumulated on and in our clothes.” To him, it was “the scent of wet ashes.” To his Apollo 11 crewmate Buzz Aldrin, it was “the smell in the air after a firecracker has gone off.”

All the astronauts who walked on the Moon noticed it, and many commented on it to Mission Control. Harrison Schmitt, the geologist who flew on Apollo 17, the last lunar landing, said after his second Moonwalk, “Smells like someone’s been firing a carbine in here.” Almost unaccountably, no one had warned lunar module pilot Jim Irwin about the dust. When he took off his helmet inside the cramped lunar module cabin, he said, “There’s a funny smell in here.” His Apollo 15 crewmate Dave Scott said: “Yeah, I think that’s the lunar dirt smell. Never smelled lunar dirt before, but we got most of it right here with us.”

Moon dust was a mystery that the National Aeronautics and Space Administration had, in fact, thought about. Cornell University astrophysicist Thomas Gold warned NASA that the dust had been isolated from oxygen for so long that it might well be highly chemically reactive. If too much dust was carried inside the lunar module’s cabin, the moment the astronauts repressurized it with air and the dust came into contact with oxygen, it might start burning, or even cause an explosion. 

(Gold, who correctly predicted early on that the Moon’s surface would be covered with powdery dust, also had warned NASA that the dust might be so deep that the lunar module and the astronauts themselves could sink irretrievably into it.)

Among the thousands of things they were keeping in mind while flying to the Moon, Armstrong and Aldrin had been briefed about the very small possibility that the lunar dust could ignite. “A late-July fireworks display on the Moon was not something advisable,” said Aldrin.

Armstrong and Aldrin did their own test. Just a moment after he became the first human being to step onto the Moon, Armstrong had scooped a bit of lunar dirt into a sample bag and put it in a pocket of his spacesuit—a contingency sample, in the event the astronauts had to leave suddenly without collecting rocks. Back inside the lunar module the duo opened the bag and spread the lunar soil on top of the ascent engine. As they repressurized the cabin, they watched to see if the dirt started to smolder. “If it did, we’d stop pressurization, open the hatch and toss it out,” Aldrin explained. “But nothing happened.”

The Moon dust turned out to be so clingy and so irritating that on the one night that Armstrong and Aldrin spent in the lunar module on the surface of the Moon, they slept in their helmets and gloves, in part to avoid breathing the dust floating around inside the cabin.

By the time the Moon rocks and dust got back to Earth—a total of 842 pounds from six lunar landings—the odor was gone from the samples, exposed to air and moisture in their storage boxes. No one has quite figured out what caused the odor to begin with, or why it was so like spent gunpowder, which is chemically nothing like Moon rock. “Very distinctive smell,” Apollo 12 commander Pete Conrad said. “I’ll never forget. And I’ve never smelled it again since then.”

In 1999, as the century was ending, the historian Arthur Schlesinger Jr. was among a group of people who was asked to name the most significant human achievement of the 20th century. In ranking the events, Schlesinger said, “I put DNA and penicillin and the computer and the microchip in the first ten because they’ve transformed civilization.” But in 500 years, if the United States of America still exists, most of its history will have faded to invisibility. “Pearl Harbor will be as remote as the War of the Roses,” said Schlesinger. “The one thing for which this century will be remembered 500 years from now was: This was the century when we began the exploration of space.” He picked the first Moon landing, Apollo 11, as the most significant event of the 20th century.

The leap to the Moon in the 1960s was an astonishing accomplishment. But why? What made it astonishing? We’ve lost track not just of the details; we’ve lost track of the plot itself. What exactly was the hard part?

The answer is simple: When President John F. Kennedy declared in 1961 that the United States would go to the Moon, he was committing the nation to do something we simply couldn’t do. We didn’t have the tools or equipment—the rockets or the launchpads, the spacesuits or the computers or the micro-gravity food. And it isn’t just that we didn’t have what we would need; we didn’t even know what we would need. We didn’t have a list; no one in the world had a list. 

Indeed, our unpreparedness for the task goes a level deeper: We didn’t even know how to fly to the Moon. We didn’t know what course to fly to get there from here. And as the small example of lunar dirt shows, we didn’t know what we would find when we got there. Physicians worried that people wouldn’t be able to think in micro-gravity conditions. Mathematicians worried that we wouldn’t be able to calculate how to rendezvous two spacecraft in orbit—to bring them together in space and dock them in flight both perfectly and safely.


On May 25, 1961, when Kennedy asked Congress to send Americans to the Moon before the 1960s were over, NASA had no rockets to launch astronauts to the Moon, no computer portable enough to guide a spaceship to the Moon, no spacesuits to wear on the way, no spaceship to land astronauts on the surface (let alone a Moon car to let them drive around and explore), no network of tracking stations to talk to the astronauts en route.

“When [Kennedy] asked us to do that in 1961, it was impossible,” said Chris Kraft, the man who invented Mission Control. “We made it possible. We, the United States, made it possible.”
Ten thousand problems had to be solved to get us to the Moon. Every one of those challenges was tackled and mastered between May 1961 and July 1969. The astronauts, the nation, flew to the Moon because hundreds of thousands of scientists, engineers, managers and factory workers unraveled a series of puzzles, often without knowing whether the puzzle had a good solution.

In retrospect, the results are both bold and bemusing. The Apollo spacecraft ended up with what was, for its time, the smallest, fastest and most nimble computer in a single package anywhere in the world. That computer navigated through space and helped the astronauts operate the ship. But the astronauts also traveled to the Moon with paper star charts so they could use a sextant to take star sightings—like 18th-century explorers on the deck of a ship—and cross-check their computer’s navigation. The software of the computer was stitched together by women sitting at specialized looms—using wire instead of thread. 

In fact, an arresting amount of work across Apollo was done by hand: The heat shield was applied to the spaceship by hand with a fancy caulking gun; the parachutes were sewn by hand, and then folded by hand. The only three staff members in the country who were trained and licensed to fold and pack the Apollo parachutes were considered so indispensable that NASA officials forbade them to ever ride in the same car, to avoid their all being injured in a single accident. Despite its high-tech aura, we have lost sight of the extent to which the lunar mission was handmade.

The race to the Moon in the 1960s was, in fact, a real race, motivated by the Cold War and sustained by politics. It has been only 50 years—not 500—and yet that part of the story too has faded.

One of the ribbons of magic running through the Apollo missions is that an all-out effort born from bitter rivalry ended up uniting the world in awe and joy and appreciation in a way it had never been united before and has never been united since.


The Apollo 11 spaceship that carried Michael Collins, Buzz Aldrin and Neil Armstrong from the Earth to the Moon was big: The command and service module and the lunar module, docked nose-to-nose, was 53 feet long. When Collins fired the service module engine to settle into orbit around the Moon—the big engine ran for 357.5 seconds to slow the ship, six long minutes—there was already another spaceship in orbit around the Moon waiting for them. It had arrived two days earlier, from the Soviet Union.

Luna 15 was a Russian unmanned robotic craft that was at the Moon on a mysterious mission. It was certainly no coincidence that at the moment the United States was getting ready to land people on the Moon’s surface, with the whole world watching, the Russians had decided to have a spacecraft at the Moon. Luna 15 had been launched on Sunday, July 13, before the Wednesday launch of Apollo 11, and the Russians said it was simply going to “conduct further scientific exploration of the Moon and space near the Moon.”

But from the moment of Luna 15’s launch, U.S. space scientists and NASA officials speculated that it was a “scooping” mission, designed to land on the Moon, extend a robotic arm, scoop up some soil and rocks, and deposit them in a compartment on the spacecraft, which would then zoom back to Earth and maybe, just maybe, arrive back on Russian soil with its cargo before the Apollo 11 astronauts could make it home.

Frank Borman, the commander of the Apollo 8 mission that had orbited the Moon, had just returned from a nine-day goodwill tour of Russia—the first visit by a U.S. astronaut to the Soviet Union—and appeared on the NBC news show “Meet the Press” the morning of Luna 15’s launch. “I would guess it’s probably an effort” to bring back a soil sample, Borman said. “I heard references to that effect [in Russia].”

NASA, at least publicly, was mostly concerned that Russian communications with Luna 15 might interfere with Apollo 11. In an unprecedented move, Chris Kraft, the head of Mission Control, asked Borman to call Soviet contacts from his just-finished trip and see if they would supply data on Luna 15. The Soviets promptly sent a telegram—one copy to the White House, one copy to Borman’s home near the Manned Spacecraft Center—with details of Luna 15’s orbit and assurances that if the spacecraft changed orbits, fresh telegrams would follow. It was the first time in the 12 years of space travel that the world’s two space programs had communicated directly with each other about spaceflights in progress. At a press conference, Kraft said Luna 15 and the Apollo spacecraft would not come anywhere near each other.

Luna 15, at least to start, succeeded in making sure the Soviet Union’s space program wasn’t overlooked while Apollo 11 dominated the news worldwide. The Soviet mission made the front pages of newspapers around the world. At the time, NASA and the public never did find out what Luna 15 was up to. Now we know it was a well-planned effort to upstage Apollo 11, or at least be onstage alongside the U.S. Moon landing, according to documents released and research done since the breakup of the Soviet Union and thanks to the rich and detailed history of the Soviet space program written by historian Asif Siddiqi, Challenge to Apollo.

When Luna 15 arrived in lunar orbit on July 17, two days ahead of Apollo 11, Siddiqi says, Russian space officials were surprised “by the ruggedness of the lunar terrain” where it was headed, and that the craft’s altimeter “showed wildly varying readings for the projected landing area.” As Armstrong and Aldrin stepped out onto the lunar surface, Luna 15 was still swooping around the Moon, and engineers back in the Soviet Union were still trying to find a landing site they had confidence in.

Two hours before the Eagle, with Armstrong and Aldrin aboard, blasted off the Moon, Luna 15 fired its retrorockets and aimed for touchdown. The legendary British radio telescope at Jodrell Bank Observatory, presided over by Sir Bernard Lovell, was listening in real time to the transmissions of both Apollo 11 and Luna 15. And Jodrell Bank was the first to report the fate of Luna 15. Its radio signals ended abruptly. “If we don’t get any more signals,” said Lovell, “we will assume it crash-landed.” Luna 15 was aiming for a site in the Sea of Crises, about 540 miles northeast of Eagle’s spot in the Sea of Tranquillity.

The Soviet news agency Tass reported that Luna 15 had fired its retrorockets and “left orbit and reached the Moon’s surface in the preset area.” Its “program of research...was completed.”
Despite taking almost a whole extra day to figure out the terrain issues, Soviet space scientists apparently missed a mountain in the Sea of Crises. On its way to the “preset area,” Luna 15, traveling 300 miles per hour, slammed into the side of that mountain.

At about 1:15 p.m. Eastern time Tuesday, the Apollo astronauts woke from a 10-hour rest period and were 12 hours into their 60-hour ride back from the Moon. As they got started on their day, astronaut Bruce McCandless, Mission Control’s official Capsule Communicator, radioed, “Apollo 11, this is Houston. If you’re not busy now, I can read you up the morning news.”

Replied Aldrin, “Okay, We’re all listening.”

A lot of the news was about Apollo 11. Reported McCandless, “Things have been relatively quiet recently in Vietnam. G.I.s on patrol were observed carrying transistor radios tuned to your flight.”

About one-third of the way through McCandless’ space newscast, slipped in between telling the astronauts that President Nixon would head to Romania after meeting them onboard their recovery aircraft carrier, and the Vietnam news, McCandless reported, “Luna 15 is believed to have crashed into the Sea of Crises yesterday after orbiting the Moon 52 times.”

If ever there was a moment that captured the crushing reversal in the performance of the world’s two space programs, that was it: Mission Control matter-of-factly reporting the crash-landing of the Soviet Union’s somewhat flailing robotic attempt to collect Moon rocks to the three American astronauts flying home from the first human landing on the Moon, with 47.5 pounds of Moon rocks. ~


Luna 15 crash-landed in The Sea of Crises. What fiction writer would dare make things so blatantly ironic? 

One reason that landing on the Moon was so incredibly important was that it showed humanity that the seemingly impossible can be done. Of course it took a lot of cooperation between thousands of people involved in the effort. After the Moon landing, we started hearing, "How come we can go to the Moon, but can't do such-and-such? (e.g. cure the common cold). The answer is that the motivation is not the same. The best minds are working on different problems, and are prevented from cooperating by bureaucracy or patent laws, for instance. There is no will, there is no financing, there is no vision. 


So fascinating about the moon dust smell!! And the fact they didn’t know if the lunar module might just keep sinking in the dust until it disappeared, or if their activities would cause the dust to ignite. Of course none of that stopped them…just like the uncertainty about whether the nuclear bomb might start an unstoppable catastrophe never stopped its deployment.


The human capacity for taking risk seems way beyond my personal tolerance of risk. Of course the astronauts were exceptional human beings, more courageous than most, but still . . . the nightmarish possibility of sinking into the lunar dust forever . . . I realize more keenly now the courage it took to land on the moon for the first time, and the gigantic collective effort that went into that astonishing enterprise.



~ Collins, known as the ‘forgotten astronaut’, kept command module flying while Neil Armstrong and Buzz Aldrin walked on the moon.

But his role in the three-man mission in 1969 was just as crucial and his task to keep the module circling and piloting it as his team mates departed from the module in the Eagle lander and then returned safely, was just as crucial, nerve-racking and exciting for the mission as a whole.

“Not since Adam has any human known such solitude as Mike Collins,” the mission log said.

Post-mission he said: “The thing I remember most is the view of planet Earth from a great distance. Tiny. Very shiny. Blue and white. Bright. Beautiful. Serene and fragile.”


~ In the late nineteenth century, somewhere between four million and eleven million people identified as Spiritualists in the United States alone. Some of the leaders back then were hucksters, and some of the believers were easy marks, but the movement cannot be dismissed merely as a collision of the cunning and the credulous. Early Spiritualism attracted some of the great scientists of the day, including the physicists Marie and Pierre Curie, the evolutionary biologist Alfred Russel Wallace, and the psychologist William James, all of whom believed that modern scientific methods, far from standing in opposition to the spiritual realm, could finally prove its existence.

So culturally prevalent was Spiritualism at the time that even skeptics and dabblers felt compelled to explore it. Mark Twain, Frederick Douglass, and Queen Victoria all attended séances, and although plenty of people declined to attend so much as a single table-turning, the movement was hard to avoid; in the span of four decades, according to one estimate, a new book about Spiritualism was published roughly once a week. These included scientific-seeming tomes purporting to offer evidence of the afterlife, as well as wildly popular memoirs such as “Evenings at Home in Spiritual Séance” and “Shadow Land; or, Light from the Other Side.” Meanwhile, more than a hundred American Spiritualist periodicals were in regular circulation, advertising public lectures and private séances in nearly eight hundred cities and towns across the country.

One of Spiritualism’s first major historians was the novelist Arthur Conan Doyle, who became so zealous a believer that he set aside Sherlock Holmes in order to focus on his research, ultimately writing more than a dozen books on the subject. His two-volume “History of Spiritualism” starts by situating the movement as “the most important in the history of the world since the Christ episode,” then proposes the Swedish mystic Emanuel Swedenborg, born in the sixteen-eighties, and the Scottish reformer Edward Irving, born in 1792, as forerunners of the Victorians.

But most accounts of Spiritualism don’t begin with great men or distant precedents. They start with little women on an exact date: March 31, 1848. On that night, as Emily Midorikawa details in her new book, “Out of the Shadows: Six Visionary Victorian Women in Search of a Public Voice” (Counterpoint), two sisters, fourteen-year-old Margaretta Fox and eleven-year-old Catherine, finally convinced some of their neighbors that an unsettling series of knockings and tappings in their home, near the south shore of Lake Ontario, was coming from the spirit world. Soon the whole town of Hydesville, New York, was gripped by the mysterious noises that haunted the Fox family.

Maggie and Kate, as the Fox sisters were known, claimed that they were able to communicate with the maker of those noises, which they said was a spirit called Mr. Splitfoot. From beyond the grave, the spirit answered their questions, first rapping back to respond with a simple yes or no, then using a more complicated series of raps to indicate letters of the alphabet. In this manner, the spirit allegedly revealed that he had been murdered for money some five years previously and been buried in the cellar of the Fox house. That revelation only further excited the residents of Wayne County—no strangers to new religious claims, since they had already welcomed the Shakers at Sodus Bay, witnessed the founding of Mormonism at Palmyra, and lately outlived the doomsday prophecies of the nearby Millerites.

The Foxes fled their haunted home, but the rapping followed the girls into other houses during the next few months, and their sensational story continued to spread. In the fall of 1849, four hundred people gathered at Corinthian Hall, in nearby Rochester, where the Foxes demonstrated what they had advertised as “WONDERFUL PHENOMENA” for a paying audience—the first of many during the next forty years. William Lloyd Garrison and James Fenimore Cooper came for séances with the girls, and Horace Greeley and his wife, Mary, not only visited with the sisters but boosted their celebrity in Greeley’s newspapers, including the New-York Daily Tribune, which would go on to cover the Spiritualist craze as dozens and then hundreds of others claimed that they, too, were capable of hearing “spirit rapping.”

According to Midorikawa, the Greeleys were representative of some of the earliest and most enthusiastic adherents of Spiritualism: affluent and progressive mothers and fathers who were desperate to communicate with sons and daughters who had died too young. In the mid-nineteenth century, an estimated twenty to forty per cent of children died before the age of five, and scholars often point to this fact to help account for the appeal of Spiritualism. But it was worse in the preceding centuries; for some time, the child mortality rate had been falling. What mattered more was that the average family size was shrinking, too, at the same time that modern ideas of childhood were taking hold—trends that combined to make the loss of any child seem that much more anguishing.

But it wasn’t only the death of children that brought people to Spiritualism, or kept them in the fold. Mary Todd Lincoln, who lost three of her four children, visited with mediums in Georgetown before hosting her own séances in the Red Room of the White House. She also hired the country’s most famous “spirit photographer” to take a picture of her with her husband after he was assassinated. 

Peter Manseau’s “The Apparitionists: A Tale of Phantoms, Fraud, Photography, and the Man Who Captured Lincoln’s Ghost” (Houghton Mifflin Harcourt) offers a fascinating account of that photographer, William H. Mumler, who worked as a jewelry engraver in Boston before taking a self-portrait that, when developed, revealed what became known as an “extra”: in his case, a young girl sitting in a chair to his right, whom he recognized as a cousin who had died a dozen years before. Mourning portraits—paintings of the recently dead—had long been popular, but spirit photographs offered something more: not just the memorialization of lost loved ones but confirmation of life after death.

In the years following the Civil War, when around three-quarters of a million dead soldiers haunted the country, spirit photographs were in high demand. After Spiritualism migrated to Europe, its prominence there tracked loosely to war, too, with a spike following the First World War. Mumler alone took dozens of spirit photographs, in which deceased friends or relatives appeared behind or beside their living loved ones. Other photographers focussed on capturing active séances, table-turnings, acts of levitation, and even ectoplasm—spiritual substances that mediums “exteriorized” from their own bodies, often their mouths, noses, or ears, but sometimes their stomachs or vaginas. Such substances could be clear or dark, pasty or gauzy, shapeless or in the form of appendages or faces.

Technological explanations for the rise of Spiritualism often cite the development of photography, which at the time was an inherently spooky medium, in that it could show things that were not actually there. Although it can be hard to remember in the age of deep fakes, photography was initially thought of not as a manipulable art but as a mirrorlike representation of reality, which made its role in Spiritualism seem probative. 

Other technologies similarly seemed to bridge such unfathomable gaps that the one between this world and the next appeared certain to collapse as well. The telegraph, for instance, offered access to voices from the beyond; how far beyond was anyone’s guess. The very word for those who could talk with spirits reflected all the new “mediums” through which information could be transmitted; spirit photographs were marketed alongside spirit telegraphs, spirit fingerprints, and spirit typewriters. Inventors such as Nikola Tesla and Thomas Edison even tinkered with uncanny radios and spirit telephones, inspired by some of the disembodied voices of their own experiments and curious about the supernatural implications of electromagnetism and other universal energies.

Still, like the appeal to mortality rates, this account of the rise of Spiritualism goes only so far. For one thing, no notable uptick in spiritualist beliefs accompanied earlier technological upheavals, including the entire Industrial Revolution, even though it altered our sense of time and set all kinds of things spinning and moving in previously unimaginable ways. For another, some of the most popular Spiritualist technologies were some of the oldest: the Ouija board was simply a branded, pencil-less version of the planchette, and forms of planchette writing had been around for centuries.

The use of technology to document spiritual phenomena was of interest not only to believers but also to skeptics, who pored over images looking for cheesecloth passing as ectoplasm, overexposures masquerading as ghostly apparitions, and wires or pulleys that could account for rappings and table-turnings. 

In one of the most publicized attempts to test the claims of Spiritualists, Scientific American offered five thousand dollars in prize money to anyone who could produce psychic phenomena sufficient to convince a committee that consisted of academics from Harvard and the Massachusetts Institute of Technology, psychic experts, and also Harry Houdini, who knew something about illusions and developed a sideline in exposing those which hucksters were trying to pass off as real. Armed with electroscopes and galvanometers, the committee tested all mediums who presented themselves for scrutiny, sometimes attending multiple séances before rendering a verdict.

Houdini’s debunking of one famous medium, Mina Crandon, is thoroughly recounted in David Jaher’s “The Witch of Lime Street: Séance, Seduction, and Houdini in the Spirit World” (Crown). Crandon was married to a prominent surgeon and attracted Boston’s élite to her performances, channeling her dead brother’s voice and even revealing his fingerprints from beyond the grave, while also levitating tables and producing ectoplasm from her mouth and from between her legs, often while naked. (The backlash against Spiritualism, which came partly from the clergy, stemmed not only from its challenge to orthodox ideas about Heaven and Hell but also from its scandalous exhibitionism.) 

Crandon’s case divided the Scientific American committee, with some members accusing others of having been sexually coerced into validating her fraud and even conspiring with her. Houdini had already exposed the deceptions of other mediums in his book “A Magician Among the Spirits,” and he never relented in his effort to discredit Crandon, publishing an entire pamphlet detailing her tricks, and going so far as to incorporate some of them into his own stage act in order to demonstrate their fraudulence.

Houdini prevented Crandon from winning the Scientific American prize, but her fame only grew, and her case later splintered another group of researchers. The American Society for Psychical Research, founded in 1885, a few years after its British equivalent, was devoted to the investigation of spiritual phenomena, which the society considered as worthy of careful study as fossils or electricity. In “Ghost Hunters: William James and the Search for Scientific Proof of Life After Death” (Penguin), Deborah Blum records the society’s investigations into everything from haunted houses to hypnotism. For the most part, those investigations only ever succeeded in disproving the phenomena they studied, but it was James, a founding member, who best articulated why they nonetheless continued their work. “If you wish to upset the law that all crows are black,” he said, “you mustn’t seek to show that no crows are; it is enough if you prove one single crow to be white.”

"My own white crow,” James announced in that same address to the Society for Psychical Research, “is Mrs. Piper.” He was referring to Leonora Piper, a Boston housewife turned trance medium who withstood years of testing and observation, her fees rising twenty-fold in the meantime and her fame extending all the way to England, where she went on tour. On one occasion, Piper impressed the James family by making contact with an aunt of theirs. Asked about the elderly woman’s health, the medium informed them that the woman had died earlier that day. “Why Aunt Kate’s here,” Piper said. “All around me I hear voices saying, ‘Aunt Kate has come.’ ” The Jameses received a telegram a few hours later confirming Aunt Kate’s death the night before.

Unlike Crandon, Piper was not fully discredited, though many people doubted her abilities, noting her failed readings and prophecies and offering convincing psychological explanations of those predictions and telepathic readings which seemed accurate. Her feats as a medium were not particular to the James family; in the course of her career, she claimed to channel, among others, Martin Luther and George Washington. As such efforts suggest, the allure of Spiritualism was not limited to consolation for the bereft: plenty of mediums worked as much in the tradition of the carnival barker as in that of the cleric, and Spiritualism was popular in part because it was entertaining. Its practitioners, some of them true connoisseurs of spectacle, promised not only reassurances about the well-being of the dearly departed but also new lines from Shakespeare and fresh wisdom from Plato.

Even more strikingly, from the perspective of the present day, early mediums offered encounters with the culturally dispossessed as well as with the culturally heralded. Piper, for instance, claimed to channel not only Washington and Luther but also a young Native American girl named Chlorine. And she was not alone in allegedly relaying the posthumous testimony of marginalized people. Enslaved African-Americans and displaced Native Americans were routinely channelled by mediums in New England and around the country. 

Whether race persisted in the afterlife was a matter of some dispute, but racially stereotyped and ethnically caricatured “spirit guides” were common, conjured with exaggerated dialects for audiences at séances and captured in sensational costumes by spirit photography. Flora Wellman, the mother of the novelist Jack London, claimed to channel a Native American chief called Plume; the Boston medium Mrs. J. H.Conant became associated with a young Piegan Blackfoot girl she called Vashti. Mediums with abolitionist sympathies passed on the stories of tortured slaves, while pro-slavery Spiritualists delivered messages of forgiveness from the same population and relayed visions of an afterlife where racial hierarchies were preserved.

For white mediums, communicating with spirits of other races could be a form of expiation, a way to confront violent histories and make cultural amends—or merely crude appropriation, garish performance art that was good for business. But Spiritualism was not only a white phenomenon. There were plenty of Black Spiritualists—including Sojourner Truth, who lived for a decade in the Spiritualist utopia of Harmonia before settling in Battle Creek, Michigan—and many Black mediums, including Paschal Beverly Randolph and Rebecca Cox Jackson, both of whom wrote books that included their work with spirits. Harriet E. Wilson, one of the first Black authors to publish a novel in the United States, later became a Spiritualist healer who was known, like some of her white counterparts, for summoning indigenous spirits, and who was described, in one of Boston’s Spiritualist newspapers, as “the eloquent and earnest colored trance medium.”

The lines between syncretism and appropriation were often fuzzy. If the initial Victorian wave of Spiritualism had a distinctly American character, later iterations took on global influences, as when the theosophists incorporated elements of Eastern religions, including belief in reincarnation and past lives. Immigration and translation brought sacred literatures into renewed contact with one another—the Bardo Thodol handed to readers of the Zohar, the Vedas and the Upanishads circulating alongside Julian of Norwich and Meister Eckhart. Occult practices melded with culturally blurry techniques of meditating and altering consciousness, and the roots of the esotericism that would eventually be known as New Age took hold.

In its Victorian incarnation, Spiritualism had provided ways for female mediums to lead and to profit. The medium Annie Denton Cridge became a newspaper publisher and wrote one of the earliest feminist utopian novels, wherein the narrator dreams first of a matriarchal government on Mars that oppresses men, and then that America has a female President; Victoria Woodhull, a clairvoyant turned suffragist, became, with her sister, one of the first women to start a brokerage firm on Wall Street and, later, the first to actually run for President of the United States; Emma Hardinge Britten, an opera-singing skeptic who set out to discredit the Spiritualists but ended up joining them, became one of the country’s most popular public speakers and helped Abraham Lincoln win reëlection.

But they and other Spiritualists faced a cultural backlash almost immediately. The religion scholar Ann Braude’s groundbreaking “Radical Spirits” (Beacon) situates spiritual work as social and political activism, since it gave women the opportunity to speak in public, and as a foundation of the women’s-rights movement, since it demonstrated the equality of the sexes. Such a framing helps explain why Spiritualism became so ridiculed, and why its opponents sought to discredit its female leaders most vigorously.

Not that those opponents needed a great deal of assistance. Much of the disillusionment came from the inside—including via the Fox sisters, the Hydesville girls credited with starting the Spiritualist craze. For years afterward, they entertained private gatherings and large public audiences in America and England. All the while, they endured examinations by physicians and gadflies, who strip-searched them, looking for bodily explanations or external assistance, and were attacked by mobs of Christians and secular skeptics alike, who threatened them with grenades and guns. 

Many people had tried to discredit them, but, in the end, they discredited themselves: in 1888, Maggie Fox, fulfilling the wishes of the late famous Arctic explorer Elisha Kane, whom she had allegedly married in secret, declared that the whole thing had been a hoax.

With her sister Kate watching from the audience, Maggie, now in her fifties, appeared onstage at the Academy of Music, on Fourteenth Street, put on a pair of glasses, and read from a prepared statement confessing “the greatest sorrow of my life”: namely, that she and her sister had collaborated in “perpetrating the fraud of Spiritualism upon a too confiding public.” After her reading ended, three doctors came to the stage and waited for her to begin cracking her big toe; each doctor then confirmed that the rappings were coming from the clicking of her joints, which grew louder and louder until finally she shouted, “Spiritualism is a fraud from beginning to end!”

But, the very next year, Fox recanted her recanting, leaving both sides to claim and reject the testimony of the sisters as they saw fit, a contest that was still unresolved when, a few years later, both sisters died poor.

Helped along by such scandals and the passage of time, Spiritualism eventually moved to the fringes. It became a kind of curiosity, a Victorian fad encountered chiefly in the biographies of artists such as Elizabeth Barrett Browning, who dabbled in mesmerism; in the footnotes to the modernist poetry of T. S. Eliot and W. B. Yeats, with their invocations of astrology, sorcery, and Madame Blavatsky; in museum exhibits of the mystical paintings of Hilma af Klint; in horror films like “Ouija” and “Things Heard & Seen.” 

Spiritualism is most often invoked only to be discredited, and cynical accounts routinely sneer at the sincerity or impugn the sanity of individual believers, unwilling or unable to imagine the appeal of a movement that dominated several decades of religious life both here and abroad.

But, if today’s Spiritualists have much in common with the Victorians, they also have something in common with the ancient Romans, who celebrated the festival of Lemuria by making food offerings to their restless dead, and with the Israelite King Saul, who consulted a medium in the Canaanite city of Endor. 

Arthur Conan Doyle’s long view may well be the right one, for, as he wrote, there is “no time in the recorded history of the world when we do not find traces of preternatural interference and a tardy recognition of them from humanity.” The dread of mortality has always inspired the dream of immortality, and the hopes that animated Victorian Spiritualism are eternal: to bridge the divide between ourselves and those we have lost, to know that they are safe and content, and to believe that they are thinking of us just as much as we are thinking of them. ~


The discussion in the blog about spiritualism brought forth a kind of shadowy memory for me of my grandmother and her siblings ( my great uncles and great aunts) gathered in the dining room of the family owned farm, sitting around the table. They would  rub their hands briskly together, then all put their hands flat down on the table. Questions were asked, and the table would jump and thump. It was dim and mysterious and a little frightening. The kids would all be napping in the living room while this went on. Such a weird memory!!
I got to play with the Ouija board not long after coming to the U.S. While it was obvious to me that we, the questioners, already knew the answers, it was still odd to see how the disk seemed to move as if of its own volition.
Yes, spiritualism has a strange appeal, even to those of us who proclaim to have a scientific viewpoint. Still, I stand firm by the statement that we already know the answers, or else at least can imagine them.
This reminds me somewhat of dreams in which I speak in a foreign language, which in my case is a language I don't know very well: German or Spanish or Italian. Sometimes I ask the other person, often a teacher, "How do you say such-and-such?" And if I truly didn't know the right word, the dream refused to give it to me. The person I hoped would teach me something new would simply ignore my question. 
Now, I am quite impressed by the theory that dreams are the way the brain has to perform therapy on itself. In one case of my recurring dreams, this seemed to be the perfect explanation: they got to be less and less frightening, the exit easy, the gate open, the guard dozing. Some dreams, however, remained just simply strange. Their memory mostly dissolved, since it wasn't useful. 
And unlike Jung or Freud, I don't think we should spend much time introspecting. The outer world is ultimately richer than the inner. At the very least, it's more challenging and unpredictable. 


“I don’t find any difference between Islam and Islamic fundamentalism. I believe religion is the root, and from the root fundamentalism grows as a poisonous stem. If we remove fundamentalism and keep religion, then one day or another fundamentalism will grow again. I need to say that because some liberals always defend Islam and blame fundamentalists for creating problems.

But Islam itself oppresses women. Islam itself doesn’t permit democracy and it violates human rights. And because Islam itself is causing injustices, so it is our duty to make people alert. It is our responsibility to wake people up, to make them understand that religious scriptures come from a particular period of time and a particular place.” ~ Taslima Nasrin, a Bangladesh-born Bengali physician and writer, the author of two best-selling novels, “Shame” and “French Lover,” poetry, memoirs, and non-fiction.

I think Taslima Nasrin has an excellent point. As long religion exists, fundamentalism may spring from it at any time, trying to go back many centuries and even millennia (all the way back to the Bronze Age if need be) to the archaic mentality that invented a particular religion.

Thousands, and even “mere” hundreds of years ago human understanding of the universe was extremely limited. Natural disasters were assumed to be “sent” by angry deities, unappeased by sufficient animal or human sacrifice. The air was crowded with demons. Society was extremely hierarchical, usually ruled by a king or emperor with absolute power, demanding absolute obedience. Democracy or human rights were not even on the horizon.

Of course all religions and all prophets are false, though some gems of wisdom can — and should — be salvaged from religious traditions. Above all, we need to be aware that all religions are a human invention, reflecting the culture of the time and place of origin. A full understanding of this fact — that religion is not an absolute truth, but a flawed human creation — should take away religion’s power to sprout malignant fundamentalism.

Again, it is important to remember is that Abrahamic religions developed during an era of kings, emperors, chieftains, and warlords — that is, by any other name, absolute dictators. These dictators were often vengeful and bloodthirsty, traits regarded as masculine virtues. The Abrahamic deity was modeled after these absolute dictators — hence titles such as the “king of kings.”



Here is Mary, Jesus, and St. John “the Beloved Disciple” — in Cologne Cathedral, Germany.

 About St. John the Beloved Disciple:

He was one of the Twelve Apostles of Jesus according to the New Testament. Generally listed as the youngest apostle, he was the son of Zebedee and Salome. His brother was James, who was another of the Twelve Apostles. The Church Fathers identify him as John the Evangelist, John of Patmos, John the Elder and the Beloved Disciple, and testify that he outlived the remaining apostles and that he was the only one to die of natural causes. The traditions of most Christian denominations have held that John the Apostle is the author of several books of the New Testament. John was Jesus' favorite disciple.

Jesus sent only John and Peter into the city to make the preparation for the final Passover meal (the Last Supper). At the meal itself, the "disciple whom Jesus loved" sat next to Jesus. It was customary to recline on couches at meals, and this disciple leaned on Jesus. Tradition identifies this disciple as Saint John. After the arrest of Jesus, Peter and the "other disciple" (according to tradition, John) followed him into the palace of the high-priest.

John alone, among the Apostles, remained near Jesus at the foot of the cross on Calvary alongside myrrh bearers and numerous other women; following the instruction of Jesus from the Cross, John took Mary, the mother of Jesus, into his care as the last legacy of Jesus. After Jesus' Ascension and the descent of the Holy Spirit at Pentecost, John, together with Peter, took a prominent part in the founding and guidance of the church. (~ Wikipedia)


Hillcrest, the gay district in San Diego, has a Cathedral of St John the Beloved. It's tiny as cathedrals go, but symbolically it says a lot.

Giampietrino's copy of The Last Supper by Leonardo. St. John the Beloved Disciple is young and feminine in appearance. 




Mozzarella is a soft, white cheese with high moisture content. It originated in Italy and is usually made from Italian buffalo or cow’s milk.

Mozzarella is lower in sodium and calories than most other cheeses.

Mozzarella also contains bacteria that act as probiotics, including strains of Lactobacillus casei and Lactobacillus fermentum.

Both animal and human studies show that these probiotics may improve gut health, promote immunity, and fight inflammation in your body.

One study in 1,072 older adults found that drinking 7 ounces (200 ml) per day of fermented dairy containing Lactobacillus fermentum for 3 monthssignificantly reduced the duration of respiratory infections, compared to not consuming the drink.

Therefore, dairy products like mozzarella that contain this probiotic may strengthen your immune system and help fight infections. However, more research is needed.


Feta is a soft, salty, white cheese originally from Greece. It’s typically made from sheep’s or goat’s milk. Sheep’s milk gives feta a tangy and sharp taste, while goat’s feta is milder. 

Since feta is packaged in brine to preserve freshness, it can be high in sodium. However, it is typically lower in calories than most other cheeses. 

Feta, like all full-fat dairy, provides conjugated linoleic acid (CLA), which is associated with reduced body fat and improved body composition.

One study in 40 overweight adults found that taking 3.2 grams per day of a CLA supplement for 6 months significantly decreased body fat and prevented holiday weight gain, compared to a placebo.

Thus, eating CLA-containing foods like feta may help improve body composition. In fact, feta and other cheeses made from sheep’s milk typically have more CLA than other cheeses.


Cottage cheese is a soft, white cheese made from the loose curds of cow’s milk. It’s thought to have originated in the United States.

Cottage cheese is much higher in protein than other cheeses.

Several studies indicate that eating high-protein foods like cottage cheese can increase feelings of fullness and help decrease overall calorie intake, which in turn may lead to weight loss.

A study in 30 healthy adults found that cottage cheese was just as filling as an omelet with a similar nutrient composition.


Ricotta is an Italian cheese made from the watery parts of cow, goat, sheep, or Italian water buffalo milk that are left over from making other cheeses. Ricotta has a creamy texture and is often described as a lighter version of cottage cheese.

The protein in ricotta cheese is mostly whey, a milk protein that contains all of the essential amino acids that humans need to obtain from food.

Whey is easily absorbed and may promote muscle growth, help lower blood pressure, and reduce high cholesterol levels.

One study in 70 overweight adults found that taking 54 grams of whey protein per day for 12 weeks lowered systolic blood pressure by 4% compared to baseline levels. However, this study focused on whey supplements rather than whey from dairy foods.


Cheddar is a widely popular semi-hard cheese from England. 

Made from cow’s milk that has been matured for several months, it can be white, off-white, or yellow. The taste of cheddar depends on the variety, ranging from mild to extra sharp.

In addition to being rich in protein and calcium, cheddar is a good source of vitamin K — especially vitamin K2.

Vitamin K is important for heart and bone health. It prevents calcium from being deposited in the walls of your arteries and veins.

Inadequate vitamin K levels can cause calcium buildup, inhibiting blood flow and leading to an increased risk of blockages and heart disease.

To prevent calcium deposits, it’s important to get enough vitamin K from foods. As K2 from animal foods is better absorbed than K1 found in plants, K2 may be especially important for preventing heart disease.

In fact, one study in over 16,000 adult women linked higher vitamin K2 intake to a lower risk of developing heart disease over 8 years.


As the name suggests, Swiss cheese originated in Switzerland. This semi-hard cheese is normally made from cow’s milk and features a mild, nutty taste.

Its signature holes are formed by bacteria that release gases during the fermentation process.
Since it is lower in sodium and fat than most other cheeses, Swiss cheese is often recommended for anyone who needs to monitor their salt or fat intake, such as people with high blood pressure.

What’s more, research shows that Swiss cheese hosts various compounds that inhibit angiotensin-converting enzyme (ACE).

ACE narrows blood vessels and raises blood pressure in your body — so compounds that stifle it may help lower blood pressure.

from another source

Gouda: Vitamin K2 powerhouse

Why K2 is essential to good health:

Cancer protective: K2 has been shown to reduce the risk of prostate cancer by 35 percent. K2 protects against leukemia and might even be used as a treatment for leukemia. It has been shown to stop the growth and invasion of human hepatocellular carcinoma (liver cancer). K2 has also been shown to supress the growth of lung and bladder cancers. 

Heart health: K2 protects us from heart disease by reducing calcium deposits in the arteries (some studies have even shown it can reverse arterial calcification). K2 basically takes the extra calcium in the blood and deposits it into our bones, where it should be.

Bone health: K2 helps form strong bones by promoting calcium deposition into the bones, and maintains bone mineralization by limiting the formation of osteoclasts (the cells that break down bone)

Skin health: K2 is associated with prevention of wrinkles, skin sagging, varicose veins. K2 prevents calcification of our skin’s elastin, thus smoothing our lines and wrinkles. K2 is also necessary for vitamin A to do its job, which is maintaining proper skin call proliferation.

Oral health: Weston A. Price talks extensively about the role of K2 and tooth health. K2 helps keep teeth cavity resistant by helping dentin produce osteocalcin, which deposits calcium into the enamel. Saliva has the second highest concentration of Vitamin K2 in the body.

Brain health: K2 promotes healthy brain function and is currently being studied for its role in the prevention of and treatment for dementia

Gouda cheese is higher in Vitamin K2 than most liver, grassfed butter, and even pastured egg yolks.


Hard cheeses in general are much higher in Vitamin K2 than soft cheeses. However, the best food source of Vitamin K2 is natto: soy beans fermented with Lactobacilus subtilis. 


Cottage cheese contains potassium, which acts as a fluid-balancing element in the body and is an important component in neural activities of the muscle and brain. It also relieves muscle cramps. Intake of potassium on a regular basis lowers the risk of getting brain stroke since it lowers blood pressure and the contraction of vessels. It is also helpful in decreasing stress levels and anxiety. Potassium, like sodium, acts as an electrolyte, but without the side effects of sodium, such as increased blood pressure and cardiovascular stress.

Zinc found in cottage cheese is about 4% of the daily recommended value. In the human body, it is found in the brain, muscles, bones, kidneys, liver, prostate, and eyes. It helps in the metabolism of DNA and RNA. Zinc is one of the trace elements whose presence in our body helps in improving the immune system and digestion. It is also useful for relieving stress and anxiety, curing night blindness, improving ocular health, preventing appetite loss and prostate disorder, and fighting various infections. Moreover, it acts as an antioxidant too.

Cottage cheese contains phosphorus, which plays a major role in the formation of DNA and RNA. It is a major component in forming bones along with calcium. Phosphates also help in digestion, excretion, and in the production and extraction of energy in the cells.

Selenium is a trace element found in cottage cheese. It is required in very small quantities, not more than 50 mcg to 70 mcg in adults. Selenium is useful as an antioxidant that protects cells and DNA from damage. According to a report published in the Harvard Health report, it is also believed that an optimum intake of selenium-rich foods reduces the risk of prostate cancer. A study conducted by a team of Harvard researchers found that a group of men who continued to receive selenium experienced a 49% lower risk of prostate cancer through a follow-up period that averaged 7.6 years.


People are not surprised to learn that yogurt has antioxidant properties, but tend to be surprised to learn that cheese (of any type) is likewise a fermented product, and has antioxidant benefits.

~ Cheese has higher antioxidant potential than other dairy products probably because of its higher protein content and the fermentation process: during ripening of cheese, breakdown of proteins to antioxidant peptides and microbial activity appear to increase the antioxidant content further.

Whole milk has higher antioxidant potential than reduced-fat milk because it contains more fat-soluble antioxidants such as CLA.

Other antioxidants contained in dairy products include coenzyme Q10, lactoferrin, carotenoids, and some minerals and trace elements.

Concerned with macular degeneration? Then you should know about this study:

~ Over the 15 years, decreased consumption of reduced-fat dairy foods was associated with an increased risk of incident late AMD, comparing the lowest to highest quintile of intake (OR 3,10). Decreasing total dietary intake over the 15 years was also associated with an increased risk of developing incident late AMD. Additional cohort studies are needed to confirm these findings. ~


Of course other antioxidant foods are also recommended, e.g. dark-meat fish and leafy vegetables.


~ The French paradox - high saturated fat consumption but low incidence of cardiovascular disease (CVD) and mortality - is still unresolved and continues to be a matter of debate and controversy. Recently, it was hypothesized that the high consumption of dairy products, and especially cheese by the French population might contribute to the explanation of the French paradox, in addition to the "(red) wine" hypothesis. Most notably this would involve milk bioactive peptides and biomolecules from cheese moulds. Here, we support the "dairy products" hypothesis further by proposing the "alkaline phosphatase" hypothesis. First, intestinal alkaline phosphatase (IAP), a potent endogenous anti-inflammatory enzyme, is directly stimulated by various components of milk (e.g. casein, calcium, lactose and even fat). This enzyme dephosphorylates and thus detoxifies pro-inflammatory microbial components like lipopolysaccharide, making them unable to trigger inflammatory responses and generate chronic low-grade inflammation leading to insulin resistance, glucose intolerance, type-2 diabetes, metabolic syndrome and obesity, known risk factors for CVD. Various vitamins present in high amounts in dairy products (e.g. vitamins A and D; methyl-donors: folate and vitamin B12), and also fermentation products such as butyrate and propionate found e.g. in cheese, all stimulate intestinal alkaline phosphatase. Second, moulded cheeses like Roquefort contain fungi producing an alkaline phosphatase. Third, milk itself contains a tissue nonspecific isoform of alkaline phosphatase that may function as IAP. Milk alkaline phosphatase is present in raw milk and dairy products increasingly consumed in France. It is deactivated by pasteurization but it can partially reactivate after thermal treatment. ~

ending on beauty:

When Ikkyu started
His journey as a boy
He thought the journey
Would be the adventure

Only now sitting
Old and broken
In the dark forest
He realizes the adventure
Was behind him
At home with mother
And father

~ John Guzlowski