Saturday, May 25, 2019

THE WESTERN AND THE NOIR: HOW MOVIES SHAPED AMERICA; WHY GENERALS THOUGHT WW1 WOULD BE OVER IN A FEW MONTHS; KAFKA: GHOSTS; “YOU CAN’T TAKE IT WITH YOU”; WORLD WITHOUT JOBS?

Cat in Hagia Sophia; photo: Eugene Halpern

*
Passing in front of a small shop that sold
cheap and flimsy merchandise for workers,
he saw a face inside, a figure
that compelled him to go in, and he pretended
he wanted to look at some colored handkerchiefs.

He asked about the quality of the handkerchiefs
and how much they cost, his voice choking,
almost silenced by desire.
And the answers came back in the same mood,
distracted, the voice hushed,
offering hidden consent.

They kept on talking about the merchandise —
but the only purpose: that their hands might touch
over the handkerchiefs, that their faces, their lips
might move close together as though by chance —
a moment’s meeting of limb against limb.

Quickly, secretly, so the shop owner sitting at the back
wouldn’t realize what was going on.

~ Cavafy, He Asked about the Quality

Cavafy has a wonderful simplicity and concreteness. No explication is needed — the details say it all.

Thinking of the rotten sex scenes in the recent movies — this poem has more erotic tension than all those ridiculous movie quickies combined.

"The rainbow Black Madonna"; Elżbieta Podleska (who faces two years in prison for "offending religious feelings")

“Poetry can repair no loss, but it defies the space which separates. And it does this by its continual labor of reassembling what has been scattered.” ~ John Berger
 
Oriana:


Yes, poetry makes us more aware of what we have in common as human beings. It makes us feel less separate. For instance, one need not be gay to feel sympathy toward the two gay men in Cavafy’s poem — two strangers who steal a bit of closeness under the pretense of talking about business. Straight couples also use similar ploys, at least now and then. Eros, which means “yearning,” has its own privacy, or even secrecy. Moments such as “almost touching” can later become precious memories — even sacred memories. The poem reminded me of that universality.
 
*
KAFKA: GHOSTS

“Obviously you’ve never spoken to a ghost. One never gets straight information from them. It’s just a hither and thither. These ghosts seem to be more dubious about their existence than we are, and no wonder, considering how frail they are.” ~ Kafka, Unhappiness

Please pardon my repeating the “double image” I've used not long ago. I think the main ghosts in our lives are our younger selves. Sometimes I am so surprised at my younger self that it's difficult to believe that really was me. My evidence is poems, but poems are unreliable narrators — very selective.

 
*

YOU CAN’T TAKE IT WITH YOU
 

Recently I watched “You Can’t Take It with You” (1938) on video. Alas, it doesn’t have a fraction of the depth and imaginative brilliance of “It’s a Wonderful Life.” It’s still a very watchable movie with the kind of message we love, and it’s quite relevant today re: corporations versus the working people, and yes, the jail and courtroom scenes were pure joy, and the harmonica duet was a flash a genius, but . . . the eccentricity of the Sycamore family was overdone and repetitious, a few characters could be cut out and no one would miss them, etc.

I almost stopped watching 10 minutes into it, then 15, and so on — and yet, and yet . . . Finally the movie drew me in, and if “It’s a Wonderful Life” hadn’t set the bar so high, I’d probably give it a higher appraisal.

My favorite character was Mrs. Kirby. Yes, a stereotype, but the acting (Mary Forbes, who started out as a British stage actress) was such perfection that I enjoyed every minute she was on the screen. With the others, sometimes I felt like screaming, “Enough already!” — she left me hungry for more.

Comedy doesn’t age well. It’s a genre that becomes quickly dated — all the more quickly when culture changes at the amazing speed we are witnessing now. What was perhaps funny back in 1938 is rarely funny now. And yet certain things remain funny: the slapstick comedy of Oliver and Hardy has a universal quality about it, as does the best of Charlie Chaplin or Groucho Marx. And perhaps that’s just me, but comedy based on the affectations of the upper class, especially the British upper class, also has that timeless quality.

Perhaps you have to be British to play someone of the upper class. Only the British actors seem to hit the perfect note every time. They somehow make the stereotype even more stereotypical and still incredibly funny down to a twitch of an eyebrow or the mouth opening just a bit in a mute scream. It's magic. It's art.

 Mary Forbes as the immaculate Mrs. Kirby

*
LOOKING AT FRACTALS REDUCES STRESS

~ “We’re finding that aesthetic images can induce staggering changes to the body, including radical reductions in the observer’s stress levels. Researchers are untangling just what makes particular works of art or natural scenes visually appealing and stress-relieving – and one crucial factor is the presence of the repetitive patterns called fractals.

My scientific curiosity was stirred when I learned that many of nature’s objects are fractal, featuring patterns that repeat at increasingly fine magnifications. For example, think of a tree. First you see the big branches growing out of the trunk. Then you see smaller versions growing out of each big branch. As you keep zooming in, finer and finer branches appear, all the way down to the smallest twigs. Other examples of nature’s fractals include clouds, rivers, coastlines and mountains.

The impact of nature’s aesthetics is surprisingly powerful. In the 1980s, architects found that patients recovered more quickly from surgery when given hospital rooms with windows looking out on nature. Other studies since then have demonstrated that just looking at pictures of natural scenes can change the way a person’s autonomic nervous system responds to stress.

Through exposure to nature’s fractal scenery, people’s visual systems have adapted to efficiently process fractals with ease. We found that this adaptation occurs at many stages of the visual system, from the way our eyes move to which regions of the brain get activated. This fluency puts us in a comfort zone and so we enjoy looking at fractals. Crucially, we used EEG to record the brain’s electrical activity and skin conductance techniques to show that this aesthetic experience is accompanied by stress reduction of 60 percent – a surprisingly large effect for a nonmedicinal treatment. This physiological change even accelerates post-surgical recovery rates.

Artists intuit the appeal of fractals


It’s therefore not surprising to learn that, as visual experts, artists have been embedding fractal patterns in their works through the centuries and across many cultures. Fractals can be found, for example, in Roman, Egyptian, Aztec, Incan and Mayan works. My favorite examples of fractal art from more recent times include da Vinci’s Turbulence (1500), Hokusai’s Great Wave (1830), M.C. Escher’s Circle Series (1950s) and, of course, Pollock’s poured paintings.

How artists create their fractals fuels the nature-versus-nurture debate in art: To what extent is aesthetics determined by automatic unconscious mechanisms inherent in the artist’s biology, as opposed to their intellectual and cultural concerns? In Pollock’s case, his fractal aesthetics resulted from an intriguing mixture of both. His fractal patterns originated from his body motions (specifically an automatic process related to balance known to be fractal). But he spent 10 years consciously refining his pouring technique to increase the visual complexity of these fractal patterns.

Pollock’s abstract expressionist colleague, Willem De Kooning, also painted fractals. When he was diagnosed with dementia, some art scholars called for his retirement amid concerns that that it would reduce the nurture component of his work. Yet, although they predicted a deterioration in his paintings, his later works conveyed a peacefulness missing from his earlier pieces. Recently, the fractal complexity of his paintings was shown to drop steadily as he slipped into dementia. The study focused on seven artists with different neurological conditions and highlighted the potential of using art works as a new tool for studying these diseases. To me, the most inspiring message is that, when fighting these diseases, artists can still create beautiful artworks.” ~

https://www.smithsonianmag.com/innovation/fractal-patterns-nature-and-art-are-aesthetically-pleasing-and-stress-reducing-180962738/?


Mary:

It is no wonder we find fractals soothing, and produce them ourselves in art, since they are everywhere in nature. And I think the basics — repetition, rhythm, variation, symmetry, are also grounded in the body, its shape, the rhythms of heartbeat and breathing, the movement of the body in space, (walking, dancing) the use of language, (talking, chanting, singing). All of these exist in heightened and deliberate form in our creations: music, visual and verbal art, architecture, dance, the shapes of the stories we tell, and everything we build, from gardens to machinery. It would be both surprising and rare to find exceptions...and if we did, I'm sure we would find these exceptions unappealing, out of sync with both the world and our experience.


*

“Narratives are fractals; they connect, repeat, and expand. Inevitably there is symmetry.” ~ Erika Swyler


 

*


Coma Berenices: Berenice's Hair. From Wiki: “Coma Berenices is one of the few constellations to owe its name to a historical figure, in this case Queen Berenice II of Egypt, wife of Ptolemy III Euergetes (fl. 246 BC–221 BC), the king under whom Alexandria became an important cultural center.

In 243 BC, during the Third Syrian War, Ptolemy undertook a dangerous expedition against the Seleucids, who had murdered his sister. His newlywed bride, Berenice, swore to the goddess Aphrodite to sacrifice her long, blonde hair, of which she was extremely proud, if her husband returned safely. He did, so she cut her hair and placed it in the goddess's temple. By the next morning the hair had disappeared. To appease the furious king the court astronomer, Conon, announced that the offering had so pleased the goddess that she had placed it in the sky. He indicated a cluster of stars that have since been called Berenice's Hair.”

Queen Berenice II of Egypt (also spelled “Berenike” — note the similarity to Veronika; the name means “bringer of victory” [Nike])

*

THE WESTERN AND THE NOIR: HOW MOVIES SHAPED AMERICA



~ “Every story needs a space in which to unfold, of course, but the Western does more; it is in love with space; it foregrounds it, full-screen, whenever it can. The start of the cattle drive in Red River (1948): in two minutes, we get a static background (drovers and herd, at dawn, motionless against the landscape), a panoramic so powerful—this is our cattle, this is our land—not even a legendary continuity blunder can spoil it, a confident sense of direction (“Take them to Missouri, Matt”), and an explosion of joy. Beginnings are particularly good at evoking the immensity of this space: in The Man of the West (1958), a horseman appears on the horizon, looks at the empty expanse around him, and rides calmly off; in The Virginian (1929) and My Darling Clementine, a herd of cows disperses slowly in every direction; in Red River, The Man from Laramie (1955), and Rio Bravo (1959), it’s wagons that advance cautiously this way and that.

Cautiously, slowly, calmly: the initial tempo of the Western: Lento assai. The first ten minutes of Once Upon a Time in the West (1968): three men at a station, a fly buzzing, a wheel screeching, a drop of water hitting the rim of a hat. In no other form does waiting—for the train, the attack, the night, the stage, the cavalry . . . —play such a large role: a dilated sense of time, mirroring the enlargement of space. The Big Trail, The Big Sky, The Big Country. Big, and empty: in film after film, the first to “set eyes” on the land is a white man, who sees nothing but an uninhabited country. Native Americans—“Indians,” as the Western calls them—were of course already living in the West (and everywhere else in America, for that matter); but by routinely introducing them only after we have already become familiar with white characters, the Western makes them look like illegitimate intruders. In reality, they were there first; in fiction, they arrive always too late. Seldom has narrative lied so spectacularly about the history it claimed to narrate.

*



“Cinema is the specifically epic art,” wrote André Bazin in a famous essay on American film, and “the migration to the West is our Odyssey.” Epic, yes; Odyssey, no. That there is no return is the founding act of the genre. Home is a vague hope, distant in space and in time; for now, all there is is a wagon; two or three generations, together, surrounded by hundreds of other families; all different, and all leading exactly the same life. Life in the open, on unsteadily undulating stoops, under everybody’s eyes; because what matters, in these films, is not the private sphere of the individual family—we never see the inside of a wagon, and the intimacy of a sentimental conversation, or of a good wash, are often met with rough collective humor—but the amalgamation of everybody into a community. Into a nation.

That there is no return is the founding act of the genre.

Dreaming . . . But this is more like an obsession. The march of the wagon train can never stop: a hasty prayer, and the dead are buried and left forever behind; a child is born, and hours later is already on the move. Everyday life is both implacably everyday—always brewing coffee, always mending socks and washing their only passable shirt—and frightfully unpredictable: a danger that comes less from human enemies (although the conflict with “Indians” is present in most films of migration), than from the hostility of nature: it’s always too hot, too cold, too dry, too windy . . . rain, dust, snow, mountains, rapids . . . So much friction, in these films: not a journey in which a wagon doesn’t get stuck in the mud; not a scene in which they go downhill, for a change.

Rarely do fictional characters work as hard as in early Westerns: keeping the animals together, cutting down trees, crossing rivers, digging passages, overcoming crazy obstacles. After all this, they deserve the West. They have been a stubborn, single-minded human herd; which is the reason Red River, with its supremely unpromising storyline (moving ten thousand cows from Texas to Missouri, imagine that), is the greatest of all epic Westerns. Those cattle are the settlers: and in the film’s terrifying stampede, caused by a man who wants to eat sugar in the middle of the night, the destructive potential of the great migration erupts for a moment, earthquake-like, into the open.

*

Shadows: THE NOIR


Though just as haunted by death and killing as the Western, the linear geometry of the duel is unthinkable in film noir. The Lady from Shanghai places Rita Hayworth and Welles face to face, looking straight into each other’s eyes; a few seconds, and a third person emerges from his words (“I thought it was your husband you wanted to kill”), to be immediately multiplied by hers (“George was supposed to take care of Arthur, but he lost his silly head and shot Broome”). They are alone—but they are not; someone else is always between them. A few more seconds, and “Arthur” (Hayworth’s husband, played by Everett Sloane) shows up in person. Now it is he and Hayworth who face each other, guns in their hands; but in the “Magic Mirror Maze” where the scene is set, optics are deceptive: in a particularly baroque moment, Hayworth is aiming straight at the audience, Sloane diagonally, in the same general direction, but also—reflected as he is from several different angles—seemingly at himself: “You’d be foolish to fire that gun. With these mirrors it’s difficult to tell. You are aiming at me, aren’t you? I’m aiming at you, lover.”

As they start firing, and glass shatters everywhere, it’s impossible to say what is happening to whom (at a certain point, it even looks as if Welles is the one being hit); and even after Hayworth and Sloane die, we are left with the baffling memory of a shootout that adds a third person to the usual two. (The unlikeliness of this situation is the secret behind The Man Who Shot Liberty Valance, 1962.) But in fact, triangulation is as essential to the structure of the noir as the binary logic was to the Western. It’s the triangle of adultery, of course, as indeed in The Lady from Shanghai, or in George Macready’s toast “to the three of us”—himself; his wife, Hayworth (always her); and her secret ex-lover, Glenn Ford—in Gilda (1946). But beyond adultery, what emerges here is the fundamental figure of the social universe of the film noir: the Third.


The adulterous triangle is merely the starting point for an incessant proliferation of corpses.

https://lithub.com/western-vs-noir-how-two-genres-shaped-postwar-american-culture/?fbclid=IwAR0CUkOIssH8rfgeCvI4NrQS3RDJEEh9wve9qbGDXuH_gf3bOUBpf7YaZtI




Oriana:

Both genres are marked with intense, unapologetic violence. One interesting difference is that the Western is rural, while the noir tends to be urban. In a Western, we typically know who the good guys are; in the noir, things get “complicated.”

It’s interesting that love plots don’t play a central part in either genre. The Western is overwhelmingly male; the cowboy is in love with his horse. Women are important in the noir, but  these are not loving women. A woman who looks like an angel will likely turn out a betrayer, a femme fatale.

Both genres, however, are interested in the idea of justice. The happy ending is not really happy in the sense of cheerful, but rather it’s “justice”: the bad guys (and gals) either get killed or are arrested and will be duly punished soon. But even the hard-boiled private detective is not exactly the supernaturally brave sheriff; he is likely a hard drinker, a cynic about the human nature, a single man who may know lust and infatuation, but has never experienced deep and lasting love.

*


*

WHY GENERALS THOUGHT THAT WWI WOULD BE OVER IN MONTHS

 
"We think of the First World War as having its causes in Europe, where the greatest bloodshed and destruction would take place. But several of the illusions that propelled the major powers so swiftly into war had their roots in far corners of the world.

The biggest illusion, of course, was that victory would be quick and easy. “You will be home,” Kaiser Wilhelm II of Germany told his troops, “before the leaves have fallen from the trees.” The German campaign plan called for knocking France out of the war in 42 days. The Allies were not quite so arrogant, but were confident of triumph in months, not years.

A second illusion of those who marched proudly into battle in 1914 was that they would be shooting at the enemy, but that he would not be shooting back, or at least not effectively. How else to explain that most soldiers on both sides had no metal helmets? And that millions of French infantrymen, as well as the Austro-Hungarian cavalry, wore combat uniforms of brilliant red and blue? As the war began, troops from both sides advanced over open ground en masse, as if they were not facing repeating rifles and machine guns: bayonet charges by the French, and ranks of young Germans walking, arms linked, toward astonished British soldiers. The British would make plenty of similar suicidal advances of their own in the years ahead.

Where were these illusions born? They came from the way generals cherry-picked previous wars to learn from. A close look at the siege of Petersburg, Va., in the American Civil War, for instance, would have provided a lesson in trench warfare — and a sense of what it meant to be under fire from an early ancestor of the machine gun, the Gatling gun. A similar foretaste of both trench warfare and the power of the machine gun could be had by studying the siege of Port Arthur (now Dalian, China) in the Russo-Japanese War of 1904-5.

But the men who led Europe into the First World War found it more comforting to look elsewhere — at battles where victory was swift and the enemy had little firepower. In 1914 Europe had not had a major war in more than 40 years and, except for the Russians, almost all officers who had actually seen combat had done so in lopsided colonial wars in Africa and Asia.

Erich von Falkenhayn, for example, chief of the German General Staff for the first two years of the war, had been in the international force that suppressed the anti-Western Boxer Rebellion in China in 1900. Another veteran of that campaign — and of military service in Indochina and Algeria — was Robert Nivelle, later the French commander on the Western Front and the leader of a 1917 offensive that left 120,000 French soldiers dead or wounded and sparked a mutiny. Joseph Joffre, Nivelle’s predecessor, had served in Indochina and Madagascar, and had led an expedition across the Sahara to conquer Timbuktu. Most of the British generals had served in the colonies; when war broke out, Britain had more troops on active duty in India alone than in the British Isles.

Colonial wars seldom lasted long because the German, French and British Armies had modern rifles, machine guns and small mobile artillery pieces, as well as steamboats and railroads that could move men and weapons as needed. The Africans and Asians usually had none of these things.

In 1898, for example, a whole panoply of British officers (including Winston Churchill) who would later fight in Europe were on hand for a battle at Omdurman, in Sudan. The 50,000 Sudanese they faced were armed only with spears, swords and antiquated rifles. In a few hours, the six Maxim machine guns of the far smaller Anglo-Egyptian force fired half a million bullets, leaving nearly 11,000 Sudanese dead and some 16,000 wounded, many fatally. The battle determined the outcome of a war in less than a day.

Yet another illusion on both sides in 1914 was that a key force would be the cavalry. After all, hadn’t cavalry service been a path to military glory for more than 2,000 years? At the Cavalry Club on London’s Piccadilly Circus and its counterparts in Paris, Berlin, St. Petersburg and Vienna, officers eagerly anticipated more of the same. The initial German invasions of France and Belgium, for example, included eight cavalry divisions with more than 40,000 horses — the largest such body ever sent into battle in Western Europe. Tens of thousands of the unfortunate animals were laboriously shipped to the front over great distances: to the Middle East from New Zealand, to Belgium from Canada, to France from India.

Faith in the cavalry also sprang from colonial wars. British horsemen made a charge at Omdurman and did so far more spectacularly a year and a half later in another colonial conflict, the Boer War. Masked by an immense cloud of dust kicked up by thousands of galloping horses, the British successfully charged, almost unscathed, through Boer forces besieging the town of Kimberley, in present-day South Africa. “An epoch in the history of cavalry,” declared the London Times history of that war. “A staggering success,” read a German General Staff report on the battle.

None of the many military observers in the Boer War seemed to notice that one simple defensive measure could have stopped the great charge at Kimberley dead: barbed wire. On the Western Front in 1914, that, along with the machine gun, would spell doom for the cavalry and for the other illusions as well."

http://www.nytimes.com/2014/07/29/opinion/adam-hochschild-why-world-war-i-was-such-a-blood-bath.html?action=click&pgtype=Homepage&version=Moth-Visible&module=inside-nyt-region&region=inside-nyt-region&WT.nav=inside-nyt-region&_r=1


Mary:

Amazing that the Generals seem to be so far off the mark in their assumptions and strategies — to see that this had a credible base in Colonial experiences of combat, where the sides were so incredibly unevenly matched, rings true. Calvary and bright red uniforms against guns, explosives and poison gas, the replacement of drawn battle lines with trench warfare, hard learned lessons with enormous cost in human lives.

By WWII horses had been replaced with armored tanks, and aerial bombs were crucial . . . taking out greater and greater numbers of civilian lives and reducing cities to rubble — Dresden, then Nagasaki, Hiroshima. The definition of battlefield and enemy shifting and blurring not only with the increasing power and deadliness of weaponry, but with changes in the political environment — the struggle against colonialism and the rise of superpowers with their constant jockeying for place.

This lag in adaptation was again obvious in Vietnam.

Conventional warfare strategy fails when there is no "front," no "battleline," and, no distinction between populace and military. Guerrilla warfare is fairly unwinnable by conventional means. Of course, in the learning curve  for all those in command,  the burden of loss is paid by the soldiers and the general populace.

The current wars continue, with sophisticated weaponry that can be aimed and triggered remotely, like playing a video game, and with ground troops embroiled in urban guerrilla warfare.

Oriana:

~ “As early as 1929, Lieutenant Colonel J. L. Schley, of the Corps of Engineers wrote in The Military Engineer: “It has been said critically that there is a tendency in many armies to spend the peace time studying how to fight the last war,” a sentiment repeated by the Dallas Morning News in 1937: “There is a partly justified criticism that peacetime generals are always fighting the last war instead of the next one.”

Every war, whether physical or metaphorical, has produced its own backward-looking buffoonery disguised as institutional prudence. From the seizure of shampoo bottles at airports after 9/11, to the mistimed austerity that prematurely forced Europe into grinding economic depression, this classic, all-too-human and ubiquitous mistake is so common that it has spawned more cliches than your average cognitive foible. Whether we call it “closing the barn door after the horse has bolted” or something else, the desire to see the present and the future in terms of the recent past is often a form of collective delusion.” ~

https://www.huffpost.com/entry/stop-preparing-for-war_b_2490775

There has been some success in using drones against insurgencies — but it has become obvious that the current wars are unwinnable. They don’t address the causes that make young men so susceptible to radical propaganda. And fighting a war using the model of playing a video game is precisely the opposite of trying to address the human side.

I’ll never understand the lack of interest in trying to understand the other side’s mentality. But I guess it’s an aspect of the racism that’s on display here — those guys are subhuman, bombs is the only language they understand.

I do remember reading about one general who broke away from that mentality and successfully defused local hostility precisely by working with the locals — talking with them, imagine!

The drones don’t unnerve me as much as what I witness at least once a week as I drive past the Navy stockyards in San Diego. The extremely expensive ships being built perhaps made sense decades ago, but do they now? Or the super-super-expensive manned bombers — who are they supposed to attack? The Chinese, who have wisely constrained their military spending in favor of developing their economy and extending their global influence? It was amazing watching a Chinese official lecture the US on “how about you invest in your infrastructure”?

Meanwhile in Syria the Russians are using low-flying planes that appear to be more successful at locating and blowing up a convoy than high-flying planes that rely on sophisticated equipment. This reminds me of NASA’s trying to develop a ballpoint pen that would work in zero gravity, while the Russian simply used pencils. To give a more serious example, in WW2 the Germans had their armies prepared to wage short, high-speed campaigns. At first it worked — then Hitler idiotically decided to invade Russia. The Russians did what they did time and time before — kept retreating until vast distances and the brutal climate did their work.

All this is wasted on the generals. Give toys to the generals, and they’ll want to use them. And they always demand more and more toys. And as they do so, they are trying to look like futurists. By now they are of course aware of the proverb that accuses them of “fighting the last war,” so they try to predict the future war — usually without success.

The only victory would be eliminating war. Every war is a defeat for humanity. 


*


WORLD WITHOUT JOBS?

 
~ “Work is the master of the modern world. For most people, it is impossible to imagine society without it. It dominates and pervades everyday life – especially in Britain and the US – more completely than at any time in recent history. An obsession with employability runs through education. Tech companies persuade their employees that round-the-clock work is play. Gig economy companies claim that round-the-clock work is freedom. Workers commute further, strike less, retire later. Digital technology lets work invade leisure.

 
In all these mutually reinforcing ways, work increasingly forms our routines and psyches, and squeezes out other influences. As Joanna Biggs put it in her quietly disturbing 2015 book All Day Long: A Portrait of Britain at Work, “Work is … how we give our lives meaning when religion, party politics and community fall away.”

And yet work is not working, for ever more people, in ever more ways. We resist acknowledging these as more than isolated problems – such is work’s centrality to our belief systems – but the evidence of its failures is all around us.

As a source of social mobility and self-worth, work increasingly fails even the most educated people – supposedly the system’s winners. In 2017, half of recent UK graduates were officially classified as “working in a non-graduate role”. In the US, “belief in work is crumbling among people in their 20s and 30s”, says Benjamin Hunnicutt, a leading historian of work. “They are not looking to their job for satisfaction or social advancement.” (You can sense this every time a graduate with a faraway look makes you a latte.)

Work is increasingly precarious: more zero-hours or short-term contracts; more self-employed people with erratic incomes; more corporate “restructurings” for those still with actual jobs. As a source of sustainable consumer booms and mass home-ownership – for much of the 20th century, the main successes of mainstream western economic policy – work is discredited daily by our ongoing debt and housing crises. For many people, not just the very wealthy, work has become less important financially than inheriting money or owning a home.

Unsurprisingly, work is increasingly regarded as bad for your health: “Stress … an overwhelming ‘to-do’ list … [and] long hours sitting at a desk,” the Cass Business School professor Peter Fleming notes in his new book, The Death of Homo Economicus, are beginning to be seen by medical authorities as akin to smoking.

Work is badly distributed. People have too much, or too little, or both in the same month. And away from our unpredictable, all-consuming workplaces, vital human activities are increasingly neglected. Workers lack the time or energy to raise children attentively, or to look after elderly relations. “The crisis of work is also a crisis of home,” declared the social theorists Helen Hester and Nick Srnicek in a paper last year. This neglect will only get worse as the population grows and ages.

Like an empire that has expanded too far, work may be both more powerful and more vulnerable than ever before. We know work’s multiplying problems intimately, but it feels impossible to solve them all. Is it time to start thinking of an alternative?

In 1930, the economist John Maynard Keynes predicted that, by the early 21st century, advances in technology would lead to an “age of leisure and abundance”, in which people might work 15 hours a week. In 1980, as robots began to depopulate factories, the French social and economic theorist André Gorz declared: “The abolition of work is a process already underway … The manner in which [it] is to be managed … constitutes the central political issue of the coming decades.”

 
For some of the [“post-work”] writers, this future must include a universal basic income (UBI) – currently post-work’s most high-profile and controversial idea – paid by the state to every working-age person, so that they can survive when the great automation comes. For others, the debate about the affordability and morality of a UBI is a distraction from even bigger issues.

Post-work may be a rather grey and academic-sounding phrase, but it offers enormous, alluring promises: that life with much less work, or no work at all, would be calmer, more equal, more communal, more pleasurable, more thoughtful, more politically engaged, more fulfilled – in short, that much of human experience would be transformed,

One of post-work’s best arguments is that, contrary to conventional wisdom, the work ideology is neither natural nor very old. “Work as we know it is a recent construct,” says Hunnicutt. Like most historians, he identifies the main building blocks of our work culture as 16th-century Protestantism, which saw effortful labour as leading to a good afterlife; 19th-century industrial capitalism, which required disciplined workers and driven entrepreneurs; and the 20th-century desires for consumer goods and self-fulfillment.

The emergence of the modern work ethic from this chain of phenomena was “an accident of history,” Hunnicutt says. Before then, “All cultures thought of work as a means to an end, not an end in itself.” From urban ancient Greece to agrarian societies, work was either something to be outsourced to others – often slaves – or something to be done as quickly as possible so that the rest of life could happen.

By the end of the 70s, it was possible to believe that the relatively recent supremacy of work might be coming to an end in the more comfortable parts of the west. Instead, the work ideology was reimposed. During the 80s, the aggressively pro-business governments of Margaret Thatcher and Ronald Reagan strengthened the power of employers, and used welfare cuts and moralistic rhetoric to create a much harsher environment for people without jobs. David Graeber, who is an anarchist as well as an anthropologist, argues that these policies were motivated by a desire for social control. After the political turbulence of the 60s and 70s, he says, “Conservatives freaked out at the prospect of everyone becoming hippies and abandoning work. They thought: ‘What will become of the social order?’”

Hunnicutt, who has studied the ebb and flow of work in the west for almost 50 years, says Graeber has a point: “I do think there is a fear of freedom – a fear among the powerful that people might find something better to do than create profits for capitalism.”

The work culture has many more critics now. In the US, sharp recent books such as Private Government: How Employers Rule Our Lives (and Why We Don’t Talk About It) by the philosopher Elizabeth Anderson, and No More Work: Why Full Employment Is a Bad Idea by the historian James Livingston, have challenged the dictatorial powers and assumptions of modern employers; and also the deeply embedded American notion that the solution to any problem is working harder.

Post-work has the potential to appeal to conservatives. Some post-workists think work should not be abolished but redistributed, so that every adult labors for roughly the same satisfying but not exhausting number of hours. “We could say to people on the right: ‘You think work is good for people. So everyone should have this good thing,’” says James Smith, a post-workist whose day job is lecturing in 18th-century English literature at Royal Holloway, University of London. “Working less also ought to be attractive to conservatives who value the family.”

The post-workists argue that it is precisely their work-saturated lives – and their experience of the increasing precariousness of white-collar employment – that qualify them to demand a different world. Like many post-workists, Stronge has been employed for years on poorly paid, short-term academic contracts. “I’ve worked as a breakfast cook. I’ve been a Domino’s delivery driver,” he told me. “I once worked in an Indian restaurant while I was teaching. My students would come in to eat, and see me cooking, and say: ‘Hi, is that you, Will?’

Defenders of the work culture such as business leaders and mainstream politicians habitually question whether pent-up modern workers have the ability to enjoy, or even survive, the open vistas of time and freedom that post-work thinkers envisage for them. In 1989, two University of Chicago psychologists, Judith LeFevre and Mihaly Csikszentmihalyi, conducted a famous experiment that seemed to support this view. They recruited 78 people with manual, clerical and managerial jobs at local companies, and gave them electronic pagers. For a week, at frequent but random intervals, at work and at home, these employees were contacted and asked to fill in questionnaires about what they were doing and how they were feeling.

The experiment found that people reported “many more positive feelings at work than in leisure”. At work, they were regularly in a state the psychologists called “flow” – “enjoying the moment” by using their knowledge and abilities to the full, while also “learning new skills and increasing self-esteem”. Away from work, “flow” rarely occurred. The employees mainly chose “to watch TV, try to sleep, [and] in general vegetate, even though they [did] not enjoy doing these things”. US workers, the psychologists concluded, had an “inability to organize [their] psychic energy in unstructured free time”.

*

A vision of state-supported but liberated and productive citizens owes a lot to Ivan Illich, the half-forgotten Austrian social critic who was a leftwing guru during the 70s. In his intoxicating 1973 book Tools for Conviviality, Illich attacked the “serfdom” created by industrial machinery, and demanded: “Give people tools that guarantee their right to work with high, independent efficiency … from power drills to mechanized pushcarts.” Illich wanted the public to rediscover what he saw as the freedom of the medieval artisan, while also embracing the latest technology.

The disappearance of the paid job could finally bring about one of the oldest goals of feminism: that housework and raising children are no longer accorded a lower status. With people having more time, and probably less money, private life could also become more communal, she suggests, with families sharing kitchens, domestic appliances, and larger facilities. “There have been examples of this before,” she says, “like ‘Red Vienna’ in the early 20th century, when the [social democratic] city government built housing estates with communal laundries, workshops, and shared living spaces that were quite luxurious.” Post-work is about the future, but it is also bursting with the past’s lost possibilities.

Despite being a Tory MP from the most pro-business wing of his party, Nick Boles accepts in his book that a future society “may redefine work to include child-rearing and taking care of elderly relatives, and finally start valuing these contributions properly”. Post-work is spreading feminist ideas to new places.

In some ways, we’re already in a post-work society. But it’s a dystopic one. Office employees constantly interrupting their long days with online distractions; gig-economy workers whose labor plays no part in their sense of identity; and all the people in depressed, post-industrial places who have quietly given up trying to earn – the specter of post-work runs through the hard, shiny culture of modern work like hidden rust.

Creating a more benign post-work world will be more difficult now than it would have been in the 70s. In today’s lower-wage economy, suggesting people do less work for less pay is a hard sell. As with free-market capitalism in general, the worse work gets, the harder it is to imagine actually escaping it, so enormous are the steps required.

But for those who think work will just carry on as it is, there is a warning from history. On 1 May 1979, one of the greatest champions of the modern work culture, Margaret Thatcher, made her final campaign speech before being elected prime minister. She reflected on the nature of change in politics and society. “The heresies of one period,” she said, always become “the orthodoxies of the next”. The end of work as we know it will seem unthinkable – until it has happened.” ~

https://getpocket.com/explore/item/post-work-the-radical-idea-of-a-world-without-jobs



John Guzlowski:

What would people do without jobs? I read somewhere recently that many people who are out of work and looking for a job spend about 50 hours a week watching TV. Work time has been replaced by TV time.


Oriana:

It is a problem, and the article does raise the issue that "post-work" is great for intellectuals and creatives, but leaves many others without a meaningful alternative. I was struck that the majority of people report being happier at work than at home. And some working mothers say, "Compared to home, the office is a piece of cake!" It's clean, structured, there's companionship . . .

John:

My mom always said that. Even when she worked on assembly lines.

 
Oriana:

Thanks for sharing this. Also, a sense of being useful is tremendously important to people. "I hope I can still be useful," I heard a checkout cashier say when a big retail store was closing. She was near tears -- and so was I, because she looked over fifty and I knew it would be difficult for her to find another job. But if the job situation is restructured, ideally nearly everyone would find  his or her “niche” of usefulness.

Matt:

It has been pointed out that primitive societies provided for their needs and still enjoyed more leisure than we do. One statistic we rarely see reported by the bureau of labor statistics is the death toll from work.

Oriana:

Many people go through a crisis when they retire. They either try to do too much, or can’t figure out what to do. But eventually they find a new routine. And some discover new places to socialize — like going to the Y practically every day. Or they make take up ballroom dancing or a craft workshop. Or put more time into cooking and gardening. Crisis at first — then being quite happy and surprisingly busy.

I imagine that the change to the post-work world would be gradual — nor would jobs disappear 100%.  And perhaps new jobs would arise in the field of trying to save the environment. 


*

“The price of anything is the amount of life you exchange for it.” ~ Henry David Thoreau


*


GOETHE ON WHO NEEDS RELIGION
 
He who possesses science and art also has religion;
but he who possesses neither, let him have religion.
~ Goethe

I think the broader underlying concept here is having a rich mental life. Rich mental life = no need for religion. I think it was late April when I left the church — I’d just turned fourteen, lilacs were coming into bloom — I can’t swear to the accuracy of that, but I do know that the world looked gorgeous when it happened, flowers all around me and clouds like giant flowers in the sky. And there were so many books! There were cinemas and theaters, there was ballet . . . This list could go on and on, but I think my point is clear: there was no vacuum, there was no gap that needed to be filled. I still enjoyed walking into a church between the services, to steep for a while in the dusk and quiet — but the negativity has fallen off, the insane babblings about sin and hell and the Crucifixion (“Every time you sin, you drive a nail into the flesh of Jesus”).

If some kind of disembodied intelligence exists in the universe (my mother would LOL at the idea of brain-free mind — in fact I did see that happen; she didn't mean to be impolite but just couldn't control herself), my guess is that it would prefer a happy atheist to a self-
flagellating Catholic any time.
 
*

“The man who prays is the one who thinks that god has arranged matters all wrong, but who also thinks that he can instruct god how to put them right.” ~ Christopher Hitchens, Mortality


*
KETO DIET TO COMBAT CANCER? THE ANSWER: IT DEPENDS

 
~ “Last year, Siddhartha Mukherjee, the Columbia University researcher and author of The Emperor of All Maladies, and his colleagues found that at least one particular chemotherapy drug can be made more effective by combining its use with eating a low-sugar, protein-and-fat-heavy “ketogenic” diet. In a paper in Nature, the researchers suggest that the effect was related to decreasing the levels of insulin that the pancreas releases into the blood in response to eating.

Around the same time, an international team of researchers concluded in the journal Science Signaling that “only some cancer cells are acutely sensitive to glucose withdrawal, and the underlying mechanism of this selective sensitivity is unclear.” In other words, a low-sugar diet could help combat some cancers, but it’s certainly not as simple as Cancers eat sugar, so low sugar stops cancer.

While the sugar-and-insulin angle has shown promise, more of the research has focused on dietary protein—or, specifically, individual amino acids that make up that protein. Studies have shown that the restriction of the amino acids serine and glycine can modulate cancer outcomes. According to a 2018 study in Nature, the chemotherapy drug methotrexate is affected by the amino acid histidine. Another, asparagine, is involved in the progression of breast cancer metastasis.

The most interest has gone to methionine, which is found in high levels in eggs and red meat. In 2018, a review of existing evidence from the Rutgers Cancer Institute of New Jersey deemed restricting methionine “a promising anti-tumor strategy.” That promise has also shown itself in brain tumors and melanomas, as the UC San Diego surgeon Robert Hoffman detailed in February. Methionine is made in normal cells—out of homocysteine, folate, and vitamin B12. However, many types of cancer cells lack the enzyme that makes cellular manufacturing of methionine possible. So they require extra methionine from outside the body—via food we eat—for survival. Cut off that supply, and it should help to slow the tumor without starving the person.

This month, Locasale and his colleagues at Duke released findings showing that restricting methionine decreased tumor growth in mice and human subjects. Locasale’s particular area of research, known as metabolomics, uses enormous data sets to quantify metabolic activity. This allows the controversial field of nutrition research to operate with new levels of precision, where specific metabolic pathways can be monitored. Most nutrition research relies on self-reported data, in which people who say they eat almonds are found to have lower rates of some sort of cancer, and the best we can do is assume these two things are related. Locasale’s paper, by contrast, is full of complex statistical calculus involving “Euclidian distances” and “multidimensional scaling.”

What really complicates the picture for Locasale is that the closest thing to a methionine-restricted diet is, in practice, a vegan diet. This would seem to be at odds with the cancer-fighting effects reported by Mukherjee and colleagues involving a “ketogenic” diet. But contrary to the dietary wars that plague the pages of popular media, Mukherjee was supportive of Locasale’s investigation. “More evidence about the fascinating connection between diet and cancer,” he tweeted of the Duke study. “It’s not ‘starving’ the cancer, but rather finding precise vulnerabilities that make metabolic therapies feasible.”

And so now I have begun referring to food as metabolic therapy.

Because cancer is a term that encapsulates many different diseases—with different changes in different metabolic pathways in different cells in different parts of the body—no single metabolic therapy is right for every person. What makes one cancer grow more slowly could conceivably hasten another.

In 2017, I reported on a provocative study of vitamin B12 supplements, which can prevent anemia in people who don’t get enough through food. In excessive amounts, though, using these supplements was associated with higher rates of lung cancer. Again, this seemed to be by way of a metabolic pathway that fuels the tumor cells.

Nutrients or vitamins are not simply good or bad, cancer-causing or cancer-fighting. If a book or blog recommends a single “cancer diet”—or even a supplement that promises to fight cancer—beware. It could end up making things worse. Especially if there is a person on the cover in a white coat with arms folded, and with teeth that look like they have never been used.

Food is medicine—or metabolic therapy. And no metabolic therapy is good or bad for everyone in every condition.

 
https://www.theatlantic.com/health/archive/2019/05/food-cancer/589714/?utm_source=facebook&utm_term=2019-05-20T16%3A39%3A33&utm_medium=social&utm_campaign=the-atlantic&utm_content=edit-promo&fbclid=IwAR3EazqOgJQAlmoeLfqN5eHClsUZAVRUYsGkzgFth63VhPo3SzS0TVQQGpU



from another source:

~ “We found that an ad libitum Keto Diet (8:1) with a fat content of 25% medium-chain triglycerides and 75% long-chain triglycerides produced a stronger anti-tumor effect compared to a KD (8:1) with all long-chain triglycerides, and was as efficacious against neuroblastoma as the above-described KD (2:1) combined with caloric restriction. These results stress the importance of an optimized KD composition to suppress tumor growth and to sensitize tumors to chemotherapy without requiring caloric restriction.” ~

https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5842847/


Oriana:

We are barely beginning to learn the intricacies of which-diet-for-which-cancer — and which fats are the most effective in producing the anti-tumor effect. But one diet-related thing that’s just been discovered is that a compound called indole-3-carbinol (i-3-c), abundant in cruciform vegetables (don’t forget the humble cabbage! sauerkraut and kimchi have the additional benefit of improving the microbiome) seems to protect against cancer by restoring function to a cancer-fighting gene. 


Also, it’s worth noting that keto diet warns against excess protein, which is easily turned into glucose. It’s possible— though not easy — to do a methionine-restricted keto diet (oddly enough, egg yolk contains only half the amount of methionine found in egg white). And it’s certainly feasible to add healthy vegetable fats such as avocado, avocado oil, coconut oil, MCT (medium-chain triglycerides), extra-virgin olive oil and olives to a vegan diet. True, that’s not the “classic” vegan diet, but so what if your life is at stake.

Yet another solution would be to prepare a protein bar or drink that specifically excludes methionine — or asparagine — or whichever amino-acid a particular cancer requires.

But, as the article wisely warns, we still know too little about diet and various kinds of cancer to be able to give evidence-based advice. Eliminating junk food and other sources of fructose (including fruit-loaded smoothies — a lot of “health food” is actually bad for you) and adding cabbage-family veggies seems to improve health in general — always a good idea.


(For whatever it’s worth, the two countries with the highest cancer rates are Australia and New Zealand, followed by Ireland, Hungary, and the US. The countries with low cancer rates include Poland, Israel, Spain, Japan, Austria, Lebanon, and Bulgaria. https://www.wcrf.org/dietandcancer/cancer-trends/data-cancer-frequency-country

(It’s also worth noting that while first-generation immigrants tend to have cancer rates reflecting their country of origin, the cancer rates of their descendants resemble those prevalent in the new country.)

ending on beauty:

 
And then I rose
in the dazzle of light, to the pine trees
plunging and righting themselves in a furious wind.

To have died and come back
raw, crackling,
and the numbness
stunned.

That clumsy
pushing and wheeling inside my chest, that ferocious
upturn —
I give myself to it. Why else
be in a body?

~ Chana Bloch (1940-2017), Afterlife


Photo: Susan Rogers

No comments:

Post a Comment