Friday, September 26, 2014
‘Zhivago’, in the pre-revolutionary genitive case, means ‘the living one’. On the novel’s first page a hearse is being followed to the grave. ‘Whom are you burying?’ the mourners are asked. ‘Zhivago’ is the reply, punningly suggesting ‘him who is living’. After his first reading of the draft early chapters, at the British Embassy in Moscow in 1945, Berlin felt that he had seen a flare sent up from the survivor of a cataclysm. Swept away by the novel’s defiant personal claim for the indomitable Russian soul, he was sure that Bolshevism’s systematic programme of turning Russia away from Western civilisation couldn’t be completed as long as such writing existed. Before leaving his diplomatic post, he turned in a long memorandum – what he called, misleadingly, a ‘rambling discourse on the Russian writers’ – containing extended resumés of his meetings with Pasternak, Akhmatova, Chukovsky and others. It was a founding text of the Kulturkampf, as important in its way as George Kennan’s Long Telegram (also written in 1946) was to the shaping of the political Cold War. In a letter accompanying the report, Berlin requested that it be treated as ‘confidential’ because of ‘the well-known consequences to the possible sources of the information contained in it, should its existence ever become known to “them”’.
We’ll call this next chapter in the novel of the novel ‘The Alphabet Men’. It’s the bit where the CIA, MI6 and their little helpers at the FO, IRD, BBC, IOD, SRD, CCF, RFE, RL, VOA and BVD process the purloined microfilm of the Russian text into ‘combat material’ for the Cold War.
I am lived. I am died.
I was two-leafed three times, and grazed,
but then I was stemmed and multiplied,
sharp-thorned and caned, nested and raised,
earth-salt by sun-sugar. I was innerly sung
by thrushes who need fear no eyed skin thing.
Finched, ant-run, flowered, I am given the years
in now fewer berries, now more of sling
out over directions of luscious dung.
Of water crankshaft, of gases the gears
my shape is cattle-pruned to a crown spread sprung
above the starve-gut instinct to make prairies
of everywhere. My thorns are stuck with caries
of mice and rank lizards by the butcher bird.
Inches in, baby seed-screamers get supplied.
I am lived and died in, vine woven, multiplied.
by Les Murray
Contemporary cultural studies likes the concept of "borderlands" because it seems to fit our complex, interrelated and dynamic world and provides an alternative to the homogenizing logic of nationalism and the related ideal of mono-ethnicity. In recent decades, borderlands have been re-construed as contact zones, as systems of communication and as social networks. As geopolitically amorphous zones "in between", they generate hybrid identities and create political, economic and cultural practices that combine different, often mutually exclusive values. Moreover, borderlands are associated with multiculturalism, cultural authenticity and cosmopolitanism. Yet from the nation-building perspective, their ambiguity is nothing to be celebrated. Mixed and overlapping identities and multiple loyalties pose a challenge to the nationalizing agenda and potentially threaten the integrity of a nation-state.
These two approaches clashed over eastern Ukraine, a former Soviet heartland and since 1991 a new borderland. From the perspective of some Kyiv and Lviv intellectuals the Russian speaking population of eastern Ukraine – which voted for the Communists and for oligarchic parties and was indifferent and even hostile to the national idea – were post-Soviet "creoles" lacking Ukrainian identity.
Thursday, September 25, 2014
In his fascinating new book the developmental biologist Lewis Wolpert argues that there is actually hard science behind many of our stereotypical gender roles.
Lewis Wolpert in The Telegraph:
In My Fair Lady Professor Higgins sings a song about the difference between the sexes, “Why can’t a woman be more like a man?” It comes from an amusingly, ludicrously biased male point of view, but I have used it as the title for my new book on the subject to remind us that the differences between men and women remain a major issue.
I am a developmental biologist who has studied how embryos develop from the fertilised egg. Genes control the development of the embryo by providing the codes for making proteins, which largely determine how cells behave.
The cells in the human embryo give rise to the structure and function of our brains and bodies. These cells determine whether we are male or female, and I want to understand the extent to which important differences in the behaviour of men and women are controlled by their genes during development and by the action of hormones both in the womb and in later life.
Exactly how different men and women are is, of course, a controversial subject. The view that there are inborn differences between the minds of men and women is being challenged by others who call this the pseudoscience of “neurosexism”, and are raising concerns about its implications. They emphasise instead social influences, such as stereotyping, in determining the differences in the behaviour of the sexes.
Ariane Tabatabai in Bulletin of the Atomic Scientists:
Amid the thorny nuclear negotiations between Iran and six world powers, most observers have recognized two main sticking points: How much to limit Tehran’s ability to enrich uranium, and how sanctions will be lifted. But as meetings resume between Iran and its negotiating partners—the United States, the United Kingdom, France, Russia, China, and Germany, also known as the P5+1— an issue once thought settled is anything but.
The Arak Heavy Water Reactor, also known as IR-40, almost derailed the initial rounds of talks in 2013. Following the conclusion of the interim deal in November of that year, French Foreign Minister Laurent Fabius expressed his concerns over the reactor, telling the French newspaperLe Monde (link in French) that “Arak, the most proliferation-prone facility, the one producing the most plutonium, is not necessary for civilian use. But once it goes online, we cannot destroy it.”
Inspired by his shipboard reading of Baudelaire’s “Mon Coeur mis à nu” and Goethe’s Dichtung und Wahrheit, the forty-page logbook of “Mon Voyage en Amérique” contains the germ of all of Cendrars’s future autobiographical writings, staging as it does the phoenix-like (and deeply Christian) drama whereby the ashes of a former self are converted into a blaze of resurrection. As he glimpses the lighthouses of New York Harbor after three weeks at sea, the poet cries out rhapsodically: “C’est une nouvelle naissance! Je vois des feux briller, comme à travers l’épaisseur de la chair . . . . Je me souviens, je me souviens des splendeurs apparues . . . . Vais-je crier ainsi qu’un nouveau-né? . . .” He signed the first text he wrote on his arrival with his new baptismal name: “Blaise Cendrart” (the final “s” would appear a year later) – perhaps derived from the copy of Villon he carried with him (“A mal, être ars et mis en cendre”) or a reminiscence of Nietzsche’s Ecce Homo (“Und alles wird mir nur zur Asche / Was ich liebe, was ich fasse”). Beneath a quickly sketched self-portrait he scrawled “Je suis l’autre!” – the very phrase that Gérard de Nerval (another literary hero) had once inscribed next to a lithograph of his likeness. From which Cendrars drew the Schopenhauerian corollary, often quoted in his work: the world is my representation.
The Dutch director George Sluizer, who has died aged 82, made only one perfect work: The Vanishing. There are surprisingly few filmmakers who can even match that tally. This 1988 picture follows its own remorseless logic to the natural conclusion, and makes no compromises or concessions along the way. It is so unsettling and strange that to put it in the Thriller or Horror section, or to call it Psychological Drama, would be to diminish it, and give only the feeblest impression of its powers. At the beginning of the film, a woman disappears during a pit-stop that she and her boyfriend make in the middle of a long drive. Years later, the bereft man is contacted by the person who abducted her and offered a choice that is tantalising and terrifying.
Sluizer directs with unshakable calm throughout. Stanley Kubrick told him it was “the most horrifying picture I’ve ever seen”. When he asked whether it was even more horrifying than Kubrick’s own The Shining, the senior director replied that it was. Kubrick’s producer, Jan Harlan, explained: “The Vanishingwas real. The Shining was a ghost film. A huge difference.”
The problem about Seneca is that it was always difficult to pin him down (and so it remains). What Tacitus is saying, in his carefully chosen words, is that in his last hours he was “shaping…still” an imago of himself that he had been working on, revising, and adjusting for most of his life, in many different forms. Like it or not, there is something elusive, even a whiff of “spin,” about Seneca.
Romm finds a vivid symbol of that elusiveness in the surviving likenesses of the philosopher (“images” in yet another sense). Before the nineteenth century, the favored image of Seneca (now demoted to “Pseudo-Seneca”) was “a gaunt, haggard, and haunted” portrait sculpture that has survived in several ancient versions. It is not named, but it so matched everyone’s preconceptions of what the elderly philosopher must have looked like that it was simply assumed to be him. In 1813, however, a double-sided portrait—showing two male heads, back to back—was unearthed in Rome, probably dating to the third century AD: one was clearly labeled, in Greek, “Socrates,” the other, in Latin, “Seneca” (“the two sages joined at the back of the head like Siamese twins sharing a single brain,” as Romm has it).
Tom Shroder in Salon:
The therapeutic properties of the synthetic compound MDMA, which would soon become known on the street as Ecstasy, were discovered by Alexander “Sasha” Shulgin, a leading researcher for Dow Chemical in the late 1950s and early 1960s who had been so awed by the psychoactive effects of mescaline that he decided to devote his life to experimenting with similar compounds, which he concocted in a backyard lab at his home in Lafayette, California. When he cooked up MDMA and “taste-tested” the drug in the 1970s, he thought he’d discovered a pleasant “no-calorie martini.” Then he increased the dose. The world cracked open.
“I am afraid to turn around and face the mountains,” he wrote in his lab notes, “for fear they will overpower me. But I did look, and I am astounded. Everyone must get to experience a profound state like this. I feel totally peaceful. I have lived all my life to get here, and I feel I have come home. I am complete. I feel absolutely clean inside, and there is nothing but pure euphoria. I have never felt so great, or believed this to be possible.” Shulgin urgently contacted his friend, the psychiatrist Leo Zeff, who following the lead of pioneering researchers in the 1950s and early 1960s, had been using psychedelic drugs like LSD, mescaline and psilocybin to assist in therapy with private patients. In 15 years of psychedelic practice, he hadn’t done any formal studies of his results, but his patients often said they felt they accomplished more in one session with Zeff than they had in years of traditional therapy. By the time Shulgin contacted him, Zeff was ready to retire — until he tried the MDMA.
Gulab Chand in PhysOrg:
India became the first Asian country to reach Mars on Wednesday when the unmanned Mangalyaan spacecraft entered the planet's orbit after a 10-month journey, all on a shoestring budget. The mission, which is designed to search for evidence of life on the planet, is a huge source of national pride for India as it competes with Asian rivals for success in space. India beat rival neighbour China, whose first attempt flopped in 2011 despite the Asian superpower pouring billions of dollars into its programme.
At just $74 million, India's mission cost less than the estimated $100 million budget of the sci-fi blockbuster "Gravity". It also represents just a fraction of the cost of NASA's $671 million MAVEN spacecraft, which successfully began orbiting the fourth planet from the sun on Sunday. India now joins an elite club of the United States, Russia and Europe who can boast of reaching Mars. More than half of all missions to the planet have ended in failure. No single nation had previously succeeded on its first go, although the European Space Agency, which represents a consortium of countries, pulled off the feat at its first attempt. Scientists presented the Mars photos on Thursday to Prime Minister Narendra Modi who was on hand in the command centre to witness the achievement. "The success of our space programme is a shining symbol of what we are capable of as a nation," a jubilant Modi said on Wednesday.
Today I'm Going to Start Living Like a Mystic
Today I am pulling on a green wool sweater and walking across the park in a dusky snowfall. The trees stand like twenty-seven prophets in a field, each a station in a pilgrimage—silent, pondering. Blue flakes of light falling across their bodies are the ciphers of a secret, an occultation. I will examine their leaves as pages in a text and consider the bookish pigeons, students of winter. I will kneel on the track of a vanquished squirrel and stare into a blank pond for the figure of Sophia. I shall begin scouring the sky for signs as if my whole future were constellated upon it. I will walk home alone with the deep alone, a disciple of shadows, in praise of the mysteries.
by Edward Hirsch
from Lay Back the Darkness
Knopf/Random House, Inc
Wednesday, September 24, 2014
At the crossroads of architecture and the comic is Mazzucchelli’s Asterios Polyp, the love story of architecture professor Asterios Polyp—an unwieldy, snobbish, weak-chinned scrap of a man—and his lovely wife Hana. Asterios is one of the paper architects of the 1980s and 1990s avant-garde, a tight-knit coterie of poststructuralist designers who took their cues directly from French philosopher Jacques Derrida’s understanding of architecture as a form of writing. Like Derrida’s one-time collaborator Peter Eisenman, Asterios’s reputation rests on “his designs, rather than on the buildings constructed from them.” Nothing he has designed has ever been built. Rather, his career is an accumulation of riddles, abstractions and analogues, systems and sequences “governed by their own internal logic.” They take little by way of inspiration from the material world and give next to nothing back. Asterios Polyp, we could conclude, is the story of a man who could have authored a savvier version of Yes Is More.
Mazzucchelli draws Asterios as an extension of his intellectual sensibilities, a not-so-subtle takedown of architectural theory that’s delightful to behold in comic form. At his most pedantic moments—lecturing a class on Apollonian versus Dionysian design, or boasting about his sexual prowess at a faculty meeting—Asterios’s body morphs into an artist’s mannequin, a cool blue assemblage of hollow geometries that bear no relationship to the world around him.
When Theodor Adorno declared, in 1949, that “to write poetry after Auschwitz is barbaric,” he could hardly have anticipated the ensuing quantity of poetry and prose that actually concerned itself with the Holocaust, still less its astonishing range and depth. The category now encompasses the densely narrated psychological-historical realism of André Schwarz-Bart and Imre Kertész, the Kafka-inspired dreamscapes of Aharon Appelfeld, and, later, the elliptical, deeply original fictions of W. G. Sebald. As the generations of firsthand witnesses give way to younger generations, literary works that confront the subject have often been more circumspect; recent novels by Susanna Moore and Ayelet Waldman achieve their emotional power by focussing upon characters peripheral to the terrible European history that has nonetheless altered their lives. The conflagration must be glimpsed indirectly, following Appelfeld’s admonition that “one does not look directly into the sun.”
Such circumspection has not been Martin Amis’s strategy in approaching the Holocaust. The Nazi death camps at Auschwitz provide a setting for Amis’s tour de force “Time’s Arrow: or The Nature of the Offense” (1991), in which the lifetime of a Nazi doctor-experimenter is presented in reverse chronological order, from the instant of his death (as the affable American Tod Friendly) to his conception (as the ominously named German Odilo Unverdorben), witnessed by a part of himself that seems to be his conscience, or his soul.
But why did India, a success story not so long ago, need to be Modi-fied at all? Throughout the 1990s and 2000s, sliced open by neoliberal knives into a realm of information technology, real estate and conspicuous consumption, the country was widely celebrated, both by its own elites and its Western boosters, as having entered the realm of true democracy. The four previous decades of postcolonial India were consigned to a conceptual darkness that was sometimes called “socialism” and sometimes, in a slightly more accurate reference to the heavy bureaucratic role of the centralized state, the “license-permit Raj.” In contrast to this was the celebration of the present: the new, market-friendly nation, tiger rising and “India Shining” (the latter a slogan coined by the BJP in its failed re-election bid in 2004), and particularly its growth as measured by GDP, averaging 8 to 9 percent throughout the first decade of the new millennium and peaking at 10.3 percent in 2010. Fed largely by flows of foreign capital and inherently weak, the tiger has since shrunk to the size of a goat, with growth having fallen to 4.7 percent in 2014—which goes some way toward explaining why both the Indian oligarchs and sections of the population turned against the Congress Party toward the end of its ten-year rule and began to clamor for Modi to take over.
Cass R. Sunstein in The New Republic:
As everyone knows, the supreme court ruled six–three for Al Gore in the great dispute over the Florida recount in 2000. As everyone also knows, Gore emerged as the ultimate victor in that recount, and with his poetic and moving inauguration address he managed to unify a badly divided nation. For a long period, the Gore years continued the peace and prosperity established under President Clinton, punctuated by the successful prevention of an apparent terrorist plot in 2001, by the enactment of health care reform in 2003 (mocked by critics as GoreCare), and by aggressive steps to reduce greenhouse gas emissions, culminating in the historic Copenhagen Protocol, ratified by the U.S. Senate in 2005.
It was not until the nation’s financial collapse, beginning in 2007, that Gore’s presidency started to unravel. Senator John McCain, a longtime critic of Gore’s “failure to respect free markets,” succeeded in convincing the American public that the collapse was partly a product of the Democratic Party’s “regulatory overreach,” and he was able to trounce Senator Joseph Biden in the 2008 election. Now in his second term, McCain has presided over a successful recovery (with unemployment levels down to 8 percent from their high of 13 percent in 2010). But his own legislative agenda, including repeal of GoreCare and immigration reform, has been stymied by what McCain calls the “do-nothing Senate,” which has a slim Democratic majority. Many insiders think that the Democratic nominee in 2016 will be Minnesota Senator Amy Klobuchar. According to University of Chicago law professor Barack Obama, a specialist on election law, “Klobuchar is perfectly positioned to win her party’s nomination—and to triumph in the general election as well. She’s audacious.”
What if Jesus had never been crucified? Can we imagine a world without Christianity? Suppose that Germany won World War II. What would Europe and the United States be like now? Imagine that Kennedy had not been assassinated. Would the Vietnam war have been avoided? Would the 1960s have been fundamentally different? Would Reagan have become president? Would the Soviet Union still exist?
David Hershkovits in Paper:
Fran Lebowitz loves to talk -- so much so that when Martin Scorsese made a documentary about her he called it Public Speaking. But before she was one of the world's greatest talkers, she made her name as a writer; first at Andy Warhol's Interview and then with two collections of acerbic essays, Metropolitan Lifeand Social Studies. While a long-running writer's block limited her to an occasional magazine piece and two children's books, it liberated her voice to keep talking and developing into the type of personality that could only exist in NewYork City, specifically Manhattan, the only place she will consider living. An original gangster by any standard, she's completely self-invented and did it her way -- sardonic, entertain- ing, insightful -- inspiring a generation of humorists who followed. Even though Lebowitz is back to writing again (working on a novel she's been incubating), that doesn't mean she's stopped talking. On a recent summer day she sounded off on everything from Lena Dunham to gay marriage, and we did the only thing you really can do when Fran starts talking -- we listened.
New York, Mike Bloomberg and Rich People In Politics
I would say that the changes in New York that I most object to came under Michael Bloomberg, and I would have objected to these if I was 20 or if I was 12. The second that Bloomberg appeared on the political scene, I objected to him. Most people didn't know who he was so they didn't object to him, but I did know who he was, and I did object to him. I object to people who are rich in politics. I don't think they should be allowed to be in politics. It is bad that rich people are in politics, it is bad for everybody but rich people, and rich people don't need any more help. Whenever people say, "Oh he earned his money himself," I always say the same thing: "No one earns a billion dollars. People earn $10 an hour, people steal a billion dollars."
Ferris Jabr in The New Yorker:
In 1990, while visiting a research camp in central Borneo, the primatologist Anne Russon saw an orangutan nicknamed Supinah attempt to make fire. Supinah sauntered toward an ashy fire pit, picked up a stick glowing with embers, and dipped it into a nearby cup full of liquid. Russon thought that the cup contained water, but it in fact held kerosene. Fortunately, that bath did little more than dampen the wood. Yet Supinah persisted: she got a second glowing stick, blew on it, fanned it with her hands, and rubbed it against other sticks. She never got the right steps in the right order to start a fire, but what foiled her was not her innate intelligence. She had a clear goal in mind and the right kind of brain to achieve it. She just needed a little more practice.
At the time, Russon was visiting Camp Leakey, which the anthropologist Biruté Galdikas established, in 1971, to study orangutans, just as Jane Goodall and Dian Fossey had done, in Africa, to observe chimpanzees and gorillas, respectively. Since then, Galdikas, Russon, and a handful of other orangutan specialists have learned firsthand just how intelligent and resourceful the animals really are. Some of their mental skills may exceed those of their great-ape brethren. Michelle Desilets, executive director of the Orangutan Land Trust, has summarized the unique intellect of orangutans like this: “They say that if you give a chimpanzee a screwdriver, he’ll break it; if you give a gorilla a screwdriver, he’ll toss it over his shoulder; but if you give an orangutan a screwdriver, he’ll open up his cage and walk away.”
Compared with chimpanzees, which are highly excitable, orangutans seem far more sober and considerate. They move deliberately and often spend a good deal of time silently watching before deciding how to act. At Camp Leakey, the orangutans had plenty of opportunity to observe and imitate people. They soon developed a habit of stealing canoes, paddling them downriver, and abandoning them at their destinations.
Simon Warrall in National Geographic:
In her new book, The Human Age, Diane Ackerman, best-selling author of The Zookeeper's Wife and A Natural History of the Senses, takes us on a journey into the Anthropocene: the era in which humans have both mastered and degraded the natural world. Ranging across the globe, she shows how our unique talent for self-awareness and our technological prowess can help us overcome today's global challenges. Speaking a few days before what may be the largest ever climate change demonstration, in New York, she talked about the Frozen Ark, where the DNA of vanishing species is being collected, introduced us to an orangutan with an iPad and a group of Alaskan Inuits threatened by rising sea levels, and expressed her optimism about the future.
Human beings have been on the planet for about 200,000 years, but if you think about it, most of the wonders we identify with contemporary life came about in the past 200 years. And in the past 20 years, they've been advancing at a mind-boggling pace. We've now changed the course of rivers, we've changed the outline of continents, we've created giant megacities, and we've even played golf on the moon. We've so dominated our landscape that the coalition of scientists believes we have to change the name of the era in which we're living. We're in the Holocene, a geologic era like the Jurassic for the dinosaurs. But [scientists would] like to change it to something that conveys more of our imprint on the planet, the Anthropocene, which translates as the human age. And I think it's a very good idea.
Scientists at the Salk Institute have discovered an on-and-off “switch” in cells that points to a way to encourage healthy cells to keep dividing and generating, for example, new lung or liver tissue — even in old age — and may hold the key to healthy aging. In our bodies, newly divided cells constantly replenish lungs, skin, liver and other organs. However, most human cells cannot divide indefinitely — with each division, a telomere (a cellular timekeeper at the ends of chromosomes) shortens. When this timekeeper becomes too short, cells can no longer divide, causing organs and tissues to degenerate, as often happens in old age. But there is a way around this countdown: some cells produce an enzyme called telomerase, which rebuilds telomeres and allows cells to divide indefinitely.
However, in a new study published September 19 in the journal Genes and Development, scientists at the Salk Institute have discovered that telomerase, even when present, can be turned off. “Previous studies had suggested that once assembled, telomerase is available whenever it is needed,” says senior author Vicki Lundblad, professor and holder of Salk’s Ralph S. and Becky O’Connor Chair. “We were surprised to discover instead that telomerase has what is in essence an ‘off’ switch, whereby it disassembles.” Understanding how this “off” switch can be manipulated, thereby slowing down the telomere shortening process, could lead to treatments for diseases of aging (for example, regenerating vital organs later in life).
The Last Chapter is The Longest
Every door opened and walked through recalls
all other doors —that first glimpse of the next room
its bric-a-brac reflecting what you knew and
how old you were and
what style shirt hung over a chair and
which program was on TV and
who else lay on the couch the night you met her and
when your first kiss led to which ceremony and
what gifts from then survived so many years and
why another door shut behind you that last time together and
how empty rooms seemed then without her to share those memories
of all the doors you opened together and
what color you decided to paint those walls after she had gone and
which car you drove to drop off the child at her place and
when you watched her door close and
how much time it took to find another door and
knock expectantly like you had before
by Michael Chrisman
from Little Stories, New Poems by Michael Chrisman
Tuesday, September 23, 2014
Ben Zimmer in the Visual Thesaurus:
As has become the custom for the LinguaFile series on Lexicon Valley, I presented the hosts Mike Vuolo and Bob Garfield with a mystery word. This time, I had them guess the word that Eminem discussed in a 2010 interview on "60 Minutes" with Anderson Cooper: "People say that the word ___ doesn't rhyme with anything, and that kind of pisses me off, because I can think of a lot of things that rhyme with ___." Bob figured out right away that it was orange, that eminently unrhymable word. Or not so unrhymable for Eminem, as he freestyles: "I put my orange four-inch door hinge in storage, and ate porridge with George." I was amused to find out that Eminem's quasi-rhyming of orange has its roots in versifying going back to Walter William Skeat in an 1865 issue of Notes and Queries (not to mention a couple of dirty limericks collected by the great folklorist Gershon Legman).
The question that immediately came up had a less-than-obvious answer: Which came first, the color orange or the fruit orange? Many people are tempted to say the color, because it seems so basic to our vocabulary, but its "basicness" is relatively recent in the history of English. In the 1969 book Basic Color Terms, Brent Berlin and Paul Kay posited a kind of evolutionary sequence of terms in a language. The sequence starts with white and black, then proceeds to red, then green and yellow, then blue, then brown, and eventually to orange and purple (both unrhymable in English, as it turns out). The earliest evidence for the use of orange as a color term in English comes from 1512, several centuries after the other terms had been established. In Old English, you would need to say "yellow-red" (ġeolu-rēad) to describe something orange-colored.