Tuesday, January 29, 2013
From The Guardian:
Nadeem Aslam was years into his second novel when the 11 September attacks took place. "Many writers said the books they were writing were now worthless," he recalls. Martin Amis, for one, felt his work in progress had been reduced to a "pitiable babble". But Aslam's saddened reaction to 9/11 was one of recognition. "I thought, that's Maps for Lost Lovers – that's the book I'm writing." The link might seem tenuous to a novel set many miles from the twin towers or Bin Laden's lair, in an almost cocooned urban community of Pakistani migrants and their offspring in the north of England, where Aslam grew up from the age of 14. The novel was almost pastoral in its tracing of the seasons, with riffs on jazz, painting and spectacular moths. Each chapter was as minutely embellished as the Persian and Mughal miniatures Aslam has in well-thumbed volumes on his coffee table. But the plot turns on a so-called honour killing, as an unforgiving brand of Islam takes hold. In his view, and above all for women, "we were experiencing low-level September 11s every day."
Maps for Lost Lovers, which took 11 years to write, and was published in 2004, won the Encore and Kiriyama awards (the latter recognises books that contribute to greater understanding of the Pacific Rim and South Asia). It was shortlisted for the Dublin Impac prize and longlisted for the Man Booker prize. His debut, Season of the Rainbirds (1993), set in small-town Pakistan, had also won prizes, and been shortlisted for the Whitbread first novel award. The books confirmed Aslam as a novelist of ravishing poetry and poise – admired by other writers including Salman Rushdie and AS Byatt.
More here. (Note: While I have read all his books, Maps for Lost Lovers remains my favorite. I strongly recommend it)
Jared Diamond in The New York Times:
You see, falls are a common cause of death in older people like me. (I’m 75.) Among my wife’s and my circle of close friends over the age of 70, one became crippled for life, one broke a shoulder and one broke a leg in falls on the sidewalk. One fell down the stairs, and another may not survive a recent fall. “Really!” you may object. “What’s my risk of falling in the shower? One in a thousand?” My answer: Perhaps, but that’s not nearly good enough. Life expectancy for a healthy American man of my age is about 90. (That’s not to be confused with American male life expectancy at birth, only about 78.) If I’m to achieve my statistical quota of 15 more years of life, that means about 15 times 365, or 5,475, more showers. But if I were so careless that my risk of slipping in the shower each time were as high as 1 in 1,000, I’d die or become crippled about five times before reaching my life expectancy. I have to reduce my risk of shower accidents to much, much less than 1 in 5,475. This calculation illustrates the biggest single lesson that I’ve learned from 50 years of field work on the island of New Guinea: the importance of being attentive to hazards that carry a low risk each time but are encountered frequently.
I first became aware of the New Guineans’ attitude toward risk on a trip into a forest when I proposed pitching our tents under a tall and beautiful tree. To my surprise, my New Guinea friends absolutely refused. They explained that the tree was dead and might fall on us. Yes, I had to agree, it was indeed dead. But I objected that it was so solid that it would be standing for many years. The New Guineans were unswayed, opting instead to sleep in the open without a tent. I thought that their fears were greatly exaggerated, verging on paranoia. In the following years, though, I came to realize that every night that I camped in a New Guinea forest, I heard a tree falling. And when I did a frequency/risk calculation, I understood their point of view. Consider: If you’re a New Guinean living in the forest, and if you adopt the bad habit of sleeping under dead trees whose odds of falling on you that particular night are only 1 in 1,000, you’ll be dead within a few years. In fact, my wife was nearly killed by a falling tree last year, and I’ve survived numerous nearly fatal situations in New Guinea.
Beneath the white cap of Christmas frost.
The ones without hope, without shelter,
Shiver in the hollow of the cold.
Terrified at the hunger upon them,
Small birds peck at emptiness.
Here in the snow, redwings from the East
Search in the frosted absences.
From the dark heights of a fir tree
The magpie’s greedy eye observes
The songbirds’ growing panic
When a fat rat sends them scurrying.
It is the small bird that struggles
While the predator takes his ease.
In this blank hardness without mercy
Will they find even a worm’s worth of hope?
It is the berries of ivy and holly
Who give the wren its bed and board;
Buds glistening under the frosty cap
Are the waiting June where songbirds are.
by Bríd Ní Mhóráin
from Mil ina Slaoda
publisher: An Sagart, Dingle, 2011
translation: 2012, Thomas McCarthy
Monday, January 28, 2013
by Tom Jacobs
Nobody knows anyone. Not that well.
~ Miller’s Crossing
The midnight thoughts we have when we are kids are amongst the most profound we will ever have, even if we are not in a position to understand them at the time. How do I know the color I see as red is the same color of red for you? What happens after you die? How do I know that my life is not a dream?
These are ridiculously important and childish questions. The kind of questions that used to keep you up at night and that now seem safely relegated to the category of pointlessness (in part because possibly unanswerable…what evidence could one ever marshal to “prove” or even convincingly argue one’s case one way or the other?). But the heart, and somewhere in the back of one’s mind, the mind too, knows, that these questions matter. They will not go away. But there’s work to do and subways to get and schedules to keep. Whether or not you really exist kinda fades into the shadows, along with one’s fear of ghosts. Hell, it’s not even in the background. It’s offstage, somewhere in the wings, occasionally whispering stage directions. But not much more than that. But still it whispers.
Sometimes the moon appears in the middle of the day, spang in the middle of the cerulean familiar. It’s always seemed a damned strange thing, this midday moon. It’s a nighttime thing, the moon, the sort of thing that draws out freaks and lunatics and people who are up to the devil’s business. And yet, the moon is there, hovering over the horizon, at midday no less, offering a kind of vague threat or prophecy. Geosynchronous with us, never letting us see its ass end. As Pink Floyd pointed out long ago, there’s a dark side to it, even if we never get to see it. And this is what creates and cultivates the notion of mystery. Things we know are there but have never seen. The substance of things hoped for, but have never felt or seen (to paraphrase).
My thumb is more or less exactly the same size as the moon is. My thumb is actually usually bigger than the moon, depending on how far it is from my head when I point it towards the sky. How, then, do I know that the moon is, at least in relation to my thumb, immense? How do I know this?
Faith, mostly, with a bit of reason and textbook understanding of physics and geometry thrown in. Even if I was born yesterday (which I wasn’t…I age, I age, and it fills me with a sense of Gnosticism, the felt sense that something has been lost, something important with the advent of consciousness), but even if I was born yesterday, I would never believe you when you tell me that the moon is a moon, orbiting in some unlikely revolution around our earth. And you tell me the earth is four billion years old? Get outta here. But I do trust people who tell me so and I believe them. Why is this so? Is it worth anyone’s time to try to worry over or try to verify these things? Pragmatism comes to the fore to point our attention to things worth thinking about, even if on some deeper level, questions remain.That's All She Wrote
There sits my self
near a window in the sun
its feet up on a sill
There, beside the begonia
whose rose-tinged leaves are satin,
succulent and still
then, as now, taking down
and making up the tale of itself,
a concocting troubadour
in sight of a star above a pine,
past noon remembering,
telling the story of itself to itself
spinning its character
from threads of the old and
new seconds it stitches into
its suit of being,
as clear as the nose
on the face of itself
(but strange too as it tells and tells),
who reads between the lines of itself
following the story's lead
back to the start of itself
in the beginning
before which, and beyond the end leaf,
there's nothing to tell itself
of itself —that's all she wrote
more would be as silent
as a song without a note
by Jim Culleny 1/23/13
by Mara Jebsen
In August in Philadelphia, the sun leaks across red bricks and washes them down in foamy hot colors like a peach set on fire. Grown-ups sit barefoot on stoops and kids skip under rainbows of fire hydrant spray, which veil their bare arms in incandescent mist. This happened in 1985, perhaps it happens now. It keeps on happening in the diamond in my mind.
My mother and I encountered the city of Brotherly Love in 1983. It did not begin well. That year, her father, a splendid Norwegian gentleman who carried great mischief and light inside him; whose dark hair bristled around his bald pate like Caesar’s wreath, died on a tennis court. He was not yet sixty. This catastrophe blew the universe into grayness, into a sort of deep ash-color that billowed and swallowed even my mother’s golden head.
We’d been living in Benin, in West Africa. When we arrived in Philly I was six; she was thirty-one. We’d just spent two years being jolly and tropical and adventurous. There had been sand castles and palm trees and parties and villages, and chickens to chase. There had been hundreds of friends for both of us, and bright, homemade cotton dresses, paper hats, a parrot, puppet theaters, and my mother had gone dancing under the palm trees to zouk music in her strappy high-heel sandals.
But she was a serious person, basically, and had gotten a spot as a PHD candidate in folklore at the University of Pennsylvania. We took a one-bedroom in the ugliest little stucco building on the last ‘nice’ street between South Philly and Center city, so that I could go to the ‘good’ public school. For two years we were sour and sad and serious.
In 2006, in New York, the poet Philip Levine told me, with that wicked and often charming humor of his, that my poem was "very interesting," but that he “didn’t want to hear the memoirs of anyone under 40.” He didn’t say it mean. I was 26 and saw his point. I was getting ahead of myself—I wasn’t old enough to look back. Still I have that urge, because of the colors and the gemstone-feeling.
In 1986 these ‘colors’ came. I suppose this is acculturation? It really was as if the first two years were a muddy black and white, a dour Kansas, but 1986 was Oz. My mother got a part-time job as a delivery person for a florist. We drove every street in Philadelphia in a silver van crammed with Birds of Paradise. From about this moment my memories begin to form like a series of complicated kaleidoscopes, the red and yellow and green diamonds spinning
I am not sure that this is entirely a trick of memory.
by Omar Ali
“Men make their own history, but they do not make it as they please; they do not make it under self-selected circumstances, but under circumstances existing already, given and transmitted from the past. The tradition of all dead generations weighs like a nightmare on the brains of the living.” (Karl Marx)
Shia killing in Pakistan started in earnest in the 1980s and proximate causes include the CIA’s Afghan project, the Pakistani state’s use of that project to prepare Jihadi cadres for other uses, the influence of Saudi Arabia and modern Takfiri-Salafist movements, the rivalry between Iran and its Arab neighbors and so on. Some aspects of this (especially in light of the history of Pakistan) are covered in an article I wrote earlier . Here I want to discuss a little more about the historical background to this conflict. The aim is to provide a brief overview of how this conflict has played out at some points in Islamic history and to argue that if both Shias and Sunnis are to live amicably within the same state, the state needs to be secular. The alternatives are oppression of one sect or endless conflict.
The origins of the Arab empire lie in the first Islamic state established in Medina under the leadership of the prophet Mohammed (this historical narrative has been criticized as being too quick to accept the various histories generated a century or more later in the Ummayad and Abbasid empires; skeptics claim that the early origins of the Ummayad empire and its dominant religion may be very different from what its own mythmakers later claimed. But this is a minority view and is not a concern of this article). The succession to the prophet became a matter of some controversy (primarily on the issue of Ali’s claim to the caliphate) and tensions between prominent companions of the Prophet eventually spilled over into open warfare (the first civil war). This civil war had not yet been finally settled when Ali was assassinated and Muavia, the Ummayad governor of Syria, managed to consolidate his rule over most of the nascent Arab empire. Ali’s elder son Hassan, eventually renounced his claim and settled terms with Muavia, leading to a period of relative peace. But when Muavia died and his son Yazid took over in the Ummayad capital of Damascus, there was a challenge from Ali’s younger son Hussain. This ended with the famous events at Karbala, where Hussain and most male members of his extended famly were brutally killed by a large Umayyad force. Supporters of Ali and opponents of the Ummayads (the two categories were not always synonymous) launched a series of revolts against various Ummayad rulers, including several led by different members of the extended family of Ali (and by extension, by Hashemites; since in tribal Arab terms, this was also a struggle between the Hashemite clan and the Ummayad clan). During this time the supporters of Ali and his family (Shia means partisan, as in partisan of Ali) developed their own version of Islamic history in which Ali was the rightful successor to the prophet and his right was usurped by the first three caliphs. They also developed various notions about the special status of Ali and his family. Yazid and his Ummayad successors were thus (with varying intensity) regarded as illegitimate rulers and various Shia groups formed natural foci of opposition to Ummayad rule.
BETTER A DOG THAN YOUNGER BROTHER
When did I start seeing him father-figure?
It wasn’t an endearment shining shoes
for my older brother. Himalayas
shaped his character. I couldn’t call out
his name. “We don’t address our father
using first name, Harry the shrink said.
“He can’t help but see you as kid brother
but, remember, he gets into his pants
as all men do, one leg follows the other.
Banish imaginary gods. Demolish
the ego, for a seed mingles into dust
before blooming. The world is vast. Plumb
your own universe. Forgive your father
for new wife younger than his daughter.”
More poems by Rafiq Kathwari here.
by Randolyn Zinn
Last week Martin Moran performed a private run-thru of All The Rage at a midtown rehearsal room for his director Seth Barrish, stage managers, assistants, a friend, and --- me.
Moran is a well-known actor and memoirist who goes public with his private musings, seeking where the disparate threads of his life intersect, especially the doubts, guilts and misdeeds that trouble him. He discerns patterns and consequences and then presents them as questions in performance, checking in with the wider world beyond his personal preoccupations.
His latest solo performance piece All The Rage is now in previews at the Peter Jay Sharp Theatre in New York City. Moran has done this sort of thing before. In 2004 he brought his Obie© award-winning The Tricky Part: A Boy’s Story of Sexual Trespass, A Man’s Journey to Forgiveness to the stage before it was published as a book.
After the run-thru (a compact 70-minutes), Martin and I walked to a nearby restaurant to chat about his process.
Randolyn Zinn: I was so moved by your story and how you tell it, the ease with which you make an audience feel focused and connected to your world. I suspect that your theatrical presence, while casual and charming, belies a highly sophisticated set of skills you've developed as an actor. And then there’s your terrific script. The piece moves effortlessly from topic to topic and locale to locale: from Manhattan to Denver to South Africa and back. How did the idea first present itself?
MARTIN MORAN: Every time I make a piece as a storyteller, it’s an imperative, like a knocking in my chest.
It all began with my stepmother. I started writing about my relationship with her because it’s the first time in my life that I actually felt such an outrageous hatred for another human being. That feeling frightened me. Around the same time, my home town newspaper ran a review of my book, The Tricky Part, and it felt like the village elder was saying Martin Moran has no testosterone, why does he not blame his abuser, why is he so mellow, how will this boy ever move on??? And that really threw me for a loop. When I handed my book to a radical feminist to blurb, she said something like Oh Marty your book is so beautiful but where is your anger? And audience members would say in talk-backs after that show, Where is your anger? It all really freaked me out. I thought I had explored my subject, but maybe, I thought, I’m not finished after all, because I skipped an entire realm of human emotion.
RZ: So this piece is a quest to understand anger, your anger...
MARTIN MORAN: Yes. And how anger and compassion can live side by side, like a dance. Of course, there are things worth being angry about and, in a strange way, anger can fuel understanding for how we’re one, connected. We’ve all been wounded somehow. Siba, the man seeking asylum I translated for, was a torture victim. I was abused as a kid. Everyone has something that has sliced through them. So that wound calls us to examine what it is to embrace the reality of why is it we hurt each other and/or why we reach a sublime place of understanding. Perhaps in this piece I’m trying to forgive myself for forgiving.
Sunday, January 27, 2013
AC Grayling in Prospect:
Frank Ramsey was 26 years old when he died after an operation at Guy’s Hospital in January 1930. In his short life, he had made lasting contributions to mathematics, economics and philosophy, and to the thinking of a number of his contemporaries, including Ludwig Wittgenstein.
When I taught at St Anne’s, Oxford during the 1980s, I was introduced by my colleague Gabriele Taylor to Ramsey’s sister, Margaret Paul, by then retired from teaching economics at Lady Margaret Hall college. As with anyone with some knowledge of the fields of enquiry Ramsey influenced, I was immediately recruited into helping with her research into his life and thought, though in a minor capacity; she had a formidable array of other helpers besides, from eminent philosophers like Taylor and PF Strawson onwards.
Frank Ramsey was 18 when Margaret was born, so her own memories of him were those of a little girl. A large part of her motivation in writing about him was to get to know him. In this quest she was equally tireless and scrupulous. Most aspects of his work require advanced technical competence, but she was determined to understand them; an afternoon at her house talking about him could be as gruelling as it was educative.
Her memoir has now been published. It is a remarkable book, a window not just into a prodigious mind—Ramsey translated Wittgenstein’s Tractatus as a second year Trinity undergraduate, simultaneously publishing original work in probability theory and economics—but into the amazingly rich intellectual world of his day. The book’s roll-call includes John Maynard Keynes, Bertrand Russell, GE Moore and Wittgenstein, and the mise-en-scène equals it: Ramsey’s father was president of Magdalene college at Cambridge, his famously bushy-eyebrowed brother, Michael, later became Archbishop of Canterbury, and Ramsey himself, after scholarships at Winchester and Trinity, became a fellow of King’s, aged 21.
Suffering unrequited love for a married woman drove Ramsey to Vienna to be psychoanalysed by one of Freud’s pupils. It was there that he met Wittgenstein, spending hours every day in conversation with him, and later helping Keynes to bring him back to Cambridge. In the last year of his life, the 26-year-old Ramsey was the 40-year-old Wittgenstein’s nominal PhD thesis supervisor, the thesis being the Tractatus Logico-Philosophicus itself.
Nathaniel Rich in the New York Review of Books:
The first dive to a depth of a thousand feet was made in 1962 by Hannes Keller, an ebullient twenty-eight-year-old Swiss mathematician who wore half-rimmed glasses and drank a bottle of Coca-Cola each morning for breakfast. With that dive Keller broke a record he had set himself one year earlier, when he briefly descended to 728 feet. How he performed these dives without killing himself was a closely guarded secret. At the time, it was widely believed that no human being could safely dive to depths beyond three hundred feet. That was because, beginning at a depth of one hundred feet, a diver breathing fresh air starts to lose his mind.
This condition, nitrogen narcosis, is also known as the Martini Effect, because the diver feels as if he has drunk a martini on an empty stomach—the calculation is one martini for every additional fifty feet of depth. But an even greater danger to the diver is the bends, a manifestation of decompression sickness that occurs when nitrogen gas saturates the blood and tissues. The problem is not in the descent, but the ascent. As the diver returns to the surface, the nitrogen bubbles increase in size, lodging in the joints, arteries, organs, and sometimes the brain or spine, where they can cause pain and potentially death. The deeper a diver descends, the more slowly he must ascend in order to avoid the bends.
Ann Jones in TomDispatch:
The euphemisms will come fast and furious. Our soldiers will be greeted as “heroes” who, as in Iraq, left with their “heads held high,” and if in 2014 or 2015 or even 2019, the last of them, as also in Iraq, slip away in the dark of night after lying to their Afghan “allies” about their plans, few here will notice.
This will be the nature of the great Afghan drawdown. The words “retreat,” “loss,” “defeat,” “disaster,” and their siblings and cousins won’t be allowed on the premises. But make no mistake, the country that, only years ago, liked to call itself the globe’s “sole superpower” or even “hyperpower,” whose leaders dreamed of a Pax Americana across the Greater Middle East, if not the rest of the globe is… not to put too fine a point on it, packing its bags, throwing in the towel, quietly admitting -- in actions, if not in words -- to mission unaccomplished, and heading if not exactly home, at least boot by boot off the Eurasian landmass.
Washington has, in a word, had enough. Too much, in fact. It’s lost its appetite for invasions and occupations of Eurasia, though special operations raids, drone wars, and cyberwars still look deceptively cheap and easy as a means to control... well, whatever. As a result, the Afghan drawdown of 2013-2014, that implicit acknowledgement of yet another lost war, should set the curtain falling on the American Century as we’ve known it. It should be recognized as a landmark, the moment in history when the sun truly began to set on a great empire. Here in the United States, though, one thing is just about guaranteed: not many are going to be paying the slightest attention.
Huw Price in the New York Times:
In Copenhagen the summer before last, I shared a taxi with a man who thought his chance of dying in an artificial intelligence-related accident was as high as that of heart disease or cancer. No surprise if he’d been the driver, perhaps (never tell a taxi driver that you’re a philosopher!), but this was a man who has spent his career with computers.
Indeed, he’s so talented in that field that he is one of the team who made this century so, well, 21st – who got us talking to one another on video screens, the way we knew we’d be doing in the 21st century, back when I was a boy, half a century ago. For this was Jaan Tallinn, one of the team who gave us Skype. (Since then, taking him to dinner in Trinity College here in Cambridge, I’ve had colleagues queuing up to shake his hand, thanking him for keeping them in touch with distant grandchildren.)
I knew of the suggestion that A.I. might be dangerous, of course. I had heard of the “singularity,” or “intelligence explosion”– roughly, the idea, originally due to the statistician I J Good (a Cambridge-trained former colleague of Alan Turing’s), that once machine intelligence reaches a certain point, it could take over its own process of improvement, perhaps exponentially, so that we humans would soon be left far behind. But I’d never met anyone who regarded it as such a pressing cause for concern – let alone anyone with their feet so firmly on the ground in the software business.
I was intrigued, and also impressed, by Tallinn’s commitment to doing something about it.
Genetic evidence suggests that, four millennia ago, a group of adventurous Indians landed in Australia
From The Economist:
The story of the ascent of man usually casts Australia as the forgotten continent. Both archaeology and the genes of aboriginal Australians suggest that a mere 15,000 years were required for humanity to spread from its initial toehold outside Africa, on the Arabian side of the straits of Bab el Mandeb, to the land of Oz. The first Australians thus arrived about 45,000 years ago. After that, it took until 1788, when Captain Arthur Phillip, RN, turned up in Sydney Cove with a cargo of ne’er-do-wells to found the colony of New South Wales, for gene flow between Australia and the rest of the world to be resumed.
This storyline was called into question a few years ago by the discovery, in some aboriginal Australian men, of Y chromosomes that looked as though they had come from India. But the details were unclear. Now a study by Irina Pugach of the Max Planck Institute for Evolutionary Anthropology, in Leipzig, and her colleagues, which has just been published in the Proceedings of the National Academy of Sciences, has sorted the matter out. About 4,000 years before Captain Phillip and his merry men arrived to turn the aboriginals’ world upside down, it seems that a group of Indian adventurers chose to call the place home. Unlike their European successors, these earlier settlers were assimilated by the locals. And they brought with them both technological improvements and one of Australia’s most iconic animals.
Hamish Johnston in Physics World:
The radius of the proton is significantly smaller than previously thought, say physicists who have measured it to the best accuracy yet. The surprising result was obtained by studying "muonic" hydrogen in which the electron is replaced by a much heavier muon. The finding could mean that physicists need to rethink how they apply the theory of quantum electrodynamics (QED) – or even that the theory itself needs a major overhaul.
A proton contains three charged quarks bound by the strong force and its radius is defined as the distance at which the charge density drops below a certain value. The radius has been measured in two main ways – by scattering electrons from hydrogen and by looking very closely at the difference between certain energy levels of the hydrogen atom called the Lamb shift. Until recently the best estimate of the proton radius was 0.877 femtometres with an uncertainty of 0.007 fm
This Lamb shift is a result of the interactions between the electron and the constituent quarks of the proton as described by QED. These interactions are slightly different for electrons occupying the 2S and 2P energy levels and the resulting energy shift depends in part on the radius of the proton.
However, in muonic hydrogen the Lamb shift is much more dependent on the proton radius because the much heavier muon spends more time very near to – and often within – the proton itself.
Now an international team led by Randolf Pohl at the Max Planck Institute for Quantum Optics in Garching, Germany has measured the Lamb shift in muonic hydrogen for the first time and found the proton radius to be 0.8418 fm with uncertainty 0.0007 fm.
Her cart like a dugout canoe.
Had been an oak trunk.
Cut young. Fire-scoured.
What was bark what was heartwood : P u r e C h a r - H o l e
Adze-hacked and gouged.
Ever after (never not) wheeling hollow there behind her.
Up the hill toward Bennett Yard; down through Eight-Mile, the Narrows.
C o m e s C l a r y b y h e r e n o w
Body bent past bent. Intent upon horizon and carry.
Her null eye long since gone isinglassy, opal.
—The potent (brimming, fluent) one looks brown.
Co u r s e s C l a r y s u r e a s b a y o u t h r o u g h h e r e n o w
Bearing (and borne ahead by) hull and hold behind her.
Plies the dark.
Whole nights most nights along the overpass over Accabee.
Cr o s s e s C l a r y b l e s s h e r b a r r o w u p t h e r e n o w
Pausing and voweling there— the place where the girl fell.
Comes her cart like a whole-note held.
by Atsuro Riley
from Poetry, Vol. 192, No. 5
publisher: Poetry, Chicago, 2008
I thought about that moment last weekend, when my 12-year-old daughter was having a Harry Potter-themed sleepover with a few of her friends. One of the girls was recalling a moment in a Potter book and came up short as she groped for a word. She was looking for ferret, but what came out was faggot. Another girl immediately jumped. “That’s a bad word,” she said. The first girl asked what it meant and after she was told, simply nodded her head at the nastiness of the thing. The girls, in effect, had gang-tackled the word, first by opprobrium, then by indifference—and then they went back to their playing. The slow, inexorable sunset of this most-used and most-loathed gay slur is by no means complete. It still burns brightly and horribly in far too many places and far too many lives, but its day is undeniably passing — a process only hastened by President Obama’s inaugural address, which included an explicit call for the rights of “our gay brothers and sisters” and memorably invoked the lessons of Seneca Falls, Selma and Stonewall. How this particular bit of hate speech finally dies will be a lesson both in the way a language and, more important, a culture matures.
The roots of the anti-gay f-word are not what most people think they are. Popular lore has it that suspected homosexuals were once put to death by fire, and that piles of sticks — or “faggots,” in the antiquated term — were used as kindling. The pile-of-sticks definition is correct, but everything else appears not to be. “There’s no historical evidence that this is how and why it originated,” says Ben Zimmer, language columnist for the Boston Globe and executive producer of the website Vocabulary.com. “Its first recorded use was in the early 20th century, when it was applied to women. As with words like queen, it then became an epithet for gay men.” But there’s value even in the etymological misconception. Gay people may never have been put to the torch, but the widespread belief that they were serves to sensitize people to the very real bigotry—and often very real danger—they’ve faced over the centuries. “Even if it has no historical truth it has a different kind of truth as a lesson,” Zimmer says. Epithets fade not just by public censure and growing disuse, but by appropriation. Queer used to pack a terrible punch of its own until gays picked it up and began using it in chants (“We’re here, we’re queer, get used to it!”), as a name for an activist group (Queer Nation) and in the “queer studies” programs offered in many college curricula.
From The Telegraph:
"It is a truth universally acknowledged that a single man in possession of a good fortune must be in want of a wife.” Thus begins Jane Austen’s Pride and Prejudice, one of the most famous opening lines of any novel ever written. It is a story that has touched hearts for exactly 200 years: girl meets boy, girl loses boy, girl gets boy.
...And when Austen wasn’t slicing up the men, she was defining women into tribes (long before the Spice Girls): the pretty, the funny, the clever, the bookish, the bold. Of course, I knew that in real life I was an Elizabeth – not the handsomest, not the fastest, but the “sparkiest” of girls. My true love would value me for my mind first and foremost, and that – like Elizabeth – is what I would want.
Some warn that Pride and Prejudice sets modern girls up to fail. At night, we dream of an honourable man like Darcy. By day, we learn that many modern men favour the pulchritudinous countenance of a Miss Jane Bennet, the rather relaxed morals sported by Lydia-a-likes, and especially the juicy inheritance behind an Anne de Bourgh. Against those temptations, which Elizabeth among us fancies our chance? Coraggio, whispers the author, be true to yourself. Thirty-five years later, living just eight miles from Chawton, Austen’s home, now a museum devoted to her, I find my love for the book endures (although I have long since found my Darcy). So what keeps me – and so many others – wedded to this novel? Especially when we could just whack on the Colin Firth box set instead? Certainly, I enjoy a hit of Georgian grace and fantasy: a dip into that world where problems could be solved by a new gown, an invitation to a ball, or some scrumptious item of gossip. And I appreciate more knowingly Austen’s descriptions of how money rules society. But it is Austen’s knack of describing the human heart that still sets my literary pulse racing, and makes me long for a quiet corner in which to curl up with the book. And now I read it with my daughter in mind; will she, too, find Pride and Prejudice, the gold standard of love stories, a primer for romantic life?
Saturday, January 26, 2013
For the first time in history we could end poverty while protecting the global environment. But do we have the will?
John Quiggin in Aeon:
Even to those who are thoroughly inured to warnings of impending catastrophe, the World Bank’s recent report on climate change, Turn Down the Heat (November, 2012), made for alarming reading. Looking at the consequences of four degrees of global warming, a likely outcome under current trajectories, the Bank concludes that the full scope of damage is almost impossible to project. Even so, it states: ‘The projected impacts on water availability, ecosystems, agriculture, and human health could lead to large-scale displacement of populations and have adverse consequences for human security and economic and trade systems.’ Among the calamities anticipated in the paper are large-scale dieback in the Amazon, the collapse of coral reef systems and the subsistence fishing communities that depend on them, and sharp declines in crop yields.
By contrast, most of us are already inured to the continuing catastrophe reported in the Bank’s annual World Development Report. Hundreds of millions of people go hungry every day. Tens of millions die every year from easily treatable or preventable diseases. Uncontrolled climate change could produce more crop failures and famines, and spread diseases and the pests that cause them even more widely.
Economic development and technological progress provide the only real hope of lifting billions of people out of poverty and destitution, just as it has done for the minority in the developed world. Yet the living standards of the developed world have been built on cheap energy from carbon-based fossil fuels. If everyone in the world used energy as Americans or even Europeans do, it would be impossible to restrict climate change to even four degrees of warming.
For those of us who seek a better life for everybody, the question of how much our environment can withstand is crucial. If current First World living standards can’t safely be extended to the rest of the world, the future holds either environmental catastrophe or an indefinite continuation of the age-old struggle between rich and poor. Of course, it might hold both.
[Thanks to Georg Hofer.]
Via Laura Agustín, William Kremer in the BBC:
[T]he intellectual movement that came to be known as the Enlightenment brought with it a new respect for the rational and useful and an emphasis on education rather than privilege. Men's fashion shifted towards more practical clothing. In England, aristocrats began to wear simplified clothes that were linked to their work managing country estates.
It was the beginning of what has been called the Great Male Renunciation, which would see men abandon the wearing of jewellery, bright colours and ostentatious fabrics in favour of a dark, more sober, and homogeneous look. Men's clothing no longer operated so clearly as a signifier of social class, but while these boundaries were being blurred, the differences between the sexes became more pronounced.
"There begins a discussion about how men, regardless of station, of birth, if educated could become citizens," says Semmelhack.
"Women, in contrast, were seen as emotional, sentimental and uneducatable. Female desirability begins to be constructed in terms of irrational fashion and the high heel - once separated from its original function of horseback riding - becomes a primary example of impractical dress."
High heels were seen as foolish and effeminate. By 1740 men had stopped wearing them altogether.
But it was only 50 years before they disappeared from women's feet too, falling out of favour after the French Revolution.
By the time the heel came back into fashion, in the mid-19th Century, photography was transforming the way that fashions - and the female self-image - were constructed.
Robert Trivers in Psychology Today:
I point out in The Folly of Fools that science is naturally self-correcting—it requires experiments, data gathering and modes of analysis to be fully explicit, the better to be replicated and thus verified or falsified—but where humans or social behavior are involved, the temptation for quick and illegitimate progress is accelerated by the apparent importance of the results and the difficulty of checking on their veracity. Recently cases of deliberate fraud have been uncovered in the study of primate cognition (Harvard), the health benefits of resveratrol (U Conn), and numerous social psychology findings (Tilburg U, Netherlands). I will devote some later blogs to other aspects of fraud in science but will begin here with a very clever analysis of statistical fraud and lack of data sharing in psychology papers published in the United States. This and related work suggest that the problem of fraud in science is much broader than the few cases of deliberate, large-scale fraud might suggest.
Wicherts and co-authors made use of a little noted feature of all papers published in the more than 50 journals of the American Psychological Association (APA)—the authors of these papers commit by contract to sharing their raw data with anyone who asks for it, in order to attempt replication. Yet earlier work by this same group showed that for 141 papers in four top APA journals, 73 percent of the scientists did not share data when asked to. Since, as they point out, statistical errors are known to be surprisingly common and accounts of statistical results sometimes inaccurate and scientists often motivated to make decisions during statistical analysis which are biased in their own preferred direction, they were naturally curious to see if there was any connection between failure to report data and evidence of statistical bias.
Here is where they got a dramatic result. They limited their research to two of the four journals whose scientists were slightly more likely to share data and most of whose studies were similar in having an experimental design. This gave them 49 papers. Again, the majority failed to share any data, instead behaving as a parody of academics.
Caroline Freund and Mélise Jaud in Vox:
The Arab world is undergoing a major political transition. The final outcomes of the changes are far from certain in nations where they have occurred. The geographical spread of the changes is also far from clear at this point. Nevertheless, there have been and will continue to be economic consequences from the moves towards democracy (see Besley and Kudamatsu 2007).
In recent research (Freund and Jaud 2013), we have looked at historical experiences to get an idea of likely outcomes. Specifically, to get a sense of what to expect, we identified and examined 90 attempts at transition from autocracy to democracy that took place over the last half century. Our results offer a cautiously optimistic tale for the Arab countries: most transitions are successful politically and/or economically.
In particular, we find that about 45% succeeded, 40% failed, and 15% achieved democracy gradually. Success is defined as achieving a high level of democracy within three years and maintaining it; failure is when democracy is achieved temporarily or only at very low levels; and gradual is sustained and significant democratic change that takes 4-15 years to complete.
Importantly, we find that the majority of countries that underwent a transition experienced long-run gains in income growth following short-run declines (see Figure 1). Typically, countries face temporary challenges around the time of change with growth declining by 7-11 percentage points in the year of transition, though in the case of gradual transition declines were much larger around 21 percentage poiints and lasted longer.