Thursday, July 24, 2014
Andersen was profoundly committed to the Romantic ideal of the extraordinary genius, singled out for distinction from birth. “It doesn’t matter if you’re born in a duck yard if you’ve lain in a swan’s egg”, as he explains in “The Ugly Duckling”, one of his most self-reflective tales. His desire to please his many acquaintances meant he could be irritatingly anxious and deferential, and his inclination to a fawning submissiveness in his relations with aristocratic patrons was vexing to those wishing to promote the professional dignity and independence of writers. But he never doubted his credentials as an artist. Despite his lifelong social uncertainties, he was convinced that his unique gifts meant that he was perfectly entitled to special treatment from the hands of fate and his friends.
Binding is especially persuasive in tracing Andersen’s creative relations with Walter Scott, whose work had been translated into Danish in the 1820s. Andersen’s first published tale, “The Apparition at Palnatoke’s Grave” (1822), was influenced by the character of Madge Wildfire in Scott’s The Heart of Midlothian (1818), a novel that acquired cult status among its European readers, and made a lasting impression on Andersen. The story was published under the pseudonym of Villiam Christian Walter, in a volume that included one of the plays that won the approval of the Royal Theatre. Choosing the name “Walter” was an act of homage, but it was also a bold statement of intent. Binding suggests that Gerda’s journey to the icy palace of Kay’s glamorous captor in “The Snow Queen” reflects Jeanie Deans’s indomitable walk to London, undertaken so that she can plead with Queen Caroline for the pardon and release of her condemned sister.
Rock stars are the gods of the last century, avatars for the emotional and religious yearnings 1960s youth would have had nowhere else to place. Bob Dylan’s cryptic magnetism marked him as the Person With the Answers; Mick Jagger’s shaking hips stood for personal and sexual liberation. “Mick Jaggerpersonified a penis,” wrote Pamela Des Barres, the famed groupie and author of several books, in her 1987 memoir, I’m With the Band: Confessions of a Groupie. As a teenager, she “rushed home from school every day to throb along with Mick while he sang: ‘I’m a king bee, baby, let me come inside.’”
Getting in close proximity to a rock star, then—for a night, or a string of tour dates, or the time it would take them to compose an album especially for you—would seem a service to a higher power. “Perhaps by embracing their cherished rock gods, groupies tap into their own divinity,” Des Barres wrote in 2007’sLet’s Spend the Night Together: Backstage Secrets of Rock Muses and Supergroupies. Unfortunately, rock stars are not gods but rather human beings whose emotions happen to resonate with millions—emotions that are inspired by other human beings, some of whom have written memoirs.
For people accustomed to the cooler precincts of modernist and postmodernist art, it is often a joy to reëncounter older, messier forms of theatre, with coincidences and murders and the like. Therefore, when I arrived at the Rose Theatre for “The Ghost Tale of the Wet Nurse Tree,” the Kabuki company Heisei Nakamura-za’s contribution to the Lincoln Center Festival, I was not surprised to find the lobby packed with people spending too much money at the snack bar and looking as though they were going to a soccer game.
Here, with considerable abridgment, is what happens in “The Ghost Tale of the Wet Nurse Tree.” The distinguished painter Shigenobu and his wife, Oseki, have a new baby boy. Hanging around the neighborhood is a self-styled samurai, Namie, wearing a hat the size of a washtub, with a nasty smirk on his face. Shigenobu announces that he’s leaving town to create a dragon painting for a famous temple. Incredibly, he entrusts the care of his wife and son to Namie.
Christopher de Bellaigue in The Guardian:
Along with the "Trojan Horse" controversy about the imposition of a strict Islamic ethos on a number of Birmingham schools, the disclosure that several hundred Britons have been to Syria and Iraq to fight with the jihadis has fired up those who believe that Islam represents an urgent threat to this country. Amid the hype – some of it justifiable, much not – nuance has inevitably been lost. It is significant that the British jihadis have chosen to realise their fantasies not here but in Mesopotamia. From a theological point of view, a caliphate can only be set up in Muslim lands. Britain would be a poor choice even for a pilot scheme – it has a substantial opposing majority and a competent intelligence service. As for the Birmingham "conspiracy", that, too, is more complicated than it seems: while there have undoubtedly been moves to Islamise the schools' curricula and atmosphere, much of the pressure in this direction has come from parents. It would be understandable if law-abiding families with children at schools such as the formerly "outstanding" Park View Academy – now deemed "inadequate" and placed in special measures – felt targeted by former education secretary Michael Gove's campaign to "drain" the fundamentalist "swamp". While in opposition, Gove authored a famously error-strewn and intemperate screed on Islamic fundamentalism, "Celsius 7/7". And before his unexpected sacking, he wanted to inculcate "British values" in people whose social attitudes suggest they have had a bellyful.
The deeper concern is that a significant number of British Muslims are getting more conservative while much of the rest of society – including, of course, very many other Muslims – liberalises apace. Exporting high-profile hate preachers such as Abu Qatada is no solution, for whether one likes it or not the values of conservative Muslims are "British", too. As the shortcomings of "Celsius 7/7" demonstrated, and as Innes Bowen confirms in her sober, meticulous and revelatory new book, the state's attitude towards British Muslims has been defined in part by ignorance.
In My Spare Time
During my long, boring hours of spare time
I sit to play with the earth’s sphere.
I establish countries without police or parties
and I scrap others that no longer attract consumers.
I run roaring rivers through barren deserts
and I create continents and oceans
that I save for the future just in case.
I draw a new colored map of the nations:
I roll Germany to the Pacific Ocean teeming with whales
and I let the poor refugees
sail pirates’ ships to her coasts
in the fog
dreaming of the promised garden in Bavaria.
I switch England with Afghanistan
so that its youth can smoke hashish for free
provided courtesy of Her Majesty’s government.
I smuggle Kuwait from its fenced and mined borders
to Comoro, the islands
of the moon in its eclipse,
keeping the oil fields in tact, of course.
At the same time I transport Baghdad
in the midst of loud drumming
to the islands of Tahiti.
I let Saudi Arabic crouch in its eternal desert
to preserve the purity of her thoroughbred camels.
This is before I surrender America
back to the Indians
just to give history
the justice it has long lacked.
I know that changing the world is not easy
but it remains necessary nonetheless.
by Fadhil al-Azzawi
from Poetry International
translation: 2000, Khaled Mattawa
By 2050, the number of people over the age of 80 will triple globally. These demographics could come at great cost to individuals and economies. Two groups describe how research in animals and humans should be refocused to find ways to delay the onset of frailty.
The problems of old age come as a package. More than 70% of people over 65 have two or more chronic conditions such as arthritis, diabetes, cancer, heart disease and stroke1. Studies of diet, genes and drugs indicate that delaying one age-related disease probably staves off others. At least a dozen molecular pathways seem to set the pace of physiological ageing. Researchers have tweaked these pathways to give rodents long and healthy lives. Restricting calorie intake in mice or introducing mutations in nutrient-sensing pathways can extend lifespans2 by as much as 50%. And these 'Methuselah mice' are more likely than controls to die without any apparent diseases3. Post-mortems reveal that tumours, heart problems, neurodegeneration and metabolic disease are generally reduced or delayed in long-lived mice. In other words, extending lifespan also seems to increase 'healthspan', the time lived without chronic age-related conditions.
These insights have made hardly a dent in human medicine. Biomedicine takes on conditions one at a time — Alzheimer's disease, say, or heart failure. Rather, it should learn to stall incremental cellular damage and changes that eventually yield several infirmities. The current tools for extending healthy life — better diets and regular exercise — are effective. But there is room for improvement, especially in personalizing treatments. Molecular insights from animals should be tested in humans to identify interventions to delay ageing and associated conditions. Together, preclinical and clinical researchers must develop meaningful endpoints for human trials.
Picture: Fauja Singh, here aged 100, prepares for Britain's Edinburgh marathon in 2011
Wednesday, July 23, 2014
William Dalrymple in The Guardian (Photograph: Karim Sahib/AFP/Getty Images):
According to tradition it was St Thomas and his cousin Addai who brought Christianity to Iraq in the first century. At the Council of Nicea, where the Christian creed was thrashed out in AD325, there were more bishops from Mesopotamia than western Europe. The region became a refuge for those persecuted by the Orthodox Byzantines, such as theMandeans – the last Gnostics, who follow what they believe to be the teachings of John the Baptist. Then there was the Church of the East, which brought the philosophy of Aristotle and Plato, as well as Greek science and medicine, to the Islamic world – and hence, via Cordoba, to the new universities of medieval Europe.
Now almost everywhere Arab Christians are leaving. In the past decade maybe a quarter have made new lives in Europe, Australia and America. According to Professor Kamal Salibi, they are simply exhausted: "There is a feeling of fin de race among Christians all over the Middle East. Now they just want to go somewhere else, make some money and relax. Each time a Christian goes, no other Christian comes to fill his place and that is a very bad thing for the Arab world. It is Christian Arabs who keep the Arab world 'Arab' rather than 'Muslim'."
Certainly since the 19th century Christian Arabs have played a vital role in defining a secular Arab cultural identity. It is no coincidence that most of the founders of secular Arab nationalism were men like Michel Aflaq – the Greek Orthodox Christian from Damascus who, with other Syrian students freshly returned from the Sorbonne, founded the Ba'ath party in the 1940s – or Faris al-Khoury, Syria's only Christian prime minister. Then there were intellectuals like the Palestinian George Antonius, who in 1938 wrote in The Arab Awakening of the crucial role Christians played in reviving Arab literature and the arts after their long slumber under Ottoman rule.
If the Islamic state proclaimed by Isis turns into a permanent, Christian-free zone, it could signal the demise not just of an important part of the Arab Christian realm but also of the secular Arab nationalism Christians helped create.
Amanda Marcotte in The Daily Beast (NBC/Getty):
In the three decades since the scandal erupted, some things have changed and some haven’t. You still have plenty of people who want to shame young women for failing to meet the paradoxical demand to be sexy and not sexual, but there’s a growing chorus of people who see through that hypocrisy and have stopped punishing women for being, well, human beings who enjoy sex.
Beauty pageants, of course, are ground zero for the hypocritical demands on women to flaunt their bodies without actually acknowledging the existence of sex. The whole point of being a pageant queen is to trot around in your bikini to be ogled at while feigning sexual naiveté. But, of course, women—even young women—are actually sexual beings, as much as society denies that. Williams is hardly the only beauty queen who has been the subject of a scandal because she was caught dropping the “what is this sex you speak of?” act and found to be—gasp!—interested in actually enjoying her youth.
In 2006, Miss Nevada Katie Rees, got a bunch of exploitative attention for “sexy” pictures of her showing off her breasts and underwear and kissing other women, an offense for which she lost her crown. A few years later, Miss California Carrie Prejean endured having a few semi-nude photos leaked. She was able to keep her crown, but only after Donald Trump did a big, pompous show of how magnanimous he was being by saying, “We have determined that the pictures taken were fine.” That it’s a subject that needs to be “determined” at all is ridiculous, suggesting that being in pageants still comes at the price of having outsiders—outsiders like Donald Trump—feel entitled to sit in judgment of your sexual behavior.
Women who want to go into politics find themselves under similar pressure to conceal that they have bodies under their clothes or that they know what sex is all about. Witness what happened to Krystal Ball, a young Democrat who wanted to run for Congress in 2010. A pair of conservative bloggers decided to shame her by running pictures they obtained from a party she attended many years prior, where she was seen posing for prank photos with a dildo.
Noah Smith over at his website (via Crooked Timber):
Consider Proposition H: "God is watching out for me, and has a special purpose for me and me alone. Therefore, God will not let me die. No matter how dangerous a threat seems, it cannot possibly kill me, because God is looking out for me - and only me - at all times."
So P(H|E) is greater than P(H) - every moment that you fail to die increases your subjective probability that you are an invincible superman, the chosen of God. This is totally and completely rational, at least by the Bayesian definition of rationality.
Scott Spillman in The Point (image Kara Walker, savant, 2010):
As 12 Years a Slave repeatedly shows, the idea that black slaves were something less than human—although appealing for obvious reasons to masters— was subject to an inevitable tension, first at the abstract level of argument and then, more fatally, at the concrete level of daily life. The movie’s signal achievement is to bring out the various consequences of this tension, perhaps most powerfully in the relationship between the white master Edwin Epps (Michael Fassbender) and his slave girl Patsey (Lupita Nyong’o). Epps, who repeatedly refers to his slaves as his “property” and compares them to baboons, nevertheless warns his jealous wife that he would sooner send her away than lose Patsey. Later, in one of the film’s most memorable scenes, Epps is himself driven into a jealous rage by his suspicion that Patsey has escaped his control and cheated on him with a neighbor. Unable to whip Patsey himself, he compels Northup to do it. As Northup draws blood, Epps looks on with a blend of satisfaction, hatred and horror utterly belying his claim that Patsey means no more to him than a ball of yarn or a beast of prey.
For the historian David Brion Davis, this dynamic describes the basic “problem” of slavery. Ideally, as Aristotle noted long ago, a slave is like a tool or a domestic animal—something the master owns and over which he has complete control. Yet such a “natural slave” has never existed; and no system of slavery has ever successfully dehumanized its slaves to the point where they are indistinguishable from mere property. This inherent contradiction led, according to Davis, not only to complicated relationships between masters and their slaves, but to organized opposition, for which “the essential issue was how to recognize and establish the full and complete humanity of a ‘dehumanized people.’”
When and how the contradiction of treating a person as property became enough of a moral issue that people would demand an end to slavery is the question that has occupied the bulk of Davis’s career, especially in the three long works culminating with The Problem of Slavery in the Age of Emancipation, which he completed this year at the age of 86.
Matthew Jakubowski in Truce:
Before I ask more about your blogging—when you decided to pursue your interest in ethical criticism as a research topic, can you say more about that particular choice, especially how becoming a mother made you want to do work that mattered more in the world? It sounds like several things going on then affected the approach to your reading, writing, and research.
It was really a convergence of things that led me to change the kind of academic work I was doing. I published my first book and not long after I was awarded tenure: this meant I had a secure opportunity to reconsider my priorities, and I found that doing more of the same was not high on the list. Then, being a new mother made doing any research and writing more difficult: there’s nothing like being both very busy and very sleep-deprived to make you ask hard questions about the value of how you spend your time.
Anthony Trollope once observed that “(No) man … can work long at any trade without being brought to consider much whether that which he is doing daily tends to evil or to good.” I have rarely had this concern about the teaching part of my job. Sure, I worry plenty about what exactly I’m doing in the classroom and whether I’m doing it well, but that teaching students to be better readers (more attentive, more questioning, more informed) is a good thing to attempt has always seemed to me inarguable. I wanted to feel as urgent and committed to the other facets of my work.
Juan Cole in TruthDig:
The map is useful and accurate. It begins by showing the British Mandate of Palestine as of the mid-1920s. The British conquered the Ottoman districts that came to be the Mandate during World War I (the Ottoman sultan threw in with Austria and Germany against Britain, France and Russia, mainly out of fear of Russia).
But because of the rise of the League of Nations and the influence of President Woodrow Wilson’s ideas about self-determination, Britain and France could not decently simply make their new, previously Ottoman territories into mere colonies. The League of Nations awarded them “Mandates.” Britain got Palestine, France got Syria (which it made into Syria and Lebanon), Britain got Iraq.
The League of Nations Covenant spelled out what a Class A Mandate (i.e. territory that had been Ottoman) was:
“Article 22. Certain communities formerly belonging to the Turkish Empire have reached a stage of development where their existence as independent nations can be provisionally recognised subject to the rendering of administrative advice and assistance by a Mandatory [i.e., a Western power] until such time as they are able to stand alone. The wishes of these communities must be a principal consideration in the selection of the Mandatory.”
That is, the purpose of the later British Mandate of Palestine, of the French Mandate of Syria, of the British Mandate of Iraq, was to ‘render administrative advice and assistance” to these peoples in preparation for their becoming independent states, an achievement that they were recognized as not far from attaining. The Covenant was written before the actual Mandates were established, but Palestine was a Class A Mandate and so the language of the Covenant was applicable to it. The territory that formed the British Mandate of Iraq was the same territory that became independent Iraq, and the same could have been expected of the British Mandate of Palestine. (Even class B Mandates like Togo have become nation-states, but the poor Palestinians are just stateless prisoners in colonial cantons).
Scott Aaronson in Big Questions Online:
Picture, if you can, the following scene. It’s the year 2040. You wake up in the morning, and walk across your bedroom to your computer to check your email and some news websites. Your computer, your mail reader, and your web browser have some new bells and whistles, but all of them would be recognizable to a visitor from 2014: on casual inspection, not that much has changed. But one thing has changed: if, while browsing the web, you suddenly feel the urge to calculate the ground state energy of a complicated biomolecule, or to know the prime factors of a 5000-digit positive integer—and who among us don’t feel those urges, from time to time?—there are now online services that, for a fee, will use a quantum computer to give you the answer much faster than you could’ve obtained it classically. Scientists, you’re vaguely aware, are using the new quantum simulation capability to help them design drugs and high-efficiency solar cells, and to explore the properties of high-temperature superconductors. Does any of this affect your life? Sure, maybe it does—and if not, it might affect your children’s lives, or your grandchildren’s. At any rate, it’s certainly cool to know about.
Privacy and security are different as well in this brave new world. When you connect to a secure website—let’s say, to upload sensitive financial data—there’s still a padlock icon in your web browser; indeed, the user experience is pretty much the same as it was in 2014. But, you’ve heard, the previous mechanism that encrypted your data was broken by quantum computers, with their ability to factor large numbers.
Fanuel Muindi in Science:
When I started out in Stanford University's biology doctoral program, I didn't feel ready. My feeling that I was poorly prepared was corroborated by a committee that told me often how underprepared and unqualified I was. I attempted to argue my case, but the committee held to its position: I was unworthy. Fast forward 2 years: Just before I walked into my qualifying exam, the committee convinced me that I was going to fail. I succeeded, but that didn't change its low opinion of me. Then, at the end of my third year, I was selected to receive a Diversifying Academia, Recruiting Excellence (DARE) fellowship, which targets students in their final 2 years of graduate school who are interested in pursuing academic careers. The committee now argued that I didn't belong in the company of the other fellows, who, it insisted, were way more accomplished than I was.
Later, when I interviewed for postdoc positions, committee members had nothing positive to say. “You are not ready,” they said. “You are not good enough.” When the time came to defend my Ph.D. thesis, they were at it again: “You don't deserve this,” they said, after I had succeeded. “Your work isn't that good. You won't get it published anywhere decent.” It was a continuous barrage of criticism aimed at undermining my self-confidence.
Arthur C. Brooks in The New York Times:
ABD AL-RAHMAN III was an emir and caliph of Córdoba in 10th-century Spain. He was an absolute ruler who lived in complete luxury. Here’s how he assessed his life: “I have now reigned above 50 years in victory or peace; beloved by my subjects, dreaded by my enemies, and respected by my allies. Riches and honors, power and pleasure, have waited on my call, nor does any earthly blessing appear to have been wanting to my felicity.” Fame, riches and pleasure beyond imagination. Sound great? He went on to write: “I have diligently numbered the days of pure and genuine happiness which have fallen to my lot: They amount to 14.”
Abd al-Rahman’s problem wasn’t happiness, as he believed — it was unhappiness. If that sounds like a distinction without a difference, you probably have the same problem as the great emir. But with a little knowledge, you can avoid the misery that befell him. What is unhappiness? Your intuition might be that it is simply the opposite of happiness, just as darkness is the absence of light. That is not correct. Happiness and unhappiness are certainly related, but they are not actually opposites. Images of the brain show that parts of the left cerebral cortex are more active than the right when we are experiencing happiness, while the right side becomes more active when we are unhappy.
Veil of Religion
If you are One
and your teachings are One
why did you inscribe our infancy in the Torah
and adorn our youth in the Gospels
only to erase all that in your final Book?
Why did you draw those of us who acknowledge your oneness into disagreement
Why did you multiply in us, when you are the one and only One
by Amal al-Jubouri
from Poetry International
translation by Seema Atalla
Tuesday, July 22, 2014
Some have called Kingsnorth a catastrophist, or fatalist, with something like a death wish for civilization (see John Gray in The New Statesman and George Monbiot in The Guardian). Others might call him a realist, a truthteller. If nothing else, I’d call him a pretty good provocateur.
Kingsnorth tossed a grenade in the January/February issue of Orion Magazine with his controversial essay “Confessions of a Recovering Environmentalist.” There, Kingsnorth gets to the heart of his case. “We are environmentalists now,” he writes, “in order to promote something called ‘sustainability.’ What does this curious, plastic word mean? … It means sustaining human civilization at the comfort level that the world’s rich people — us — feel is their right, without destroying the ‘natural capital’ or the ‘resource base’ that is needed to do so.”
Ouch. But he isn’t finished.
If “sustainability” is about anything, it is about carbon. Carbon and climate change. To listen to most environmentalists today, you would think that these were the only things in the world worth talking about. … Carbon emissions threaten a potentially massive downgrading of our prospects for material advancement as a species. … If we cannot sort this out quickly, we are going to end up darning our socks again and growing our own carrots and other such unthinkable things.
On a spring day in 1964, a boy walked into the Oldfield Hotel in the London suburb of Greenford—or perhaps the White Hart Hotel in Acton, or perhaps an unknown pub on London’s North Circular Road; fact slides so easily into myth—and demanded an audition from the band playing there that night.
The boy was 17 years old. He was the drummer for a surf band called the Beachcombers. He hit his drums so hard that six-inch nails had to be driven through the base of his kit into the stage to keep it from wandering off when he played. He had been playing drums for five years. He had first tried the bugle, but then he heard the American jazz drummers Gene Krupa and Philly Jo Jones, and he was enlightened, so he switched to the drums and practiced in a music store with a kindhearted and probably hard-of-hearing owner. At age 14 he quit school altogether and got a job repairing radios. Part of the reason he quit school was that his teachers thought he was a dolt: “Retarded artistically, idiotic in other respects,” wrote his art teacher.
I want to start with a luminously beautiful – and luminously profound – quotation from Vladimir Nabokov’s autobiography Speak, Memory. He writes: “The cradle rocks above an abyss, and common sense tells us that our existence is but a brief crack of light between two eternities of darkness.”
“Common sense”. I believe that the knowledge of this state of affairs is the fundamental truth about our human nature: the fact that our lives simply amount to our individual occupation of this “brief crack of light” between two eternities of darkness shapes everything that makes us human and is responsible for everything good – and everything bad – about us.
You might argue that if you believe in a religious faith, where life and an afterlife are ordained and somehow controlled by a supernatural being – a god or gods – then this awareness of our temporal, bounded existence in time doesn’t apply. In response, you might counter-argue that religious faith is created expressly to confound and disprove this primordial conviction: a faith created, as Philip Larkin put it, to “pretend we never die”.
But whatever the nature of a faith in a supernatural being, or beings, and whatever its unprovable postulates, I am convinced that what makes our species unique among the fauna of this small planet circling its insignificant star is that we know we are trapped in time, caught briefly between these two eternities of darkness, the prenatal darkness and the posthumous one.
In December of 1961, a high-ranking K.G.B. agent knocked on the door of the U.S. Embassy in Helsinki, asking for asylum. His name was Antoliy Golitsyn, and he had a remarkable secret to share. There had existed within the British intelligence service, he said, a “ring of five”—all of whom knew one another and all of whom had been recruited by the Soviets in the nineteen-thirties. Burgess and Maclean, who had decamped to Moscow a decade earlier, were No. 1 and No. 2. The art historian Anthony Blunt had been under suspicion by M.I.5 for some time. He was No. 3. No. 4 sounded a lot like Philby: that was why M.I.5 rekindled its investigation of him shortly thereafter. But who was the fifth? When Philby managed to escape to Moscow, concern grew. Had the mysterious fifth man tipped him off?
Within the espionage world, Golitsyn was a deeply divisive figure. Some suspected that he was a fabulist, who embroidered his accounts of K.G.B. secrets in order to extend his usefulness to Western intelligence. Two people remained firmly convinced of Golitsyn’s bona fides, however. The first was Philby’s lunchmate at the C.I.A., James Angleton. The news about Philby convinced Angleton that the C.I.A. must be riven with moles as well, and he set off on a frenzied search for traitors which consumed the American intelligence community for the next decade.
Mohammed Suliman's tweets from Gaza (via Juan Cole):
Some were just tweeting and posting on Facebook about other people's death like I'm doing. Now people are tweeting about their death.— Mohammed Suliman (@imPalestine) July 20, 2014
Hani mourns the victims of Al Shijaiyya massacre. He wishes that his friends stay safe. He soon gets killed. His friends mourn his death.— Mohammed Suliman (@imPalestine) July 20, 2014
I look forward to surviving. If I don't, remember that I wasn't Hamas or a militant, nor was I used as a human shield. I was at home.— Mohammed Suliman (@imPalestine) July 20, 2014
We decide to escape the harrowing reality we're entrapped in by sleeping. Sleeping however has become an absurd wish. Death is easier.— Mohammed Suliman (@imPalestine) July 20, 2014
Sarah Viren in The Morning News:
It was sweltering the day I unmarried Marta, and we weren’t even together. I was with my little brother in a Penske truck, the flat haze of West Texas rising before us like the credits at the end of a movie. Marta was with our three-month-old daughter back in Iowa, where the weather was temperate. Highs were in the 70s, lows in the 50s, and Marta was still married to me. Don McLean was coming in concert that weekend and there were drink specials at our favorite vegan restaurant. Our three-month-old baby cried for milk and slept and cried some more. A couple of days later, the two of them flew out to West Texas to join me in our new home next to a university where Marta and I both had jobs, and where we were no longer married to each other.
It’s hard to define when the act of unmarrying takes place. Were we unmarried as soon as I drove out of Iowa in that Penske van and into Missouri, where same-sex marriage is not recognized? Or was it only official once Marta joined me in Texas, where marriages like ours are outright banned? Or perhaps the real unmarrying occurred when we changed our mailing address with the post office, which would mean we were unmarried for a week without even realizing it. Getting unmarried to someone is also quite different from divorcing them. There are no legal documents to sign. There are no lawyers or judges explaining the terms to you. There is just you and your once-wife and your still-legal baby in a one-story orange brick house under the beaming sun of a West Texas neighborhood where you feel the same as you did before. Almost the same—you are both aware a difference exists, and you can also feel that something small but significant has changed.
Justin Smith in Berfrois:
Does a Muslim Chechen migrant laborer in a provincial Siberian city –a ‘Caucasian’ if anyone ever was– enjoy ‘white privilege’? It seems offensive to suggest that he does. Of course, there is some scenario on which his children could be taken to the US and raised by Americans, and if this were to happen they would have a set of privileges denied to African adoptees. But that scenario is so remote from the actual range of advantages of which this Chechen can avail himself as he navigates his own social reality that one may as well not mention it. In his context, though racially ‘white’ by American standards, he is the object of suspicion, contempt, and exclusion. The thought that he is ‘white’ has almost certainly never crossed his mind.
Now of course there is nothing wrong in principle with focusing on our own parochial context—indeed it is our responsibility to be concerned with it, and to strive to improve it. When Kimberlé Crenshaw first introduced the intersectional approach, she had just such a focused and non-global concern, namely, to analyze the actors’ categories that come into play in government responses to domestic violence against women in the United States. But one serious problem with staying faithful to actors’ categories and thinking of local contexts in terms of ‘race’, is that this seems to imply a universal natural order in which the locally salient distinctions between different types of people are grounded. And there simply is no such order. What we find when we move to the global context, and to the longue durée, rather, is that the focus on supposedly racial physical attributes is generally an a posteriori rationalization of a prior unequal system of interaction between members of different ethnic groups. The more aggravated this inequality, typically, the more racially different the people on different sides of the ethnic divide will appear to one another.
William Deresiewicz in TNR [h/t: Simon During]:
Let’s not kid ourselves: The college admissions game is not primarily about the lower and middle classes seeking to rise, or even about the upper-middle class attempting to maintain its position. It is about determining the exact hierarchy of status within the upper-middle class itself. In the affluent suburbs and well-heeled urban enclaves where this game is principally played, it is not about whether you go to an elite school. It’s about which one you go to. It is Penn versus Tufts, not Penn versus Penn State. It doesn’t matter that a bright young person can go to Ohio State, become a doctor, settle in Dayton, and make a very good living. Such an outcome is simply too horrible to contemplate.
This system is exacerbating inequality, retarding social mobility, perpetuating privilege, and creating an elite that is isolated from the society that it’s supposed to lead. The numbers are undeniable. In 1985, 46 percent of incoming freshmen at the 250 most selective colleges came from the top quarter of the income distribution. By 2000, it was 55 percent. As of 2006, only about 15 percent of students at the most competitive schools came from the bottom half. The more prestigious the school, the more unequal its student body is apt to be. And public institutions are not much better than private ones. As of 2004, 40 percent of first-year students at the most selective state campuses came from families with incomes of more than $100,000, up from 32 percent just five years earlier.
The major reason for the trend is clear. Not increasing tuition, though that is a factor, but the ever-growing cost of manufacturing children who are fit to compete in the college admissions game. The more hurdles there are, the more expensive it is to catapult your kid across them. Wealthy families start buying their children’s way into elite colleges almost from the moment they are born: music lessons, sports equipment, foreign travel (“enrichment” programs, to use the all-too-perfect term)—most important, of course, private-school tuition or the costs of living in a place with top-tier public schools. The SAT is supposed to measure aptitude, but what it actually measures is parental income, which it tracks quite closely. Today, fewer than half of high-scoring students from low-income families even enroll at four-year schools.