Saturday, March 28, 2015
Ilyana Kuziemko, Michael Norton, Emmanuel Saez, and Stefanie Stantcheva over at the Washington Center for Equitable Growth:
There are several novel findings that emerge from our survey. When respondents are given the actual data on the growing income gap in the United States, their concern about the problem increases by a staggering 35 percent—an effect equal in size to roughly 36 percent of the liberal-conservative gap on this question. Moreover, viewing information about inequality also significantly influences attitudes toward two redistributive policies: the estate tax and the minimum wage (See Figure 2).
When respondents in the treatment group learn the small share of estates subject to the estate tax (roughly one in 1,000), they support increasing it at three times the rate of the control group—akin to cutting the political gap in half (See Figure 3). This finding is mirrored in a recent study by political scientist John Sides of George Washington University, who finds that accurate information on the small number of families subject to the estate tax substantially reduces support for repealing the tax.
Lee Billings in Scientific American:
Every once in a great while, something almost unspeakable happens to Earth. Some terrible force reaches out and tears the tree of life limb from limb. In a geological instant, countless creatures perish and entire lineages simply cease to exist.
The most famous of these mass extinctions happened about 66 million years ago, when the dinosaurs died out in the planet-wide environmental disruption that followed a mountain-sized space rock walloping Earth. We can still see the scar from the impact today as a nearly 200-kilometer-wide crater in the Yucatan Peninsula.
But this is only one of the “Big Five” cataclysmic mass extinctions recognized by paleontologists, and not even the worst. Some 252 million years ago, the Permian-Triassic mass extinction wiped out an estimated nine of every ten species on the planet—scientists call this one “the Great Dying.” In addition to the Big Five, evidence exists for dozens of other mass extinction events that were smaller and less severe. Not all of these are conclusively related to giant impacts; some are linked instead to enormous upticks in volcanic activity worldwide that caused dramatic, disruptive climate change and habitat loss. Researchers suspect that many—perhaps most—mass extinctions come about through the stresses caused by overlapping events, such as a giant impact paired with an erupting supervolcano. Maybe the worst mass extinctions are simply matters of poor timing, cases of planetary bad luck.
Tom Bartlett in The Chronicle:
Alice Dreger is feverish. On a wet, chilly Wednesday evening, in a high-ceilinged, beige ballroom at the Marriott in downtown Philadelphia, she is taking to task — eviscerating, really — the American Anthropological Association for its ham-fisted handling of allegations made in Darkness at El Dorado: How Scientists and Journalists Devastated the Amazon, a much-heralded but ultimately discredited book by Patrick Tierney, a journalist whose tales tended toward the fanciful. That controversy needn’t be chewed over again here, and besides, Dreger isn’t talking about just one misguided book or one feckless group of scholars. She is casting a wider net, diagnosing a disorder that she fears pervades too much of what passes for reasonable intellectual discourse. "Forms of scholarship that deny evidence, that deny truth, that deny the importance of facts, even when performed in the name of good, are dangerous, not only to science and to ethics but to democracy," she tells the Philadelphia crowd.
You’re not just hurting yourselves, people. You’re hurting America. That was in December 2009. I happened to be in the room that night, scribbling in a steno pad, pleased to have something interesting to cover. The rebuttal to her rousing remarks seemed sniffy and weirdly muted, embarrassed almost. Perhaps Dreger had violated the bylaws by saying precisely what she meant. Dreger writes about that skirmish, and many others, in her new book, Galileo’s Middle Finger: Heretics, Activists, and the Search for Justice in Science (Penguin Press), and reveals in passing that she was suffering from whooping cough that night and running entirely on adrenaline and a highly developed sense of outrage. The book is not about Galileo, except glancingly, and it’s not about anthropology, except in the section discussing the El Dorado debacle. Much of it is about gender and genitalia. There is a chapter on the motivations of rapists. There is an account of Dreger’s difficult, years-long and still-active campaign against a steroid sometimes given to pregnant woman, an effort that succeeded in "nearly crushing my reputation and my spirit."
There is swearing ("postmodernist horseshit") and drinking ("I ordered a gin and tonic for myself, and then another"). Insults are hurled. Enemies are made. Tears are shed.
Heidi Julavits once said that keeping a diary when she was young is what made her a writer. Julavits, the author of four novels, revisits that story in the opening pages of her latest work, “The Folded Clock.” She tells of returning to her childhood diaries after making that claim, looking for evidence of the writer she would become. “The actual diaries, however, fail to corroborate the myth I’d concocted for myself,” she admits. “They reveal me to possess the mind, not of a future writer, but of a future paranoid tax auditor. I exhibited no imagination, no trace of a style, no wit, no personality.” With “The Folded Clock,” she corrects the record. Keeping a diary may not have made her a writer, but becoming a writer has made it possible for her to produce, now, an exquisite diary.
This diary is a diary in the way that Thomas De Quincey’s “Confessions of an English Opium Eater” is a confession, or that Daniel Defoe’s “A Journal of the Plague Year” is a journal, or that Sei Shonagon’s “Pillow Book” is a pillow book. Meaning it is, and it isn’t. “The Folded Clock” refuses one of the primary conventions of the diary: chronology. The entry for July 16 is followed by Oct. 18, which is followed by June 18. Time moves loosely forward, so that the final entries occur a year or two after the initial entries, but time loops and circles forward.
New anthologies of African fiction seem to materialize virtually every year, if not more often in recent years. When presented with the physical fact of yet another new anthology of African fiction, the immediate question, one which I was asked when I pressed the warm, bound pages of the Africa39 anthology into the even warmer hands of a new acquaintance, was: why should I read this?
One might consider merit and credentials. For African writers and readers there are a clutch of big-ticket prizes, scholarships, and fellowships that are relevant (in order of increasing size of cash payout): The African Poetry Prize, The Commonwealth Short Story Prize, The Caine Prize (which The Guardian calls “the African Booker”), The Etisalat Prize, The Morland Scholarship. These endowments are, to coin a phrase, optimally relevant because they guarantee the authors (roughly in order of priority): international exposure (The New York Times recently hailed, as a trend, the “new wave of African writers with an internationalist bent,” some of whom are part of the Africa39, others who are friends or mentors to a number of the less well-known Africa39), Africa-wide recognition, reliable publishing opportunities, renowned mentors/editors, future awards, a sustainable life as a professional writer, and lots of travel. These award recipients are the authors who, in the decades of their ascendancy, will be read widely, will speak prodigiously, will be quoted and cited extensively, and whose names will come to characterize (if not define, and even represent) who African writers are and what African literature is on the world stage.
He wrote in exceptionally pure, cold Swedish without frills. His descriptions of nature were as sparse and alive as a Japanese painting. In fact, in later life, he attempted to write haiku in Swedish. Peter Englund, the secretary of the Swedish Academy, said: “One of the secrets of his success around the world is that he’s writing about everyday stuff. The economy of words that you can see in his poems is manifested in the economy of his output; you can get the core of his work in a pocket book of 220 pages. You can get through it in an evening.”
Björn Wiman, writing in the Stockholm paper Dagens Nyheter, praised him for his capacity to transform the everyday into astonishment. “His poem C Major is almost unique in the history of literature, since it both describes and summons up pure delight.”
The Guardian praised him when he won the prize as “unobtrusively unforgettable”, a writer “whose style is so simple as to make most words seem vain and superfluous. In translation, some of the slippery hard simplicities of his lyricism can melt like ice. But enough remains to show a poet who transforms the ordinary in apparently ordinary language. The world he sees is sometimes bleak or terrible, but it is always also full of promise no less real for being inexpressible: ‘The only thing I want to say glints out of reach, like silver in a pawnbroker’s’.”
But that's getting ahead of the story. Back on October 31, 1999, with the first news of the crash, it was hard to imagine any form of pilot error that could have condemned the airplane to such a sustained and precipitous dive. What switch could the crew have thrown, what lever? Nothing came to mind. And why had they perished so silently, without a single distress call on the radio? A total electrical failure was very unlikely, and would not explain the loss of control. A fire would have given them time to talk. One thing was certain: the pilots were either extremely busy or incapacitated from the start. Of course there was always the possibility of a terrorist attack—a simple if frightening solution. But otherwise something had gone terribly wrong with the airplane itself, and that could be just as bad. There are more than 800 Boeing 767s in the world's airline fleet, and they account for more transatlantic flights than all other airplanes combined. They are also very similar in design to the smaller and equally numerous Boeing 757s. So there was plenty of reason for alarm.Read the rest here.
One of the world's really important divides lies between nations that react well to accidents and nations that do not. This is as true for a confined and technical event like the crash of a single flight as it is for political or military disasters. The first requirement is a matter of national will, and never a sure thing: it is the intention to get the story right, wherever the blame may lie. The second requirement follows immediately upon the first, and is probably easier to achieve: it is the need for people in the aftermath to maintain even tempers and open minds. The path they follow may not be simple, but it can provide for at least the possibility of effective resolutions.
Michael White in Pacific Standard (Photo: epsos/Flickr):
The idea that our DNA, rather than being an immutable fact of our biology, is actually responsive to changes in our health and our environment is what makes people so enthusiastic about epigenetics. According to a popular metaphor, our genes themselves may be written in ink, but they're marked up in pencil—which can be erased and re-done. By developing drugs or treatments that modify these pencil marks, so the thinking goes, we can escape the limits imposed by our genes, which can't be changed. Cancer, for example, is caused by genetic mutations that can't be undone, but it is also characterized by abnormal epigenetic marks, which can potentially be reversed. Researchers have struggled for years, with little success, to fix our genetic print by repairing mutations with gene therapy. But in some cases we may not need to repair mutations if we can re-work the epigenetic pencil marks instead.
If this idea is right, the impact could be tremendous, because researchers have found epigenetic changes associated with almost everything. Distinct patterns of epigenetic marks are found not only in cancer, but most other common diseases as well, including psychiatric ones like depression and addiction. Differences in epigenetic marks are being linked with differences in socioeconomic status, and one study found epigenetic changes in suicide victims who had suffered childhood abuse. Even more worrying is the idea (still largely speculative) that epigenetic marks can be passed on from one generation to the next, meaning that parents may pass on the effects of their poor health choices, diet, or social environment to their children.
The potentially broad impact of epigenetics has drawn the attention of social scientists, who are not usually worried about the details of molecular biology. A team of bioethicists has called epigenetics one of the most "legally and ethically significant cutting-edge subjects of scientific discovery" because "a large range of environmental, dietary, behavioral, and medical experiences can significantly affect the future development and health of an individual and their offspring."
This sounds both liberating and terrifying at the same time: Our destinies are not fixed by our genes, and yet much of what we do and experience could have a profound effect on the biological make-up of ourselves and our children. But the hype has outrun the science.
Over at Radio Open Source:
In his latest book, Flash Boys: A Wall Street Revolt, Lewis sounds worried. After that last great crash, finance has gone digital. The action has moved off the downtown trading floors and into black-box servers stationed in New Jersey. Wall Street’s work has become so automatic, algorithmic and obscure that ordinary buyers and sellers have less understanding than ever of what’s happening with their savings.
In Flash Boys Michael Lewis focused on the practice of ‘high-frequency trading’ — a game of arbitrage conducted in the course of microseconds, well handled on Radiolab. But in a new afterword he says HFT is just a symptom of a larger problem. The market’s big players have once again abdicated their “clear responsibility to protect investors… and to create a fair marketplace,” meaning that the game may be more dangerous than ever.
So we’re asking the $64,000 question: can we build a more crash-proof, less leveraged, more equitable financial system? Our guest Jeremy Allaire would argue that the technology known as Bitcoin can do just that: bring back transparency and a simple standard of honest exchange. But we’re reminded that the American dream runs on credit — and we may just be too dependent on the boom-and-bust market we’ve made.
Friday, March 27, 2015
Carl Zimmer in his excellent blog, The Loom:
Malaria is caused by single-celled parasites called Plasmodium. A female mosquito carries them in its gut as it flies around in search of a victim to bite. After the parasites mature, they push through the insect’s gut wall, eventually making their way into its salivary glands. When the mosquito lands on a person and drills into the skin, it pushes some of its Plasmodium-laden saliva into the wound.
The parasites now begin their long journey through the human body. They get pushed by the surges of the bloodstream to the liver, where they invade cells and multiply inside them. The infected liver cells erupt with the next stage of Plasmodium’s life cycle, called merozoites. The merozoites end up back into the bloodstream, where they now invade red blood cells. They multiply yet again, rupturing the blood cells and invading new ones. Eventually the parasites achieve the next stage in their life cycle, when they’re ready at last to get sucked up by a hungry mosquito in a meal of blood.
If Plasmodium can’t get into a mosquito, all of this multiplication is for naught. So anything that the parasite can do to increase the odds of a successful exit can potentially be favored by natural selection. Last year, for example, a team of researchers found that mosquitoes were attracted to mice infected with Plasmodium parasites–but only when they were ready to leave their rodent host. The scientists found evidence that the parasites engineer this attraction by changing the odor of the mice. Infected mice give off odor molecules that draw mosquitoes to them.
Simon Radford in HIPPO Reads:
It’s not often that a working paper published on an academic website creates a stir, but it seems ours has! As the Guardian Observer reports, our paper shows that for the number of big donors being nominated for positions in the UK’s House of Lords to be coincidental, it would be the equivalent of entering the National Lottery five times in a row and winning the jackpot each time. 1 in 22 Tory big donors, 1 in 14 Labour big donors, and 1 in 7 Lib Dem big donors, have been nominated for a peerage (a position in the House of Lords). Rumors have abounded in the vicinity of Westminster for some time that party leaders exchanged patronage for political financing, but denying that claim just got a whole lot harder. Calls for a wholly elected UK Second Chamber must now be deafening. However, while elections are less prone to corruption than a system of appointment, we can’t stop there. Developed democracies need to rethink the role of public financing in elections if they are going to eliminate private favors and serve the public good.
Critics might argue that a few shady characters shouldn’t be spun into a general lesson. However, by using a statistical analysis across time, across governments, and across different party leaders, we show that there is a structural problem—not a case of the corruption of a few individuals. This is the first time that academics have been able to show a direct link between donations and power over voting on specific laws, but, by adding to a literature that all points in the same direction, our work is also a window on a larger issue facing all developed democracies: inequality in economic power means that our political class responds to the wishes of the rich, not the average voter. This should worry voters in Los Angeles as much as in London, Brussels as much as in Bristol.
Of course, we’re not the first to tackle the subject: academics and activists have been doing their best to sound the alarm for the last few years.
The position once held by the European Left – that solidarity is to be valued above thehomo homini lupus, and that the concept of freedom doesn’t merely have a negative character – has been abandoned. The attitude which Mark Fisher defines as ‘capitalist realism’ appears to have engulfed most of the mainstream Left. Although the recent successes of Syriza in Greece and Podemos in Spain seem timidly to hint at a possible revival of a radical Left, all the major democratic/labour parties in the West appear to converge towards a neoliberal and bleakly anti-humanist consensus. Only in Latin America does the Left still enjoy a comfortable hegemonic status, while also being able to present the future as a land of opportunities rather than a hostile wasteland. Although it is unlikely that Franciscus has read Berardi’s remarks on ‘the end of the future’ and on the consequences of its demise, he has grasped the immense political potential of reopening – and monopolising – the very concept of the time to come.
Consistently with these considerations, Franciscus has placed his pontificate under the bright red star of what was once considered a revolutionary Leftist worldview. In doing so he has been able at the same time to reinforce his presence in the Latin American countries – partly through a revival of the rhetoric and politics of Liberation Theology – and to present himself as the only credible candidate to occupy the gaping hole vacated on the left of the Western political spectrum. He has founded his attack on spectacularly populist tactics, made even more universally appealing by his repeated (yet slyly ambiguous) claim that many call him a Communist, but that he is no Communist – only a true Christian, faithful to the call of Love.
My European friends in China had largely been agreed in their envy of my departure to the ‘civilized’ world. When I’d expressed any apprehensions about the move they had rushed to assure me. Things would be so much easier than in China, they’d stressed. Everything worked. You flushed the toilet and watched the toilet paper disappear instead of the water rising ominously out of the bowl. You might pay more for food and clothes but what you purchased was of assured quality. People in Europe were ethical. None of that lying and cheating that went on in China with its get-rich-quick culture. The air was clean, the neighbourhoods green. People queued at bus stops and didn’t spit up foaming gobs of phlegm on the roads.
Efficiency, quality, honesty: these words echoed in my head as our plane prepared for landing in Brussels on a late April day in 2009. An hour or so later I was desperately knocking at the door of the airport police station, wild-eyed and begging for help, having been robbed of my handbag and laptop case while expertly distracted by the thief’s accomplice. ‘Is this arrivals or departures?’ the partner in crime had asked, and when I’d turned to answer, his friend had quietly made off with my belongings.
The British Security Service, better known as MI5, released its file on Eric Hobsbawm last autumn. Hobsbawm, who had long desired to see it, had died two years earlier, at the age of 95. In his memoir, Interesting Times, he warned against autobiographical ‘post-mortem inquests in which the corpse pretends to be the coroner’, but whatever self-justifications he might have entered as evidence, the reading of his file is hampered by his absence. It is an unwritten rule of MI5 that Personal Files (PFs) are only released after their subjects have died. Another unwritten rule, among so many, is that it only releases such material after fifty years, which explains why the Hobsbawm file deposited at the National Archives in Kew ends in the mid-1960s. The rest is withheld, and researchers who ask for more will fare no better in their feeble supplications to the state than Hobsbawm, one of the pre-eminent British historians of the 20th century.
To this deficit must be added the blanks in the file left by the declassifiers (a posh word for ‘censors’), the silent deceptions by which deception is itself concealed. Many names are redacted, and some pages have been removed in toto and replaced with a white sheet on which is stamped this grammatically unappealing message: ‘THE ORIGINAL DOCUMENT RETAINED IN DEPARTMENT UNDER SECTION 3(4) OF THE PUBLIC RECORDS ACT 1958.’ Section 3(4) allows for the retention of a record for a ‘special reason’, which does not have to be given. No reason is given, either, for the absence of an entire folder of the Hobsbawm file. Retained? Lost in transit? Destroyed? Also withheld, as standard practice, is MI5’s intelligence assessment, the casework on the material collected (through surveillance, informers, plants etc) in a file.
Novelist Akhil Sharma on why his first response to winning the 2015 Folio Prize was not joy but shame
Gaby Wood in Telegraph:
Akhil Sharma’s deadpan autobiographical novel, Family Life, ends with a kind of beginning. The narrator has taken his beautiful new girlfriend to a resort hotel. As they lounge by the pool and she leans against him, he feels happier and happier. “The happiness,” Sharma writes, “was almost heavy.” And then comes the last line: “That was when I knew I had a problem.” When I ask Sharma how he’s feeling, the morning after he has won the £40,000 Folio Prize, he responds with a brief smile, a shrug and a flat-toned explanation of his tendency to pan the world for disappointment. “My mind is like a police scanner,” he says, “wondering what’s wrong.” The first thing he felt when he heard he’d won, he says, was shame.
If that sounds melodramatic, or inappropriately comic, the book itself goes some way towards explaining the background. Sharma’s novel (his second) tells the story of an Indian family who move to New Jersey to begin what they hope will be a better life. Just after the elder brother is granted a place at a distinguished high school in New York, he dives into a swimming pool, hits his head and remains underwater for long enough to provoke a coma and lifelong brain damage. All of this happened to Sharma’s family, and the story is told from the point of view of the younger sibling - Sharma’s alter ego, Ajay - with all the naive hope and pointed perception of a child. “I wondered if he was dead,” Ajay thinks when his aunt says she has to go to the hospital. “This last was thrilling. If he was dead, I would get to be the only son.” Ajay lives through the wreckage of his parents’ aspirations: his mother’s misery, his father’s alcoholism, the daily burden of caring for his brother. “Daddy, I am so sad,” he says at one point. “You’re sad?” comes the furious response. “I want to hang myself every day.” The book is so funny you almost feel guilty for laughing – some sort of alchemical transfer, one presumes, of Sharma’s shame into fictional gold. When his mother asks for a hearing aid, Ajay’s father replies: “Why? If by mistake some good news does come for you, I’ll write it down.”
Pierre-Alain Clavien and Joseph Deiss in Nature:
The academic world has changed greatly in recent decades, so demands on its leaders have too. Departmental chairs, deans, facility directors and other leaders are now expected to power research, attract funding, manage investments, engage with policy-makers, woo the media and train personnel. Finding people who can manage these demands simultaneously is difficult. Botched appointments are costly — intellectually, emotionally and financially — for universities, students, research and sometimes for hospitals and patients too. Surprisingly, there is little data on the selection processes of academic chairs1, 2.
Here, we relate a recent exercise in selecting a chair for a position in clinical academic medicine that in our view holds lessons for the appointment of science leaders more generally. Through a formal consensus process involving leaders from industry, policy and academia, we have distilled a set of principles — telegraphed here (see ‘Checklist for high-level hiring’) — for making high-level hires. Although many seem unsurprising, they are too often ignored.
Seek strong emotional, personal and social skills. Leaders need to be highly intelligent in communication and relationship-building to support and motivate interdisciplinary teams, convey integrity, adapt to change9 and to empathize with patients. This feature cannot be compensated for by other qualities. People succeed when they treat the individuals around them well.
Find someone with fire in their belly and stoke it. Chairs need to be ready to fight for their academic mission and to identify strategies to minimize the administrative burden imposed on them and their academic colleagues. The passion of a new chair should be maintained by academic freedom, good infrastructure and room for development. These factors are much more important than salary benefits in attracting — and keeping — highly qualified individuals.
Christa Gray in OUPblog:
The Renaissance vision of Jerome (c. 347-420 AD), as depicted by Albrecht Dürer in a world-famous engraving of 1514, seems to represent an ideal type of the scholar: secluded in the desert, far removed from the bustle of ordinary life (with a lion to prove it), well-established in his institution (as shown by the cardinal’s hat), and devoted to his studies. However, even a casual reader of Jerome’s letters and pamphlets can see that the reality was much more tumultuous. Jerome left Rome for Bethlehem in 384 AD not out of pious devotion but because of a feud with the Roman clergy, who resented his ascetic programme. Even his Hebrew biblical translations, which would later form the core of the authoritative Latin version of the Catholic Church, were frowned upon by contemporaries, including Augustine, who upheld the sacred status of the Greek Septuagint. Moreover, Jerome’s close attachment to a rich and noble Roman widow, Paula, had given rise to salacious gossip. What sort of model can such a man be?
John Norman Davidson Kelly’s classic biography, Jerome: His Life, Writings, and Controversies, depicts him as a quarrelsome man rarely at peace with himself and whose writings were often produced in a rush and could be severely lacking in tact. A case in point is the attack against the priest Jovinian in 393 AD, who had dared to claim that Christian virgins were not automatically superior in holiness to Christian married women. Jerome’s exaggerated and aggressive response caused embarrassment even to his supporters, who had urged him to respond to Jovinian’s claim in the first place. To us, his text reads like a choice piece of misogyny, the sort which many still associate with the Catholic Church. Yet at the time, the official church failed to embrace his stance. More interestingly, many of Jerome’s arguments in favour of celibacy have their roots in the classical–that is to say, the ‘pagan’– tradition, which abounds in misogynistic treatises raging against marriage. In short, anyone who was prepared to be offended could find something to offend in Jerome.
But is there a way to combine Dürer’s idealised picture of Jerome with the one outlined by Kelly? Andrew Cain’s monograph, The Letters of Jerome: Asceticism, Biblical Exegesis, and the Construction of Christian Authority in Late Antiquity, has taught us how to read Jerome’s often immodest and immoderate statements. They are in fact part of a deliberate strategy to advertise his abilities as a writer and his authority as an ascetic scholar as widely as possible. Cain shows that, for Jerome, it was an essential necessity to attract patrons and sponsors if he wanted to continue his monastic life. He had little wealth of his own and even the vast resources of his friend Paula dried up in the process of supporting Jerome and maintaining the Bethlehem monastery they had founded together. Jerome’s outrageous provocations can be seen as part of a wider effort to draw attention to himself and his projects. It appears that there were just enough people at the time with an interest–political or otherwise–in feeding this particular type of troll.
Read the rest here.
In the morning
After the loaded trucks
That shattered the doors of sleep.
And the final ‘adieu’ of the day before
And the final steps on the damp tiles
And your last letter
In the arithmetic notebook from your childhood
Like the grill on the small window
Which slides up the parade of the morning’s
Joyous sun with perpendicular black lines.
by Manolis Anagnostakis
translation: Philip Ramp
Thursday, March 26, 2015
Dustin Illingworth in 3:AM Magazine:
Shrill and appalling, the words still hold something of their concussive effect: “God is dead.” A particular strain of modern agony, crystallized. But if Thus Spake Zarathustraheralded deicide, it was only in the context of a larger rebuttal of metaphysical tradition. Indeed, Nietzsche’s most quotable proclamation has the dubious distinction of also being his most vulgarly misunderstood. Popularly accepted as an incursion on religious belief as such, Zarathustra’s famous utterance has seen the broader implications of its meaning dissolved within a caricatured nihilism. For Nietzsche, God was dead – but so, too, was German Idealism, the polished systems of Hegel and Schelling, to say nothing of the Enlightenment project of an eminently rational progress. It was a disintegration of the reigning spiritual and intellectual frameworks as much as it was a rooting out of God from his many hiding places: morality, culture, grammar and art, to name but a few. Surmounting the God-shaped void, our lonely hero knew, was a task for a theorized posthumanity (or, at the very least, a hardier variety of late European).
The enormous difficulty of this challenge – of discovering a surrogate commensurate with the social, moral and political power of a departed Almighty – is the provenance of Terry Eagleton’s bracing intellectual history Culture and the Death of God. Its central argument – that genuine atheism is both difficult and rare – seems at first blush a bit of wishful apologism, the death rattle of a proud but exhausted cultural model. After all, the diminishment of the sacred is no longer merely the overbold conjecture of an intellectual fringe element. Withered by the profound secularization of capitalist culture, and bolstered by positivism’s new vogue beneath the banners of Dawkins and Harris, God seems, if not dead, than irreparably reduced – something approaching an antiquated curio, or the equivalent of a harmless knocking on dusty wood.
And yet, by way of an ironically Darwinian feat of cultural adaptation, He remains alive and well – if, admittedly, much transformed. His many secular guises constitute and complicate the last 300 years of European thought – from Enlightenment rationalism, to Romantic intuition, to the Modernist culture industry. Eagleton’s oeuvre, a formidable body of literary and cultural criticism deeply informed by his Marxist-Catholic convictions, can be taken as a hostile interrogation of this secularizing tradition. His lively 2009 book Reason, Faith, and Revolution: Reflections on the God Debate, adapted from his Yale lectures, was a polemical broadside against the liberal-humanist prejudices of New Atheism. Culture and the Death of God can be usefully read as a kind of companion volume to this previous work, as it guides the reader through a brisk circuit of recent European history to compile a damning index of secular failure.
Jo Littler interviews Nancy Fraser, in Eurozine (Photo: Scott Robinson. Source: Flickr):
JL: In your work you've often warned against trading-in a truncated economism for a truncated culturalism, and stressed the importance of combining both approaches. How would you locate yourself in relation to that paradigm? How have you yourself been shaped by the politics of recognition and redistribution?
NF: I grew up in Baltimore, Maryland in the days when it was a Jim Crow segregated city. The formative experience of my life, in my early teenage years, was the struggle for racial desegregation – to dismantle Jim Crow. This was a struggle for recognition of the most compelling and obviously just kind. And like many people of my generation, I moved in quick sequence from there to anti-Vietnam War struggles. I encountered Marxism in unorthodox, democratic, New Left form. That gave me a way to try and think conceptually about the various battles against different forms of domination that were so intense in that period. And soon second-wave feminism erupted and came in to the mix. Now, all of this was going on in a time of relative prosperity. I don't think we in the New Left and the early second-wave feminist movement worried very much about how we would support ourselves. Of course we were young, and we often didn't have children; but there was very much a sense – which proved to be an illusion, but was a felt sense nonetheless – that the first-world model of Keynesian capitalist prosperity would continue. We certainly had a perspective about class, and we understood very well that racism correlated with poverty and exploitation. But we thought, looking through a quasi-Marxian socialist-feminist analytical lens, that what seemed to be a secure social-democratic drift meant that redistribution was relatively unproblematic, and that what we had to do was to fight to introduce the importance of recognition into the forms of traditional Marxism and economistic thinking that dominated even social democracy at the time. That proved to be wrong. I soon found myself getting more and more nervous, as the 1980s wore on into the 1990s, that the critique of political economy was being lost amongst the new social movements, the successor movements to the New Left – including feminism. I felt we were getting a one-sided development of the politics of recognition. To me, recognition always only made sense when it was connected to the political economic dimension of society. Otherwise – as with feminism – you get women put on a pedestal and lots of lip service about how important care work is, but it's a sentimentalized, almost Victorian ethos unless you connect it to political economy. That's when I started saying "We had a great critique of economism of a vulgar sort – let's not make the same mistake and end up ourselves with some kind of a vulgar culturalism".
Above all in the US, but also elsewhere throughout the world, there was a paradigm shift towards the dimension of recognition, and it arose exactly at the moment – it's quite ironic – when the Keynesian social-democratic formation was beginning to unravel. We got the astonishing resurrection of liberal free-market ideas that everyone had assumed were in the dustbin of history forever.
Sean Carroll in Preposterous Universe:
Don Page is one of the world’s leading experts on theoretical gravitational physics and cosmology, as well as a previous guest-blogger around these parts. (There are more world experts in theoretical physics than there are people who have guest-blogged for me, so the latter category is arguably a greater honor.) He is also, somewhat unusually among cosmologists, an Evangelical Christian, and interested in the relationship between cosmology and religious belief.
Longtime readers may have noticed that I’m not very religious myself. But I’m always willing to engage with people with whom I disagree, if the conversation is substantive and proceeds in good faith. I may disagree with Don, but I’m always interested in what he has to say.
Recently Don watched the debate I had with William Lane Craig on “God and Cosmology.” I think these remarks from a devoted Christian who understands the cosmology very well will be of interest to people on either side of the debate.
In 1902, Jack London lost himself in the East End of his urban namesake. This Klondike adventurer’s temporary disappearance from relatively polite society – of oyster pirates and tramps, among others – was the result of a fateful improvisation on his part. Deeply in debt and love with the socialist writer Anna Strunsky, London had been due to undertake a journalistic commission in South Africa. The commission fell through, however, and, as Earle Labor relates in his biography, London negotiated with his publisher for him to write a book instead, about life in what was reputed to be one of the worst slums on earth. This, characteristically, the young author knew he had to see from the inside.
On August 9, 1902, in the guise of an American sailor down on his luck, London walked into Trafalgar Square and joined the crowds celebrating the coronation of Edward VII. Heading east, he quickly found himself immersed in a “human hellhole” – “a vast shambles”, “utterly unnatural”, “a huge killing-machine”. Malnutrition, cramped and unhygienic lodgings (or no lodgings at all), hopeless insobriety, even the thought that the slightly better-off are unlikely to bequeath any security in life to their children: all of this London notes in horror, while counting his own blessings and reminding the reader of the unspeakable affluence to which the city is also home.