Tuesday, June 25, 2013
These were her father’s last words: “I have a dread of chaos in my heart.”
Or, “I have a dread of the chaos in my heart.” The two others present--
her mother, her brother—and she later cannot agree. It was perhaps
a critique of the cryptic vehicles of concealment—symmetry and white noise,
city blocks and hinterlands—she thinks now, as she watches her son watch
a praying mantis watch a caterpillar. The caterpillar is famously playing
dead. Suddenly she wonders if her father is watching her
watching her son watching the praying mantis watching the caterpillar
playing dead. Windows within windows within something window-shaped.
“Kilroy was here” means he’s not anymore—a kind of geometry nobody
cannot configure. She imagines her father working, somewhere, in a factory
that churns out checkerboards, one after another, black and red,
ordinate and abscissa, drawing the axis between obsess and abyss.
Confess and confuse: there is a blind spot in her blind spot in the shape of
a heart in chaos, or chaos in a heart, red on black, or vice versa.
by Jessica Goodfellow
from Thrush Poetry Journal, March 2010
Monday, June 24, 2013
by Stephen T. Asma
How can we fix all those lunatics on the other side of the planet? This seemingly fresh and pressing question is actually one of the oldest. All cultures have relished their barbaric "other." Asking how we can civilize the foreign hordes is undoubtedly the wrong question, but it seems downright irresistible. Even liberal Western "doves" have magic-bullet theories that try to get at the heart of social violence and pathology.
Steven Pinker expresses a well-worn normative suggestion when he says that the world should move away from tribal or group thinking and feeling, and embrace the "rights tradition" of individualism. He argues, in The Better Angels of Our Nature, that violence recedes as individualism rises. The rest of the world could profit from the recognition, Pinker argues, that we are individuals, and individuals are the ones that "really count" (they actually feel the pleasure and pain). "Groups," he says, "are a kind of abstraction."
I'm going to disagree here and argue, somewhat counter-intuitively, that Pinker is the abstraction. I am the abstraction. You, gentle reader, are the abstraction.
The independent individual is a hero to WEIRD cultures (Western, Educated Industrialized, Rich and Democratic), and it serves as the starting place for both pessimistic and romantic theories of the social contract. Whether you're a Hobbesian who thinks the selfish ego must be constrained by the community, or a Rousseauian who laments such constraint (or even a Rawlsian), you still start from a metaphysic of individualism. But what if the individual is actually an ecological, developmental, and political construct?
The primacy of the individual is what philosopher R. G. Collingwood (1889-1943) might call an "absolute presupposition" –an assumed principle that governs certain inquiries and ways of thinking. In fact, digging down to these deep presuppositions was the preferred way, according to Collingwood, to do metaphysics (without getting hung-up on ontology). So, in the spirit of Collingwood's metaphysics, let me suggest an alternative, wherein the collective group is primordial and the individual is derivative.
by Shadab Zeest Hashmi
I felt in the pit of my stomach the proximity to my school as the car approached the Air Force base and the diminutive Air Force planes in (almost pretty) earth tones became visible on the runway through the large gates. The car would now turn into the school lane and another day, the stuff of nightmares, would begin for me with the tension stomachache known in Urdu as "twisted stomach."
The daily assembly at P.A.F school started with the music master leading an uninspired rendition of Iqbal's famous poem "lab peh ati," a powerful lyric utilizing the classical metaphor of the devoted moth desiring the candle of knowledge; Iqbal's passionate verses warped into the whiney trill of children interested only in live experiments of their own vocal range, utterly oblivious to the poetry. The national anthem was sung, which, being mostly in Farsi, was beyond us Junior School students. In class five I would understand the anthem and admire the beauty of the words, and wonder why it had to be written in the high Urdu that no one understood, not that I would ever want to change the song; the clipped monosyllabic "qom," "mulk" swelling into a crescendo with the lofty "sul-tan-at," and drowning into the high note of "Pa-inda ta-binda baad" and then the decrescendo, the softening into a prayer "shaad baad manzil-e-Murad," roughly translated as "may you happily find your noble destiny," a prayer like a broken thing, open in its cracks to let in endless sadness— the sadness of an endlessly breaking people.
I was in Prep A, the kindergarten room with the overwhelming aroma of French toast (Pakistani French toast is much "eggier" and sweeter), and Rooh Afza, the super sweet herb drink in little chubby flasks. The smell came from a mountain of lunch boxes in a corner that the ayah arranged and fussed over. Here, in this room I spent one whole year learning little other than the fact that I was too fat to be selected for the role of the coveted "Dolly" for the class play on the annual Sports Day, and I must come to terms with the fact that the role of Miss Polly was good in its own way.
by Scott F. Aikin and Robert B. Talisse
Not long ago, a few philosophers went out for lunch at a small café. As they ate, they argued about the morality of infanticide. Eventually another patron of the café approached the table of philosophers and asked indignantly, “What’s wrong with you people?”
Philosophers have always cultivated an antagonistic relationship with the society in which they work. But recently many philosophers, along with the American Philosophical Association (the principal professional organization for philosophers in the United States), have begun to clamor for philosophers to go public. Within the profession these days, the call for “public philosophy” is loud, but not clear. That is, it is difficult to discern precisely what is being called for, what it means for philosophy to be “public.” Here we want to identify a few possibilities.
First, the call for more public philosophy might be a call for philosophy in public. This would be the suggestion that philosophers should simply take themselves out of their offices and into more public settings. They should go about their usual business, but create and participate in forums where their academic work can be accessed by the general public. Our lunchers above were engaged in public philosophy in this sense. The result was not especially encouraging.
So it seems that the call for public philosophy is not simply a call for a change of scenery. “Public philosophy” must be a different kind of philosophy. Hence the idea that philosophers must go public is the idea that they must do something different from what they currently do. But there are many different kinds of thing that philosophers currently do. What must change in order for philosophy to be “public” in the requested sense?
Oil on paper.
by Jalees Rehman
"The most radical revolutionary will become a conservative the day after the revolution."
The recent revelations by the whistleblower Edward Snowden that the NSA (National Security Agency) is engaged in mass surveillance of private online communications between individuals by obtaining data from "internet corporations" such as Google, Facebook and Microsoft as part of a covert program called PRISM have resulted in widespread outrage and shock. The outrage is understandable, because such forms of surveillance constitute a major invasion of our privacy. The shock, on the other hand, is somewhat puzzling. In the past years, the Obama administration has repeatedly demonstrated that it is willing to continue or even expand the surveillance policies of the Bush government. The PATRIOT Act was renewed in 2011 under Obama and government intrusion into our personal lives is justified under the mantle of "national security". We chuckle at the absurdity of obediently removing our shoes at airport security checkpoints and at the irony of having to place Hobbit-size toothpaste tubes into transparent bags for a government that seems to have little respect for transparency. Non-US-citizens who reside in or travel to the United States know that they can be detained by US authorities, but even US citizens who are critical of their government, such as the MacArthur Genius grantee Laura Poitras, are hassled by American authorities. Did anyone really believe that the Obama administration with its devastating track record of murdering hundreds of civilians - including many children – in drone attacks would have moral qualms about using the NSA to spy on individual citizens?
The Stasi analogy
One of the obvious analogies drawn in the aftermath of Snowden's assertions is the comparison between the NSA and the "Stasi", the abbreviated nickname for the "Ministerium für Staatssicherheit" (Department of State Security) in the former German Democratic Republic (GDR or DDR). Articles referring to the "United Stasi of America" or the "Modern Day Stasi-State" make references to the massive surveillance apparatus of the East German Stasi, which monitored all forms of communications between citizens of East Germany, from wire-tapping apartments, offices, phones and secretly reading letters. The Stasi "perfected" the invasion of personal spaces – as exemplified in the Oscar-winning movie "The Lives of Others". It is tempting to think of today's NSA monitoring of emails, Facebook posts or other social media interactions as a high-tech version of the Stasi legacy. A movie director may already be working on a screenplay for a movie about Snowden and the NSA called "The Bytes of Others". However, there are some key differences between the surveillance conducted by the Stasi and the PRISM surveillance program of the NSA. The Stasi was a state-run organization which was responsible for amassing the data and creating profiles of the monitored citizens. It did not just rely on regular Stasi employees, but heavily relied on so called IMs – "inoffizielle Mitarbeiter" or "informelle Mitarbeiter" - informal informants. These informal informants were East German citizens who met with designated Stasi officers, reporting on the opinions and actions of their friends, colleagues and relatives and at times aiding the Stasi in promoting state propaganda. In the case of the PRISM program, the amassing of data is conducted by private "internet corporations" such as Facebook, Google and Microsoft, who then share some of the data with the state. Furthermore, instead of having to rely on informal informants like the Stasi, "internet corporations" simply rely on the users themselves who readily divulge their demographic information, opinions and interests to the corporations.
by Mara Jebsen
One of Claire Messud's interviews for "The Woman Upstairs" has got a lot of people talking about literature and likeability, and about whether a book’s protagonist ought to be warm, and about whether expectations about that warmth are gendered. Messud, in a tone and with a vividness that ultimately pleased even the interviewer, took exception to a question about the pleasing-ness of her character, and gave with the following response:
“For heaven’s sake, what kind of question is that? Would you want to be friends with Humbert Humbert? Would you want to be friends with Mickey Sabbath? Saleem Sinai? Hamlet? Krapp? Oedipus? Oscar Wao? Antigone? Raskolnikov? Any of the characters in The Corrections? Any of the characters in Infinite Jest? Any of the characters in anything Pynchon has ever written? Or Martin Amis? Or Orhan Pamuk? Or Alice Munro, for that matter? If you’re reading to find friends, you’re in deep trouble. We read to find life, in all its possibilities. The relevant question isn’t ‘Is this a potential friend for me?’ but ‘Is this character alive?’ ”
So what shall we make of Jen Fein, the gossip columnist and protagonist of Renata Adler’s “Speedboat”? Jen is both likable and unlikable—but I keep reminding myself that that is not the question to ask. Jen, who considers the possibility that a rat she spotted in one part of New York City is likely the same rat she saw earlier, in another part of the city, seems to think in prose poems crammed with something between wit and wisdom. She rejects her mind’s own proposition about the rat, summarily, with this: “I think sanity, then, is the most profound moral option of our time.”
Sunday, June 23, 2013
Amid the outrage over the NSA's spying program, the jailing of journalist Barrett Brown points to a deeper and very troubling problem.
Peter Ludlow in The Nation:
In early 2010, journalist and satirist Barrett Brown was working on a book on political pundits, when the hacktivist collective Anonymous caught his attention. He soon began writing about its activities and potential. In a defense of the group’s anti-censorship operations in Australia published on February 10, Brown declared, “I am now certain that this phenomenon is among the most important and under-reported social developments to have occurred in decades, and that the development in question promises to threaten the institution of the nation-state and perhaps even someday replace it as the world’s most fundamental and relevant method of human organization.”
By then, Brown was already considered by his fans to be the Hunter S. Thompson of his generation. In point of fact he wasn’t like Hunter S. Thompson, but was more of a throwback—a sharp-witted, irreverent journalist and satirist in the mold of Ambrose Bierce or Dorothy Parker. His acid tongue was on display in his co-authored 2007 book, Flock of Dodos: Behind Modern Creationism, Intelligent Design and the Easter Bunny, in which he declared: “This will not be a polite book. Politeness is wasted on the dishonest, who will always take advantage of any well-intended concession.”
But it wasn’t Brown’s acid tongue so much as his love of minutiae (and ability to organize and explain minutiae) that would ultimately land him in trouble. Abandoning his book on pundits in favor of a book on Anonymous, he could not have known that delving into the territory of hackers and leaks would ultimately lead to his facing the prospect of spending the rest of his life in prison. In light of the bombshell revelations published by Glenn Greenwald and Barton Gellman about government and corporate spying, Brown’s case is a good—and underreported—reminder of the considerable risk faced by reporters who report on leaks.
Nicola Jones in Nature:
Absolutely not. There’s half a dozen really interesting questions where Fermilab can play a really interesting role. We’re looking to have a flagship programme where we can ‘own the podium’, as they said in Canada during the Olympics.
What will that flagship be?
This is determined by the international landscape. Europe is really focusing on the Large Hadron Collider (LHC). They say if you want to study neutrinos, talk to the US or Japan. So what Fermilab is pursuing is the Long Baseline Neutrino Experiment (LBNE). I personally find the science there very inviting. One issue is charge parity violation, looking to see if neutrinos are different from anti-neutrinos. This could broach completely new ground. The second goal is to look at proton decay, which gets into Grand Unified Theory questions. The third thing is people are very interested in detecting neutrinos from supernovae.
What the US government has given a bit of a green light to is a detector on the surface that I would argue is too small. I’d like to make it twice as big and put it a kilometer underground. The challenge is to work with European and Japanese colleagues to see if we can do that.
Experiments have shown that people can't tell plonk from grand cru. Now one US winemaker claims that even experts can't judge wine accurately. What's the science behind the taste?
David Derbyshire in The Guardian:
And in most years, the results are surprisingly inconsistent: some whites rated as gold medallists in one contest do badly in another. Reds adored by some panels are dismissed by others. Over the decades Hodgson, a softly spoken retired oceanographer, became curious. Judging wines is by its nature subjective, but the awards appeared to be handed out at random.
So drawing on his background in statistics, Hodgson approached the organisers of the California State Fair wine competition, the oldest contest of its kind in North America, and proposed an experiment for their annual June tasting sessions.
Each panel of four judges would be presented with their usual "flight" of samples to sniff, sip and slurp. But some wines would be presented to the panel three times, poured from the same bottle each time. The results would be compiled and analysed to see whether wine testing really is scientific.
The first experiment took place in 2005. The last was in Sacramento earlier this month. Hodgson's findings have stunned the wine industry. Over the years he has shown again and again that even trained, professional palates are terrible at judging wine.
From Scientific American:
Over at AudioVision, a project of Southern California Public Radio, Mae Ryan and others bring us the best in visual journalism. Mae contacted me about last month’s feature on David Scharf, electron microscoper extraordinaire. His images are simply stunning, and I had to share. AudioVision is not a science-specific project, so I’m especially thrilled to see science imagery there. I wish more news outlets would incorporate science into their everyday stories. It seems as if science news is always shoved into the corner by major media outlets, and the assumption becomes that science news has to be pursued all by itself, which means people have to take initiative to find it (by visiting Scientific American blogs for instance!), but it doesn’t often find its way to the average viewer who isn’t actively looking for it. Unless it’s coverage of a new study that shows chocolate is healthy and you can eat as much as you want, it stays within the science circle.
Picture: Various allergens by David Scharf.
Beth Kissileff in Tablet:
My husband isn’t the same man he used to be. But that’s OK: I’m not the same woman he married, either.
Twenty-three years later, he still has all his hair, which is barely flecked with gray. (I can’t say the same for my increasingly salt-and-pepper locks.) And though he has gained a bit of weight and rarely wears the jeans I found so attractive when I met him, he is still devastatingly handsome to me. He still loves Billy Joel and the Beatles and has some kind of satellite radio with all kinds of comedy to listen to as he drives to the homes of patients he sees as a hospice chaplain. He can still tell a joke extremely well to an appreciative audience; the late Grandpa Dave, of blessed memory, must be kvelling in absentia every time a good Jewish joke hits its mark. He knows even more about religions of all stripes, working regularly with a huge variety of patients of all religious backgrounds; whenever our kids have a question about other people’s religious practices they are referred to their Abba. We still both read and discuss current events and books we read, and helped each other prepare classes for the recent Tikkun Leil Shavuot. We generally have a discussion about some aspect of the week’s parsha, if only for me to suggest sermon topics or him to help me with a column I am writing. We spent a Shabbat together—sans offspring—to hear Avivah Zornberg speak in a nearby city last year and generally get to see a Shakespeare play, somewhere, every year. But other things have changed unexpectedly. I never imagined the illnesses he’d face, and their gravity. He has a bad back and isn’t always able to do all kinds of physical things that were once simple tasks. This recent recession has hit us hard, and we’ve faced troubles over jobs and housing that now cause him insomnia, which is only exacerbated by the noisy CPAP machine he now needs to sleep. That medicine kit he used to have has grown larger, as has the number of physicians he consults regularly, to manage various medical issues.
He is certainly not the man I married. But unlike my neighbor from all those years ago, I don’t see this as a crisis. Because he’s not the only one who’s changed.
all I leave
is a flower-pot
on the window-sill
and watch twilight
fill in a corner
of the room
like a pencil
by Roland Jooris
from Bloemlezing uit de poëzie van Roland Jooris
publisher: Poëziecentrum, Gent, 1997
translation: 2005, Peter Nijmeijer
Peter Godfrey-Smith in Boston Review:
If octopuses did not exist, it would be necessary to invent them. I don’t know if we could manage this, so it’s as well that we don’t have to. As we explore the relations between mind, body, evolution, and experience, nothing stretches our thinking the way an octopus does.
In a famous 1974 paper, the philosopher Thomas Nagel asked: What is it like to be a bat? He asked this in part to challenge materialism, the view that everything that goes on in our universe comprises physical processes and nothing more. A materialist view of the mind, Nagel said, cannot even begin to give an explanation of the subjective side of our mental lives, an account of what it feels like to have thoughts and experiences. Nagel chose bats as his example because they are not so simple that we doubt they have experiences at all, but they are, he said, “a fundamentally alien form of life.”
Bats certainly live lives different from our own, but evolutionarily speaking they are our close cousins, fellow mammals with nervous systems built on a similar plan. If we want to think about something more truly alien, the octopus is ideal. Octopuses are distant from us in evolutionary terms, have a nervous system of very different design, and bodies with no bones and little fixed shape at all. What is it like to be an octopus? The question is intrinsically interesting and, beyond that, provides a good way to chip away at the problem Nagel raised for a materialist understanding of the mind.
Saturday, June 22, 2013
The Unwinding is the right title for George Packer's epic, sad and unsettling history of the last four decades in the US. His topic is the coming apart of something in the national fabric: the unravelling of unspoken agreements about the limits to Wall Street's greed; about what a congressman would or wouldn't do for the right price; about what a company owes its workers, or what the wealthy should contribute in tax. The result of all this unwinding is more personal freedom than ever before: "Freedom to change your story, get your facts, get hired, get fired, get high, marry, divorce, go broke, begin again, start a business, have it both ways, take it to the limit, walk away from the ruins, succeed beyond your dreams." But it is the loneliest sort of freedom. What Packer's disparate characters share – as his narrative moves up and down the spectrum of inequality, from inner-city Ohio to Silicon Valley, to the exurban McMansions of Florida, to Washington's corridors of power – is that each is fundamentally on his or her own.more from Oliver Burkeman at The Guardian here.
For Fischl, the suppurating wound was his mother. Depressed, alcoholic, beautiful, creatively thwarted, subject to fits of epic rage for which she blamed her children and husband, she should have had her own chapter in “The Feminine Mystique.” Betty Friedan reported mordantly on suburban women who suddenly go berserk and run shrieking through the streets naked; Fischl’s mother actually was picked up by the police running through the streets of suburban Long Island naked. She walked around the house naked too, throwing her adolescent son off kilter. After threatening for years to kill herself, she finally succeeded, driving her car into a tree. His family’s secrecy and shame about these ordeals migrated into the anxious, discomfiting iconography of Fischl’s paintings. At first he wasn’t aware of it, embarking on a series of crude images about an imaginary near-eponymous family he called “the Fishers,” whose story grew increasingly miserable. As his process became more free-associational, what eventually emerged were the “psychosexual suburban paintings” he became famous for. “Bad Boy” doubles as the title of a potent early example, a vaguely incestuous scene of a young boy stealing something from the pocketbook of an inattentive naked woman, who lies spread-eagle on a bed.more from Laura Kipnis at the NY Times here.
Stories are compasses and architecture,” writes Rebecca Solnit in The Faraway Nearby, “we navigate by them, we build our sanctuaries and our prisons out of them, and to be without a story is to be lost in the vastness of a world that spreads in all directions like arctic tundra or sea ice.” Much of Solnit’s work is concerned to locate her, and consequently us, within the world by telling stories about it. At its best her writing is an exhilarating form of literary cartography, meandering through subjects as diverse as the development of photography, the philosophy of popular protest and the history of walking while always keeping us in touch with the people at the centre of those stories. The Faraway Nearby, her 14th book, is in some respects a consummation of her method. It is composed of a series of loosely connected essays – on love, trauma, family and fairy tales – which nestle within one another like matryoshka dolls. The loose structure is held together with threads of metaphor and allusion, enacting something of the aimless meanderings of grief itself.more from Jon Day at the FT here.
Leo Damrosch in Humanities:
When I was finishing a biography of Jean-Jacques Rousseau some years ago, I was struck by the comment of someone who had known him: “the friends of Rousseau are as though related to each other through his soul, which has joined them across countries, ranks, fortune, and even centuries.” many people who have barely heard of him are indeed friends of Rousseau, because his ideas have had a pervasive influence in our culture. Quite astoundingly, this Genevan watchmaker’s son, with no formal education at any level, arrived at profound insights that continue to challenge and inspire. and not just in one area or field, either, but in a whole range that might normally seem unconnected. I will briefly describe his legacy in three of them: in political thought, in psychology, and in the philosophy of education.
Rousseau’s first great work was a Discourse on the Origin and Foundations of Inequality among Men, written in 1749 as an entry in a prize competition (he didn’t win—the judges said his submission was too long). the expected answer in those days would have been that God created us to be unequal, or else that nature did. Either answer would confirm the rightness of social hierarchy and privilege. Rousseau, far more pessimistic than Marx would later be, accepted the truth that inequality is inseparable from human culture, but he wanted to know why. The answer was the idea that would underlie everything Rousseau ever wrote: man is naturally good, but society has made him wicked. That is to say, we are not corrupted by original sin as the churches taught, or driven by instinct to dominate each other as Thomas Hobbes taught. If we are indeed selfish and competitive and possessive, it is because we have been conditioned to be. Rousseau imagined a pre-civilized state of nature in which our ancestors, more like apes than like ourselves, had no need or opportunity to exploit and enslave each other. As hunter-gatherers they could be essentially self-sufficient. The irrevocable change came with the invention of metallurgy and agriculture, twin foundations of a developed civilization. (Interestingly, Jared Diamond says much the same thing in Guns, Germs, and Steel.) Each of these advances has contributed to our material well-being, but they are only possible in an organized society in which the many are controlled by the few. What then develops, accordingly, is bureaucracies, legal systems, and organized religions that teach people to accept their lot in this vale of tears.
From The Guardian:
From the age of 22 to that of about 39 I knew myself to be a failure. For many of those years I was not positively unhappy, because I was doing work I enjoyed, was fond of my friends and often had quite a good time; but if at any moment I stood back to look at my life and pass judgment on it, I saw that it was one of failure. That is not an exaggeration. I clearly remember specific moments when I did just that. They were bleak moments. But they did lead to a subdued kind of pride at having learned how to exist in this condition – indeed, at having become rather good at it. The reason for it was banal. Having fallen in love when I was 15, and become engaged to marry the man I loved three years later, I had known exactly what my future was to be. As soon as I finished my education at Oxford (not before, because I was enjoying it so much) we would be married. I would join him wherever he happened to be stationed (he was an officer in the RAF) and my life as a wife would begin. I didn't doubt for a moment that it would be happy. My childhood and teenage years had been very happy so I was a young woman who expected the answer "Yes". And then, not suddenly, but with excruciating slowness, I got the answer "No".
He was stationed in Egypt. After three months he stopped answering my letters. His silence endured for month after month, reducing me to a swamp of incredulous misery, until at last a letter came, asking me to release him from our engagement because he was marrying someone else. Like, I am sure, most young women at that time, I had seen giving my life over to a man, living his life, as "happiness". Doing that was what, as a woman, I was for. And this I had failed to do. I did, of course, see that the man had behaved badly, cruelly in fact, in leaving me in limbo without any explanation for so long, until (I guessed) being advised that he ought to guard against me "making trouble". But I was so thoroughly the victim of current romantic attitudes that, in spite of that recognition, I was unable to withstand a sickening feeling that a woman worth her salt would have been too powerfully attractive to allow this disaster to happen. And I was not that woman.
Ian Stewart in New Statesman:
[C.P.] Snow’s lecture [on the gulf between the two cultures of arts and sciences] was based in part on an article he had written for the New Statesman in 1956. He was continuing a tradition that goes right back to the magazine’s first editorial, which adopted a broad cultural stance: “We shall deal with all current political, social, religious, and intellectual questions . . . We shall strive to face and examine social and political issues in the same spirit in which the chemist or the biologist faces and examines her test-tubes or his specimens, ignoring none of the factors, seeking to demonstrate no preconceived proposition, but trying only to find out and spread abroad the truth whatever it may turn out to be.”
Perhaps not wishing to alarm potential readers too much, the editorial expanded on its scientific metaphor: “Social problems may not be – indeed, are not – susceptible of scientific analysis in the popular acceptation of that term, since human beings are not to be weighed in balances nor measured with micrometers . . .” It was a reasonable view then, but times have changed. Today very few social problems are not tackled by measuring aspects of human attitudes, behaviour or bodily form. Consider the current concerns about an obesity epidemic, backed up by extensive statistics in which people are literally weighed in – on balances.
The NS editor clearly had an inkling that such changes were imminent and continued: “. . . unless there can be applied to [social problems] something at least of the detachment of the scientific spirit, they will never be satisfactorily solved. The cultivation of such a spirit and its deliberate application to matters of current controversy is the task which the New Statesman has set for itself.” It was a worthy task, pursued with aplomb and considerable success; it is a task not yet finished, and if anything it is now even more vital than it was a century ago.
The cultural divide between art and science has narrowed perceptibly since Snow delivered his lecture and the issues have been thrashed out extensively, so we now have a better understanding of their nature. However, it might be more accurate to say that the divide has been spanned by a number of bridges, rather than made smaller.
Richard Marshall reviews Ian James's The New French Philosophy:
Ian James sets out to show that in the new French philosophy the idea of ‘new’ is its subject, where new is understood in terms of ‘rupture’ and ‘discontinuity’ and ‘novelty.’ The French philosophers wonder how the new is possible. Gilles Deleuze started this in the 1960’s in his philosophy of ‘difference.’ Lyotard, Derrida and Foucault continued. Lyotard’s ‘event’ seeks to explain how discourses are contested and thinking is transformed. Jeff Malpas thinks this ‘the founding moment of any postmodernism.’ Lyotard’s ‘The Different’ is defined as an instability in language and discourse. It is supposed to create ‘new addressees, new addressors, new significations and new referents’ and ‘new phrase families and new genres of discourse.’ Derrida’s late ‘Spectres of Marx’ is about going beyond existing research programmes, ‘… beyond any possible programming, new knowledge, new techniques, new political givens.’ Foucault talks about epistemic breaks as an ‘event’ in ‘The Order of Things.’ He asks, ‘ how is it that thought has a place in the space of the world, that it has its origin there, and that it never ceases to begin anew?’ He suggests a process that ‘… probably begins with an erosion from the outside, from a space which is, for thought, on the other side but in which it has never ceased to think from the very beginning.’
James discusses seven new French philosophers; Jean-Luc Marion, Jean-Luc Nancy, Bernard Stiegler, Catherine Malabou, Jacques Ranciere, Alain Badiou and Francois Laruelle. This is intended to be neither exhaustive nor up to date but rather an indicative group in support of an argument about a paradigm shift. These seven all agree with Foucault that the new comes from ‘an erosion from the outside.’ Five of them established themselves in the 1970’s. Two are younger and not yet established as much.
In the 1970’s the philosophers moved away from a linguistic paradigm which had dominated Derrida, Lyotard and Foucault. Signifiers, signifieds, the symbolic, discourse, text, writing, arche-writing were recast in terms of materiality, the concrete, ‘… worldliness, shared embodied existence and sensible-intelligible experience.’ The paradigm of structuralism and post structuralism as being a literary genre was subjected to its own ‘event’.