Saturday, September 05, 2015
Summary of a Conversation
What does it mean to be authentic,
to run down the middle of Dizengoff Street and shout in Judeo-Arabic
“Ana min el-Maghreb, ana min el-Maghreb”?
(I am from the Atlas mountains, I am from the Atlas mountains).
What does it mean to be authentic,
to sit in Café Roval in a colorful robe (an agal and a zarbiyah, kinds of clothing),
my name is not Zohar, I am Zaish I am Zaish (a Moroccan name).
Neither this nor that,
and nonetheless a different language strikes the mouth until gums crack,
and nonetheless spurned and beloved scents pounce
and I fall between the chairs
lost in the jumble of voices.
by Erez Biton
from Timbisert: A Moroccan bird
Publisher: Hakibbutz Hameuchad, Tel Aviv, 2005
Friday, September 04, 2015
Ursula K Le Guin in The Guardian:
A “colossal fragmentation of reality” occurred in the 20th century, Salman Rushdie has said, and his novels enact and display that fragmentation with terror and glee. His new book assures us that reality has lately been crumbling more colossally than ever, and is about to come completely unglued. The climate destabilisation we are experiencing is only a foretaste of advancing chaos, which the author describes with considerable relish. Eschatological lightning strikes, oracular infants and local failures of gravity will become the norm, as the Dark Ifrits, the mischievous forces of disorder, begin to take advantage of the weakening of the fabric of the everyday.
The cumbrous title transcribes a certain number of days into years and months, but not the four weeks that would naturally complete it, because the word “Nights” is needed to suggest the original Thousand and One. Rushdie is our Scheherazade, inexhaustibly enfolding story within story and unfolding tale after tale with such irrepressible delight that it comes as a shock to remember that, like her, he has lived the life of a storyteller in immediate peril. Scheherazade told her 1,001 tales to put off a stupid, cruel threat of death; Rushdie found himself under similar threat for telling an unwelcome tale. So far, like her, he has succeeded in escaping. May he continue to do so.
At the idea of trying to summarise the plot, I shriek and fall back fainting on my seraglio couch. Rushdie has a fractal imagination: plot buds from plot, endlessly. There are at least 1,001 stories and substories, and nearly as many characters. All you need to know is that they’re mostly highly entertaining, amusing and ingenious.
R. R. Helm in Deep Sea News:
Here’s a mystery: below 8,400 meters there are no fish. There are other creatures: sea cucumbers, anemones, tiny worms, but no one has ever seen a fish. At 8,370 meters? There are fish. But not below 8,400 meters. At its deepest the ocean reaches roughly 11,000 meters, so there is plenty of space. And right below 8,400 meters it’s equally cold, equally dark, equally middle-of-no-where as it is right above 8,400 meters. But there is some magic line at 8,400 meters, below which fish apparently cannot go. No one understands why this line exists. Or if perhaps one day we’ll find a fish that can, in fact, cross it. But for now, scientists do have some ideas.
Mohammed Hanif in the New York Times:
We are at it again. India and Pakistan are talking a lot these days, mostly about why they don’t want to talk to each other. Our national security advisers were supposed to meet last week. And they were supposed to talk about terrorism. Instead, they did what they do best: They hurled accusations at each other about how the other side doesn’t really know how to talk, and the meeting was canceled.
India accuses Pakistan of sponsoring terrorism in India. Pakistan accuses India of sponsoring terrorism in Pakistan, and of having bad manners. To India, it seems obvious that Pakistani militants were behind the 2008 attacks in Mumbai, and it is exasperated that the world won’t punish Pakistan for that. It is upset that the man accused in the attacks, Zaki-ur-Rehman Lakhvi, was released on bail after a leisurely trial in Pakistan, and was able to produce a baby while in prison. India is also upset that the plot’s alleged mastermind, Hafiz Muhammad Saeed, is allowed to roam freely, addressing rallies despite the bounty the American government has placed on his head.
In its own defense Pakistan points to all the hundreds of suspected terrorists it has killed in the last year and a half. It reminds India that some 60,000 Pakistanis have been killed by terrorists. India responds by saying: You are only killing the terrorists who kill Pakistanis while protecting the terrorists who kill Indians.
Lurking under this neighborly rage are stereotypes that refuse to fade.
[Thanks to Zain Sayed Alam.]
Gemma Fraser in The F Word:
“They are all innocent until proven guilty. But not me. I am a liar until I am proven honest.”
Eighteen-year-old Emma O’Donovan is beautiful, confident and seems to have the world — and all the boys — at her feet. One summer night in her small Irish hometown she heads off to a house party where she plans to drink and have fun with her friends, and hopefully catch the attention of one of the local football heroes. The next day she wakes slumped in her front porch, in pain, and with no memory of what took place the night before. But as explicit images start to appear on social media of Emma engaging in sexual acts with a group of boys, she, her family and the town of Ballinatoom are forced to confront some difficult questions. Can you really consent if you’re intoxicated? Was Emma “asking for it” by dressing and behaving provocatively? And does it even matter what happened to you if you can’t remember? Louise O’Neill doesn’t allow the reader to witness Emma’s assault (the sole small mercy she grants) so we are left just as unclear about what happened that night as Emma herself. The first half of the book details the events leading up to the party while the second leaps forward a year, when Emma is pursuing a legal case against the boys — and suffering the consequences of speaking out.
Her parents believe they have raised Emma to be “a good girl” but her active sex life and drinking habits affect how the people of Ballinatoom view her and her culpability. Many see her actions as a selfish attempt to ruin the lives of the town’s sporting heroes, and her inbox is constantly flooded with insults and threats. “No one forced the drink down her throat, or made her take shit,” says one local girl. “And what guy was going to say no if it was handed to him on a plate?” Even Emma’s own friends doubt her story. (“You know I’m on your side, right?… I was just asking if it was, like, rape rape.”)
In her memoir, Sally Mann cites a saying: when an asshole makes good art, he is remembered as an asshole who made good art, but when an asshole makes bad art, he’s just remembered as an asshole.
But when someone who made good art is accused of being a Bad Mother, can she ever be remembered as anything but a Bad Mother?
In 1992, Mann’s book Immediate Family tapped into collective anxiety about child pornography and made her the most notorious art photographer of her generation. The book has since become a classic in the history of photography, and a milestone in the history of childhood. Mann’s memoir proves, however, that there is more to her career, and more to a relationship between life and work, than maternity. Most of the gorgeously crafted black-and-white analogue photographs she has created since the 1970s have been about places haunted by death, not about children. And what kind of mother could have produced Immediate Family belongs within a larger issue. Or that, at least, is what Mann’s memoir stretches our minds to consider. Extraordinary artists, even when they are mothers, Mann reveals, require personal sacrifice from those around them, as well as from themselves.
Today the classical idea that people merit solicitude simply in virtue of being human – or, more succinctly, that bare humanity is morally important – is on the defensive. It is not uncommon for contemporary thinkers to simply dismiss this idea. Many philosophers and popular writers maintain that a human or non-human creature’s moral standing is a direct function of its individual capacities of mind and hence that the sheer fact of being human (i.e., apart from the possession of any particular individual capacities) is morally indifferent. While some are motivated by the laudable aim of showing that certain animals (viz., those who possess whatever capacities are deemed morally relevant) should be treated better, these thinkers nevertheless wind up implying – shockingly – that human beings with severe cognitive disabilities have diminished claims to moral attention.
One good reason to defend the now embattled idea that merely being human matters is to challenge those who in this way suggest that we owe less to some of the most vulnerable members of society. But the interest of a plausible account concerning the importance of being human extends beyond its usefulness for contesting the repugnant suggestion that human beings with severe cognitive disabilities are somehow less fit objects of moral concern. A plausible account of how being human matters sheds light on what is involved in bringing any human being into focus in ethics, and thus helps us to understand the kind of work we need to do to combat not only biases related to cognitive disability but also other forms of bias that obstruct the kind of clear-sighted understanding we require if we are to respond to each other justly.
ON THE NIGHT of March 9, 1945, American B-29 bombers burned 15 square miles of Tokyo, killing 100,000 civilians and leaving more than one million homeless. It was the greatest of the incendiary air raids, but it was far from the last. On March 11, American B-29s bombed Nagoya; March 13, Osaka; March 16, Kobe; March 18, Nagoya again. Five raids in nine days, 32 square miles destroyed in Japan’s four most populous cities — 41 percent of the area the Army Air Forces destroyed in all of Germany during the entire war, and at a total cost of only 22 B-29s and their crews.[i] General Curtis LeMay, who was in charge, quit, at least for a time. He had run out of napalm. Two months later, his stocks replenished, he systematically burned 62 smaller Japanese cities.
That same year, A. J. Liebling began writing his New Yorker “Wayward Press” column, which to this day is considered the gold standard in media criticism. Lieblings’s first “Wayward Press” appeared on May 19 and criticized the attempted military embargo on immediate reporting of the German surrender. Two weeks after that, The New Yorker began its coverage of the firebombing of Japan. Had Liebling been aware of the bombing’s backstory, it might have prompted a second “Wayward Press.”
What Are You Doing with My DNA? “Informed Consent” explores deep ethical questions in genetics research
Diana Kwon in Scientific American:
Twelve years ago, members of the Havasupai Tribe entered into a legal battle with Arizona State University, over the ways in which school researchers were using blood samples from tribe members without proper informed consent. The case halted the research and the university returned the blood to the tribe, along with financial compensation. The scuffle became a landmark case in bioethics. Deborah Zoe Laufer’s play, “Informed Consent,” running through September 13 at The Duke on 42nd Street theater in New York City, dramatizes the important case. Though not meant to be an exact retelling of the story, the play provides a springboard for discussion about the importance of informed consent in scientific research.
The script follows the journey of a scientist who, motivated by the desire to understand the gene for early onset Alzheimer’s that runs in her family, seeks out an isolated Native American tribe living in the Grand Canyon. The tribe presents an ideally uncorrupted gene pool for her research. The scientist initially struggles to convince tribe members to provide samples of blood, which they consider sacred, for her studies. They eventually agree, in hope that the research will reveal genetic clues to the devastating rates of diabetes destroying their own family and friends. The individuals sign a broad consent form that the scientist has deliberately written in simple language. The tribe later learns that the researcher used the blood to study ailments that it was unaware of, including mental illness and the tribe’s geographic origins. Feeling angered and betrayed, the tribe sues the university and demands that it return the blood.
the Lord your God is a consuming fire
The stories of the gods outshine the moon
your story is darkness outshining the sun
we hide our eyes because of your fire
at the moment of the mountain
let not God speak to us lest we die
no wonder history gives us
cities like widows
sitting in their menstrual blood
no wonder book of revelation surges up
four horsemen orgy of vengeance
after nonviolent gospels
no wonder swarms of Christian soldiers
no wonder chapel in Cuzco
sculpted conquistador striding upon
the prone body of an Indian
no wonder imams cut hands off sinners
no wonder the Jewish lunatic murders worshipers
in a place of reconciliation
everybody trying to look goes blind
by Alicia Ostriker
from The Book of Life: Selected Jewish Poems 1979-2011
University of Pittsburgh Press, 2012
Thursday, September 03, 2015
Political debate in Israel is vigorous, if not always elegant, often summoning the old Hebrew phrase that describes “a dialogue between deaf people.” But it has been dampened in recent years by a series of government-sponsored bills: one demanding that non-Jewish Israelis take loyalty oaths; another authorizing the finance ministry to withhold funds from organizations deemed—however vaguely—to be violating Israel’s foundational tenet of a “Jewish and democratic” state. Kashua, like other Arab Israelis in the public eye, was used to having his words scrutinized. But the summer’s events felt different. As the conflict in Gaza escalated into war, the première of a movie based on his memoir “Dancing Arabs” was hastily scrapped. Flag-draped extremists in Tel Aviv brandished metal rods at antiwar demonstrators. The atmosphere of intimidation became so intense that Ayman Odeh, the youthful leader of the Joint List, an alliance of Arab-backed parties that represent Palestinian aspirations in Israel, announced that an “age of ostracism” had taken hold.
Within the Green Line that separates Israel proper from Gaza and the West Bank, Arab Israelis make up twenty per cent of the population. For liberal Israelis, and for Arabs who hope to be accepted as equals, Kashua embodied the country’s stated ideal of coexistence—of Arab Israelis’ full legal and civil integration. For a decade, he had lived with his wife, Najat, in Ramat Denya, a Jewish neighborhood in Jerusalem, and their children attended the city’s only bilingual school.
No gravediggers. No funeral for Ophelia. No voyage to England. At the Theatre Royal, Drury Lane, on December 18, 1772, David Garrick did “the most imprudent thing I ever did in all my life”, and staged a new and much-altered version of Hamlet. At the age of fifty-five, he was to rejuvenate the prince he had first essayed in Dublin some thirty years earlier, at the outset of his career. Garrick’s was a lifelong experiment with the role: this latest alteration of Shakespeare was at least his third improvement on the decades-old acting text of Robert Wilks and John Hughes. It included almost 630 lines previously unheard in the eighteenth-century theatre. Yet it also ditched what Garrick was pleased to call the “rubbish of the fifth act”, in favour of some rubbish of Garrick’s own devising. “And now shall you feel my wrath – Guards!”, he has Claudius exclaim – to which Hamlet gamely retorts with a fatal stab and a cry of “First feel mine!” Gertrude exits, pursued by a fear (of her own son); imprudently, he impales himself on Laertes’s sword. Horatio and Laertes (not Fortinbras) are left to bury the dead.
Garrick’s Hamlet was a popular triumph. The Westminster Magazine’s reviewer was not alone in believing that “The tedious interruptions of this beautiful tale no longer disgrace it”. Most critics, however, then and now, have tended to howl about what he did to the play – tended, that is, to see only the squashed fifth act rather than the largely restored other four.
Walter B., an affable, outgoing man of forty-nine, came to see me in 2006. As a teenager, following a head injury, he had developed epileptic seizures—these first took the form of attacks of déjà vu that might occur dozens of times a day. Sometimes he would hear music that no one else could hear. He had no idea what was happening to him and, fearing ridicule or worse, kept his strange experiences to himself.
Finally he consulted a physician who made a diagnosis of temporal lobe epilepsy and started him on a succession of antiepileptic drugs. But his seizures—both grand mal and temporal lobe seizures—became more frequent. After a decade of trying different antiepileptic drugs, Walter consulted another neurologist, an expert in the treatment of “intractable” epilepsy, who suggested a more radical approach—surgery to remove the seizure focus in his right temporal lobe. This helped a little, but a few years later, a second, more extensive operation was needed. The second surgery, along with medication, controlled his seizures more effectively but almost immediately led to some singular problems.
Daniel W. Drezner in The Washington Post:
In Tuesday’s post, I argued that it was quite possible for political scientists to be both rigorous and relevant. But I closed by observing that economists generally don’t worry about the whole rigor vs. relevance debate. Their scholarly papers are impermeable black masses to lay readers, and yet policymakers and politicians defer to their expertise on a regular basis. Political scientists — particularly international relations scholars — look at that and think, “Why can’t we get us some of that?”
The response by much of political science to this state of affairs has been to try to mimic economic methodology as much as humanly possible. Now the more sophisticated modelling and statistical techniques might have some intrinsic value to studying political phenomena. But I think the belief that aping economists will lead political scientists to be treated with more respect fundamentally misinterprets why economists get more respect.
It’s worth stepping back here for a second to point out that what is particularly impressive about the prominence of economists in the marketplace of ideas is just how badly the profession has screwed up the past decade. With some important exceptions, few economists accurately warned about the severe dangers of the housing bubble before the 2008 financial crisis. Indeed, as John Quiggin and others have noted, ideas like the efficient markets hypothesis helped to spur the conditions that created the bubble in the first place.
Nor have economists shined during the post-2008 era. Forecasters of all stripes have failed badly. The Federal Reserve has persistently overestimated projected economic growth since the collapse of Lehman Brothers. Since the start of the Great Recession, the International Monetary Fund’s economic forecasters have had to continually revise downward their short-term projections for global economic growth. The failure rate has been so bad that the IMF has started to devote research to why so many revisions have been necessary.
J.M. Tyree in The Rumpus:
I suspect that everyone is always rewriting something or other, whether they are self-conscious about it or operating intuitively. It’s probably endemic to the literary impulse to wish to transform the works that gave us pleasure into something that brings someone else a similar sense of frisson. From Ulysses to Helen Oyeyemi’s latest book, Boy, Snow, Bird—a transplantation of the Snow White fairy tale to postwar New England—literature has always featured a share of deliberative rewriting projects.
In popular fiction, rewriting has become de rigueur: Patricia Park’s Re Jane features a contemporary Jane Eyre living in Flushing, Queens, while Jonathan Franzen’s forthcoming novel Purity, we’re told, will riff intriguingly on Dickens’s Great Expectations. Faced with this flood-tide of bestselling rewrites—Stephen King’s Finders Keepers, a sort of redo of Misery, and E. L. James’s Grey, her 50 Shades Take Two, the list goes on—it is tempting to rewrite the famous opening line of James Wood’s essay “Hysterical Realism”: “A genre is hardening.”
But maybe originality is not where it’s at. Perhaps the question isn’t whether authors should be rewriting but what they are rewriting, why, and how. If it is obvious by now that rewriting the classics has become a risk-averse niche-marketing strategy for an industry that is stale, flat, and unprofitable, that shouldn’t spoil the fun of our larger culture of remixing. TV and movies provide a useful analogy—just because Gotham feels like a listless prequel to the Batman saga doesn’t in any way nullify the sheer exuberance of filmmaking on display in the rebooted Mad Max Fury Road.
I take another cue from the documentary filmmaker Adam Curtis’s comments on pop music. He is right to be perturbed that we’re living in what he describes as a “static culture” or “zombie culture” in which art not only “feeds off the past” but also merely “replicates” the effects of other works of art, and in which artists begin to see themselves like “archeologists” or tomb robbers. The larger problem, according to Curtis, is a cultural world of “stuck on beards” where “so many things just go back and dig up the bloody grave.” Curtis calls for more musicians to emulate Rihanna and fewer to copy the copies produced by Mumford and Sons. He’s hoping to encourage artists to create the new from the old and discouraging them from simply reproducing the effects of previous works or inhabiting a dead style.
Galen Strawson in Aeon:
‘Each of us constructs and lives a “narrative”,’ wrote the British neurologist Oliver Sacks, ‘this narrative is us’. Likewise the American cognitive psychologist Jerome Bruner: ‘Self is a perpetually rewritten story.’ And: ‘In the end, we become the autobiographical narratives by which we “tell about” our lives.’ Or a fellow American psychologist, Dan P McAdams: ‘We are all storytellers, and we are the stories we tell.’ And here’s the American moral philosopher J David Velleman: ‘We invent ourselves… but we really are the characters we invent.’ And, for good measure, another American philosopher, Daniel Dennett: ‘we are all virtuoso novelists, who find ourselves engaged in all sorts of behaviour… and we always put the best “faces” on it we can. We try to make all of our material cohere into a single good story. And that story is our autobiography. The chief fictional character at the centre of that autobiography is one’s self.’
So say the narrativists. We story ourselves and we are our stories. There’s a remarkably robust consensus about this claim, not only in the humanities but also in psychotherapy. It’s standardly linked with the idea that self-narration is a good thing, necessary for a full human life.
I think it’s false – false that everyone stories themselves, and false that it’s always a good thing. These are not universal human truths – even when we confine our attention to human beings who count as psychologically normal, as I will here. They’re not universal human truths even if they’re true of some people, or even many, or most. The narrativists are, at best, generalising from their own case, in an all-too-human way. At best: I doubt that what they say is an accurate description even of themselves.
What exactly do they mean? It’s extremely unclear. Nevertheless, it does seem that there are some deeply Narrative types among us, where to be Narrative with a capital ‘N’ is (here I offer a definition) to be naturally disposed to experience or conceive of one’s life, one’s existence in time, oneself, in a narrative way, as having the form of a story, or perhaps a collection of stories, and – in some manner – to live in and through this conception.
Alvin E. Roth in Politico:
The Mediterranean isn’t an effective barrier between Europe and refugee crises in Africa, Asia and the Middle East. Europe could turn this challenge into a manageable opportunity to protect refugee lives as well as its own economy. But European countries must first agree on a strategy recognizing that refugees are not widgets to be distributed or warehoused. They are people trying to make choices in their best interest. Those decisions are often a matter of life and death.
August began with news of Abdul Rahman Haroun, the Sudanese man who, after having already risked his life to reach Europe by boat, put his life in peril again, coming within yards of successfully crossing the “Chunnel” on foot to reach England and claim asylum before being arrested. Then, on August 27, 71 men, women and children, at least some of whom were Syrian, were found dead in a truck near Vienna. These refugees also had already somehow safely reached Europe, but boarded a smuggler’s truck to make it to another European destination. Instead, they suffocated and perished.
These stories are shocking but not surprising. The developing world hosts over 80 percent of asylum seekers, but a growing number are making their way to industrialized countries. These refugees are trying to get to specific countries within Europe. Sweden, for example, received 81,325 asylum seekers in 2014, or 8,365 refugees per one million Swedes. In contrast, while Greece had 34,422 boat arrivals in 2014, only 9,435 applied for asylum in Greece. That’s only 859 per one million Greeks.
Vanessa C. Adriance in Ms Magazine:
LIVING IN THE CROSSHAIRS exposes the harrowing reality facing abortion providers in the U.S. To the uninitiated, that reality is shocking: a life of constant harassment and stalking, of hate mail and cyber-bullying and criminal trespass at their homes, of needing to don a disguise and bulletproof vest and do evasive maneuvers on the drive to work. Providing legal, safe abortions—or even working as a security guard or volunteer in a clinic that does so—means being the target of relentless and terrifying criminal acts.
Since 1993, eight doctors and clinic workers have been murdered; many others have been assaulted and maimed. Clinics have been burned down, providers’ children have been intimidated at school and doctors’ photos and names have appeared on WANTED posters distributed in their neighborhoods. It’s impossible to read this book without marveling at the courage and stamina these people exhibit in continuing to offer abortion care. Law professor David Cohen and attorney Krysten Connon vividly illustrate the impact of this nationwide campaign of terror on its victims. A physician who helps women needing abortions in the mid-Atlantic and one South Atlantic state remembers hearing of the 1998 murder of abortion provider Dr. Barnett Slepian, shot by a sniper through the window of his kitchen in Buffalo, New York: “I’m on the phone, and I’m probably starting to shake a little bit. Because we all have windows in our home…at that point I got on my belly and crawled around my home…Someone was out there, and we didn’t know who it was.” Kitchen, workplace—to extremists, no place is sacred. Providers’ elderly parents have been tormented in their nursing homes. In 2009, Dr. George Tiller, a prominent provider who had long been a target of the extremists, was shot in the head on a Sunday morning inside his Wichita church’s foyer, where he had just finished his duties as an usher. Even children trick-or-treating at a provider’s home on Halloween have been harassed.
Richard Holmes in Nature:
The bicentenary of Augusta Ada King, Countess of Lovelace, heralds the critical reassessment of a remarkable figure in the history of Victorian science. Ada Lovelace (as she is now known) was 27 years old and married with 3 children when she published the first account of a prototype computer and its possible applications in 1843. Her 20,000-word paper was appended as seven Notes to a translation of a descriptive article, Sketch of the Analytical Engine Invented by Charles Babbage, Esq. Lovelace's account was the fruit of one of the most intriguing collaborations in the annals of science: her friendship with Charles Babbage, Lucasian Professor of Mathematics at the University of Cambridge, UK, and inventor of the landmark analytical engine. The Notes eventually brought Lovelace both acclaim and notoriety. Babbage himself described her unforgettably to the physicist Michael Faraday as “that Enchantress who has thrown her magical spell around the most abstract of Sciences and has grasped it with a force that few masculine intellects (in our own country at least) could have exerted over it”. The exact nature of that force and enchantment continues to puzzle historians of science, not least because Lovelace's correspondence, largely archived at the Bodleian Library in Oxford, has not been fully published (see selections by Dorothy Stein in Ada (MIT Press, 1985) and Betty A. Toole in Ada, Enchantress of Numbers; Strawberry, 1992). What has emerged is the hitherto unsuspected range of Lovelace's interests and contacts, which linked the worlds of Victorian science and literature.
Lovelace was the only legitimate child of the poet Lord Byron. She never met her father, self-exiled in Italy and Greece, but inherited much of his rebellious spirit and something of his unstable genius. She directed it towards science, declaring: “I do not believe that my father was (or ever could have been) such a Poet as I shall be an Analyst (& Metaphysician); for with me the two go together indissolubly”.She was brought up with pathological severity by her mother, the brilliant Lady Annabella Byron — dubbed “the Princess of Parallelograms” for her own fascination with mathematics — and a squadron of female advisers whom Lovelace christened the Furies. Forbidden to read her father's poetry, young Ada was encouraged to study mathematics, astronomy and music, and allowed to design flying machines, play the harp and commune with her cat, Puff. In her early twenties she began to study the new calculus under Augustus De Morgan, a proponent of Boolean algebra, who described her as potentially more promising than any 'senior wrangler', or first-class Cambridge maths student.
Wednesday, September 02, 2015
The social wasp Polybia paulista protects itself against predators by producing venom known to contain a powerful cancer-fighting ingredient. A Biophysical Journal study published September 1 reveals exactly how the venom's toxin—called MP1 (Polybia-MP1)—selectively kills cancer cells without harming normal cells. MP1 interacts with lipids that are abnormally distributed on the surface of cancer cells, creating gaping holes that allow molecules crucial for cell function to leak out.
"Cancer therapies that attack the lipid composition of the cell membrane would be an entirely new class of anticancer drugs," says co-senior study author Paul Beales, of the University of Leeds in the UK. "This could be useful in developing new combination therapies, where multiple drugs are used simultaneously to treat a cancer by attacking different parts of the cancer cells at the same time."
MP1 acts against microbial pathogens by disrupting the bacterial cell membrane. Serendipitously, the antimicrobial peptide shows promise for protecting humans from cancer; it can inhibit the growth of prostate and bladder cancer cells, as well as multi-drug resistant leukemic cells. However, until now, it was not clear how MP1 selectively destroys cancer cells without harming normal cells.
The Wild West has always been surreal. Even when it existed, it was being transformed into myth, often by the very figures associated with it. In 1883, Buffalo Bill retired from bison hunting and “Indian fighting” to dazzle crowds with a Wild West vaudeville show that featured the likes of Wild Bill Hickok and Annie Oakley. These performances helped define the frontier in the public’s mind. The real Wild West might have been a mostly dull place where banks were safe places for money and cowboys had to actually, you know, deal with cattle. But in the Wild West of our imagination, a bank is robbed every day at high noon and the lone gunslinger forever stalks his enemies through the wasteland with inhuman vengeance.
It is a small step from the mythic to the bizarre, and the Western has always been a ripe genre for artists with an eye for the uncanny. From films like Alejandro Jodorowsky’s El Topo and Jim Jarmusch’s Dead Man to novels like Robert Coover’s Ghost Town and Cormac McCarthy’s Blood Meridian, the American West has been frequently depicted as a strange land. Add to that weird Western tradition Colin Winnette’s Haints Stay.
A hundred years ago, Randolph Bourne was a hot property—an intellectual wunderkind who was taking the American intellectual scene by storm. Bourne was the complete package: brilliant, charismatic, filled with social energy, and exquisitely attuned to the moment. Bourne’s essays appeared in leading periodicals like The Atlantic, The Dial, and The New Republic back when magazines set the American political and cultural agenda. Admirers considered him a visionary, an exponent of a humane new cosmopolitanism. True freedom and real democracy, he believed and exemplified, implied a spirit of tolerance, generosity, and creativity consummated in what he called “the beloved community.”
Barely two years after writing these words, Bourne became persona non grata. His offense involved not personal scandal—no violence, fraud, embezzlement, or sexual shenanigans—but something much, much worse: when the climate of opinion abruptly shifted, he refused to follow. They zigged, he zagged. While other members of the New York intelligentsia were swooning at the prospect of waging a war to end all wars that would make the world safe for democracy, Bourne dared to dissent. For this, they shut him out of virtually all the journals in which he had been publishing, and all respectable outlets generally.
The periodic table was incredibly beautiful, the most beautiful thing I had ever seen. I could never adequately analyze what I meant here by beauty—simplicity? coherence? rhythm? inevitability? Or perhaps it was the symmetry, the comprehensiveness of every element firmly locked into its place, with no gaps, no exceptions, everything implying everything else.
I was disturbed when one enormously erudite chemist, J. W. Mellor, whose vast treatise on inorganic chemistry I had started dipping into, spoke of the periodic table as “superficial” and “illusory,” no truer, no more fundamental than any other ad hoc classification. This threw me into a brief panic, made it imperative for me to see if the idea of periodicity was supported in any ways beyond chemical character and valency.
Exploring this took me away from my lab, took me to a new book that immediately became my bible, the CRC Handbook of Physics and Chemistry, a thick, almost cubical book of nearly three thousand pages, containing tables of every imaginable physical and chemical property, many of which, obsessively, I learned by heart.
I learned the densities, melting points, boiling points, refractive indices, solubilities, and crystalline forms of all the elements and hundreds of their compounds. I became consumed with graphing these, plotting atomic weights against every physical property I could think of.