Thursday, April 03, 2014
Sonali Deraniyagala in the New York Times:
“Where is Ajay? What was the point of having raised him?” an elderly woman grumbles to her husband about their adult son in the opening pages of Akhil Sharma’s semi-autobiographical new novel, “Family Life.” This book, deeply unnerving and gorgeously tender at its core, charts the young life of Ajay Mishra as he struggles to grow within a family shattered by loss and disoriented by a recent move from India to America. “Family Life” is equally the story of Ajay’s parents, whose response to grief renders them unable to find the space in which to cherish and raise him.
Sharma’s previous novel, “An Obedient Father,” was a remorseless, forceful tale of a corrupt Indian civil servant who molests his daughter and ruins lives, including his own. “Family Life,” while also about domestic torment, is gentler and of an altogether different quality.
When we first meet the Mishras, they are a young, middle-class family living in Delhi in the mid-1970s. India is under emergency rule, a time of gloom and uncertainty, but for 8-year-old Ajay and his older brother, Birju, life is playful and secure. Their mother lights their world, while their father seems so superfluous that Ajay wonders if he’s been assigned to them by the government.
Goethe's 'The Apprenticeship of Wilhelm Meister', a neglected masterpiece if ever there was, is known nowadays for a single line from a ballad sung by Mignon, the daughter of a wandering musician. 'Know'st thou the land where the lemon trees bloom?' begins her mysterious song, describing an imagined world of blue skies, marble statues and thunderous waterfalls, not without a lurking menace beneath its beauty. When Wilhelm asks her where she heard it, Mignon answers, 'Italy! If thou go to Italy, take me along with thee; for I am too cold here.'
Goethe's verses encapsulate the romantic hankering for what Browning hailed as 'the land of lands' and Forster identified as 'a place that's upset people since the beginning of the world'. Citrus fruit, as Helena Attlee clearly understands, is the ultimate metaphor for Italy as an object of desire among us shivering mortals on the wrong side of the Alps. In Goethe's day, northern Europeans with enough money created elegant orangeries, where the precious trees spent a coddled winter indoors before sweating gardeners trundled them onto the terrace for a few weeks of watery sunshine. Such buildings were a fantasy Hesperides. The real golden apples grew far to the south, where the ancestral wisdom of farmers, cooks, perfumers, engineers and entrepreneurs placed citrus fruits alongside the grape and the olive as an archetype of Mediterranean fertility.
American poets tend to want the benefits of song—its emotionality, its melodiousness—without its costs: its triviality, its obliviousness, its feyness. This conflict drives Michael Ruby’s American Songbook, whose title reminds us that we have no body of popular American poems to match the body of American songs, by the Gershwins and Irving Berlin and Cole Porter and many others, whose tunes and lyrics many people know by heart. Ruby’s book presents his own poems, some of them loosely connected with popular songs. What would “Love for Sale,” the Porter tune Ella Fitzgerald made famous, sound like as a difficult postmodern poem? Here is the opening of Ruby’s “Love for Sale,” dedicated by him to Ella Fitzgerald:
defeats sight force
pail (of milk
I peacock throne open shop to a small group
moon of gazing down
draughts the lit tunnel
wayward town of
I peacock throne go toys to work on vanishing
This is “composed,” more in William Carlos Williams’s sense than in Porter’s, out of noirish bits of city life, rank with desire.
On January 8, Museum of Modern Art director Glenn Lowry and the architects Diller Scofidio + Renfro made public their scheme to redesign and expand MoMA. Since then, virtually no artists or architects, or art, design, or architecture critics, have lauded the plan. Nearly all the reaction has been negative. Yet no one’s raised a finger to do much of anything about it. We live in a time when power structures are impervious to and imperious about protest. Yet the Lowry–DS+R plan so irretrievably dooms MoMA to being a business-driven carnival that it feels like something really worth fighting against. Actions like this aren’t pie-in-the-sky or far-fetched. If 40 well-known artists whose work is in the collection signed a petition protesting the plans, it might have a real effect. This is MoMA’s Robert Moses moment, and five decades ago, artists were key to stopping his Lower Manhattan Expressway from being built. By the end of May, the problematic American Folk Art Museum on the MoMA site will likely be torn down, to be replaced with an even worse building for art. Then construction will begin. If this scheme is not stopped immediately, it’s going to go ahead.
So far, the public has seen a couple of drawings of the gleaming glass squash-court galleries that will replace AFAM.
Robert Pondiscio in City Journal:
Educators, policy makers and business leaders often fret about the state of math education,” the New York Times reported in May. “But reading comprehension may be a larger stumbling block.” Indeed, schools and teachers consistently have better luck improving student skills in math than in reading. A fresh reminder of the difficulty came in August, when New York released scores from its first round of tests aligned with the Common Core State Standards, now adopted by most states. Students in schools across the state fared poorly on the tests; some of the city’s most celebrated charter schools posted disappointing results as well. The silver lining is that by adopting reading curricula aligned with the Common Core and abandoning failed approaches to literacy instruction, New York City could be poised to lead a reading renaissance in the coming years—but only if city schools also make significant shifts in classroom instruction and exercise patience.
Math is relentlessly hierarchical—you can’t understand multiplication, for example, if you don’t understand addition. Reading is mercilessly cumulative. Virtually everything a child sees and hears, in and out of school, contributes to his vocabulary and language proficiency. A child growing up in a book-filled home with articulate, educated parents who fill his early years with reading, travel, museum visits, and other forms of enrichment arrives at school with enormous advantages in knowledge and vocabulary.
A large international consortium of researchers has produced the first comprehensive, detailed map of the way genes work across the major cells and tissues of the human body. The findings describe the complex networks that govern gene activity, and the new information could play a crucial role in identifying the genes involved with disease. “Now, for the first time, we are able to pinpoint the regions of the genome that can be active in a disease and in normal activity, whether it’s in a brain cell, the skin, in blood stem cells or in hair follicles,” said Winston Hide, associate professor of bioinformatics and computational biology at Harvard School of Public Health (HSPH) and one of the core authors of the main paper in Nature.
“This is a major advance that will greatly increase our ability to understand the causes of disease across the body.” The research is outlined in a series of papers published March 27, 2014, two in the journal Nature and 16 in other scholarly journals. The work is the result of years of concerted effort among 250 experts from more than 20 countries as part of FANTOM 5 (Functional Annotation of the Mammalian Genome). The FANTOM project, led by the Japanese institution RIKEN, is aimed at building a complete library of human genes.
When You Are Old
When you are old and grey and full of sleep,
And nodding by the fire, take down this book,
And slowly read, and dream of the soft look
Your eyes had once, and of their shadows deep;
How many loved your moments of glad grace,
And loved your beauty with love false or true,
But one man loved the pilgrim soul in you,
And loved the sorrows of your changing face;
And bending down beside the glowing bars,
Murmur, a little sadly, how Love fled
And paced upon the mountains overhead
And hid his face amid a crowd of stars.
by W.B. Yeats
Wednesday, April 02, 2014
Ta-Nehisi Coates in The Atlantic:
Over the past week or so, Jonathan Chait and I have enjoyed an ongoing debateover the rhetoric the president employs when addressing African Americans. Here is my initial installment, Chait's initial rebuttal, my subsequent reply, and Chait'slatest riposte. Initially Chait argued that President Obama's habit of speaking about culture before black audiences was laudable because it would "urge positive habits and behavior" that are presumably found especially wanting in the black community.
Chait argued that this lack of sufficient "positive habits and behaviors" stemmed from cultural echoes of past harms, which now exist "independent" of white supremacy. Chait now concedes that this assertion is unsupportable and attempts to recast his original argument:
I attributed the enduring culture of poverty to the residue of slavery, terrorism, segregation, and continuing discrimination.
Not quite (my emphasis):
The argument is that structural conditions shape culture, and culture, in turn, can take on a life of its own independent of the forces that created it.It would be bizarre to imagine that centuries of slavery, followed by systematic terrorism, segregation, discrimination, a legacy wealth gap, and so on did notleave a cultural residue that itself became an impediment to success.
The phrase "culture of poverty" doesn't actually appear in Chait's original argument. Nor should it—the history he cites was experienced by all variety of African Americans, poor or not. Moreover, the majority of poor people in America have neither the experience of segregation nor slavery in their background. Chait isconflating two different things: black culture—which was shaped by, and requires, all the forces he named; and "a culture of poverty," which requires none of them.
That conflation undergirds his latest column. Chait paraphrases my argument that "there is no such thing as a culture of poverty." His evidence of this is quoting me attacking the "the notion that black culture is part of the problem." This evidence only works if you believe "black culture" and "a culture of poverty" are somehow interchangeable.
Gary Gutting talks to Howard Wettstein, a professor of philosophy at the University of California, Riverside, and the author of “The Significance of Religious Experience”, over at the NYT's The Stone (image, portrait of Martin Buber, from Wikimedia Commons):
H.W.: I had a close friend in Jerusalem, the late Rabbi Mickey Rosen, whose relation to God was similarly intimate. To watch him pray was to have a glimpse of such intimacy. To pray with him was to taste it; God was almost tangible. As with Feynman, Mickey had no patience with the philosophers’ questions. God’s reality went without saying. God’s existence as a supernatural being was quite another thing. “Belief,” he once said to me, “is not a Jewish notion.” That was perhaps a touch of hyperbole. The point, I think, was to emphasize that the propositions we assent to are hardly definitive of where we stand. He asked of his congregants only that they sing with him, song being somewhat closer to the soul than assent.
This brings to mind Buber’s emphasis on the distinction between speaking to God, something that is readily available to all of us, and significant speech/thought about God, something that Buber took to be impossible.
G.G.: But you can’t in fact speak to someone who doesn’t exist — I can’t speak to Emma Bovary, although I can pretend to or think I can. Further, why would you even want to pray to someone you didn’t believe exists? On your account praying to God seems like playacting, not genuine religious commitment.
H.W.: Were I to suggest that God does not exist, that God fails to exist, then what you suggest would have real purchase. My thought is otherwise; it’s rather that “existence” is, pro or con, the wrong idea for God.
My relation to God has come to be a pillar of my life, in prayer, in experience of the wonders and the awfulness of our world. And concepts like the supernatural and transcendence have application here. But (speaking in a theoretical mode) I understand such terms as directing attention to the sublime rather than referring to some nonphysical domain. To see God as existing in such a domain is to speak as if he had substance, just not a natural or physical substance. As if he were composed of the stuff of spirit, as are, perhaps, human souls. Such talk is unintelligible to me. I don’t get it.
The theism-atheism-agnosticism trio presumes that the real question is whether God exists. I’m suggesting that the real question is otherwise and that I don’t see my outlook in terms of that trio.
Tom Chatfield in Aeon:
Back in August 2012, Google announced that it had achieved 300,000 accident-free miles testing its self-driving cars. The technology remains some distance from the marketplace, but the statistical case for automated vehicles is compelling. Even when they’re not causing injury, human-controlled cars are often driven inefficiently, ineptly, antisocially, or in other ways additive to the sum of human misery.
What, though, about more local contexts? If your vehicle encounters a busload of schoolchildren skidding across the road, do you want to live in a world where it automatically swerves, at a speed you could never have managed, saving them but putting your life at risk? Or would you prefer to live in a world where it doesn’t swerve but keeps you safe? Put like this, neither seems a tempting option. Yet designing self-sufficient systems demands that we resolve such questions. And these possibilities take us in turn towards one of the hoariest thought-experiments in modern philosophy: the trolley problem.
In its simplest form, coined in 1967 by the English philosopher Philippa Foot, the trolley problem imagines the driver of a runaway tram heading down a track. Five men are working on this track, and are all certain to die when the trolley reaches them. Fortunately, it’s possible for the driver to switch the trolley’s path to an alternative spur of track, saving all five. Unfortunately, one man is working on this spur, and will be killed if the switch is made.
In this original version, it’s not hard to say what should be done: the driver should make the switch and save five lives, even at the cost of one. If we were to replace the driver with a computer program, creating a fully automated trolley, we would also instruct it to pick the lesser evil: to kill fewer people in any similar situation. Indeed, we might actively prefer a program to be making such a decision, as it would always act according to this logic while a human might panic and do otherwise.
Tim Harford in the FT Magazine:
Cheerleaders for big data have made four exciting claims, each one reflected in the success of Google Flu Trends: that data analysis produces uncannily accurate results; that every single data point can be captured, making old statistical sampling techniques obsolete; that it is passé to fret about what causes what, because statistical correlation tells us what we need to know; and that scientific or statistical models aren’t needed because, to quote “The End of Theory”, a provocative essay published in Wired in 2008, “with enough data, the numbers speak for themselves”.
Unfortunately, these four articles of faith are at best optimistic oversimplifications. At worst, according to David Spiegelhalter, Winton Professor of the Public Understanding of Risk at Cambridge university, they can be “complete bollocks. Absolute nonsense.”
Found data underpin the new internet economy as companies such as Google, Facebook and Amazon seek new ways to understand our lives through our data exhaust. Since Edward Snowden’s leaks about the scale and scope of US electronic surveillance it has become apparent that security services are just as fascinated with what they might learn from our data exhaust, too.
Consultants urge the data-naive to wise up to the potential of big data. A recent report from the McKinsey Global Institute reckoned that the US healthcare system could save $300bn a year – $1,000 per American – through better integration and analysis of the data produced by everything from clinical trials to health insurance transactions to smart running shoes.
But while big data promise much to scientists, entrepreneurs and governments, they are doomed to disappoint us if we ignore some very familiar statistical lessons.
“There are a lot of small data problems that occur in big data,” says Spiegelhalter. “They don’t disappear because you’ve got lots of the stuff. They get worse.”
Bret Easton Ellis is modern literature’s little rascal supreme. He seems to do things for no reason other than the fun of it. Take, for example, the many references in his books to his other books, references made in such a super-subtle yet obsessive way he could be doing it only to amuse himself. His minor characters are often recurring. Sean Bateman, for example, one of the protagonists in The Rules of Attraction, has, it is glancingly mentioned, an older brother, Patrick, the gifter of a brown Ralph Lauren tie about which Sean has ambivalent feelings. Patrick then lands the lead role as the Psycho who also happens to be an American in Ellis’s next work. Ellis did the same thing with Victor Johnson, Lauren Hynde’s mostly offstage boyfriend in The Rules of Attraction, moving him from the periphery of that novel (he’s backpacking through Europe for much of the narrative) to front-and-center in Glamorama. Ellis even gives him a stage name, Victor Ward—which is stronger, more macho-sounding, and, with fewer syllables, fits better on a marquee—as is commensurate with his change in status from bit player to star. What or whom, one wonders, did these characters have to do in order to secure their big breaks? If any writer would have a casting couch for his fictional creations, it would be Ellis.
In the 1940s, a curiously enigmatic figure haunted New York City’s great libraries, his mind afire with urgent questions whose resolution might reveal, once and for all, the most ancient secrets of the universe in their crystalline clarity. This scholar eschewed the traditional disciplinary boundaries that define the intellectual terrain of the specialist; instead, he read widely, skimming the surface of countless works of science, myth and history to craft an answer to an overwhelming question: Had our planet been altered repeatedly by cosmic catastrophes whose traces could be found in the earliest human records?
A fantastic theory began to emerge, redolent of the efforts of an earlier age to unify knowledge, yet speaking to the preoccupations of a world contemplating the chaos of another gruesome European war. The solar system, it was revealed, did not operate according to Newton’s universal laws of gravitation, nor did life on Earth evolve gradually and continuously, as Darwin had written. Instead, the cosmos was like a giant atom, periodically discharging photons whose energy disrupted and redirected the movements of celestial bodies, even causing the reversal of Earth’s magnetic poles. A planet was a kind of super-electron.
Soldiers who set out to write the story of their war also have to navigate a minefield of clichés: all of them more or less true but open to qualification; many sowed long before the soldiers were ever deployed, because every war is like every other war. That’s one of them. War is hell is another. War begins in illusion and ends in blood and tears. Soldiers go to war for their country’s cause and wind up fighting for one another. Soldiers are dreamers (Sassoon said that). No one returns from war the same person who went. War opens an unbridgeable gap between soldiers and civilians. There’s no truth in war—just each soldier’s experience. “You can tell a true war story by its absolute and uncompromising allegiance to obscenity and evil” (from “How to Tell a True War Story,” in O’Brien’s story collection “The Things They Carried”).
Irony in modern American war literature takes many forms, and all risk the overfamiliarity that transforms style into cliché. They begin with Hemingway’s rejection, in “A Farewell to Arms,” of the high, old language, his insistence on concreteness: “I had seen nothing sacred, and the things that were glorious had no glory and the sacrifices were like the stockyards at Chicago if nothing was done with the meat except to bury it. There were many words that you could not stand to hear and finally only the names of places had dignity.”
Prashant Keshavmurthy in Chapati Mystery:
In 1892, Maulana Shibli Nu’māni, an internationally celebrated Indian Muslim historian, (Urdu-Persian) literary critic and theologian of his day, traveled by sea from Bombay to the Ottoman Empire, journeying through Cyprus, Istanbul, Syria and Egypt. Of this journey he kept a journal that he later published under the title of Safarnāma-i rūm va misr va shām (A Travel Account of Turkey, Egypt and Syria).2 He claims that he had not intended to write a travel account but that European prejudices with regard to the Turks had led him to do so. Even well-meaning Europeans, he observes, remain bound by the Islamophobic prejudices they are raised with. His aims in writing it are therefore corrective and pedagogical: to correct prejudiced European travel accounts of Turkey that form the basis for European histories, and to instruct Indian Muslims by documenting exemplary “progress” among Turkish Muslims. The Turkey or Ottoman state of Shibli’s time, we must remember, was the only one of the three great early modern Islamic states – the other two being Safavid Iran and Mughal India – to still be extant. Moreover, its emperor, Abduḥamīd II (1876 – 1909), had only recently achieved radical advances in the movement to modernize or “reorganize” – “reorganization” or tanzīmāt bespeaking the bureaucratic character of this modernity – of his state on European models. Shibli intends therefore to focus on the “developments and reforms” of the Muslim world, especially Turkey.
The turn of the century preoccupation with lost Mughal sovereignty among North India’s Reformist Muslims – a sovereignty they understood as Muslim in the wake of the formal end of the Mughal state in 1857 – led them to regard the still regnant Ottoman empire with special attention: in it they saw a Muslim empire that was modeling itself through technological and institutional reforms on Europe, the very ambition of Sayyid Aḥmad Khān, the founder of what became Aligarh Muslim University, and his colleagues like Shibli Nu’māni. Shibli thus discusses formerly Ottoman Cyprus, when he passes through it, in terms of the history of its political sovereignty under Muslim and then British rule. Furthermore, everywhere in his travels he singles out educational syllabi, technology, and such empirical aspects of a society as clothing and food, treating them as indices of a polity’s development. Shibli desires and is at pains to discover signs of a continuous Muslim world. That he conflates all Arabs in the Ottoman territories with Muslims and vice versa signals this desire.
Joshua Hartshome in Scientific American:
For years now, physicists and engineers have been building computer simulations of physics in order to understand the behavior of objects in the world. Want to see if a bridge would be stable during an earthquake? Enter it into the simulation, apply earthquake dynamics, and see what happens. Recently, the prestigious Proceedings of the National Academy of Sciences published work by MIT psychologists (and my labmates) Peter Battaglia, Jessica Hamrick, and Joshua Tenenbaum, arguing that all humans do roughly the same thing when trying to understand or make predictions about the physical world. The primary difference is that we run our simulations in our brains rather than in digital computers, but the basic algorithms are roughly equivalent. The analogy runs deep: To model human reasoning about the physical world, the researchers actually used an open-source computer game physics engine — the software that applies the laws of physics to objects in video games in order to make them interact realistically (think Angry Birds).
Battaglia and colleagues found that their video game-based computer model matches human physical reasoning far better than any previous theory. The authors asked people to make a number of predictions about the physical world: will tower of blocks stand or fall over, what direction would it fall over, and where would the block that landed the farthest away land; which object would most likely fall off of a table if the table was bumped; and so on. In each case, human judgments closely matched the prediction of the computer simulation ... but not necessarily the actual world, which is where it gets interesting.
Under our boot soles
In memory of Jim Thomas
Once you stepped out an open window onto nothing
we could see from our desks, and for a whole
long second you floated and didn't fall
through two floors of air to the earth's something.
You never fell. You were just going smoking
before class on the unseen roof.
All of us saw you make that roof when you didn't fall.
You took drags, looked down, looked up, thinking.
Then you stepped back through the open window
and read us the end of "Song of Myself"
where the spotted hawk swoops and grass grows
under a boot. You were all voice, we were all ears.
Up ahead words with hollow bones wait
once you step onto nothing. We could hear.
by Dennis Finnell
from Ruins Assembling
Shape and Nature Press, Geenfield, Ma.2014
Tuesday, April 01, 2014
Nicolas Claidière, Thomas C. Scott-Phillips, and Dan Sperber over at the Philosophical Transactions of the Royal Society (image via Wikimedia Commons):
Darwin-inspired population thinking suggests approaching culture as a population of items of different types, whose relative frequencies may change over time. Three nested subtypes of populational models can be distinguished: evolutionary, selectional and replicative. Substantial progress has been made in the study of cultural evolution by modelling it within the selectional frame. This progress has involved idealizing away from phenomena that may be critical to an adequate understanding of culture and cultural evolution, particularly the constructive aspect of the mechanisms of cultural transmission. Taking these aspects into account, we describe cultural evolution in terms of cultural attraction, which is populational and evolutionary, but only selectional under certain circumstances. As such, in order to model cultural evolution, we must not simply adjust existing replicative or selectional models but we should rathergeneralize them, so that, just as replicator-based selection is one form that Darwinian selection can take, selection itself is one of several different forms that attraction can take. We present an elementary formalization of the idea of cultural attraction.
1. Population thinking applied to culture
In the past 50 years, there have been major advances in the study of cultural evolution inspired by ideas and models from evolutionary biology. Modelling cultural evolution involves, as it would for any complex phenomenon, making simplifying assumptions; many factors have to be idealized away. Each particular idealization involves a distinct trade-off between gaining clarity and insight into hopefully major dimensions of the phenomenon and neglecting presumably less important dimensions. Should one look for the best possible idealization? There may not be one. Different sets of simplifying assumptions may each uniquely yield worthwhile insights. In this article, we briefly consider some of the simplifications that are made in current models of cultural evolution and then suggest how important dimensions of the phenomenon that have been idealized away might profitably be introduced in a novel approach that we see as complementary rather than as alternative to current approaches. All these approaches, including the one we are advocating, are Darwinian, but in different ways that are worth spelling out.
Much clarity has been gained by drawing on the analogy between cultural and biological evolution (an analogy suggested by Darwin himself: ‘The formation of different languages and of distinct species, and the proofs that both have been developed through a gradual process, are curiously parallel’. This has made it possible to draw inspiration from formal methods in population genetics with appropriate adjustments and innovations. Of course, the analogy with biological evolution is not perfect. For example, variations in human cultural evolution are often intentionally produced in the pursuit of specific goals and hence are much less random than in the biological case.
Via Andrew Sullivan, Philip Schofield discusses Jeremy Bentham's writings on religion and sex, over at the Oxford University Press blog:
In 1814, just two hundred years ago, the radical philosopher Jeremy Bentham (1748–1832) began to write on the subject of religion and sex, and thereby produced the first systematic defence of sexual liberty in the history of modern European thought. Bentham’s manuscripts have now been published for the first time in authoritative form. He pointed out that ‘regular’ sexual activity consisted in intercourse between one male and one female, within the confines of marriage, for the procreation of children. He identified the source of the view that only ‘regular’ or ‘natural’ sexual activity was morally acceptable in the Mosaic Law and in the teachings of the self-styled Apostle Paul. ‘Irregular’ sexual activity, on the other hand, had many variations: intercourse between one man and one woman, when neither of them were married, or when one of them was married, or when both of them were married, but not to each other; between two women; between two men; between one man and one woman but using parts of the body that did not lead to procreation; between a human being and an animal of another species; between a human being and an inanimate object; and between a living human and a dead one. In addition, there was the ‘solitary mode of sexual gratification’, and innumerable modes that involved more than two people. Bentham’s point was that, given that sexual gratification was for most people the most intense and the purest of all pleasures and that pleasure was a good thing (the only good thing in his view), and assuming that the activity was consensual, a massive amount of human happiness was being suppressed by preventing people, whether from the sanction of the law, religion, or public opinion, from engaging in such ‘irregular’ activities as suited their taste.
Bentham was writing at a time when homosexuals, those guilty of ‘the crime against nature’, were subject to the death penalty in England, and were in fact being executed at about the rate of two per year, and were vilified and ridiculed in the press and in literature. If an activity did not cause harm, Bentham had argued as early as the 1770s and 1780s, then it should not be subject to legal punishment, and had called for the decriminalization of homosexuality. By the mid-1810s he was prepared to link the problem not only with law, but with religion. The destruction of Sodom and Gomorrah was taken by ‘religionists’, as Bentham called religious believers, to prove that God had issued a universal condemnation of homosexuality. Bentham pointed out that what the Bible story condemned was gang rape.
Nicholas Shakespeare in The Telegraph:
Slang’s first compilers were chippy individualists, routinely beset by financial worries and complex marital lives. They were never grandees like the 70-odd team beavering away still on the Oxford English Dictionary in Great Clarendon Street (less than 30 yards from where I live in Oxford). They numbered Francis Grose (1731-91), the son of a Swiss jeweller, who was so fat that his servant had to strap him into bed every night; Pierce Egan (1772-1849), a boxing journalist and editor of Real Life in London; and John William Hotten (1832-73), a workaholic pornographer (The Romance of Chastisement) who died from a surfeit of pork chops, and was remembered, unfairly, by the phrase: “Hotten: rotten, and forgotten”. Even so, they shared many characteristics of lexicographers like William Chester Minor (1834-1920), one of the OED’s founding fathers, who was, quite conclusively, bonkers. As one of Jonathon Green’s mentors, Anthony Burgess, cautions: “The study of language may beget madness.”
Super-geeks (from geek, meaning fool) to a man, slang’s lexicographers tend to be self-appointed guardians who, while cheerfully plagiarising each other in their project to demonstrate the importance and scope of slang, have yet to agree on a definition of what, precisely, slang is, or was – or even its origin. Hotten believed slang to be a gipsy term for the gipsies’ secret language; the Oxford philologist Walter Skeat attributed it to the Icelandic slunginn (cunning), while Eric Partridge (1894-1979), a New Zealand ex-soldier, ex-publisher and ex-bankrupt, believed it was the past participle of the Norwegian/Old Norse verb sling, so giving the concept of a “thrown” language. Into this tradition, Green (from greens, meaning sexual intercourse, b 1948) fits seamlessly. “What goes in a slang dictionary and what does not is often a matter of individual choice,” he writes. “Ultimately slang seems to be what you think it is.”
Natalie Angier in The New York Times:
The “Iliad” may be a giant of Western literature, yet its plot hinges on a human impulse normally thought petty: spite. Achilles holds a festering grudge against Agamemnon (“He cheated me, wronged me ... He can go to hell...”) turning down gifts, homage, even the return of his stolen consort Briseis just to prolong the king’s suffering. Now, after decades of focusing on such staples of bad behavior as aggressiveness, selfishness, narcissism and greed, scientists have turned their attention to the subtler and often unsettling theme of spite — the urge to punish, hurt, humiliate or harass another, even when one gains no obvious benefit and may well pay a cost. Psychologists are exploring spitefulness in its customary role as a negative trait, a lapse that should be embarrassing but is often sublimated as righteousness, as when you take your own sour time pulling out of a parking space because you notice another car is waiting for it and you’ll show that vulture who’s boss here, even though you’re wasting your own time, too.
Evolutionary theorists, by contrast, are studying what might be viewed as the brighter side of spite, and the role it may have played in the origin of admirable traits like a cooperative spirit and a sense of fair play. The new research on spite transcends older notions that we are savage, selfish brutes at heart, as well as more recent suggestions that humans are inherently affiliative creatures yearning to love and connect. Instead, it concludes that vice and virtue, like the two sides of a V, may be inextricably linked.
Monday, March 31, 2014
by Jonathan Kujawa
On March 26th it was announced that Yakov Sinai, a mathematician at Princeton University and the Landau Institute for Theoretical Physics, had won the 2014 Abel Prize. The Abel prize was established in 2001 by the government of Norway and was first given 2003. Unlike the more famous Fields Medal, which (in)famously can only be granted to those under the age of forty, the Abel prize recognizes an individual for the breadth and depth of their entire career. It has quickly become the highest award one can earn in mathematics. Indeed, the list of prizewinners over the past ten years reads like a who's who of influential mathematicians.
Dr. Sinai won the prize "for his fundamental contributions to dynamical systems, ergodic theory, and mathematical physics". Fortunately, I'm completely unqualified to tell you about Dr. Sinai's work. I say fortunately because Jordan Ellenberg already does an excellent job explaining Dr. Sinai's work in layman's terms as part of the announcement of the winner. You can watch the video here. Dr. Ellenberg gives a very nice twenty-minute overview of Dr. Sinai's work starting at the nine minute mark. Highly recommended!
I also say fortunately because it gives me the excuse to tell you about some cool math. A big part of Dr. Sinai's work is in the area of "Dynamical Systems." This is a rare case where the name of a mathematical discipline actually tells you what the field is all about. Simply put, researchers in dynamical systems are interested in studying how a given system changes over time. The artist Tristan Perich explores the same territory by examining the upredictable dynamics of using computer code to draw in an unsheltered environment.
This is the sort of math you would be interested in if you want to model and predict the weather, the climate, the stock market, the reaction in the combustion chamber of an engine or in a nuclear explosion, etc. Of course these are all wildly difficult problems. Even with all our modern computing power it's hard to make progress. So here we'll instead think about much, much simpler examples which still exhibit some of the same interesting phenomena.
by Quinn O'Neill
It is a widely held view that women are more emotional than men, and some argue that this makes them unsuitable for positions that demand important, cool-headed decision making. The argument often rears its head in discussions about women in politics - particularly as prospective presidents - and I've heard it asserted by both males and females.
The claim that women are more emotional should immediately raise the question of what we mean by emotional. Perhaps we're referring to the intensity at which one experiences an emotion. It's quite possible that women do feel emotion more intensely but this would be difficult to establish with certainty. Emotions are subjective in nature, as are individuals' ratings of the their intensity. Would two people experiencing the same emotion at the same intensity necessarily rate it similarly? It's hard to say.
Alternatively, we might equate emotionality with emotional demonstrativeness. In this sense, a person crying at a sad movie would be deemed more emotional than his or her dry-eyed companion, even if both are feeling equally sad. In this context, one might guess that women are indeed more emotional than men. It seems to me, at least, that they are more likely to cry when watching a sad movie, and more likely to cry in public for other reasons as well. It's important to consider, however, that social norms and expectations differ for men and women when it comes to crying, with it generally being more acceptable for females. If crying were equally acceptable for both sexes, would women still cry more often? Maybe. Maybe not.
It may also be the case that media portrayals of men and women distort our views on gender and crying. In the political domain, Hillary Clinton's tears seemed to garner a lot more media attention - particularly of the negative variety - than those of George Bush junior or senior, Barack Obama, or Joe Biden. Jessica Wakeman, writing for FAIR, detailed the sexist media portrayal of Clinton's emotional display.
Whether we equate emotionality with the intensity of the experience or with demonstrativeness, there's a wide array of emotions to consider aside from sadness. What about anger? When angry, which sex is more likely to punch walls or other people? The vast majority of violent crime is committed by men, and while all incidents may not result from emotions getting the upper hand, I'd guess that a large proportion does. Violent crime certainly isn't the result of the kind of rational, level-headed decision-making we expect of good leaders.
by Jalees Rehman
Geteiltes Leid ist halbes Leid ("Shared sorrow is half the sorrow") is a popular German proverb which refers to the importance of sharing bad news and troubling experiences with others. The therapeutic process of sharing takes on many different forms: we may take comfort in the fact that others have experienced similar forms of sorrow, we are often reassured by the empathy and encouragement we receive from friends, and even the mere process of narrating the details of what is troubling us can be beneficial. Finding an attentive audience that is willing to listen to our troubles is not always easy. In a highly mobile, globalized world, some of our best friends may be located thousands of kilometers away, unable to meet face-to-face. The omnipresence of social media networks may provide a solution. We are now able to stay in touch with hundreds of friends and family members, and commiserate with them. But are people as receptive to sorrow shared via Facebook as they are in face-to-face contacts?
A team of researchers headed by Dr. Andrew High at the University of Iowa recently investigated this question and published their findings in the article "Misery rarely gets company: The influence of emotional bandwidth on supportive communication on Facebook". The researchers created three distinct Facebook profiles of a fictitious person named Sara Thomas who had just experienced a break-up. The three profiles were identical in all respects except for how much information was conveyed about the recent (fictitious) break-up. In their article, High and colleagues use the expression "emotional bandwidth" to describe the extent of emotions conveyed in the Facebook profile.
In the low bandwidth scenario, the profile contained the following status update:
"sad and depressed:("
The medium bandwidth profile included a change in relationship status to "single" in the timeline, in addition to the low bandwidth profile update "sad and depressed:(".
Finally, the high emotional bandwidth profile not only contained the updates of the low and medium bandwidth profiles, but also included a picture of a crying woman (the other two profiles had no photo, just the standard Facebook shadow image).
The researchers then surveyed 84 undergraduate students (enrolled in communications courses, average age 20, 53% female) and presented them with screenshots of one of the three profiles.
Laying duality on the world,
cleaving philosophers’ minds,
inspiring theologians to settle scores,
he undoes the unity of chaos
splitting it to bits like chips
to feed the dogs of wars
Reaching down, this buff, man-like self
curiously in his prime
with old head coiffed white
raked by wind gusting furiously
through heaven’s open door,
Urizen bends to scribe a zero with his compass,
leaving nothing out, including all
From his plush but sanguinary perch
He loads the dark with That and This
There and Here, Was and Is, tendering to Man
the dubious consciousness of Bliss,
propping all its characters to fall
by Jim Culleny
Graphic: Ancient of Days, by William Blake
"I beheld till the thrones were cast down, and the Ancient of days did sit, whose garment was white as snow, and the hair of his head like the pure wool: his throne was like the fiery flame, and his wheels as burning fire." —Daniel 7:9