Saturday, February 22, 2014
Joshua Rothman in The New Yorker:
A few years ago, when I was a graduate student in English, I presented a paper at my department’s American Literature Colloquium. (A colloquium is a sort of writing workshop for graduate students.) The essay was about Thomas Kuhn, the historian of science. Kuhn had coined the term “paradigm shift,” and I described how this phrase had been used and abused, much to Kuhn’s dismay, by postmodern insurrectionists and nonsensical self-help gurus. People seemed to like the essay, but they were also uneasy about it. “I don’t think you’ll be able to publish this in an academic journal,” someone said. He thought it was more like something you’d read in a magazine.
Was that a compliment, a dismissal, or both? It’s hard to say. Academic writing is a fraught and mysterious thing. If you’re an academic in a writerly discipline, such as history, English, philosophy, or political science, the most important part of your work—practically and spiritually—is writing. Many academics think of themselves, correctly, as writers. And yet a successful piece of academic prose is rarely judged so by “ordinary” standards. Ordinary writing—the kind you read for fun—seeks to delight (and, sometimes, to delight and instruct). Academic writing has a more ambiguous mission. It’s supposed to be dry but also clever; faceless but also persuasive; clear but also completist. Its deepest ambiguity has to do with audience. Academic prose is, ideally, impersonal, written by one disinterested mind for other equally disinterested minds. But, because it’s intended for a very small audience of hyper-knowledgable, mutually acquainted specialists, it’s actually among the most personal writing there is. If journalists sound friendly, that’s because they’re writing for strangers. With academics, it’s the reverse.
“Writing poetry is an unnatural act,” Elizabeth Bishop once wrote. “It takes skill to make it seem natural.” The thought is kin to the one John Keats expressed in an 1818 letter to his friend John Taylor: “If Poetry comes not as naturally as the Leaves to a tree it had better not come at all.” Bishop and Keats both evoked a double sense of “natural”: that which is concerned with nature, with landscape, flora and fauna, and that which is unforced and fluent. In both senses, Derek Walcott is a natural poet.
Walcott, who turned 84 this year, began writing young. His first poem appeared in a local paper when he was 14, and his first volume, “25 Poems,” was self-published when he was 18. “Everyone wants a prodigy to fail,” Rita Dove wrote. “It makes our mediocrity more bearable.” Walcott did not fail. His early poems were expert, and even though they bore traces of his apprenticeship to the English tradition (in particular W. H. Auden and Dylan Thomas), they were to prove thematically characteristic. Right from the beginning, he was keen to use European poetic form to testify to the Caribbean experience. This commitment made him a part of the boom in 20th-century Caribbean literature, a gathering of talents that included Édouard Glissant, Patrick Chamoiseau, Aimé Césaire and Maryse Condé on the French-speaking side; and Samuel Selvon, George Lamming and C. L. R. James from the English-speaking islands, as well as the Trinidad-born V. S. Naipaul, with whom Walcott was one of the Caribbean’s two Nobel Prize winners for literature.
An astonishing 20 years have passed since Richard Dorment and Margaret MacDonald’s exhibition of James McNeill Whistler at London’s Tate Gallery, and yet the thrill of the work – the remarkable “Nocturnes” of the Thames, the portraits, the nude drawings, the lithographs and etchings – is as fresh in my mind as if it had been yesterday.
It is more than time for a new biography of this great artist, American by birth but for most of his life a European, who left behind such a range of beautiful artworks and who was also, in the judgment of the poet Algernon Charles Swinburne, “a little viper”. Daniel Sutherland, a professor of history at the University of Arkansas, has given us a warts-and-all portrait of Whistler, the man, the work and his times.
The father was an engineer, who was hired by the tsar of Russia to build the Moscow to St Petersburg railway. He died young and his widow Anna, destined to be the most famous artist’s model since the Mona Lisa, took the family back to America for a short spell. Anna’s tiny firstborn son James, after education at private school in England, was placed at the West Point military academy.
Daljit Nagra in The Guardian:
Our people have many lice in their clothes, and they bite terribly. They are worse than a rifle bullet. But there are no mosquitoes or other creatures which bite mankind, and no snakes or scorpions at all." This extract is from a letter by an Indian soldier in 1915. He is in France and writing home to a friend. The letter comes from a collection of correspondence copied by British military censors, revealing the experiences of the many Indian soldiers who fought in the first world war, that has just been digitised by the British Library. The collection also contains the censors' summaries of the letters, revealing their concerns.
By the time of the Armistice, India had provided more than 1.27 million men. The Indian army at this time was drawn mainly from the middle peasantry, recruited from the north and north-west of India partly on account of the "martial races" theory of the British which suggested that some races or castes were inherently more warlike than others. Most Indian soldiers in France were Punjabi Muslims and Sikhs. My family are Sikhs from Punjab, and my maternal grandfather served in the Indian army in the 1930s. I don't know if my family had any further involvement in the first world war, but, for me, these letters provide a valuable link to the history of my ancestors and their positive involvement with the empire.
The swan will waddle in
from its easy shadow
Leave its tear-shape imprinted
on the crushed rushes.
The butterfly will souse
its bridal-price of colour
It will fold its wings
on a weighed-down stalk.
A crisp froth will break
from the face of the water
only wisps will be left
snagged on its border.
And your body will finish
its graceful dance
vigour will pass
from your satin limbs.
You’ll have no defence
except the warrior reeds
dark grey spear heads
a living shield.
by Aifric Mac Aodha
from Gabháil Syrinx
publisher: An Sagart, Dublin, 2010
translation by author
Aminatta Forna in The Independent:
One morning in December, Aaliyah Sobhi, a 72-year-old resident of Beirut, misreads the label on a shampoo bottle and dyes her hair bright blue. With this accidental act, so begins Rabih Alameddine’s gorgeous fourth novel, An Unnecessary Woman, the story of a life lived in a city at war. The story’s conceit is simple and unusual. Aaliyah, who has spent her life working in a moderately unsuccessful bookstore in Beirut, passes her retirement translating works of fiction into classical Arabic. On the first of January each year she chooses her book. Last year it was WG Sebald’s Austerlitz; this year she ponders tackling Chilean Roberto Bolano’s mammoth 2666. For reasons of her own she never translates from English or French. When the translation is complete she does not send it to a publisher but stores it in the unused maid’s room at the back of her apartment, along with the translations of the books that went before. So far there are 57. Aaliyah is alone. There is an impotent ex-husband, who did not love her and whom she declined to love in turn. There is a father who died. There is an avaricious and ailing mother, who favoured the sons of her second marriage. Aaliyah’s step-brothers, equally avaricious, all want to get their hands on Aaliyah’s apartment. There are neighbours: husbandless women too, who meet for coffee each morning. From her flat below Aaliyah listens to their conversations and assiduously avoids the possibility of an encounter.
...An Unnecessary Woman is a story of innumerable things. It is a tale of blue hair and the war of attrition that comes with age, of loneliness and grief, most of all of resilience, of the courage it takes to survive, stay sane and continue to see beauty. Read it once, read it twice, read other books for a decade or so, and then pick it up and read it anew. This one’s a keeper.
Brown v. Board of Education (1954), now acknowledged as one of the greatest Supreme Court decisions of the 20th century, unanimously held that the racial segregation of children in public schools violated the Equal Protection Clause of the Fourteenth Amendment. Although the decision did not succeed in fully desegregating public education in the United States, it put the Constitution on the side of racial equality and galvanized the nascent civil rights movement into a full revolution. In 1954, large portions of the United States had racially segregated schools, made legal by Plessy v. Ferguson (1896), which held that segregated public facilities were constitutional so long as the black and white facilities were equal to each other. However, by the mid-twentieth century, civil rights groups set up legal and political, challenges to racial segregation. In the early 1950s, NAACP lawyers brought class action lawsuits on behalf of black schoolchildren and their families in Kansas, South Carolina, Virginia, and Delaware, seeking court orders to compel school districts to let black students attend white public schools.
One of these class actions, Brown v. Board of Education was filed against the Topeka, Kansas school board by representative-plaintiff Oliver Brown, parent of one of the children denied access to Topeka's white schools. Brown claimed that Topeka's racial segregation violated the Constitution's Equal Protection Clause because the city's black and white schools were not equal to each other and never could be. The federal district court dismissed his claim, ruling that the segregated public schools were "substantially" equal enough to be constitutional under the Plessy doctrine. Brown appealed to the Supreme Court, which consolidated and then reviewed all the school segregation actions together. Thurgood Marshall, who would in 1967 be appointed the first black justice of the Court, was chief counsel for the plaintiffs. Thanks to the astute leadership of Chief Justice Earl Warren, the Court spoke in a unanimous decision written by Warren himself. The decision held that racial segregation of children in public schools violated the Equal Protection Clause of the Fourteenth Amendment, which states that "no state shall make or enforce any law which shall ... deny to any person within its jurisdiction the equal protection of the laws." The Court noted that Congress, when drafting the Fourteenth Amendment in the 1860s, did not expressly intend to require integration of public schools. On the other hand, that Amendment did not prohibit integration. In any case, the Court asserted that the Fourteenth Amendment guarantees equal education today. Public education in the 20th century, said the Court, had become an essential component of a citizen's public life, forming the basis of democratic citizenship, normal socialization, and professional training. In this context, any child denied a good education would be unlikely to succeed in life. Where a state, therefore, has undertaken to provide universal education, such education becomes a right that must be afforded equally to both blacks and whites.
More here. (Note: One post throughout February will be dedicated to Black History Month.)
Friday, February 21, 2014
Lawrence Krauss in The New Yorker:
Earlier this month, Ken Ham, the founder of the Creation Museum, in Petersburg, Kentucky, held a debate with Bill Nye at the museum. Within the creationist crowd, Ham represents the young-Earth wing, which believes that the planet is around six thousand years old. He also has other extreme interpretations of biblical claims: for example, he believes that the Tyrannosaurus rex and other dinosaurs were actually vegetarians that lived in the Garden of Eden before the fall of Adam and Eve.
Ham often stresses a line of argument made within the broader creationist community, which resonates, at least somewhat, with the public at large. “There’s experimental or observational science, as we call it. That’s using the scientific method, observation, measurement, experiment, testing,” he said during the debate. “When we’re talking about origins, we’re talking about the past. We’re talking about our origins. You weren’t there, you can’t observe that…. When you’re talking about the past, we like to call that origins or historical science.” In other words, Ham was saying that there is a fundamental difference between what creationists call the “historical sciences”—areas of study, like astronomy, geology, and evolutionary biology, that give us information about the early Earth and the evolution of life—and other sciences, like physics and chemistry, which appear to be based on experiments done in the laboratory today.
On the surface, this does not seem completely unreasonable. There is, after all, a difference between an observation and an experiment. In the laboratory, one can have much better control when attempting to establish cause-and-effect relationships. However, to suggest that somehow this qualitative difference between observation and experiment translates into any sort of deep qualitative difference between the different sciences mentioned above is to demonstrate a fundamental misunderstanding of the nature of science itself.
Anis Shivani in the Huffington Post:
I have no audience of actual contemporary readers in mind when I write anything, fiction, poetry, or criticism. I suppose if there is an abstract audience in mind it is an audience of the future, distant in time but not too distant, perhaps a hundred years away, when there might still be enough similarity with the present that they would be able to understand what I'm saying but not so distant that they can't comprehend what it's all about.
I'm not saying that this specific audience is what I always have in mind, but if pressed for an answer about audience, perhaps something like this would be my guess. I write inspired by writers I hold in high regard, primarily the high modernists, so you could say that they are my demanding audience: Am I keeping faith with the tradition? Can I hold my head high against the effort they've already made? I think if I wrote with any actual present audience in mind, whether one person or a larger number, I would be instantly lost, my writing would lose all value.
Writing for an audience really means writing for approval, and that is fine, everyone has that need, except the sources one is seeking approval from better be the highest authorities, who can also be contemporary writers. What if, say, Franz Wright, a poet I much admire, were reading my poetry, would he think it legitimate? Or if I write fiction, would it pass muster with Orhan Pamuk or J. M. Coetzee? If not, it's probably false.
Kevin Fong in Wired:
In our daily lives, gravity is that pedestrian physical force that keeps us glued to the ground. You have to go out of your way — climb a cliff face or jump out of a plane — before it starts demanding your attention.
But we are constantly sensing the effects of gravity and working against them, largely unconsciously.
Without the quadriceps, buttocks, calves, and erector spinae that surround the spinal column and keep it standing tall, the pull of gravity would collapse the human body into a fetal ball and leave it curled close to the floor. These muscle groups are sculpted by the force of gravity, in a state of constant exercise, perpetually loaded and unloaded as we go about our daily lives. That’s why the mass of flesh that constitutes the bulk of our thighs and works to extend and straighten the knee are the fastest-wasting group in the body.
In experiments that charted the changes in the quadriceps of rats flown in space, more than a third of the total muscle bulk was lost within nine days.
Our bones, too, are shaped by the force of gravity. We tend to think of our skeleton as pretty inert — little more than a scaffold on which to hang the flesh or a system of biological armor. But at the microscopic level, it is far more dynamic: constantly altering its structure to contend with the gravitational forces it experiences, weaving itself an architecture that best protects the bone from strain. Deprived of gravitational load, bones fall prey to a kind of space-flight-induced osteoporosis. And because 99 percent of our body’s calcium is stored in the skeleton, as it wastes away, that calcium finds its way into the bloodstream, causing yet more problems from constipation to renal stones to psychotic depression.
Medical students remember this list as: “bones, stones, abdominal groans, and psychic moans”.
Doo Won Kang in the Bulletin of the Atomic Scientists:
Why ammonia? An ammonia molecule is composed of one nitrogen atom and three hydrogen atoms. Ammonia can be burned in internal combustion engines with minor modifications — emitting only nitrogen and water vapor from the tailpipe, even when only low-cost emissions controls are used. Unburned ammonia and nitrogen oxides in the engine’s exhaust would be removed by a selective catalyst reduction system. Ammonia can be produced, at an affordable cost, by a catalytic reaction between nitrogen (obtained from air, which is 78 percent nitrogen) and hydrogen (obtained by splitting water molecules into hydrogen and oxygen).
Ammonia-fueled vehicles operate in much the same way as gasoline-fueled vehicles: Liquid ammonia is burned with oxygen, producing energy that is harnessed to drive the vehicle’s wheels. This familiar technology means that ammonia-fueled vehicles can generally be built and maintained in the same way as the current vehicle fleet. But unlike conventionally fueled vehicles, ammonia-powered cars would not emit carbon dioxide.
Most cars on the road can run on a mixture of 90 percent gasoline and 10 percent liquid ammonia, and could be modified to run on a mixture of up to 80 percent ammonia—at a cost of $1,000 to $5,000 per vehicle. An engine that could run entirely on ammonia is currently under development.
Because I’m a skeptical person, I tend to grow suspicious of the things I love. Which is why I’m wondering whether a half-century after the Beatles landed at JFK, it might be time to give them a rest. The demographic cohorts following my own will never attain our heights of Fab Four worship. And indeed, their impatience with Boomer culture strikes me as completely reasonable. The giants of that era keep sucking the air out of the room, don’t they? Imagine being told that guys in their seventies are still better than anything your own generation can produce. Imagine being peddled, year in and year out, a rosy View-Master panorama of that departed age, like Periclean Athens plus paisley and blotter acid.
No wonder a kind of Beatles fatigue has set in. Let the contemporary idols — Kanye West, Lady Gaga, Radiohead, Taylor Swift — have their turn. Whether their music will still sound as fresh fifty years hence is anybody’s guess. But you could make the same argument about, say, Monteverdi, whose operatic masterpiece L’incoronazione di Poppea went into a three-century-long hibernation after his death, only to find popular success right around the time the Beatles were recording “I Want to Hold Your Hand.”
Meanwhile, I’ll admit that my hoard of memorabilia functions quite literally as a fetish object: a magical means of accomplishing an impossible end, which is stopping time.
The more I read, the more it seems a complete investment of one’s entire being is a necessity for greatness in the arts. Even to speak of greatness in our time invites derision. Who needs greatness when you can have tenure? Yet we’ve all seen it, haven’t we? Not in our contemporaries, the blur of smaller talents, but in the dead. Generalizations never stand up to scrutiny, but I will risk a few. Most contemporary poets I read seem too concerned with avoiding ridicule, trying to be the smartest kid in the workshop, rather than plumbing what Eliot called “the inexplicable mystery of sound”—bodying forth a whole charged expression of living. Much of our poetry seems denatured, flat. Intelligence abounds, cleverness is everywhere, but vitality is hard to find.
One experiment I frequently conduct is to open a contemporary journal and read only the first lines of poems. Usually the exercise proves soporific in the extreme. No novelist worth his salt would assume he deserved to be read without grabbing the reader by the throat, yet our poets are so often complacent, too comfortable in the expectation that someone will read them, even if only assigned to do so in a classroom.
If you’re like me, the Olympics have borne in you one mighty, overriding desire: to become a strapping world-class professional figure-skater. Well, we’re in luck, every one of us. Thanks to the glut of teaching materials available in the public domain, dazzling one’s peers in the rink and taking home the gold has never been easier.
As a starting point, consult an invaluable volume from 1897: T. Maxwell Witham’s A System of Figure-Skating: Being the Theory and Practice of the Art as Developed in England, with a Glance at its Origin and History. In sporting matters, Witham was no slouch—the title page notes that he was a “Member of The Skating Club.” Which skating club, you ask? Well, let me answer your question with a question: How many skating clubs do you belong to?
With verve and good humor, A System of Figure-Skating will teach you such cherished and essential maneuvers as “the Jagendorf dance,” “the Mercury scud,” “the spread-eagle grape vine,” “the sideways attitude of edges,” and—of course—the “United Shamrock.” Confused? You needn’t be. The System offers detailed instructions every step of the way.
A team of Bio-X researchers at Stanford has developed mice whose sensitivity to pain can be dialed up or down simply by shining light on their paws. The research could help scientists understand and eventually treat chronic pain in humans. The mice in Scott Delp’s lab, unlike their human counterparts, can get pain relief from the glow of a yellow light. “This is an entirely new approach to study a huge public health issue,” Delp said. “It’s a completely new tool that is now available to neuroscientists everywhere.” He is the senior author of a research paper published Feb. 16 in Nature Biotechnology. The mice are modified with gene therapy to have pain-sensing nerves that can be controlled by light. One color of light makes the mice more sensitive to pain. Another reduces pain. Increasing or decreasing the sensation of pain in these mice could help scientists understand why pain seems to continue in people after an injury has healed. Does persistent pain change those nerves in some way? And if so, how can they be changed back to a state where, in the absence of an injury, they stop sending searing messages of pain to the brain?
The researchers took advantage of a technique called optogenetics, which involves light-sensitive proteins called opsins that are inserted into the nerves. Optogenetics was developed by a colleague of Delp, Karl Deisseroth, a co-author of the journal article. He has used the technique as a way of activating precise regions of the brain to better understand how the brain functions. Deisseroth is a professor of bioengineering, psychiatry and behavioral sciences. Delp, who has an interest in muscles and movement, saw the potential for using optogenetics for studying the many nerves outside the brain. These are the nerves that control movement, pain, touch and other sensations throughout our body and that are involved in diseases like amyotrophic lateral sclerosis (ALS), also known as Lou Gehrig’s Disease.
Ruby Bridges Hall in NPR:
There were barricades and people shouting and policemen everywhere. As we walked through the crowd, I didn't see any faces. I guess that's because I wasn't very tall and I was surrounded by the marshals. People yelled and threw things. I could see the school building, and it looked bigger and nicer than my old school. When we climbed the high steps to the front door, there were policemen in uniforms at the top. The policemen at the door and the crowd behind us made me think this was an important place.All day long, white parents rushed into the office. They were upset. They were arguing and pointing at us. When they took their children to school that morning, the parents hadn't been sure whether William Frantz would be integrated that day or not. After my mother and I arrived, they ran into classrooms and dragged their children out of school. From behind the windows in the office, all I saw was confusion. I told myself that this must be the way it is in a big school.
That whole first day, my mother and I just sat and waited. We didn't talk to anybody. I remember watching a big, round clock on the wall. When it was 3:00 and time to go home, I was glad. When we left school that first day, the crowd outside was even bigger and louder than it had been in the morning. There were reporters everywhere. I guess the police couldn't keep them behind the barricades. It seemed to take us along time to get to the marshals' car. Later on I learned there had been protestors in front of the two integrated schools the whole day. They wanted to be sure white parents would boycott the school and not let their children attend. Groups of high school boys, joining the protestors, paraded up and down the street and sang new verses to old hymns. Their favorite was "Battle Hymn of the Republic," in which they changed the chorus to "Glory, glory, segregation, the South will rise again." Many of the boys carried signs and said awful things, but most of all I remember seeing a black doll in a coffin, which frightened me more than anything else.
MY FIRST WHITE TEACHER
On the second day, my mother and I drove to school with the marshals. The crowd outside the building was ready. Racists spat at us and shouted things like "Go home, nigger," and "No niggers allowed here." One woman screamed at me, "I'm going to poison you. I'll find a way." She made the same threat every morning. I tried not to pay attention. When we finally got into the building, my new teacher was there to meet us. Her name was Mrs. Henry. Mrs. Henry took us into a classroom and said to have a seat. When I looked around, the room was empty. There were rows of desks, but no children. I thought we were too early, but Mrs. Henry said we were right on time. My mother sat down at the back of the room. I took a seat up front, and Mrs. Henry began to teach. I spent the whole first day with Mrs. Henry in the classroom. I wasn't allowed to have lunch in the cafeteria or go outside for recess, so we just stayed in our room. The marshals sat outside. If I had to go to the bathroom, the marshals walked me down the hall.
Picture: On July 15, 2011, Bridges met with President Barack Obama at the White House, and while viewing the Norman Rockwell painting of her on display he told her, "I think it's fair to say that if it hadn't been for you guys, I might not be here and we wouldn't be looking at this together." On May 19, 2012, Bridges Hall received an Honorary Degree from Tulane University at the annual graduation ceremony at the Mercedes-Benz Superdome.
More here. (Note: One post throughout February will be dedicated to Black History Month.)
My father’s in my fingers, but my mother’s in my palms.
I lift them up and look at them with pleasure –
I know my parents made me by my hands.
They may have been repelled to separate lands,
to separate hemispheres, may sleep with other lovers,
but in me they touch where fingers link to palms.
With nothing left of their togetherness but friends
who quarry for their image by a river,
at least I know their marriage by my hands.
I shape a chapel where a steeple stands.
And when I turn it over,
my father’s by my fingers, my mother’s by my palms
demure before a priest reciting psalms.
My body is their marriage register.
I re-enact their wedding with my hands.
So take me with you, take up the skin’s demands
for mirroring in bodies of the future.
I’ll bequeath my fingers, if you bequeath your palms.
We know our parents make us by our hands.
by Sinead Morrissey
from The State of the Prisons
publisher: Carcanet, Manchester, 2005
Thursday, February 20, 2014
Tom Whipple in More Intelligent Life:
An algorithm, at its most basic, is not a mysterious sciencey bit at all; it is simply a decision-making process. It is a flow chart, a computer program that can stretch to pages of code or is as simple as "If x is greater than y, then choose z".
What has changed is what algorithms are doing. The first algorithm was created in the ninth century by the Arabic scholar Al Khwarizami—from whose name the word is a corruption. Ever since, they have been mechanistic, rational procedures that interact with mechanistic, rational systems. Today, though, they are beginning to interact with humans. The advantage is obvious. Drawing in more data than any human ever could, they spot correlations that no human would. The drawbacks are only slowly becoming apparent.
Continue your journey into central London, and the estates give way to terraced houses divided into flats. Every year these streets inhale thousands of young professional singles. In the years to come, they will be gently exhaled: gaining partners and babies and dogs, they will migrate to the suburbs. But before that happens, they go to dinner parties and browse dating websites in search of that spark—the indefinable chemistry that tells them they have found The One.
And here again they run into an algorithm. The leading dating sites use mathematical formulae and computations to sort their users’ profiles into pairs, and let the magic take its probabilistically predicted course.
Not long after crossing the river, your train will pass the server farms of the Square Mile—banks of computers sited close to the fibre-optic cables, giving tiny headstarts on trades. Within are stored secret lines of code worth billions of pounds. A decade ago computer trading was an oddity; today a third of all deals in the City of London are executed automatically by algorithms, and in New York the figure is over half. Maybe, these codes tell you, if fewer people buy bananas at the same time as more buy gas, you should sell steel. No matter if you don’t know why; sell sell sell. In nanoseconds a trade is made, in milliseconds the market moves. And, when it all goes wrong, it goes wrong faster than it takes a human trader to turn his or her head to look at the unexpectedly red numbers on the screen.
Jonathan Berger in Nautilus (illustration by Sterling Hundley):
Neuroscience gives us insights into how music creates an alternate temporal universe. During periods of intense perceptual engagement, such as being enraptured by music, activity in the prefrontal cortex, which generally focuses on introspection, shuts down. The sensory cortex becomes the focal area of processing and the “self-related” cortex essentially switches off. As neuroscientist Ilan Goldberg describes, “the term ‘losing yourself’ receives here a clear neuronal correlate.” Rather than enabling perceptual awareness, the role of the self-related prefrontal cortex is reflective, evaluating the significance of the music to the self. However, during intense moments, when time seems to stop, or rather, not exist at all, a selfless, Zen-like state can occur.
While the sublime sense of being lost in time is relatively rare, the distortion of perceived time is commonplace and routine. Broadly speaking, the brain processes timespans in two ways, one in which an explicit estimate is made regarding the duration of a particular stimulus—perhaps a sound or an ephemeral image—and the second, involving the implicit timespan between stimuli. These processes involve both memory and attention, which modulate the perception of time passing, depending upon how occupied or stimulated we are. Hence time can “fly” when we are occupied, or seem to stand still when we are waiting for the water in the kettle to boil. Unlike the literal loss of “self” that occurs during intense perceptual engagement, the subjective perception of elongated or compressed time is related to self-referential processing. An object—whether image or sound—moving toward you is perceived as longer in duration than the same object that is not moving, or that is receding from you. A looming or receding object triggers increased activation in the anterior insula and anterior cingulate cortices—areas important for subjective awareness.
The directionality of musical melody and gesture evoke similar percepts of temporal dilation. The goal-oriented nature of music provides a framework in which a sense of motion is transposed to sonic structures, and the sensation of “looming” and “receding” can be simulated independently of relative spatial orientation. The subjectivity of time perception can be grounding and self-affirming—a source of great pleasure, or, conversely, able to create a state of disassociation with one’s self—a state of transcendence.
Susan Schneider interviewed by Richard Marshall in 3:AM Magazine:
3:AM: You’ve been bold in asserting that the Fodorian Language of Thought program and the related computational theory of mind theory have three major problems that unless solved renders them obsolete. Before saying what these problems are can you sketch out the theories and what they’re supposed to be explaining?
SS: The computational paradigm in cognitive science aims to provide a complete scientific account of our mental lives, from the mechanisms underlying our memory and attention to the computations of the singular neuron. The Language of Thought program (LOT) is one of two leading positions on the computational nature of thought, the other being a neural network based approach advanced by (inter alia) philosophers Paul and Patricia Churchland.
According to LOT, humans and even non-human animals think in a lingua mentis, an inner mental language that is not equivalent to any natural language. This mental language is computational in the sense that thinking is regarded as the algorithmic manipulation of mental symbols, where the ultimate algorithm is to be specified by research in the different fields of cognitive science. The “Computational Theory of Mind” holds that part or all of the brain is computational in this algorithmic sense. In my book on LOT, I urged that both approaches are insightful; the brain is probably a hybrid system — being both a symbol processing engine, and having neural networks. In particular, deliberative, conscious thought is symbolic, but it is implemented by neural networks.
3:AM: The problems are about computationality, symbols and Frege aren’t they. Can you say what’s wrong?
SS: Sure. Several problems have plagued the LOT approach for years: First, LOT’s chief philosophical architect, Jerry Fodor, has argued the cognitive mind is likely non-computational. Fodor calls the system responsible for our ability to integrate material across sensory divides and generate complex, creative thoughts “the central system.” Believe it or not, Fodor holds that the brain’s “central system” will likely defy computational explanation. One of his longstanding worries is that the computations in the central system are not feasibly computed within real time. For if the mind truly is computational in a classical sense, when one makes a decision one would never be able to determine what is relevant to what. For the central system would need to walk through every belief in its database, asking if each item was relevant. Fodor concludes from this that the central system is likely non-computational. Shockingly, he recommends that cognitive science stop working on cognition.
Hannah Proctor in The New Inquiry:
“She had suffered an acute attack of ‘love’- the name given to a disease of ancient times when sexual energy, which should be rationally distributed over one’s entire lifetime, is suddenly concentrated into one inflammation lasting a week, leading to absurd and incredible behavior.” —Vladimir Mayakovsky, The Bedbug
In summer 1956, six tons of books were thrown by court order into the public incinerator on 25th Street in New York City. Those smouldering pages were written by Wilhelm Reich, who died in jail shortly thereafter, infamously denounced as the fraudulent peddler of “orgone,” a mystical cosmic life force. As a young communist psychoanalyst in interwar Vienna, Reich had argued that capitalism unhealthily restrains primal sexual instincts, and that a genuine political revolution would shatter the constraints of bourgeois sexual morality, unleashing sexual energies through a kind of wild orgasmic release.
In 1929, Reich visited the Soviet Union, where psychoanalysis would soon be outlawed, and was rather scathing of the psychologists he met there, including one of his hosts, Aron Zalkind, a leading figure in the psychological community in Moscow. Zalkind was the author of the influential treatise “12 Commandments for the Sexual Revolution of the Proletariat,” first published in 1925, which argued that the capitalist free market was incompatible with what he somewhat confusingly called “free love,” given that he meant something like the opposite of what it meant in the 1960s. Unlike Reich, whose prurient embrace of unrestrained lovemaking was to be enthusiastically championed during the “sexual revolution” of the 1960s, Zalkind advocated sexual abstinence as the appropriate conduct for the revolutionary proletariat.
During the period of the New Economic Policy (1921–1928), which saw the reintroduction of certain forms of private enterprise into the Soviet economy, sexual relations were being renegotiated for both ideological and practical reasons. As the heroine of Feodor Gladkov’s 1925 novel Cementobserves: “Everything is broken up and changed and become confused. Somehow love will have to be arranged differently.” But how exactly love was to be arranged was unclear.