Tuesday, May 24, 2016
Jacob Harris in The Atlantic:
One of the joys of modern technology is how easy it is to immerse yourself in the past. Every day, more libraries and archives are pushing pieces of their collections online in easily browsable interfaces.
The New York Public Library, for instance, has historic menus and interactive floor plans. Chronicling America is a searchable repository of newspapers published between 1836 and 1922 from the Library of Congress, which is also one of the many institutions in the Flickr Commons public image archive. Wikipedia has its own Wikimedia Commons, to which anybody can upload images and videos. Project Gutenberg continues to add new public-domain books to its collection every day, and New York’s Metropolitan Museum of Art has posted thousands of images online with metadata as part of its Open Access for Scholarly Collections initiative.
My personal favorite however is TimesMachine, a site available to all New York Times subscribers that lets readers virtually flip through any historical issue of The New York Times all the way up through 2002. The site delivers the reader directly to the past, making you feel like a cross between a tourist and an archaeologist. You might start by visiting a historic event—say, coverage of the Titanic sinking—but the real fun is wandering off the beaten path and exploring all the other news of the day. On the same day the Titanic sank, there was also coverage of a gun battle in Greenwich Village, and a passenger lost in a runaway balloon. On any day, such vignettes sometimes become rabbit holes to the past.
This is the story of how I ended up captivated by a chance encounter with a 135-year-old newspaper advertisement—and how the random face staring back at me from the archives would reveal the surprising origins of ASCII art, a graphic design technique that’s usually associated with 20th-century computer art.
Alex Wellerstein in The New Yorker:
The demonstration began on the afternoon of May 21, 1946, at a secret laboratory tucked into a canyon some three miles from Los Alamos, New Mexico, the birthplace of the atom bomb. Louis Slotin, a Canadian physicist, was showing his colleagues how to bring the exposed core of a nuclear weapon nearly to the point of criticality, a tricky operation known as “tickling the dragon’s tail.” The core, sitting by itself on a squat table, looked unremarkable—a hemisphere of dull metal with a nub of plutonium sticking out of its center, the whole thing warm to the touch because of its radioactivity. It had been quickly molded into shape after the bombing of Nagasaki, to be used in another attack on Japan, then reallocated when it turned out not to be needed for the war effort. At that time, Slotin was perhaps the world’s foremost expert on handling dangerous quantities of plutonium. He had helped assemble the first atomic weapon, barely a year earlier, and a contemporary photograph shows him standing beside its innards with his shirt unbuttoned and sunglasses on, cool and collected. Back then, the bomb was a handmade, artisanal product.
Slotin’s procedure was simple. He would lower a half-shell of beryllium, called the tamper, over the core, stopping just before it was snugly seated. The tamper would reflect back the neutrons that were shooting off the plutonium, jump-starting a weak and short-lived nuclear chain reaction, on which the physicists could then gather data. Slotin held the tamper in his left hand. In his right hand, he held a long screwdriver, which he planned to wedge between the two components, keeping them apart. As he began the slow and painstaking process of lowering the tamper, one of his colleagues, Raemer Schreiber, turned away to focus on other work, expecting that the experiment would be uninteresting until several more moments had passed. But suddenly he heard a sound behind him: Slotin’s screwdriver had slipped, and the tamper had dropped fully over the core. When Schreiber turned around, he saw a flash of blue light and felt a wave of heat on his face.
Edward Docx in Prospect:
Dylan turns 75 on 24th May. For millions of devotees like myself—many of whom consider him the world’s greatest living artist—it is a moment of celebration tinged with apprehension. Joan Baez, his most significant early anointer-disciple (Joan the Baptist), best expresses what might be described as “the Dylan feeling” in the excellent Martin Scorsese 2005 documentary when she says: “There are no veils, curtains, doors, walls, anything, between what pours out of Bob’s hand on to the page and what is somehow available to the core of people who are believers in him. Some people would say, ‘not interested,’ but if you are interested, he goes way, way deep.” I love this for lots of reasons but most of all because it captures not only the religious devotion that many who love him feel, but also the bemused indifference of the sane and secular who do not.
Of course, the first order of business when writing about Dylan is to urge readers to ignore writers who write about Dylan. We are like Jehovah’s Witnesses, forever tramping door to door with our clumsy bonhomie and earnest smudgy leaflets; in all honesty, you would be much better off seeking out the resonant majesty of the actual work. Indeed, you’ll be relieved—and possibly endeared—to hear that Dylan himself considers his disciples to be deranged. “Why is it when people talk about me they have to go crazy?” Dylan asked in a recent interview for Rolling Stone. “What the fuck is the matter with them?”
Ralph Jones in New Humanist:
Few atheists know the Bible as intimately as Dan Barker. Few, after all, can profess to have begun their careers as fundamentalist Christian preachers. Currently co-president of the Freedom from Religion Foundation, an American non-profit organisation, Barker was a self-proclaimed “extremist” for 19 years, until he renounced the faith. Given how vehemently the 66-year-old now defends a life free of any supernatural authority, I ask him if he regrets the consequences that his Christian ministry may have had on people he would now describe as vulnerable. “Yes, I do regret a lot of it,” he says with candour. “I would counsel people to pray for healing. That’s dangerous. That’s harmful. People die from that. And I acted irresponsibly with my health, because I knew that God was going to take care of me.” This is a window that, once opened, is difficult to close. Barker reels off multiple instances in which he believes that he seriously damaged the lives of his parishioners.
In Arizona, a woman approached him, looking for faith healing to cure her of an illness. The two prayed together and when, inevitably, it did nothing, he said, “Let it be unto you according to your faith” (a reference to a line originally found in Matthew 8:13). “In other words,” Barker says, “it was her fault. She walked out of that meeting not only not healed but feeling chastised. It’s not a kind way to treat another human being.” In his mid-twenties, he counselled a woman who was struggling with an abusive husband. Barker told her to persevere with him because, as the Bible says, he would eventually see the light. “So I counselled a woman to stay in an abusive relationship, because the Bible says that you are married for life.” What would he say if she approached him with the same problem now? “I would tell her to run for the nearest shelter and get out of there.” Barker may have left religion behind but he is still a preacher of sorts. His latest book, God: the Most Unpleasant Character in All Fiction, draws on his knowledge of scripture to attack the Bible’s claim to moral authority.
...I am interested in Barker’s views on Donald Trump, the man taking alarmingly large strides up the escalator of US politics. “It seems to me that there’s an awful lot of shallow support for people like Trump,” he says. The Republican candidate has appealed to the supposed “Christian” character of the US as a way to mobilise prejudice against Muslims. His followers seem to believe that he is a Christian but Barker sees this more as identity politics than evangelism. “He doesn’t know that much about the Bible. He doesn’t speak the Christian lingo.”
Jennifer Hackett in Scientific American:
For most people a single bee or wasp sting is one too many. But University of Arizona entomologist Justin Schmidt is a dramatic exception: By his own estimation he has been stung more than 1,000 times by at least 80 kinds of insects as part of his job. After unintentionally collecting a few different types of stings while conducting fieldwork to investigate the social behavior of stinging insects, Schmidt decided to take a cue from medical science and create a sting pain index that ranked each sting on a scale of 1 to 4 with eloquent, almost poetic descriptions of the pain (or lack thereof) they caused. The scale, Schmidt hoped, would help reveal how the ability to sting—and the type of sting delivered—serve different insects and enable their respective social structures.
In his new book The Sting of the Wild, which came out this week and was published by Johns Hopkins University Press, Schmidt explains the roles of stings in insect society in great detail. He devotes chapters to how different insects inflict their respective flavors of pain, covering creatures from fire ants to tarantula hawk wasps to honeybees. For the first time, Schmidt’s full sting pain index and his thoughts on each experience—including such comments as “like coffee, but oh so bitter” for a low-level sting or “like spilling a beaker of hydrochloric acid on a paper cut” for a higher one—is published at the end of the book. Even though the pain-laced topic might leave you wincing, Schmidt’s engaging and entertaining writing makes for a tale worth reading.
Afternoon in Siena
Soon I will know this room.
It will have become familiar.
Then sometime after I’ve left
they’ll rent it to another writer
or student, a couple on holiday
for a long weekend.
For now I’ll try to fix it in my mind,
this ordinary room with its cold
tile floor without a rug,
the low chair and ugly wardrobe
with its foxed glass,
the shuttered windows that open
onto the narrow street where
in the evening a small dog yaps
and yelps beneath the washing line,
the purple canopy of wisteria.
And in the corner, of course,
the messy bed, where in another life
we might have made love -
the afternoon sun
bathing us in liquid light -
if only I knew who you were.
by Sue Hubbard
Monday, May 23, 2016
by Scott F. Aikin and Robert B. Talisse
Argumentation is the term used to denote the activity of arguing with a real interlocutor, in real time, over claims that are actually in dispute. When argumentation is properly conducted, the parties involved exchange arguments, objections, criticisms, and rejoinders, all aimed at discerning the truth (or at least what one would be most justified in accepting to be true). To be sure, argumentation does not always result in a consensus among disputants; even when argumentation is impeccably conducted, disagreement often persists. But this is no strike against argumentation. This is for a few reasons. First, the open exchange of reasons, evidence, and criticism is, after all, the best means we have for rationally resolving disputes and pursuing the truth. Insofar as we want a rational resolution, this is not only our best means, it's our only means. Furthermore, even when argumentation does not dispel disagreement, it can provide disputants with a firmer grasp of precisely where they differ. So even if argument doesn't yield consensus, it does yield fecundity. And, as John Stuart Mill famously observed, understanding the views of one's critics is an essential element of understanding one's own views.
We have frequently claimed in this column that argumentation comes naturally to human beings. People aspire to form and maintain true beliefs and eschew false beliefs, and the central way in which they enact this aspiration is by arguing with each other. Of course, that people are naturally disposed to engage in argumentation does not entail that people are naturally adept at it. The pitfalls of human reasoning are abundant, and there is rightly a substantial academic industry devoted to identifying, studying, and cataloguing them.
Yet detecting argumentative pitfalls is itself part of the activity of argumentation. When we argue, for sure, we argue about things. And so most argument has all the vocabulary of any other talk about the world. But when we argue, we aren't just looking at the things we are talking about, we are evaluating what we've said as reasons. And so, we must have a vocabulary that doesn't merely track things we are talking about, but it must also track how we've talked about it. That's what it is to assess whether you think someone's reasoning is acceptable or not. The issue isn't always about whether you accept what an interlocutor says, but it's also about how the things they say logically relate to each other.
by Daniel Ranard
Math works pretty well. We can count apples and oranges; we can scribble equations and then launch a rocket that lands gently upright. When an argument is indisputable, we colloquially say "do the math," and we speak of events that will happen with "mathematical certainty." Math works so well that you're forced to wonder: why, and what does it mean about our world? I won't fully answer these questions, but I'll offer a few perspectives.
You don't need to know much math to see it works. Say you go apple-picking with a friend; you count 12 as you pick them, and your friend counts 19 of her own. How many apples are in the basket? Maybe you crunch the numbers on a scrap of paper, just to be sure. You manipulate symbols on a page, and afterward you make a claim about reality: you know how many apples you would count if you pulled them out.
But was that math or just common sense? If you're not impressed by addition, let's try multiplication. I suspect many of us encounter our first real mathematical "theorem" when we learn that A times B is B times A. As Euclid wrote circa 300 BC, "If two numbers multiplied by one another [in different orders] make certain numbers, then the numbers so produced equal one another." This fact may be so familiar you forget its meaning: 4 x 6 = 6 x 4, or rather 6 + 6 + 6 + 6 = 4 + 4 + 4 + 4 + 4 + 4. It may be obvious, but a curious child would still ask, why? The equation demands proof, much like the Pythagorean Theorem. Euclid gave a proof in Book VII, Proposition 16 of the Elements. And though he proved an abstract fact using abstract symbols, the world seems to obey this arithmetic rule: if you have four groups of six apples, Euclid predicts you can always rearrange them into six groups of four.
Maybe it's no surprise we can use arithmetic to make these predictions. But what about the success of more sophisticated math and physics?
"I got my own pure little bangtail mind and
the confines of its binding please me yet."
~ Neal Cassady, letter to Jack Kerouac
One of the curious phenomena that computing in general, and artificial intelligence in particular, has emphasized is our inevitable commitment to metaphor as a way of understanding the world. Actually, it is even more ingrained than that: one could argue that metaphor, quite literally, is our way of being in the world. A mountain may or may not be a mountain before we name it - it may not even be a mountain until we name it (for example, at what point, either temporally or spatially, does it become, or cease to be, a mountain?). But it will inhabit its ‘mountain-ness' whether or not we choose to name it as such. The same goes for microbes, or the mating dance of a bird of paradise. In this sense, the material world existed, in some way or other, prior to our linguistic entrance, and these same things will continue to exist following our exit.
But what of the things that we make? Wouldn't these things somehow be more amenable to a more purely literal description? After all, we made them, so we should be able to say exactly what these things are or do, without having to resort to some external referents. Except we can't. And even more troubling (perhaps) is the fact that the more complex and representative these systems become, the more irrevocably entangled in metaphor do we find ourselves.
In a recent Aeon essay, Robert Epstein briefly guides us through a history of metaphors for how our brains allegedly work. The various models are rather diverse, ranging from hydraulics to mechanics to electricity to "information processing", whatever that is. However, there is a common theme, which I'll state with nearly the force and certainty of a theorem: the brain is really complicated, so take the most complicated thing that we can imagine, whether it is a product of our own ingenuity or not, and make that the model by which we explain the brain. For Epstein - and he is merely recording a fact here - this is why we have been laboring under the metaphor of brain-as-a-computer for the past half-century.
by Jonathan Kujawa
(this is the sequel to last month's 3QD essay on the Pancake Problems)
I frequently come across a rafter of wild turkeys on bike rides through the countryside near my home. This particular group is recognizable thanks to having a peahen as an honorary member. Just this morning I was treated to a startling surprise: the peahen was busily herding a brood of chicks! I would have thought peacocks and turkeys were too distantly related to successfully breed. Apparently nobody told the peahen. I haven't seen any other peacocks in the neighborhood, so it would seem that she is more than friends with one of her turkey buddies. According to the internet, peacock/turkey hybrids (turcocks? peakeys?) are a thing which can happen.
Going by looks and their natural geographic ranges, my wrong guess was that peacocks and turkeys should be pretty distant on the tree of life. In the not-too-distant past, classification of species depended on such observational data.
Nowadays we can dig directly into the DNA to look for answers about relatedness. In the past decade it became possible to sequence the entire DNA of an organism. Not only that, but it's become fast and cheap. In fifteen years we've gone from the Human Genome Project taking thirteen years and $2.7 billion dollars to sequence the human genome to now being able to do it in days for $1,000. The progress in this field puts Moore's Law to shame.
It's one thing to have the data, it's another to put it to use. To deal with the flood of information pouring out of DNA sequencers an entirely new field called computational molecular biology has sprung up. It's a wonderful combination of biology, mathematics, and computer science.
A good example of this is turnips. Looking at them in the garden you might guess that they are more closely related to radishes than cabbage. In the 1980s Jeffrey Palmer and his collaborators looked at the mitochondrial genomes of turnips and cabbage and found that the genes they contained were nearly identical. What was different was the order of those genes . The random mutations which occurred over the years didn't change the genes themselves, only their position in the DNA.
Even better, Palmer and company saw that the kind of rearrangements which occur are of specific kind. When a mutation occurs, what happens is that a segment of DNA consisting of some number of genes is snipped out, flipped around, and put back in, now in reverse order. For example, if genes were the numbers one through five, a typical sequence of mutations might look like:
Here at each step the segment of genes to be snipped out and reversed is indicated with an underline. Because each mutation reverses the order of some of the genes, folks call it a reversal.
S. Abbas Raza. Untitled, 2016.
by Brooks Riley
In the last year, two extraordinary events have indelibly changed the immediate course of history, for better or worse. In an utterly surprising move, Germany, led by Chancellor Angela Merkel, spontaneously accepted over a million refugees, most of them from the war in Syria, only slightly changing the demographic landscape of that rich, stable, mature and responsible democracy, but making a much bigger splash.
This year, for reasons that are still unclear, America's Top Wild Card has all but bagged the Republican nomination.
The two events are unrelated, and yet they serve to make one ponder the nature of nationhood and expectation. The two protagonists of these events could not be more different. So too their nations.
Trump has succeeded in the land of the free-for-all, a place where narcissism is rewarded with undivided attention. Trump has just about won the Republican nomination, not because he's the best man, not because he knows jack all about governing, government, foreign policy or any other policy, not because he's rich, not because he's got a new vision, not because he's promised the moon, not because he wants to help the poor, but because he's loud. He's so loud that we can hear him all the way over here on the other side of the Atlantic.
by Genese Sodikoff
There is the nightmare of fecundity and the nightmare of the multitude. There is the nightmare of uncontrolled bodies and the nightmare of inside our bodies and all over our bodies. There is the nightmare of unguarded orifices and the nightmare of vulnerable places. There is the nightmare of foreign bodies in our bloodstream and the nightmare of foreign bodies in our ears and our eyes and under the surface of our skin.
—Hugh Raffles, Insectopedia
I am writing anthropological stories of zoonosis, disease that spills over from animals to humans and then potentially spreads person-to-person. A zoonosis may erupt into an alarming epidemic (Ebola, HIV/AIDS), or may idle in a reservoir host as an ever-present threat (rabies, Lyme disease, hantavirus). Insects often vector these diseases by sucking up the tainted blood of an animal and injecting it into human skin. Zoonosis can encompass parasitic infections too, such as when larvae afloat in the drinking water or nestled in the litter box penetrate our bodies and mature into worms that make us sick. By some definitions, zoonosis and vector-borne diseases are distinct categories, even though viruses and bacteria introduced by insects into human populations may have originally been lifted from an animal.
Beyond the role of vector, there's another kind of insect that acts more as a disease server. It wears pathogens like foundation, coated with bacteria, viruses, fungi, and larval cysts, as it goes about its business. Chief among these is the cockroach, whose glossy cuticle teams with unwholesome microbes. Since the cockroach does not convey pathogens from vertebrate animals to humans, it does not transmit zoonotic disease, properly understood. Instead it traffics pathogens that are just out there, free floating in the dwellings and detritus of humanity, and deposits them on our food and our wounds. Cockroaches are responsible for introducing Staphylococcus into hospitals and spreading antibiotic resistance bacteria. They sprinkle kitchen counters and cabinets with Salmonella, Shigella, and E. coli. They truck Hepatitis A from sewers into homes. If that isn't enough, their odiferous droppings and sloughed-off skins trigger asthma attacks. The list goes on.
by Olivia Zhu
When I first get to meet Johnny B.A.N.G. Reilly, he looks tired. Really tired, leaning back away from his computer screen with most of his head cocooned tight in a sweatshirt hood. The light is wan, his hood is grey, and his famous voice is at first raspy and subdued. As quiet as he is, though, Reilly speaks in punctuated, verse-like phrases. His responses to my questions seem to arrive as fully formed from his head as do the spoken word, “visual” poems he has become known for.
Chief among these is “Dear Brother,” a spec ad for Johnnie Walker created by two students, Daniel Titz and Dorian Lebherz. Since the video was uploaded half a year ago, it has amassed over four million views and plenty of praise—including some for the haunting poem and voiceover by Reilly.
“Dear Brother” was, in fact, how I learned about Reilly in the first place. He somehow has the ability to sound joyous and heartbroken in the same breath, with words timed so they roll out perfectly at the last possible second to still sound melodic. That perfect rhythm might be attributed to his time as a street dancer, or as a mixed martial artist. Yet “my rhythm comes from what’s actually beating in my chest,” says Reilly. After suffering a heart attack due to a former drug habit, he experienced irregular heartbeats that sped up and slowed down, informing the cadence of his poems. He rushes and pauses and sometimes drops single syllables, leaving them to float amidst longer phrases.
The timing, the gravel-in-the-sun voice—they make Reilly’s work distinct. However, the YouTube video makes it clear the filmmakers who created “Dear Brother” credit themselves, along with Reilly, for the creation of the poem. In the comments, they note that “It was written by voice actor John “Bang” Reilly in collaboration with us.” Reilly disagrees.
‘Home Had Come Here': Connective Dissonance and Split Selves in Leila Aboulela's "The Translator" and Elif Shafak's "Honour"
by Claire Chambers
Leila Aboulela's debut novel The Translator (1999) is about a love affair between a Sudanese translator, Sammar, and her employer, the Scottish lecturer Rae Isles. Turkish novelist Elif Shafak similarly handles various transcultural love affairs in her 2012 novel Honour, but is more concerned with their darker aspects of jealousy and disgrace. Both novels contain the repeated motif of a new migrant from a Muslim background finding it hard to adjust in her new life in Britain and living as though she were still in the home country.
In The Translator, Sammar sometimes observes a British object or phenomenon and is transported back imaginatively to Sudan. We see this connective dissonance when Scottish central heating pipe noises call to Sammar's mind the azan or Muslim call to prayer. Sammar also attempts to recapture the tropical weather she is accustomed to by spending time in Aberdeen's heated Winter Gardens.
In Honour, the fractured identity of the migrant is dramatized most vividly through the split selves of Kurdish twins Pembe (who moved to Britain) and Jamila (who stayed at home in Turkey). Even as children, each girl's subjectivity is inseparable from that of her twin. For example, Pembe's father takes her miles away from Jamila to get a rabies injection, but the sister cries out in pain at the same moment the shot is administered. As the narrator puts it, 'When one closed her eyes, the other one went blind. If one hurt, the other bled'. This is an idea of connection drawn from Islam, since in a hadith Mohammed describes the indivisible nature of the ummah or global community of believers as being like 'that of one body; when any limb of it aches, the whole body aches'.
To theorize the translocal disconnection that makes the UK veer off into Sudan, Turkey, or elsewhere for diasporic writers, I reach for Jahan Ramazani's A Transnational Poetics and for Derek Gregory's analysis of imagined geographies as 'doubled spaces of articulation' in The Colonial Present. As a geographer, Gregory is alert to both the linkages and the severances that are caused by globalization. He offers the term 'connective dissonance', which is helpful in allowing insight into the frequent moments in these novels at which characters experience the world swinging around and Britain becoming Sudan/Turkey or vice versa.
by Brooks Riley
by Jalees Rehman
Lingulodinium polyedrum is a unicellular marine organism which belongs to the dinoflagellate group of algae. Its genome is among the largest found in any species on this planet, estimated to contain around 165 billion DNA base pairs – roughly fifty times larger than the size of the human genome. Encased in magnificent polyhedral shells, these bioluminescent algae became important organisms to study biological rhythms. Each Lingulodinium polyedrum cell contains not one but at least two internal clocks which keep track of time by oscillating at a frequency of approximately 24 hours. Algae maintained in continuous light for weeks continue to emit a bluish-green glow at what they perceive as night-time and swim up to the water surface during day-time hours – despite the absence of any external time cues. When I began studying how nutrients affect the circadian rhythms of these algae as a student at the University of Munich, I marveled at the intricacy and beauty of these complex time-keeping mechanisms that had evolved over hundreds of millions of years.
Over the course of a quarter of a century, I have worked in a variety of biological fields, from these initial experiments in marine algae to how stem cells help build human blood vessels and how mitochondria in a cell fragment and reconnect as cells divide. Each project required its own set of research methods and techniques, each project came with its own failures and successes. But with each project, my sense of awe for the beauty of nature has grown. Evolution has bestowed this planet with such an amazing diversity of life-forms and biological mechanisms, allowing organisms to cope with the unique challenges that they face in their respective habitats. But it is only recently that I have become aware of the fact that my sense of biological beauty was a post hoc phenomenon: Beauty was what I perceived after reviewing the experimental findings; I was not guided by a quest for beauty while designing experiments. In fact, I would have been worried that such an approach might bias the design and interpretation of experiments. Might a desire for seeing Beauty in cell biology lead one to consciously or subconsciously discard results that might seem too messy?
I was prompted to revisit the role of Beauty in biology while reading a masterpiece of scientific writing, "Dreams of a Final Theory" by the Nobel laureate Steven Weinberg in which he describes how the search for Beauty has guided him and many fellow theoretical physicists to search for an ultimate theory of the fundamental forces of nature. Weinberg explains that it is quite difficult to precisely define what constitutes Beauty in physics but a physicist would nevertheless recognize it when she sees it.
by Muhammad Aurangzeb Ahmad
“Those who control the present, control the past and those who control the past control the future.”
― George Orwell, 1984
These days every other person seems to be concerned about the future of Islamic Civilization. From the Islamists, the traditionalists, the Liberals, the Conservatives etc. almost everyone seems to have a stake in the future of Islam. While these different groups may have different vision of the future they do have one thing in common – they almost always define the future in terms of the past: From the Salafis harkening back to a supposed era of purity, to the academics yearning for the Golden Age of Islam and to the more recent Ottoman nostalgia in Turkey and the wider Middle East. The study of history becomes paramount in such an encounter since a distorted view of the past can become a potentially unrealizable view of the future.
As any historian will tell us each group reads history in terms of its own aspirations and agenda. For the Muslims world in general the nostalgia for the past usually seems to be heavy on reviving the glories of the past. The danger here being that one may start living in a non-existent romanticized past and be condemned to repeat the mistakes of the past. In the West every other political pundit seems to be calling for an Islamic Reformation even though parallel religious structures do not exist in Islam. What do these visions of the future-past look like and what can be learned from these?
by Dwight Furrow
We who are absorbed in the philosophy of wine are usually preoccupied by questions about objectivity, meaning, the nature of taste, aesthetic properties, and other exotica that surround this mysterious beverage. But wine considered as an aesthetic object can never be wholly severed from the commercial aspects of wine, and no philosophy of wine is complete without taking into account the influence of commercial categories.
If you stand perplexed before the thousands of choices available on the wine aisles of your supermarket, or if it all tastes like fermented grape juice to you, here is a primer on distinguishing the good stuff from the ordinary.
Any discussion of wine quality must begin with a distinction between commodity wines and premium or fine wines. Commodity wines usually sell for under $15, although the “commercial premium” sector is growing rapidly and pricier wines will increasingly fall into this category. A quality commodity wine is reliable and familiar, with no obvious flaws, easy to drink and designed for immediate consumption. It will spring no surprises that would offend the casual drinker. Unlike the situation 20 years ago, when $10 might have bought you an attractively packaged bottle of battery acid, there are few bad wines on the market today. The technology of mass wine production has made extraordinary advances. Wine connoisseurs will think these wines uninteresting, but they may be full of flavor, food-friendly, and satisfying to drink.
by Shadab Zeest Hashmi
At an Australian cricket club in Manhattan so dimly lit that I lift the nearest tea light so I can see the menu, I feel oddly at home; it is likely the haze of cultural nostalgia in these surroundings. As a Pakistani I grew up with cricket though I’m not much of a cricket fan and have had little direct exposure to it apart from a single visit to Lord’s Cricket Ground in London where I chose to talk to the gardener about the oldest trees on the premises instead of taking the famous cricket tour with my family, but here in New York, in the company of the Kashmiri-American poet Rafiq Kathwari, in the midst of cricket paraphernalia, framed action shots and lived-in colonial furniture, I’m on an unexpected bridge to a familiar time and place. I’m struck by the complexity of this nostalgia as I talk about life in our other countries, countries of birth, countries awaiting a rebirth, with the author of “In Another Country:” a collection of poems.
We talk about the personal-political in poetry. Cricket as a post-raj cultural idiom becomes even more poignant when I’m reminded of the traumas of Kashmir tied to the partition of India and Pakistan and of the intense political friction between the two countries that manifests itself every time the two countries are engaged in a cricket match against each other. Kashmir doesn’t play. It is played. And the political game is the ghost of the “great game” that the British began and that the Indian and Pakistani governments continue to play.
In South Asia, we know Kashmir as the land of immense natural beauty, mystics and poets, and a culture of great aesthetic delicacy and depth. In the West, Kashmir is synonymous with wool; not many know or care about the place or its long history of conflict.
As a voice of Kashmir and of the Kashmiri diaspora, Rafiq Kathwari’s most phenomenal gesture in this book of narrative poems that probe into psycho-social, historical and political, is his “protagonist” of sorts— the most haunting, fierce and charming persona in the book: his mother.
Sunday, May 22, 2016
Melissa Holbrook Pierson in the Los Angeles Review of Books:
Of the thousand images stored in my mind’s archives, there is only one of me holding a book. The result of what they call a flashbulb memory, where a shock imprints every detail of a scene on the mind forever, it permits me to view a single moment in my dorm at high school: were it not for the book, I would have forgotten everything — the peculiar darkness that used to fall across only half the room, its twin closets, the honey color of their wood, the fact that my hair reached the bottom of my shoulder blades in 1974. The book I hold, frozen in mid-turn, isPilgrim at Tinker Creek.
Pilgrim blew apart what I knew about writing at age 16: up until I read it, my notions were based on the usual pack of novels, poetry, philosophy, and exposition, all of which stayed neatly in their categories. This book, though, bled across lines (sometimes quite literally; it included plenty of death and injury): it refused to be held to one purpose. It coursed like a river swollen with snowmelt in spring from thing to thing, from inner life to outer. Or, rather, it found the edge where mind meets world. Annie Dillard sang this line, loud and imperative.
I’d thought the stuff I had spent my youth doing was something I’d come up with all on my own, and (to the mind of a self-doubting girl) must therefore be unimportant; but now I’d found someone who made a literature of wandering alone in the woods, watching, listening, poking at flora and fauna, describing views and pieces of nature, and trying to make a whole of her experience.
As he appears in new documentary The Divide, the great intellectual explains why Brexit is unimportant, why Trump’s climate change denial is catastrophic – and why revolution is easier than you think.
Leo Benedictus in The Guardian:
You talk about capitalism, politics and inequality a lot. Do you ever tire of it? Do you ever wish someone would ask you about something else? Well, from my point of view, there are two major categories of issues. There are the kind that are humanly important but intellectually pretty shallow. There are the kind that are intellectually quite deep and challenging, but don’t have the immediate human significance. If I had my choice, I’d rather stay on the second, but unfortunately the world won’t go away.
Do you not feel you’ve had enough sometimes? It’s like seeing a child in the street and a truck coming rapidly. Do you say, “Look, I’m too busy thinking about interesting questions, so I’ll let the truck kill the child”? Or do you go out into the street and pull the child back?
But if it was another child, every day, for decades? It doesn’t matter. I remember the philosopher Bertrand Russell was asked why he spent his time protesting against nuclear war and getting arrested on demonstrations. Why didn’t he continue to work on the serious philosophical and logical problems which have major intellectual significance? And his answer was pretty good. He said: “Look, if I and others like me only work on those problems, there won’t be anybody around to appreciate it or be interested.”
Brian Boutwell & J.C. Barnes in Nautilus:
For the past few years, social scientists have been buzzing over a particular topic in molecular biology—gene regulation. The hype has been building steam for some time, but recently, it rocketed to the forefront of public discussion due to a widely circulated piece in the New Yorker. Articles on the topic are almost always fascinating: They often give the impression that this particular area of biology stands poised to solve huge mysteries of human development. While that conclusion may be appropriate in fields like medicine and other related disciplines, a number of enthusiasts have openly speculated about its ability to also explain lingering social ills like poverty, crime, and obesity. The trouble is, this last bit isn’t really a feeling shared by many of the genetics experts.
Social scientists’ excitement surrounds what we can refer to broadly as transgenerational epigenetics. To understand why social scientists have become enamored with it, we must first consider basic genetics. Many metaphors exist for describing and understanding the genome; they all capture the reality that genes provide the information for building and running biological machinery like the human body.
From the moment sperm manages to infiltrate an egg cell, genes (segments of our DNA that ultimately produce proteins) are at work knitting together the necessary components to make life possible. This requires exquisite coordination. Even though every cell in your body (minus red blood cells) carries your complete genetic code, not every gene is “turned on” all at once all over the body.