Sunday, June 26, 2016
Jerry Brown in the New York Review of Books:
I know of no person who understands the science and politics of modern weaponry better than William J. Perry, the US Secretary of Defense from 1994 to 1997. When a man of such unquestioned experience and intelligence issues the stark nuclear warning that is central to his recent memoir, we should take heed. Perry is forthright when he says: “Today, the danger of some sort of a nuclear catastrophe is greater than it was during the Cold War and most people are blissfully unaware of this danger.”1 He also tells us that the nuclear danger is “growing greater every year” and that even a single nuclear detonation “could destroy our way of life.”
In clear, detailed but powerful prose, Perry’s new book, My Journey at the Nuclear Brink, tells the story of his seventy-year experience of the nuclear age. Beginning with his firsthand encounter with survivors living amid “vast wastes of fused rubble” in the aftermath of World War II, his account takes us up to today when Perry is on an urgent mission to alert us to the dangerous nuclear road we are traveling.
Reflecting upon the atomic bombing of Hiroshima and Nagasaki, Perry says it was then that he first understood that the end of all of civilization was now possible, not merely the ruin of cities.
Hisham Matar in The Guardian:
I don’t remember a time when words were not dangerous. But it was around this time, in the late 1970s, when I was a young schoolboy in Tripoli, that the risks had become more real than ever before. There were things I knew my brother and I shouldn’t say unless we were alone with our parents. I don’t remember my mother or father explicitly telling us what not to say. It was simply implied and quickly understood that certain words strung together in a particular order could have grave consequences. Men were locked up for saying the wrong thing or because they were innocently quoted by a child. “Really, your uncle said that? What’s his name?” It was as though a listening, bad-intentioned ghost was now present at every gathering. It brought with it a new silence – wary and suspicious – that was to remain in our lives for many years. Even when I was writing my first novel in a shed in Bedfordshire, beside the River Great Ouse, I could feel the disapproving hot breath of the dictator at my neck. It did not matter that I was writing in English and yet to have a publisher; I was nonetheless writing into and against that silence. But back when I was still a boy, when I only lived in one language, that silence, like black smoke from a new fire, was still growing. Lists, drafted by the authorities, were read on television. They contained the names of those to be questioned. That was how, one afternoon, I heard our name, by which I mean my father’s name, read out. He was abroad. He did not return to Tripoli. A year or so later, we left the country to be reunited with him in Cairo where a new life began: new schools and new teachers.
Aziz Ansari in The New York Times:
“DON’T go anywhere near a mosque,” I told my mother. “Do all your prayer at home. O.K.?” “We’re not going,” she replied. I am the son of Muslim immigrants. As I sent that text, in the aftermath of the horrible attack in Orlando, Fla., I realized how awful it was to tell an American citizen to be careful about how she worshiped. Being Muslim American already carries a decent amount of baggage. In our culture, when people think “Muslim,” the picture in their heads is not usually of the Nobel Peace Prize winner Malala Yousafzai, Kareem Abdul-Jabbar or the kid who left the boy band One Direction. It’s of a scary terrorist character from “Homeland” or some monster from the news. Today, with the presidential candidate Donald J. Trump and others like him spewing hate speech, prejudice is reaching new levels. It’s visceral, and scary, and it affects how people live, work and pray. It makes me afraid for my family. It also makes no sense.
There are approximately 3.3 million Muslim Americans. After the attack in Orlando, The Times reported that the F.B.I. is investigating 1,000 potential “homegrown violent extremists,” a majority of whom are most likely connected in some way to the Islamic State. If everyone on that list is Muslim American, that is 0.03 percent of the Muslim American population. If you round that number, it is 0 percent. The overwhelming number of Muslim Americans have as much in common with that monster in Orlando as any white person has with any of the white terrorists who shoot up movie theaters or schools or abortion clinics.
Saturday, June 25, 2016
Kashmir Hill in Fusion:
The plot has been owned by the Vogelman family for more than a hundred years, though the current owner, Joyce Taylor née Vogelman, 82, now rents it out. The acreage is quiet and remote: a farm, a pasture, an old orchard, two barns, some hog shacks and a two-story house. It’s the kind of place you move to if you want to get away from it all. The nearest neighbor is a mile away, and the closest big town has just 13,000 people. It is real, rural America; in fact, it’s a two-hour drive from the exact geographical center of the United States.
But instead of being a place of respite, the people who live on Joyce Taylor’s land find themselves in a technological horror story.
For the last decade, Taylor and her renters have been visited by all kinds of mysterious trouble. They’ve been accused of being identity thieves, spammers, scammers and fraudsters. They’ve gotten visited by FBI agents, federal marshals, IRS collectors, ambulances searching for suicidal veterans, and police officers searching for runaway children. They’ve found people scrounging around in their barn. The renters have been doxxed, their names and addresses posted on the internet by vigilantes. Once, someone left a broken toilet in the driveway as a strange, indefinite threat.
All in all, the residents of the Taylor property have been treated like criminals for a decade. And until I called them this week, they had no idea why.
Annie Sparrow in the Bulletin of the Atomic Scientists:
Business and politics have always influenced international efforts to solve public health problems. Unfortunately that remains as true in the era of Ebola, Zika, and bioweapons as it did in the 19th century, when cholera—a disease that spreads more quickly and kills faster than any other pathogen—began its deadly global march. Beginning in 1817, cholera spread relentlessly from the Ganges Delta across Asia, reaching Europe in 1830 and North America in 1832, taking millions of lives along the way. It ultimately precipitated the first of 14 International Sanitary Conferences in 1851. At the time, the typical response to cholera was to quarantine ships traveling from affected areas, but this practice, which slowed commerce, was expensive and unpopular. The World Health Organization (WHO), whose origins lie in those early cholera pandemics, says they “were catalysts for intensive infectious disease diplomacy and multilateral cooperation in public health.” But in fact, the first six International Sanitary Conferences were entirely unproductive due to conflicting interests: government fears about losing profits from trans-Atlantic trade took priority over the need to reduce the international death toll. Consensus was achieved only at the seventh conference in 1892, after the opening of the Suez Canal for use by all countries made standardized quarantine regulations necessary. The participating states then unanimously approved and ratified the first of four International Sanitary Conventions, the forerunner of today’s International Health Regulations, which commit all governments to work toward stopping the spread of infectious disease and other global health threats.
In his introduction, TS Eliot hailed In Parenthesis as “a work of genius”. Graham Greene placed it “among the great poems of the century”. WH Auden claimed “it does for the British and Germans what Homer did for the Greeks and Trojans”; he wrote to Jones to tell him “your work makes me feel very small and madly jealous”. On entering a party and seeing Jones sitting in the corner, WB Yeats bowed low to “salute the author of In Parenthesis”.
Perhaps the most considered response came from Herbert Read, an ex‑solider himself, whose reviews of In Parenthesis are shot through not just with admiration, but also a sense of gratitude. “For the first time,” he wrote, “all the realistic sensory experiences of infantrymen have been woven into a pattern which, while retaining all the authentic realism of the event, has the heroic ring which we associate with the old chansons de geste … a book which we can accept as a true record of our suffering and as a work of art in the romantic tradition of Malory and the Mabinogion.”
Read’s acknowledgment of In Parenthesis’s ability to simultaneously contain the contemporary and the ancient, the literary and the demotic, the realistic and the mythic, and of the “pattern” underpinning its whole, are key to understanding the power of Jones’s achievement.
In 1964, Trillin captured an exchange with King that speaks to our current political moment. King was flying to Mississippi when a young white man with “a thick drawl” and self-identifying as a Christian leaned across the aisle and questioned whether King’s movement was teaching Christian love or inciting violence. King explained that “love with justice” was a basic tenet of the nonviolent civil rights movement, and asked him what he thought of the new civil rights law. The inquisitor said he hadn’t read it.
“I think parts of it just carry on the trend toward federal dictatorship,” the man said. King later asked him if he was going to vote for Goldwater, the Republican nominee. “Yes, I expect I will,” the man answered. “I’ve voted for losers before.” King shook his head as the white man exited the plane. “His mind has been cold so long, there’s nothing that can get to him.”
In today’s hostile political climate, when an air of fatalist resentment seems to emanate from supporters of Donald Trump, that conversation, with a change of names, could easily occur.
Demonstrating that racism extended beyond the South, Trillin wrote about the successful battle whites waged against integration in Denver schools in 1969. In “Doing the Right Thing Isn’t Always Easy,” he patiently debunks the coded language of white supremacy the segregationists used to warn of “forced mandatory crosstown busing on a massive scale.” By 2015, Trillin writes in an update, most of the city’s white residents have fled to the suburbs, and “only 29 of Denver’s 188 schools could be considered integrated.”
Though László Krasznahorkai’s early fictions were set in his native Hungary, over the past two decades he has turned to settings that cover the globe across much of historical time. He is suited to this wide range by his erudition, by the air of conviction in his long, oscillating sentences; above all because he is a writer temperamentally nowhere at home. His protagonists are wanderers, sometimes easily distinguished from their author, sometimes less so. Whether in Renaissance Florence, Muromachi Japan, New York or Berlin, they meet their surroundings with the foreigner’s mixture of curiosity and fear, and can count no homeland but the symbolic one of art.
Destruction and Sorrow beneath the Heavens is the most recent Krasznahorkai volume to appear in English; though it carries the subtitle “Reportage,” it differs from the fiction only in that its confusion and longing are not joined to outright peril. An authorial double—“Mr. László” or “Comrade László” in the Hungarian original, in translation called (at the author’s direction) László Stein—travels to China, together with a long-suffering interpreter, to seek out remnants of classical Chinese culture: “this last ancient civilization, this exquisite manifestation of the creative spirit of mankind.” The phrase “beneath the heavens” is a rendering of tian xia, the Confucian concept of an ordered universe in which earth is brought in accord with heaven. Structurally, Krasznahorkai is a religious writer, and his quests after aesthetic revelation take the form of pilgrimages; but in this case, the pilgrimage goes quickly and spectacularly awry. The scenic town of Zhouzhuang is horribly transformed to a flea market the moment the tourist buses arrive; the fabled “First Spring Under Heaven” in Zhenjiang has become a filthy, stagnant pond; the adjoining Jiangtian monastery, while keeping its outward form, has inwardly ceased to exist.
Ian Leslie in New Statesman:
Angela Lee Duckworth begins her book with a story that frames her life’s work as an act of retribution against her father. When Duckworth was a child, her dad would tell her, repeatedly, “You know, you’re no genius.” He was, she says, expressing the worry that she wasn’t intelligent enough to succeed in life. In 2013, aged 43, Duckworth felt able to show her father how wrong he had been. She was awarded a prestigious fellowship for her work on the relationship between character and success – specifically her identification of “grit” as a critical component, perhaps the critical component, of educational achievement. The unofficial name of the award: the MacArthur Genius Grant. Or maybe she proved him right. Duckworth’s work casts doubt on the very idea of genius. Her aim is to knock talent off its pedestal and replace it with strategically applied effort. Successful people, she argues, display a blend of passion and perseverance. They are motivated primarily by a love of what they do, as opposed to money or fame. They set long-term goals and seek to get better at what they do every day. They never give up, no matter what setbacks they suffer. Because grit is a practice, and not a gift, it can be learned.
Duckworth’s own success has been dazzling. Grit is already one of the best-known and most widely influential ideas to emerge from psychology in the past decade. Duckworth’s Ted talk has been viewed well over eight million times. She has advised the White House, the World Bank, the National Basketball Association and Fortune 500 chief executives. In the US, universities and schools are implementing programmes to raise grit levels among their students. In the UK, the Education Secretary, Nicky Morgan, has announced measures to instil grit in disadvantaged pupils.
Pauls Toutonghi in The New York Times:
Enter 2016 — the election year of our discontent — which threatens to topple the country into a social chaos unseen since the late 1960s. Nearly two-thirds of Republican voters approve of a temporary ban on Islamic immigration. A mainstream presidential candidate has made xenophobia a central tenet of his campaign. In the first three months after terrorists attacked Paris in November, the rate of hate crimes against Muslims tripled in the United States. This is not an America with a robust and nuanced public discourse. And so the question must be asked: How much is our cultural marketplace to blame — where the narratives that sell most widely are ones that, arguably, do little to advance understanding, or even dialogue, across difference?
Into this maelstrom comes Ali Eteraz’s debut novel, “Native Believer.” Eteraz is the author of a memoir, “Children of Dust” (2009), that chronicled his journey from boyhood in a small town in central Pakistan to sex-obsessed adolescence in the American South to pious Islamic young adulthood to the broadly humanist activism that has marked his past 10 years. “Children of Dust” is, essentially, a description of the birth of “Ali Eteraz” — a pen name that translates to “Noble Protest,” which the author adopted several years after Sept. 11. Eteraz’s publisher has taken an admirable risk with “Native Believer.” I found myself wondering — as I sped through its pages with alternating interest, awe and queasiness — whether Eteraz had set out purposefully to challenge his imagined readership, to engage in a kind of “noble protest” against the demands of literary commerce. I believe this novel will offend as many readers as it captivates. It is unflinching in its willingness to transgress taboos, whether those taboos are religious, sexual or both. And in the end, “Native Believer” stands as an important contribution to American literary culture: a book quite unlike any I’ve read in recent memory, which uses its characters to explore questions vital to our continuing national discourse around Islam. This is a novel that says (to borrow a line from Aimé Césaire’s “Discourse on Colonialism”), “Any civilization that chooses to close its eyes to its most crucial problems is a stricken civilization.”
Friday, June 24, 2016
Ken Chen in The New Republic:
Once, in my youth, I took a graduate philosophy seminar I thought would be about law and justice: Instead we discussed the semantic implications of punctuation marks. After class, I found myself venting to a friend who’d been a literature professor. I told her I was unsatiated by the course—it felt like when I had discovered poetry and found, in practice, this most lyric of arts often meant writing about flowers or describing an epiphany in the grocery store checkout line. My friend laughed. “You know your problem?” she said. “You thought that philosophy would be Truth and poetry would be Beauty.”
Apparently, this is Ben Lerner’s problem too. In his new book,The Hatred of Poetry, the poet, novelist, and MacArthur “genius” argues that if you love poetry’s promise of transcendence, you must also hate poems for their failure to keep up their end of the bargain. “Poetry,” Lerner writes, “arises from the desire to get beyond the finite and the historical—the human world of violence and difference—and to reach the transcendent or divine.” The only problem? Poems are ultimately human rather than divine in character. “As soon as you move from that impulse to the actual poem,” he continues, “the song of the infinite is compromised by the finitude of its terms. In a dream your verses can defeat time… but when you wake… you’re back in the human world with its inflexible laws and logic.” In other words, if you’re a poet, you may declare yourself the unacknowledged legislator of the world, but you’re really just a hobbyist in the verse game.
Bruce Bower in Science News:
Fierce combat erupted in February 2016 at the northern Iraqi village of Kudilah. A Western-backed coalition of Arab Sunni tribesmen, Kurds in the Iraqi army and Kurdish government forces advanced on Islamic State fighters who had taken over the dusty outpost.
Islamic State combatants, led by young men wearing explosive vests, fought back. The well-trained warriors scurried through battle lines until they reached their enemy. Then they blew themselves up along with a few coalition soldiers, setting the stage for an Islamic State victory. These suicide bombers are called inghamasi, meaning “those who dive in deep.”
The inghamasi’s determination and self-sacrifice inspires their comrades to fight to the death, says anthropologist Scott Atran of the University of Michigan in Ann Arbor. Outnumbered about 6-to-1, Islamic State fighters still retained control of Kudilah after two days of heavy fighting. Coalition forces retreated, unwilling to lose more soldiers.
Atran and colleagues arrived in northern Iraq a couple of weeks later. Their plan: study “the will to fight” among soldiers on both sides of the Kudilah clash, even as fighting in the area continued. Their goals: try to understand what motivates people to join brutal organizations such as the Islamic State, and describe the personal transformations that push people leading comfortable, peaceable lives to commit acts of incredible violence and self-destruction.
Bomani Jones in Playboy:
When did you realize you had become somebody?
When I came to The Atlantic I’d been writing for 12 years. The Atlantic is seen as this arbiter of sophisticated ideas, well ensconced in the mainstream consensus, and then they bring in this dude. I wasn’t making the case for reparations back then, but I was saying that sort of shit. I could see the reaction, and it built a little bit, and then when “The Case for Reparations” came out—holy shit. But even then it was like, “This is one story, and I’ll go back to my life.” I thought Between the World and Me would hit people who read shit. When we did BookExpo America, the book-trade joint, there was a line of people to get the galleys. I was like, “What the fuck?” And I knew it was some shit when somebody said to me on Twitter, “Oh, you’ve got to be a celebrity to get this book?” [laughs] Who the fuck wants a galley? And then when you’ve gotten love from Toni Morrison—it still didn’t hit me. When I started seeing the reaction to it I thought, Oh, this is different.
King’s childhood in Connecticut and Maine was something of a blend of the lives he created for Lachance and Chambers in The Body. Like Lachance, King had a talent for storytelling. Like Chambers, he grew up without much money. King’s mother raised her two sons alone in the 1950s by taking on a series of low-paying jobs: shifts in a bakery and an industrial laundry, and housekeeping at a facility for the mentally ill. The strong women who populate King’s work—Wendy Torrance in The Shining (1977), a far cry from the trembling Shelley Duvall in the movie, and hardworking housekeeper Dolores Claiborne—reflect King’s admiration for his own mother’s efforts to get her boys a college education.
The stakes were high. In 1966, when King was in his last year of high school, the Vietnam war machine was at full throttle. Not being admitted into college would have meant getting drafted. To help with tuition, King got a job at a mill, a place he later described as “a dingy fuckhole overhanging the polluted Androscoggin River like a workhouse in a Charles Dickens novel.” Every day after school, he punched in for an eight-hour shift, went home to sleep for several hours, attended classes, then punched in again. His first notable story sale, in 1970, to a men’s magazine called Cavalier, was about the enormous rats under the mill. The grimy horror tale, “Graveyard Shift,” landed him the equivalent of a few weeks’ pay.
What will happen now? Precise predictions at this stage would be rash. The immediate upshot has already been position-staking by interest groups, notably from Scotland and Northern Ireland, both of which backed Remain in the poll. Sinn Fein has already called for a referendum on sovereignty. It’s unlikely that Nicola Sturgeon will be too quick to follow suit on Scotland’s behalf, first because in the short term the oil price collapse undermines an independent Scotland’s viability, and because a Scexit from the UK won’t quickly lead to Scotland’s reabsorption into the EU – existing members can veto accession, and Spain (and the Commission) will be loath to bless a precedent for secession, specifically of Catalonia.
If Scotland or Northern Ireland or both do peel off, the immediate prospects are fairly grim for people in what – the term is obsolete – used to be called Labour’s ‘heartlands’ in Rump UK. The kingdom of England and Wales would become, still more than it already is, Londonia, the capital a city-state as dominant over the rest as ancient Athens was over the surrounding demes. National politics is likely to be steered by the political wing of the Faragist falange, almost certainly with Johnson as premier. Its payroll vote skewed the Tory parliamentary party’s public stance in the referendum towards Remain; now it’s free to become what it is, an English nationalist party figureheaded by Johnson. Europhile Tories will be isolated. It’s not impossible that a major reconfiguration will occur, as happened with the Peelite Tories after Corn Law Repeal in 1846 or with anti-coupon liberals after the 1918 election, which eventually put paid to the Liberals as a single party of government.
- First step in returning Britain to its pre-1970s glory as an economically languishing failed colonial empire
- In the face of a resurgent Russia and increased threats from ISIS, leaving E.U. would be the best strategy for letting someone else deal with that shit
- Britons could once again refocus their hatred on internal class divisions
- One less goddamn flag everyone has to hang up
- Won’t have to take part in awkward process of denying Bosnia and Herzegovina’s E.U. membership request
- Throughout its history, Britain has always been a valiant defender of the right of smaller territories to separate from larger, oppressive governments
- Pretty airtight way for citizens to mask racism as concern for national autonomy
- more here
Brian Handwerk in Smithsonian:
Slammed. Swamped. Flat out. Buried. No matter how it's said, the refrain is all too familiar—people are just too busy. But there's good news for the harried and hectic, new research shows that busy lifestyles may be good for your brain. “There hasn't been much scientific research on busyness itself, although it's something that we talk about so often,” explains Sara Festini, a cognitive neuroscientist at the University of Texas at Dallas Center for Vital Longevity, a co-author of the new research published this week in Frontiers in Aging Neuroscience. “So we wanted to look at the relationship of a generally very busy lifestyle to cognition.” Festini and colleagues found that middle-aged and older Americans who keep themselves busy test better across a whole range of different cognitive functions like brain processing speeds, reasoning and vocabulary. The memory of specific events from the past, or episodic memory, is especially enhanced among busy people, they report. Psychologist Brent Small, director of the University of South Florida's School of Aging Studies, said the results are “in line with a large body of research suggesting that older adults who are actively engaged in cognitive stimulating activities are more likely to perform better on standard cognitive tasks.”
“This paper extends that work by examining the concept of busyness,” adds Small, who wasn't involved in the new research. But the strong correlation shown between busyness and brain function also raises an intriguing chicken-and-egg question: Does busyness boost the brain, or might people with better cognitive powers be more likely to keep themselves busy?
A team of scientists at The Scripps Research Institute (TSRI), University of California, San Diego (UC San Diego) and Illumina, Inc., has completed the first large-scale assessment of single neuronal "transcriptomes." Their research reveals a surprising diversity in the molecules that human brain cells use in transcribing genetic information from DNA to RNA and producing proteins. The researchers accomplished this feat by isolating and analyzing single-neuronal nuclei from the human brain, allowing classification of 16 neuronal subtypes in the brain's cerebral cortex, the "gray matter" involved in thought, cognition and many other functions. "Through a wonderful scientific collaboration, we found an enormous amount of transcriptomic diversity from cell to cell that will be relevant to understanding the normal brain and its diseases such as Alzheimer's, Parkinson's, ALS and depression," said TSRI Professor and neuroscientist Jerold Chun, who co-led the study with bioengineers Kun Zhang and Wei Wang of UC San Diego and Jian-Bing Fan of Illumina. The study was published on June 24 in the journal Science.
All the Same
While parts of the cerebral cortex look different under a microscope—with different cell shapes and densities that form cortical layers and larger regions having functional roles called "Brodmann Areas"—most researchers treat neurons as a fairly uniform group in their studies. "From a tiny brain sample, researchers often make assumptions that obtained information is true for the entire brain," said Chun. But the brain isn't like other organs, Chun explained. There's a growing understanding that individual brain cells are unique, and a possibility has been that the microscopic differences among cerebral cortical areas may also reflect unique transcriptomic differences—i.e., differences in the expressed genes, or messenger RNAs (mRNAs), which carry copies of the DNA code outside the nucleus and determine which proteins the cell makes. To better understand this diversity, the researchers in the new study analyzed more than 3,200 single human neurons—more than 10-fold greater than prior publications—in six Brodmann Areas of one human cerebral cortex. With the help of newly developed tools to isolate and sequence individual cell nuclei (where genetic material is housed in a cell), the researchers deciphered the minute quantities of mRNA within each nucleus, revealing that various combinations of the 16 subtypes tended to cluster in cortical layers and Brodmann Areas, helping explain why these regions look and function differently.
Thursday, June 23, 2016
Ariel Sabar in The Atlantic:
On a humid afternoon this past November, I pulled off Interstate 75 into a stretch of Florida pine forest tangled with runaway vines. My GPS was homing in on the house of a man I thought might hold the master key to one of the strangest scholarly mysteries in recent decades: a 1,300-year-old scrap of papyrus that bore the phrase “Jesus said to them, My wife.” The fragment, written in the ancient language of Coptic, had set off shock waves when an eminent Harvard historian of early Christianity, Karen L. King, presented it in September 2012 at a conference in Rome.
Never before had an ancient manuscript alluded to Jesus’s being married. The papyrus’s lines were incomplete, but they seemed to describe a dialogue between Jesus and the apostles over whether his “wife”—possibly Mary Magdalene—was “worthy” of discipleship. Its main point, King argued, was that “women who are wives and mothers can be Jesus’s disciples.” She thought the passage likely figured into ancient debates over whether “marriage or celibacy [was] the ideal mode of Christian life” and, ultimately, whether a person could be both sexual and holy.
King called the business-card-size papyrus “The Gospel of Jesus’s Wife.” But even without that provocative title, it would have shaken the world of biblical scholarship. Centuries of Christian tradition are bound up in whether the scrap is authentic or, as a growing group of scholars contends, an outrageous modern fake: Jesus’s bachelorhood helps form the basis for priestly celibacy, and his all-male cast of apostles has long been cited to justify limits on women’s religious leadership.