Thursday, June 23, 2016
As you know, we are able to run the site only because our regular readers support us through subscriptions or one-time payments.
Whichever you'd like to do, please take a couple of minutes and use the appropriate button near the top of the left-hand column to make a contribution.
Please do it now!
New posts below.
Monday, June 27, 2016
by Michael Liss
On September 19, 1796, less than two months prior to the meeting of the Electors to choose the next President of the United States, George Washington stunned the country by publishing “The Address of General Washington To The People of The United States on his declining of the Presidency of the United States”—what came to be known as Washington’s Farewell Address.
Washington was tired. The office had made him old before his time—compare the ubiquitous Gilbert Stuart “dollar-bill” paintings done in his second term to the immensely vigorous figure you see in Charles Willson Peale’s full length portrait after the Battle of Trenton. Still, the Presidency would have been his to keep, probably for life, if he had wanted. His prestige was immense, his character considered unimpeachable, and his words carried enormous weight.
“Weight” also described the text. In an era where there were no page limits, The Farewell Address just keeps on going—32 densely-handwritten pages and well over 6000 words when set in type. And as to the prose, there is just no lift, no color, no poetry. People think they remember “beware of foreign entanglements,” but even that is incorrect--the exact quote is “steer clear of permanent alliances with any portion of the foreign world.”
Therein lies the paradox--the famous speech that no one can accurately recall because no one can get through it. My daughter gave me a collection of 40 great American speeches. The Farewell Address is included, but with at least 80% of it “abridged.” Seems as if the editor couldn’t get through it either.
There is something oddly appropriate about this. Washington wasn’t eloquent. Monuments rarely are. At times, it seems he was barely human—he was Flexner’s Indispensable Man, transitioning from warrior-chief to an immovable stone obelisk to which the ship of state could be lashed in any storm. What people get out of the Farewell, after wading through the prolixity, is his strength and steadfastness—his primary bequest to the country. Here, he voluntarily gives up power; there, he reassures that great things have been accomplished by forming a Union; and, finally, he warns of dangers and advises on how to reduce them.
Can we stop with that—is that enough? Do we really need more from Washington, beyond seeing him as a colossus?
by Paul North
In the third installment of "Current Genres of Fate" I want to think about a mode of fate that has been all the rage for the last 20 years or so. Let's call it "the persistence of the past." For some time before that, as is well known, it was the rage to remark on the speed with which we were leaving the past behind. Rages come and go. It was oddly pleasurable to discover, in the midst of our progress, that the past had kept right up with us. Now we happily talk about how little has changed. But however cutting edge it has recently seemed, the idea that the past persists within or behind the newness of things is at least as old as our ideas of progress. Darwin tells about a species driven toward innovation that at the same time keeps intimate ties with the deep past. Freud says a new psychic attachment is a guise for a primal ur-attachment.
You will never be rid of the past. This is surely a fateful way of understanding the past's persistence. But this fate does not have to be bad. Just because we are shadowed by the old does not mean we are its puppets or have no freedom at all. What's more, the idea that the past persists can have a salutary effect. It may soften our fetish for change, turn our fever for forward movement to reticence, relax the continual, tortured desire to "move on." On the other hand, if we admit that the past persists, it does seem unlikely that we will ever achieve total freedom. Accepting this mode of fate ruins the fantasy that we could have no constraints whatever.
An artist named Friese Undine has made it his responsibility to cast shadows on the idea of progress in life as well as in art. Undine proposes to stain putatively current images with blotches of the past. In art this is particularly hard to do, since art, visual art—‘contemporary' art—seems over the last 150 years or so to have signed a pact with progress-lovers in other walks of life, like politics and economics. Art wants to consign the past to the past just like they do. We associate this gesture with "modernism"—waving away tradition, refusing conventional subjects and traditional techniques. With its dismissive wave, modern art kept up with capitalism. "Make it new" was the aesthetic rallying cry of a century, until, at a certain point, the sheen on the plastic packing rubbed off. Newness got old. The only novelty left to plunder was the past. Yet even the return to past forms—in order to quote styles, ridicule out of date wishes, to consciously recycle images or to debase conventions, and all the rest—even this way of doing art that saw the past as a storehouse of gestures to be repurposed, also denied that the past simply persists. Artists could not proceed plundering the past if it were not dead. They could not innovate and renovate and at the same time admit that the past had never actually passed.
Taking its cue from French politics, French experimental writing has always been a clubby affair. Unlike in Britain or America, where economic and political liberalism have encouraged writers to view themselves as individual talents engaged in private agons with tradition, in France, with a few notable exceptions, avant-garde writers have presented themselves as members of an organization, complete with founding documents, by-laws, regular meetings, and a leadership structure, in short, as citoyens of a mini-republic.
Founded in 1960 by Raymond Queneau and François Le Lionnais, the Ouvroir de littérature potentielle or Workshop of Potential Literature, known by its acronym, Oulipo, is the longest-lasting experimental writing group in history. Oulipians marry two strange bedfellows, literature and mathematics, adopting and inventing rigorous formal constraints—most famously, the lipogram, in which the use of a certain letter is proscribed, and the n+7 rule, in which every noun is replaced by the noun that follows it seven entries later in a dictionary—to generate poems, novels, essays, memoirs and "texts that defy all classification." From its ten original members, all but one of whom are now dead, the group has nearly tripled in size, "co-opting" (to use the group's official term) writers from Italy, Germany, the UK, and America. Although it has by no means achieved anything close to gender parity, five of its new co-optees have been women.
The Oulipo owes its longevity, in part, to its refusal as a collective to entertain any kind of political line, despite the avowed leftism of many of its members. In so doing, it managed to avoid the power struggles, excommunications, and splintering characteristic of the avant-garde movements that were fatally drawn into the orbit of French Marxism and Maoism. But its survival can also be attributed to the fruitfulness of constrained writing itself. The widespread availability of constrained writing techniques has enabled Oulipians to identify those who are working along parallel lines and co-opt them.
Geoffrey Farmer. The Surgeon and The Photographer. From Stage Presence, SFMOMA 2012.
Installation: paper, textiles, wood, & metal.
"... Farmer collaged photographic reproductions from books into 365 puppet-like sculptures, each approximately the size of a hand, thirty of which are included in this exhibition. The puppets bristle with multiple identities; each angle presents a new figuration as disproportionate and layered appendages cohere into forms. They are totemic but not possessed of any spirit. Rather, they are waiting for occupation and activation."
by Akim Reinhardt
Hello. My name is Akim Reinhardt, I was very, very wrong, and now it's time for me to pay for my mistakes.
The good news is, when I pay, you just might be the one to collect. My loss can be your windfall.
The catch? You'll have to publicly debase yourself almost as much I am about to do right now.
How did it come to this? You and I publicly shaming ourselves on the internet, each of us desperately hoping to salvage a little bit of joy as the world burns around us?
It's all because of that goddamned Donald Trump.
Trump is about to claim the Republican presidential nomination, and a whole lotta pundits got that one wrong. Legions of professional gabbers, from every corner of the political spectrum, badly missed the mark, assuring you that he'd never be the GOP candidate.
Despite their wishful thinking dressed up in high falutin' gibberish, it's happening anyway; Trump is poised to become leader of the pachyderm pack. And so a lot of the yakkers had to make amends.
Dana Milbank of the Washington Post literally ate his words. Pass the salt and pepper.
Nate Cohn of the New York Times and David Byler of Real Clear Politics each created a laundry list of everything they got wrong, which like most analysts, was quite a lot.
Perhaps the oddest mea culpa came from polling wunderkind Nate Silver, who explained away his spectacular failure by saying that he had acted like a barbaric "pundit" instead of staying true to the "scientific method." Rather than relying on statistical modeling to figure out if Trump would win, Silver says he just made "educated guesses."
Since Silver never really explains why he traded in true reason for such wild tomfoolery, I'm just gonna assume he went on a months-long bender.
Normally, it would be very easy for me to look down my nose at these losers. After all, I'm not a statistician or a professional talking head. I'm a historian. And if there's one thing studying history has taught me, it's that trying to predict the future is pure folly.
What were these dullards thinking? Guess the future? Good luck with those crystal ball shennanigans. Studying history has shown me, time and time again, that the future is unknowable. The past is a mystery and the future is an illusion. So allow me to haughtily point a sanctimonious finger at these morons.
Except for one thing. It turns out that I'm one of those morons. I, too, am a loser.
I spouted off like all the others, publicly assuring people that Trump would not win the nomination, offering up historically informed ramblings as evidence. And just like the rest of them, I was wrong, wrong, wrong.
It was a fool's errand, of course. So why did I do it?
by Humera Afridi
I want to hear her: bold; questioning; insistent, refusing to compromise her ideals. I want to understand; to see, her: this woman of deep faith, with a distinctive laugh, who "had no equal among either the women or the men of her century." Possessed of a brilliant mind and exceptional memory, she was controversial—beloved, reviled, envied, not averse to taking risks in the service of truth and justice. Falsely accused of adultery, she was publicly defended by her husband, Seal of the Prophets and a political leader, who took to the minbar and challenged the men bent on sullying her name and that of his household. At 42, she led an army against the fourth Caliph—the infamous Battle of the Camel in the mid-seventh century—in which she suffered devastating losses. Mother of the Believers, yet herself childless. Youngest wife of Prophet Muhammad. Transmitter of two thirds of his sayings, the Hadith or traditions, that are treasured keys to a deeper understanding of the Quran and the commentaries written on its divinely revealed verses.
But: where is Aisha today?
When we speak of Muslim women, or the status of women in Islam, harking back always to that distant past—seventh century Arabia—which through a prismatic lens continues to determine our present, why are the Mothers of the Believers silent, invisible, absent? Asked whom he loved the most, Prophet Muhammad, magnificent warrior against misogyny in egregiously patriarchal Arabia, unhesitatingly declared, "Aisha!" Aisha in whose lap he breathed his last breath before he passed into the Realm of Beauty.
All this to say, Aisha was far from flat. She was refreshingly complex, multi-dimensional, a "round character"—to borrow a literary term from E. M. Forster—filled with the breath of God. And she wasn't the only one. Well before her, there was Khadijah, the Prophet's first wife—with whom he had monogamous relationship for twenty-five years until her death—savvy business woman, older than him by over a decade, a former widow, who on discerning his gentle and upright character, qualities she deemed attractive in a man, proposed marriage to him when he was a lad of 25 and in her employ.
by Dave Maier
As someone who lived through the surreal drug-war dystopia of the 1980s, I have always assumed that the collected forces behind it (right-wing authoritarianism, progressive nanny-statism, the law enforcement, private-prison, and Big Pharma lobbies, general aversion to other races and/or dirty f’ing hippies, inertia and lack of imagination, etc.) would render it a permanent fixture of our political landscape, at least in the USA. So even after two states re-legalized marijuana in 2012 (and two more since), I didn’t pay much attention. It simply remained inconceivable to me that it would go beyond that.
Nowadays, however, one hears frequently that re-legalization of marijuana and perhaps even all “illicit” drugs is inevitable and in fact will happen sooner rather than later. The thought is that young people (i.e. new voters) are strongly in favor of re-legalization and only older people (i.e. those preparing to shuffle off this mortal coil and thus off the voting rolls) are strongly against – and even the latter are discovering, perhaps to their surprise, the apparently wondrous utility (if anecdote be any guide) of medical cannabis. The latest nationwide polls on the issue show Americans favoring the end of marijuana prohibition by wide margins (58-39, 56-36, numbers like that), suggesting a cultural shift as momentous and sudden (at least to those not paying attention, such as myself) as that which has led to today’s widespread acceptance of same-sex marriage.
So I thought I better get up to speed and hit the books. I don’t have a tightly argued, persuasive essay for you, and I am still only halfway through a fairly tall stack of relevant literature, but I can at least pass on some recommendations and share some speculation over the next couple of columns.
I'd start with Dan Baum’s authoritative study Smoke and Mirrors: The War on Drugs and the Politics of Failure (1996). This will fire your outrage and keep you going through some of the more pedestrian public-policy issues, as well as dauntingly complex psychopharmacology, on offer later on. Baum insists that the book is not a manifesto for legalization, but rather an examination of the genesis of the war, which he traces to the election of 1968, and its escalation into “a policy as expensive, ineffective, delusional, and destructive as government gets.”
A recurrent theme in Baum’s story, as he notes in his introduction, is that “[t]he War on Drugs is about a lot of things, but only rarely is it really about drugs.” Notoriously high on President Nixon’s paranoid list of enemies were “the blacks” and “the hippies”, and by fomenting drug war he saw a way to attack both at once. When his hand-picked Presidential Commission on Marijuana (a.k.a. the Shafer Commission) failed to provide the desired denunciations of drug use (Nixon had demanded “a goddamn strong statement about marijuana … one that just tears the ass out of them”), Nixon simply ignored it. In any case Congress had already passed the 1970 Controlled Substances Act, which still determines government policy in this area to this day. The drug war – or at least its modern phase – had begun.
by Brooks Riley
by Leanne Ogasawara
When it comes to private art collections, not many places have the richness and diversity of Italy. Of course, Italy also has a few great national museums too. But that is not where one usually heads to find the cream of the crop of the country's fine art. For in Italy, the famed pictures and sculptures are mainly to be seen in the once legendary private collections of long-dead dukes and princes; as well as in those of Renaissance mercenaries and bankers --not to mention the art still found miraculously in the the churches for which they were originally created. Beautiful gems, these private collections are in part why going to Italy to see art somehow feels more an act of pilgrimage than of travel.
In the Uffizi last summer, I wondered about how the collection of a banker-- like, say that of a Medici -- is different from those of a prince or duke. Indeed, to my untrained eyes, the collecting styles and practices didn't seem so different at all. I wondered why that was. And as luck would have it, the museum shop had Tim Parks' new book Medici Money quite prominently displayed by the cashier. So I grabbed it!
What a great read! And the more I read, the more I could understand why it is that the great private collections of mercenaries and bankers so closely resembled that of the princely collections. Very similar to the situation in Japan, in Italy too, men of business and men of war--once having gained power-r- typically began to crave social acceptance. And so they often turned to art. In those days, art collecting and aesthetic sensibility was seen as a marked sign of character and virtue--and therefore of status. Along these lines, there is an absolutely brilliant (but out of print) book by Christine Guth, called Art, Tea ad Industry: Matsuda Takashi and the Mitsui Circle, about how this practice functioned in Japan down into modern times, where connoisseurship and taste were viewed as the necessary signs of a noble character-- and unlike today (where money "trumps" everything), in days past --in Japan and in Europe at least-- one would never be taken seriously without noble pursuits and enlightened hobbies.
by Evan Edwards
Last month was the 197th anniversary of the birth of American poet, Walt Whitman. While one hundred and ninety-seven isn’t as clean as a good, solid, two hundred years of the grandfather of free verse, I reckon we’ll just have to make do with it until 2019. Still, it has been a very good year for Whitman, and for those impassioned by his work. In February, one of the hundreds if not thousands of letters that he wrote for dying soldiers during the Civil War turned up in a Washington archive. But even more significantly, last summer, a 13-part column series on “manly health,” written by Whitman, was discovered, verified, and then published in April of this year. Since Whitman was a prolific writer, newly discovered texts of his crop up every year or so; but this series of columns is another beast entirely. Weighing in at over one hundred and twenty pages, the text’s discovery was not just the addition of a small fragment or marginalia to the oeuvre, not even just a new article written during his years as a journalist, but an entirely new text.
When Leaves of Grass was first published in 1855, the book of untitled, authorless, largely unorganized verse was just ninety pages. Over the course of the next thirty one years, he would add, organize, reorganize, subtract, and alter the poems, so that the text ended up being around four hundred leaves or so. This is only important to note when we consider that the columns on health were written in the years just after the publication of the first edition. Just like a series of lectures that were written around and after the first edition, lectures which were supposed to eventually replace the original introductory essay, this series on manly health seems to be conceived as a sibling project to the poems.
The essay that precedes the first edition, written in the days just preceding its publication, as well as the lectures written to replace that essay, and the columns on manly health that sought to replace those lectures, all of these share a common theme: they are Whitman’s admitted attempts to “explain” or “fulfill” the poetry for which he was so famous.
The best lack all conviction, while the worst Are full of passionate intensity.
One morning, as Gregor Samsa was waking up from anxious dreams, he discovered that in bed he had been changed into a monstrous verminous bug.
It was night and I was walking along the street outside an aged building where I occupied an apartment on the first floor. To my surprise and dismay the external wall along one side of my apartment was completely gone. It had been in bad shape and needed repair, but why hadn’t my landlord given me notice about the repairs? I’d been thinking of inviting my father to visit me, but I couldn’t do that now. And how could I protect my things from thieves? So I went to the rear of the building, clambered up an external wall and over the roof of a rear-facing porch and through a window into a second floor apartment. I looked around, went past the occupants, who were sitting on mattresses on the floor and paid me no mind, and entered the front apartment on the second floor where I did the same thing. So did they. I walked down the stairs to the first floor where I entered my apartment through the door. Now I was in my apartment and looking into the street through the wall that was no longer there.
That’s when I awoke. There was of course more to the dream than that, But I don’t recall it very well and, in any event, dreams and prose are such very different things that any account I give will be as much invention as recollection. So it is with that first paragraph.
The thing is, I recognized that apartment. It seemed that I’d been there in other dreams, dreams years ago. But that apartment also seemed like a diffuse and distracted amalgam of apartments I’d occupied in Baltimore, Buffalo, Troy NY, and even Jersey City.
I remembered that before I’d fallen asleep I’d been thinking of writing this piece. Not so much about Brexit as about a diffuse wandering state of mind. Start with Brexit and see where it takes me.
For you see, I’m not expert in any of the various things that would allow me to lay claim to serious insight into the referendum that just took place in Britain. It’s just one of the things that flows through my mind these days, along with Trump, Sanders, and Hillary; territorial disputes in the South China Sea; drone warfare; and just when are we going to see self-driving vehicles on the open road? I claim no particular expertise in these matters either, but they enter my mind where I entertain them. In the one case I’m going to have to vote.
Strained Analogies Between Recently Released Films and Current Events: Central Intelligence and Why Gun Laws Don't Change
by Matt McKenna
More than the thousands of articles laboriously describing the apocalyptic state of American politics in 2016, the low-brow Kevin Hart comedy Central Intelligence is the most efficient and accurate portrayal of the circus we’ve created out of our Presidential election process. In hindsight, it seems odd to expect long-read think-pieces in periodicals like the New Yorker to shed light on what is less of a democratic election and more of a reality show called “Who Wants to be the President”. Indeed, a run-of-the-mill summer comedy with the crass tagline “Saving the world takes a little Hart and a big Johnson” seems the more appropriate medium to comment on our equally crass election. So perhaps it shouldn't be surprising that director Rawson Thurber’s Central Intelligence isn’t just reasonably funny, but it also provides a legitimate critique of American politics.
Central Intelligence co-stars Kevin Hart and Dwayne “The Rock” Johnson. Hart plays Calvin Joyner, an accountant who is bummed out because he used to be cool in high school, but now he’s boring. Johnson plays Bob Stone, a CIA operative who was bullied mercilessly in high school, but now he’s super jacked. Because Calvin was nice one time to Bob in high school, Bob recruits Calvin to help him on some cockamamie save-the-world mission involving satellites, access codes, and Aaron Paul implausibly portraying a CIA agent. The story, of course, doesn’t make sense, nor was it designed to make sense, which is the first clue the film is actually commenting on American politics.
The humor in Central Intelligence stems from the conflict between the diminutive Calvin and the gargantuan Bob. Calvin is a stuck-up white-collar jerk, and Bob is an naive violence-loving semi-idiot. The film has therefore patterned its leads after the stereotypes of the two major political parties in America; Calvin represents Democrats with their politically correct, holier-than-thou elitism, and Bob represents Republicans with their inability to solve problems in a way that doesn’t involve applying violence to something. Neither party gets a pass in the film--Calvin is frequently the butt of the joke as he sheepishly runs from conflict and is unable to take care of himself. And though he is able to beat people up, the motivation for Bob to develop his physically dominating stature is his feeling emasculated as an adolescent.
Sunday, June 26, 2016
Barry Reay in Aeon:
The anonymous author of the pamphlet Onania (1716) was very worried about masturbation. The ‘shameful vice’, the ‘solitary act of pleasure’, was something too terrible to even be described. The writer agreed with those ‘who are of the opinion, that… it never ought to be spoken of, or hinted at, because the bare mentioning of it may be dangerous to some’. There was, however, little reticence in cataloguing ‘the frightful consequences of self-pollution’. Gonorrhoea, fits, epilepsy, consumption, impotence, headaches, weakness of intellect, backache, pimples, blisters, glandular swelling, trembling, dizziness, heart palpitations, urinary discharge, ‘wandering pains’, and incontinence – were all attributed to the scourge of onanism.
The fear was not confined to men. The full title of the pamphlet wasOnania: Or the Heinous Sin of Self-Pollution, and all its Frightful Consequences (in Both Sexes). Its author was aware that the sin of Onan referred to the spilling of male seed (and divine retribution for the act) but reiterated that he treated ‘of this crime in relation to women as well as men’. ‘[W]hilst the offence is Self-Pollution in both, I could not think of any other word which would so well put the reader in mind both of the sin and its punishment’. Women who indulged could expect disease of the womb, hysteria, infertility and deflowering (the loss of ‘that valuable badge of their chastity and innocence’).
Another bestselling pamphlet was published later in the century:L’onanisme (1760) by Samuel Auguste Tissot. He was critical of Onania, ‘a real chaos … all the author’s reflections are nothing but theological and moral puerilities’, but nevertheless listed ‘the ills of which the English patients complain’.
Modern Western culture is slowly acknowledging gender fluidity, but "third genders" and other classifications have existed throughout history.
Jessie Guy-Ryan in Atlas Obscura:
This week, an Oregon judge ruled to allow Jamie Shupe, a 52-year-old former Army mechanic, to list themselves as non-binary—that is, neither male nor female on their driver’s license. The ruling is likely the first time that an individual has been allowed to legally identify as non-binary in the United States, and represents part of a growing effort around the world to extend legal recognition to those whose identities fall outside the masculine/feminine gender binary.
Some might assume that the shift towards viewing gender as fluid or encompassing identities beyond the binary is a novel cultural change; in fact, several non-Western cultures—both historically and today—have non-binary understandings of gender. In Indonesia, one ethnic group shows us that the idea that gender identity is expressed in more ways than two is actually hundreds of years old.
The Bugis are the largest ethnic group in South Sulawesi, Indonesia, and are unique in their conception of five distinct gender identities. Aside from the cisgender masculinity and femininity that Westerners are broadly familiar with, the Bugis interpretation of gender includes calabai (feminine men), calalai (masculine women) and bissu, which anthropologist Sharyn Graham describes as a “meta-gender” considered to be “a combination of all genders.”
Steven Wheeler in Inference:
As the 1960s drew to a close, Rainer Weiss was working as an associate professor at MIT’s Department of Physics.1 Asked to teach an undergraduate course on general relativity by the department chairman, Weiss found himself in the unenviable position of teaching an unfamiliar subject. “I had a terrible time with the mathematics,” Weiss recalled, “[a]nd I tried to do everything by making aGedankenexperiment out of it.”2
Weiss’s students were curious about the work of physicist Joseph Weber and, in particular, his attempts to detect gravitational waves. Predicted by Albert Einstein’s theory of general relativity, gravitational waves were a much-debated phenomenon for which no experimental evidence had been found. Weber’s efforts were centered on resonant mass detectors of his own design: suspended aluminum cylinders two meters in length and a meter in diameter fitted with a ring of piezoelectric crystals.3 Weber believed that the cylinders would, in effect, act like giant tuning forks; a passing gravitational wave would ring the cylinders at their resonant frequency. In a 1969 paper published by the Physical Review, Weber claimed to have found evidence for gravitational waves.
From the MIT Technology Review:
One of the curious features of network science is that the same networks underlie entirely different phenomena. As a result, these phenomena have deep similarities that are far from obvious at first glance. Good examples include the spread of disease, the size of forest fires, and even the distribution of earthquake magnitude, which all follow a similar pattern. This is a direct result of their sharing the same network structure.
So it’s usually no surprise that the same “laws” emerge when physicists find the same networks underlying other phenomena. Exactly this has happened repeatedly in the social sciences. Network science now allows social scientists to model societies, to study the way ideas, gossip, fashions, and so on flow through society—and even to study how this influences opinion.
To do this they’ve used the tools developed to study other disciplines. That’s why the new field of computational social science has become so powerful so quickly.
But there’s another field of endeavor that also stands to benefit: the study of history. Throughout history, humans have formed networks that have played a profound role in the way events have unfolded. Historians have recently begun to reconstruct these networks using historical sources such as correspondence and contemporary records.
Jerry Brown in the New York Review of Books:
I know of no person who understands the science and politics of modern weaponry better than William J. Perry, the US Secretary of Defense from 1994 to 1997. When a man of such unquestioned experience and intelligence issues the stark nuclear warning that is central to his recent memoir, we should take heed. Perry is forthright when he says: “Today, the danger of some sort of a nuclear catastrophe is greater than it was during the Cold War and most people are blissfully unaware of this danger.”1 He also tells us that the nuclear danger is “growing greater every year” and that even a single nuclear detonation “could destroy our way of life.”
In clear, detailed but powerful prose, Perry’s new book, My Journey at the Nuclear Brink, tells the story of his seventy-year experience of the nuclear age. Beginning with his firsthand encounter with survivors living amid “vast wastes of fused rubble” in the aftermath of World War II, his account takes us up to today when Perry is on an urgent mission to alert us to the dangerous nuclear road we are traveling.
Reflecting upon the atomic bombing of Hiroshima and Nagasaki, Perry says it was then that he first understood that the end of all of civilization was now possible, not merely the ruin of cities.
Hisham Matar in The Guardian:
I don’t remember a time when words were not dangerous. But it was around this time, in the late 1970s, when I was a young schoolboy in Tripoli, that the risks had become more real than ever before. There were things I knew my brother and I shouldn’t say unless we were alone with our parents. I don’t remember my mother or father explicitly telling us what not to say. It was simply implied and quickly understood that certain words strung together in a particular order could have grave consequences. Men were locked up for saying the wrong thing or because they were innocently quoted by a child. “Really, your uncle said that? What’s his name?” It was as though a listening, bad-intentioned ghost was now present at every gathering. It brought with it a new silence – wary and suspicious – that was to remain in our lives for many years. Even when I was writing my first novel in a shed in Bedfordshire, beside the River Great Ouse, I could feel the disapproving hot breath of the dictator at my neck. It did not matter that I was writing in English and yet to have a publisher; I was nonetheless writing into and against that silence. But back when I was still a boy, when I only lived in one language, that silence, like black smoke from a new fire, was still growing. Lists, drafted by the authorities, were read on television. They contained the names of those to be questioned. That was how, one afternoon, I heard our name, by which I mean my father’s name, read out. He was abroad. He did not return to Tripoli. A year or so later, we left the country to be reunited with him in Cairo where a new life began: new schools and new teachers.
Aziz Ansari in The New York Times:
“DON’T go anywhere near a mosque,” I told my mother. “Do all your prayer at home. O.K.?” “We’re not going,” she replied. I am the son of Muslim immigrants. As I sent that text, in the aftermath of the horrible attack in Orlando, Fla., I realized how awful it was to tell an American citizen to be careful about how she worshiped. Being Muslim American already carries a decent amount of baggage. In our culture, when people think “Muslim,” the picture in their heads is not usually of the Nobel Peace Prize winner Malala Yousafzai, Kareem Abdul-Jabbar or the kid who left the boy band One Direction. It’s of a scary terrorist character from “Homeland” or some monster from the news. Today, with the presidential candidate Donald J. Trump and others like him spewing hate speech, prejudice is reaching new levels. It’s visceral, and scary, and it affects how people live, work and pray. It makes me afraid for my family. It also makes no sense.
There are approximately 3.3 million Muslim Americans. After the attack in Orlando, The Times reported that the F.B.I. is investigating 1,000 potential “homegrown violent extremists,” a majority of whom are most likely connected in some way to the Islamic State. If everyone on that list is Muslim American, that is 0.03 percent of the Muslim American population. If you round that number, it is 0 percent. The overwhelming number of Muslim Americans have as much in common with that monster in Orlando as any white person has with any of the white terrorists who shoot up movie theaters or schools or abortion clinics.
Saturday, June 25, 2016
Kashmir Hill in Fusion:
The plot has been owned by the Vogelman family for more than a hundred years, though the current owner, Joyce Taylor née Vogelman, 82, now rents it out. The acreage is quiet and remote: a farm, a pasture, an old orchard, two barns, some hog shacks and a two-story house. It’s the kind of place you move to if you want to get away from it all. The nearest neighbor is a mile away, and the closest big town has just 13,000 people. It is real, rural America; in fact, it’s a two-hour drive from the exact geographical center of the United States.
But instead of being a place of respite, the people who live on Joyce Taylor’s land find themselves in a technological horror story.
For the last decade, Taylor and her renters have been visited by all kinds of mysterious trouble. They’ve been accused of being identity thieves, spammers, scammers and fraudsters. They’ve gotten visited by FBI agents, federal marshals, IRS collectors, ambulances searching for suicidal veterans, and police officers searching for runaway children. They’ve found people scrounging around in their barn. The renters have been doxxed, their names and addresses posted on the internet by vigilantes. Once, someone left a broken toilet in the driveway as a strange, indefinite threat.
All in all, the residents of the Taylor property have been treated like criminals for a decade. And until I called them this week, they had no idea why.
Annie Sparrow in the Bulletin of the Atomic Scientists:
Business and politics have always influenced international efforts to solve public health problems. Unfortunately that remains as true in the era of Ebola, Zika, and bioweapons as it did in the 19th century, when cholera—a disease that spreads more quickly and kills faster than any other pathogen—began its deadly global march. Beginning in 1817, cholera spread relentlessly from the Ganges Delta across Asia, reaching Europe in 1830 and North America in 1832, taking millions of lives along the way. It ultimately precipitated the first of 14 International Sanitary Conferences in 1851. At the time, the typical response to cholera was to quarantine ships traveling from affected areas, but this practice, which slowed commerce, was expensive and unpopular. The World Health Organization (WHO), whose origins lie in those early cholera pandemics, says they “were catalysts for intensive infectious disease diplomacy and multilateral cooperation in public health.” But in fact, the first six International Sanitary Conferences were entirely unproductive due to conflicting interests: government fears about losing profits from trans-Atlantic trade took priority over the need to reduce the international death toll. Consensus was achieved only at the seventh conference in 1892, after the opening of the Suez Canal for use by all countries made standardized quarantine regulations necessary. The participating states then unanimously approved and ratified the first of four International Sanitary Conventions, the forerunner of today’s International Health Regulations, which commit all governments to work toward stopping the spread of infectious disease and other global health threats.