Saturday, October 25, 2014
To what extent is the price of immortality humanity, as you could put it? Must the revolutionary artist ignore — even flout — the basic laws of decency that govern our world in order to transform that world? “Perfection of the life, or of the work,” as Yeats had it. “And if it take the second,” he went on, the intellect of man “must refuse a heavenly mansion, raging in the dark.”
It was an ancient question even then, but somehow every other book I’ve been reading of late comes back to it. Walter Isaacson’s unbiddable 2011 biography of Steve Jobs presents his subject as a kind of Lee Kuan Yew of the tech industry, demanding we give up our ideas of democracy and control in exchange for a gorgeously designed new operating system. Innovation doesn’t have to be so dictatorial: Albert Einstein, the subject of Isaacson’s previous biography, is revered in part for his readiness to defer to what he didn’t understand. Yet the more we read about Jobs publicly humiliating colleagues and refusing to acknowledge responsibility for the birth of his first child, the more we see that his genius could seem inextricable from his indifference to social norms.
Assaf Gavron's 2010 novel "Almost Dead" does something I would have thought impossible — it makes satire out of terrorism. The story of a man who becomes an Israeli national hero after surviving three attacks in a single week, the book offers a sharply ironic look at the intersection of image and reality.
This character is no role model; he's a guy in the wrong place at the right time. Gavron, who was born near Jerusalem and lives in Tel Aviv, is suggesting that we are all of us (citizens, nations, even, to some extent, terrorists) making it up as we go along.
A similar sensibility centers "The Hilltop," Gavron's seventh book, although only the second (after "Almost Dead") to appear in the United States. A sprawling novel that revolves around a small settlement in the occupied territories, its focus is less satirical than absurdist, offering a middle vision between the ridiculous and the sublime.
Perhaps of greater importance to Thomas’s poetry, however, was the wider cultural landscape of 1930s Wales, and Thomas’s geographical and familial location within it. Thomas’s parents were the personification of the intellectual and industrial movement from rural to urban that characterised Wales in the early 20th century. Both originated from Welsh-speaking families of agricultural and religious occupation. The young Thomas, a listener “in love with words”, found himself at the centre of a linguistic and cultural maelstrom. The languages of both Welsh and English informed his ear, just as both the streets of Swansea and the fertile fields of the Llanstephan peninsula informed his eye.
In his now famous notebooks, Thomas’s search for a poetic voice can be traced as if following his route on a map. In these notebooks, he passes through a period of derivative free verse before evolving his poems into the grander, more visceral and patterned work the world met just a few years later when he published 18 Poems. Although the book itself was modest, even retiring – no jacket copy and no author portrait, both on Thomas’s request – inside, the poems themselves were the polar opposite. Bold, physical and sonorous, they have been described by some critics as “biomorphic”. Thomas once wrote: “Every idea, intuitive or intellectual, can be imaged and translated in terms of the body, its flesh, skin, blood, sinews, veins, glands, organs, cells and senses.
Priscilla Gilman in The New York Times:
On Dec. 28, 1817, Benjamin Robert Haydon, then England’s pre-eminent history painter, hosted a dinner party to celebrate his progress on his latest work, “Christ’s Entry Into Jerusalem.” He invited, among others, three men anachronistically pictured in that painting: John Keats, William Wordsworth and the essayist Charles Lamb. In “The Immortal Evening” (the phrase is from Haydon’s letters and diaries), the poet and biographer Stanley Plumly offers an idiosyncratic, heartfelt, at once sinuous and expansive exploration of the dinner, its “aesthetic context and the larger worlds of the individual guests, particularly the three ‘immortal’ writers, Keats, Wordsworth and Lamb.”
Plumly begins the story strikingly, elliptically, in the present tense: “Keats has the most ground to cover.” What the walk to the dinner was like for each of its major participants, the look and feel of Regency London, what kind of food they would have eaten, all come to vivid life in Plumly’s evocative rendering. But if it presents a historical place and moment with immediacy and “present-tense personal intensity” (as Plumly says of Romantic art), “The Immortal Evening” also tackles timeless questions: “How does a living moment in time become ‘immortal’? What are a painting’s terms of immortality?” (Or, for that matter, a poem’s?) Why are some artists remembered and some forgotten?
Neal Hartman in Nautilus:
To those who say that there is no room for genius in modern science because everything has been discovered, Fabiola Gianotti has a sharp reply. “No, not at all,” says the former spokesperson of the ATLAS Experiment, the largest particle detector at the Large Hadron Collider at CERN. “Until the fourth of July, 2012 we had no proof that nature allows for elementary scalar fields. So there is a lot of space for genius.”
She is referring to the discovery of the Higgs boson two years ago—potentially one of the most important advances in physics in the past half century. It is a manifestation of the eponymous field that permeates all of space, and completes the standard model of physics: a sort of baseline description for the existence and behavior of essentially everything there is.
By any standards, it is an epochal, genius achievement.
What is less clear is who, exactly, the genius is. An obvious candidate is Peter Higgs, who postulated the Higgs boson, as a consequence of the Brout-Englert-Higgs mechanism, in 1964. He was awarded the Nobel Prize in 2013 along with Francois Englert (Englert and his deceased colleague Robert Brout arrived at the same result independently). But does this mean that Higgs was a genius? Peter Jenni, one of the founders and the first “spokesperson” of the ATLAS Experiment Collaboration (one of the two experiments at CERN that discovered the Higgs particle), hesitates when I ask him the question.
Farahnaz Ispahani Nina Shea in The Weekly Standard:
Pakistan’s blasphemy law, which turns 30 this year, has become only more deadly with age. Since blasphemy was made a capital crime under the nation’s secular penal code, the effect has been to suppress moderate influences, pushing “Pakistani society further out on the slippery slope of extremism,” said Mujeeb-ur-Rahman, senior advocate at the Supreme Court of Pakistan, in Washington last week. With its large population and sensitive location, Pakistan is a place where any societal shift in the direction of the Taliban deserves the attention of all concerned about Islamic extremism. Instead, this is one more foreign threat that the Obama administration underestimates.
On October 16, for the first time, an appeals court affirmed a death sentence for blasphemy meted out to a woman. A Christian mother of five, Asia Bibi was arrested in 2009 after fellow field hands complained that, during a dispute, she had insulted the prophet of Islam. No evidence was produced, because to repeat blasphemy is blasphemous. Similarly, anyone who defends an accused blasphemer risks being labeled a blasphemer; two officials who made appeals on Bibi’s behalf—Salman Taseer, governor of Punjab, and Shahbaz Bhatti, federal minister for minorities affairs—were assassinated in 2011. Bibi has one last legal recourse, an appeal to the federal Supreme Court, but now no public official dares speak up for her—or for any other blasphemy defendant.
Accusations of blasphemy are brought disproportionately against Pakistan’s Christians, some 2 percent of the population. Intent is not an element of the crime, and recent years have seen cases brought against illiterate, mentally disabled, and teenage Christians. Each case seems to heighten the sensitivities of the extremists and further fracture society. The flimsiest rumor of a Koran burning can spark hysteria ending in riots against entire Christian communities. Lahore’s St. Joseph Colony was torched last year in such a pogrom.
Moustafa Bayoumi, Kayla Epstein, Alan Yuhas, and Eli Valley in The Guardian:
As this panel’s Orthodox Jewish participant, I’m aware that I’ve been asked to participate for a very specific purpose: to bring the bearing of my religious and cultural upbringing to the question of whether The Death of Klinghoffer is antisemitic.
So let’s just get that out of the way: the answer is no.
I can understand why some would jump to to that conclusion, especially if they haven’t seen the opera. Klinghoffer forces Jewish audiences to confront some uncomfortable aspects of Israel’s history, and to relive a tragic chapter in the history of the Israeli-Palestinian conflict. There are some antisemitic lines delivered by one of the hijackers: “America is one big Jew,” he sneers at his cowering captives. But his brutal actions and the shrill, frenzied music that accompanies his words so clearly prove him a villain that it’s ridiculous to say composer John Adams and his librettist Alice Goodman are promoting that view.
Though two opposing sides are given the opportunity, over three hours, to present their narratives, it’s crucial to remember who gets the last word. Klinghoffer ends with a beautiful, heartbreaking aria by his widow Marilyn, who has just learned of the death of her husband. “I wanted to die,” she cries out. The finale lays bare the suffering and anguish that terrorism and antisemitism has wrought.
So why the outrage? Because there’s a lot of ignorance out there drowning out the facts about what this opera is about.
Sitting up with a yawn,
Rolling up the tattered mat,
Tucking up the torn mundu,
Walking along the hedges.
Not for a lark.
The muddy fields grimace,
The cows wag their tails.
Where is that long night –
The one they sang their fervent hymns about,
The one they said spring thunder
Would light up with brilliant flashes
Before the great new dawn arrived?
Hate, anger –
On racing pulses.
They stood leaning against the good old walls,
The graying firebrands.
Out of the dry, cracked, poetry-less soil they had sprung.
Drained by the waters of compassion
They had grown dreams on their bodies.
They now watch
As texts are served on a platter.
by Raghavan Atholi
from Poetry International Web
Over at the Columbia University Press Blog, an interview with Herve This on his new book:
Question: How does note-by-note cooking differ from molecular gastronomy?
Herve This: Molecular gastronomy is a scientific activity, not to be confused with molecular cooking. Indeed, molecular gastronomy, being science, has nothing to do with cooking. In other words, science is not about making dishes. Science looks for the mechanism of phenomena. That’s all. And technology uses the results of science to improve technique. So, note-by-note cooking is a technique.
Another question could be, how is note-by-note cooking different from molecular cooking? And here the answer would be that the definition of molecular cooking is “to cook using modern tools” (such as siphons, liquid nitrogen, etc.). But you still use meat, vegetables, etc. However, with note-by-note cooking, the instruments are not important, and the big revolution is to cook with pure compounds, instead of meat, vegetables, fruits, eggs, etc.
Q: Where does the name Note-by-Note Cooking come from?
HT: In 1999, when I introduced the name “molecular cooking,” I was upset, because it was a bad choice, which had to be made for many complex reasons. Unfortunately, people now confuse molecular gastronomy and molecular cooking. So, For note-by-note cooking, I wanted a name that could appeal to artists and it’s fair to say that note-by-note cooking is comparable to a term such as electro-acoustic music.
Q: Won’t not-by-note cooking produce artificial forms of food?
HT: Yes, but all food is “artificial”! Do you think that barbecue meat hangs “naturally” on the trees of the wild forest? Or that French fries appear suddenly from potatoes? No, you need a cook, to make them. In ordinary language, “natural” means “what was not transformed by human beings”, and “artificial” means that it was transformed, it was the result of human “art”.
Friday, October 24, 2014
Gary Wills in the New York Review of Books:
People are amazed or disgusted, or both, at today’s “power of the media.” The punch is in that plural, “media”—the twenty-four-hour flow of intermingled news and opinion not only from print but also from TV channels, radio stations, Twitter, e-mails, and other electronic “feeds.” This storm of information from many sources may make us underestimate the power of the press in the nineteenth century when it had just one medium—the newspaper. That also came at people from many directions—in multiple editions from multiple papers in every big city, from “extras” hawked constantly in the streets, from telegraphed reprints in other papers, from articles put out as pamphlets.
Every bit of that information was blatantly biased in ways that would make today’s Fox News blush. Editors ran their own candidates—in fact they ran for office themselves, and often continued in their post at the paper while holding office. Politicians, knowing this, cultivated their own party’s papers, both the owners and the editors, shared staff with them, released news to them early or exclusively to keep them loyal, rewarded them with state or federal appointments when they won.
It was a dirty game by later standards, and no one played it better than Abraham Lincoln. He developed new stratagems as he rose from citizen to candidate to officeholder. Without abandoning his old methods, he developed new ones, more effective if no more scrupulous, as he got better himself (and better situated), for controlling what was written about him, his policies, and his adversaries.
Rebecca Morelle at the BBC:
Now, the rest of the dinosaur's body has been unearthed, and researchers say that the creature is even more bizarre than they had thought.
They say it was huge, with a beak, a humped back and giant, hoofed feet.
The study is published in the journal Nature.
Lead researcher Yuong-Nam Lee, from South Korea's Institute of Geoscience and Mineral Resources (Kigam), said: "It turned out to be one of the weirdest dinosaurs, it's weird beyond our imagination."
Brendan Fitzgerald in The Morning News:
A man dies, leaving behind, among other things, a combination lock. Opening it may just prove the existence of the afterlife.
I first learned about Stevenson through his obituary, which ran in the New York Times in 2007. My wife and I have a habit of sharing interesting obituaries with each other. I sent Stevenson’s to her, with a few sentences highlighted: “Tucked away in a file cabinet in the Division of Perceptual Studies is an ordinary combination lock, which Dr. Stevenson bought and locked nearly 40 years ago. He had set the combination himself.” A colleague told the Times that Stevenson hoped to communicate the combination to a friend or loved one, who could open the lock and prove that some part of Stevenson had survived death. Ian Stevenson, who was born in 1918, spent about half his life studying reincarnation and past-life memories. He documented thousands of stories from young men and women who claimed to recall previous lives, whose birthmarks resembled fatal wounds sustained by others. Perhaps Stevenson’s death provided him with the answer he sought for most of his life. For me, it simply rekindled the question I’d struggled with for most of mine: What happens when we die?
...The Combination Lock Test for Survival began the same year Stevenson founded the Division of Perceptual Studies. Stevenson had heard a number of stories from colleagues about men and women opening combination locks set by deceased loved ones. In one instance, a widow opened a locked box, though she claimed her husband was the only person who knew the combination. In another, a widower asked a medium to help him open a lock set by his deceased wife, and the medium recounted the correct combination. Perhaps, after the death of its keeper, a combination might be communicated, its lock unclasped, its mouth opened. Stevenson purchased a Sargent & Greenleaf model combination lock from Brown’s Lock & Safe, a Charlottesville locksmith located a few miles from his home. He set its password then placed the lock in a drawer inside his office. “In setting my own locks, I started not with meaningless random numbers, but with a word or phrase that is extremely meaningful to me,” Stevenson said at the time. “I have no fear whatever of forgetting it on this side of the grave and, if I remember anything on the other side, I shall surely remember it.” Though, on one occasion, Stevenson opened his own lock—in 1975, to check either the lock’s reliability or his own—then closed it again. The Division of Perceptual Studies collected 10 more locks—a duplicate for Stevenson, and nine others. Upon news of the death of a lock’s owner, Stevenson awaited contact from family or friends who might have received some suggestion of a combination, or might have visited a medium for help contacting the dead. When a colleague and fellow combination lock owner died in 1979, Stevenson received more than 40 letters that suggested combinations, and consulted two mediums himself, his colleague’s lock in his pocket. All combinations failed.
Sophia Nguyen in Harvard Magazine:
In the early 2000s, a riptide of business scandals toppled Enron, Arthur Andersen, and WorldCom. In the aftermath, says Straus professor of business administration Max Bazerman, “society turned to professional schools” to ask why their graduates were misbehaving. Behavioral ethics—combining aspects of moral philosophy, cognitive science, psychology, and economics—was born: “a creation of the new millennium.” As a teacher in this field, Bazerman explains, “My job is not about what ethics you follow, but how to bring you up to your own ethical standards.”
...In his book, Bazerman highlights promising directions in behavioral decision research that have the potential to promote more effective and more ethical noticing. One involves “choice architecture,” a term coined by Richard H. Thaler and Walmsley University Professor Cass R. Sunstein in their 2008 book Nudge: Improving Decisions about Health, Wealth, and Happiness. Choice architecture taps knowledge of psychology to identify better ways to present options—and Bazerman asserts that organizations can use it to create systems that increase the likelihood of their staff noticing key data. In a study he conducted with Kennedy School colleagues Alexandra van Geen and professor of public policy Iris Bohnet, supervisors were asked to assess a pool of job candidates. When judging applicants one at a time, they tended to favor men on quantitative tasks and women on verbal tasks. But when judging male and female candidates side-by-side, they relied on performance-related data; gender biases no longer factored into the decision. Changing the structure of the hiring process encouraged people to pay attention to the important information.
Whether we like it or not, the big idea behind American democracy is to make us like each other more. It’s a faintly embarrassing dimension of our social experiment, carved out of the crack-up of the original British colonies, that the great theorists and practitioners of new world order in America were looking for something more than political independence. They sought to create a basis for the small-r republican ideal of fraternity: a territorially limited, widely participatory, and socially equitable economy made up principally of small producers—home manufacturers, merchants, and farmers. Only on such a basis, the theory went, could America be prevented from regressing into anarchy, despotism, or worse.
But things didn’t exactly go as planned. Come the Jacksonian age, the legal interpreters of the U.S. Constitution, spurred on by the directives of a fast-consolidating national and corporate economy, ratcheted the whole enterprise upward into something that many of the founders would have seen as a blatant contradiction in terms: a “commercial republic,” as the jurisprudence of the Federalist-on-the-make John Marshall (echoing the political rhetoric of his close political ally Daniel Webster) had it.
Few institutions have offered themselves as less promising for the novelist than the modern office. Work of any kind is a tricky subject for representation; office work—gray, gnomic, and unknowable—even more so. After all, what is it that people do in offices? Herman Melville’s “Bartleby the Scrivener: A Story of Wall Street,” the locus classicus for discussions of early clerical work, begins by depicting strategies for avoiding work at what is nominally a law office. Few of the unnamed narrator’s employees seem to do much lawyering: Turkey works through the morning, but gets drunk at lunch; Nippers never finds an appropriate position to sit at his desk. And then there’s Bartleby, who, unlike his colleagues, works—and does so without fanfare, “silently, palely, mechanically.” But rather than producing things, he seems to consume them. “As if long famishing for something to copy,” the narrator observes, “he seemed to gorge himself on my documents.” And then—famously—Bartleby suddenly loses interest in his work. The tedium of office life offers a brief moment of satisfaction for Bartleby, which just as quickly vanishes; eventually deprived of his paperwork sustenance, Bartleby starves to death.
While “Bartleby” has remained unmatched as a parable of white-collar alienation (it was adapted to the contemporary, cubicular, and computerized workplace as a film in 2001), its casual treatment of the actual substance of work makes it unexceptional in the history of the literature of the office. Like many office novels that have followed, it is primarily one of manners—or in Bartleby’s case, a lack thereof.
Reynolds dominated British art for some three decades before his death in 1792, by which time the British portrait was firmly established. Jonathan Richardson, in his influential Essay on the Theory of Painting (1715), remarked of contemporary portraitists that they had “prostituted a Noble Art, chusing to exchange the honorable Character of good painters for that sordid one of profess’d, mercenary flatterers”. Richardson’s essay was a powerful influence on the young Reynolds. Reynolds went on to cultivate an excellent character as a painter, becoming the first president of the Royal Academy of Arts on its foundation in 1768 (a position he kept for the rest of his life) and acquiring a knighthood; meanwhile, understanding how important it was to distinguish himself from those whom William Hogarth labelled a “nest of Phizmongers”, he worked tirelessly to combat sordid associations. His success is all the more remarkable given his equally tireless attention to the business side of his art. Reynolds quickly became very famous and rich as a portraitist without attracting the opprobrium of being a mercenary flatterer. How did he do it?
Avoiding insipidity was a good start, and to leaf through Hallett’s sumptuous volume is to feel the vibrancy. In the early portraits especially, something is generally going on: Commodore Augustus Keppel is striding towards us (a boiling sea behind); David Garrick is being pulled one way by the muse of comedy and the other by tragedy; Colonel Acland and Lord Sydney are flying through the forest in “The Archers”, a gloriously silly masquerade of heroic masculinity which draws from Hallett one of his rare acknowledgements that Reynolds, in his determination to make the picture move, might sometimes “teeter on the brink of absurdity”.
Bruce Barlett in The American Conservative:
A Republican stimulus would undoubtedly have had more tax cuts and less spending, even though every serious study has shown that tax cuts are the least effective method of economic stimulus in a recession. Even so, tax cuts made up 35 percent of the budgetary cost of the stimulus bill—$291 billion—despite an estimatefrom Obama’s Council of Economic Advisers that tax cuts barely raised the gross domestic product $1 for every $1 of tax cut. By contrast, $1 of government purchases raised GDP $1.55 for every $1 spent. Obama also extended the Bush tax cuts for two years in 2010.
It’s worth remembering as well that Bush did not exactly bequeath Obama a good fiscal hand. Fiscal year 2009 began on October 1, 2008, and one third of it was baked in the cake the day Obama took the oath of office. On January 7, 2009, the Congressional Budget Office projected significant deficits without considering any Obama initiatives. It estimated a deficit of $1.186 trillion for 2009 with no change in policy. The Office of Management and Budget estimated in November of that year that Bush-era policies, such as Medicare Part D, were responsible for more than half of projected deficits over the next decade.
Republicans give no credit to Obama for the significant deficit reduction that has occurred on his watch—just as they ignore the fact that Bush inherited an projected budget surplus of $5.6 trillion over the following decade, which he turned into an actual deficit of $6.1 trillion, according to a CBO study—but the improvement is real.
Republicans would have us believe that their tight-fisted approach to spending is what brought down the deficit. But in fact, Obama has been very conservative, fiscally, since day one, to the consternation of his own party. According to reporting by the Washington Post and New York Times, Obama actually endorsed much deeper cuts in spending and the deficit than did the Republicans during the 2011 budget negotiations, but Republicans walked away.
Obama’s economic conservatism extends to monetary policy as well. His Federal Reserve appointments have all been moderate to conservative, well within the economic mainstream.
An excerpt from Tom Shachtman's Gentleman Scientists and Revolutionaries in Scientific American:
During the Revolutionary War, while American laboratory and field research was much reduced, science did not grind to a halt. Scientific thought helped frame America’s initiating rhetoric of the war, and throughout the conflict innovations in medicine and disease control and in arms and armaments were integral to the American effort. This and the next two chapters deal with science-related aspects of the war, the present one with the initiating rhetoric, the next with the medical aspects, and the following chapter with technology in armament.
In the seventeenth and eighteenth centuries, Jürgen Habermas writes, the “light of reason” entered the public sphere in stages, cropping up first among the elite and in a semiprivate way before being adopted by ever wider groups. Broad public participation in debate did not take place until the “problemization of areas that had until then not been questioned,” and when “the issues discussed became ‘general’ not merely in their significance but also in their accessibility: everyone had to be able to participate.” Those stages had characterized the path of natural philosophy in the American colonies, from the initial debate in the public sphere of 1721–1722 about smallpox prevention in the Boston epidemic, increasing through the middle decades of the century and cresting in the broad participation in the recording of the 1769 transit of Venus and in the growing audience for the efforts of the renewed American Philosophical Society. The same path to acceptance was being hewed in the consideration of non-monarchical and non-church governance: the debate was moving steadily from the elite’s private colloquies to publicly available written materials and thence to open assemblies. Habermas insists that the “communicative” aspects of this path were absolutely vital; in his view, the availability of newspapers able to operate beyond the day-to-day control of governing powers geometrically increased a populace’s ability to engage in public argument. From 1765 on, there were increasingly sophisticated discussions in colonial news- papers of direct and indirect taxes and of an accused person’s right to habeas corpus, as well as of such scientific matters as the parallax to be computed from observations of the transit of Venus. At stake in all these sort of discussions were Enlightenment ideals, particularly those of liberty, justice, and equality, enabling ordinary citizens to openly consider wresting their collective freedom from what Habermas labels the “restrictive particularism” of fealty to kings, lords, and church hierarchies.
Thursday, October 23, 2014
Today would've been Matthew Power's 40th birthday. The GQ contributor and friend of GQ staffers past and present died this March while reporting a story along the Nile River in Uganda. Matt was curious, adventurous, and always empathetic—he worked tirelessly to understand his subjects and bring them alive on the page. You can read Matt's stories about drone pilots, urban explorers, and theart world here on GQ.com.
In the days after Matt died, dozens of tributes to him sprang up online, coming in from all around the world. Even for those of us who knew Matt well, who knew what a constant and generous friend he was, what was especially remarkable were the number of young journalists Matt had taken the time to coach and mentor: beginning writers who'd gotten a fan letter from Matt at a moment when they were considering giving up, established journalists who owed at least part of their success to a chance Matthew Power had encouraged them to take on themselves.
John Quiggin in Crooked Timber:
Gough Whitlam, Prime Minister of Australia from 1972 to 1975, died on Tuesday. More than any other Australian political leader, and as much as any political figure anywhere, Gough Whitlam embodied social democracy in its ascendancy after World War II, its high water mark around 1970 and its defeat by what became known as neoliberalism in the wake of the crises of the 1970s.
Whitlam entered Parliament in 1952, having served in the Royal Australian Air Force during the War, and following a brief but distinguished legal career. Although Labor had already chosen a distinguished lawyer (HV Evatt) as leader, Whitlam’s middle-class professional background was unusual for Labor politicans
Whitlam marked a clear break with the older generation of Labor politicians in many other respects. He was largely indifferent to the party’s socialist objective (regarding the failure of the Chifley governments bank nationalisation referendum as having put the issue off the agenda) and actively hostile to the White Australia policy and protectionism, issues with which Labor had long been associated.
On the other hand, he was keen to expand the provision of public services like health and education, complete the welfare state for which previous Labor governments had laid the foundations, and make Australia a fully independent nation rather than being, in Robert Menzies words ‘British to the bootstraps’.
Adam Shatz in the London Review of Books:
The Death of Klinghoffer, John Adams’s 1991 opera about the hijacking of the Achille Lauro by the Palestine Liberation Front in 1985, has achieved a rare distinction in contemporary classical music: it’s considered so dangerous by its critics that they’d like to have it banned. For its opponents – the Klinghoffer family, Daniel Pearl’s father, conservative Jewish organisations, and now the former New York mayor Rudy Giuliani and former New York governor George Pataki, who took part in a noisy demonstration outside the Met last night – Klinghoffer is no less a sacrilege than The Satanic Verses was to Khomeini and his followers. They haven’t issued a fatwa, but they have done their best to sabotage the production ever since the Met announced it.
Peter Gelb, the Met’s general manager, capitulated in the summer to pressure from the Anti-Defamation League (and, according to the New York Times, from ‘three or four’ major Jewish donors), cancelling a live broadcast to cinemas around the world. The rationale for the decision, made against the backdrop of the Gaza offensive, was that the opera might be exploited by anti-semites. How, they didn’t say. For some reason the opera’s enemies don’t seem concerned that its unflinching portrayal of the murder of an elderly Jew in a wheelchair might be ‘used’ to foment anti-Muslim sentiment.
The notion that Adams and his librettist, Alice Goodman, are justifying terrorism is absurd. The hijacking is depicted in all its horror, chaos and fear. The scene that raised accusations of anti-semitism, a dinner table conversation among ‘the Rumors’, an American-Jewish family, was excised from the libretto long ago.
David Remnick in The New Yorker:
Benjamin Crowninshield Bradlee, the most charismatic and consequential newspaper editor of postwar America, died at the age of ninety-three on Tuesday. Among his many bequests to the Republic was a catalogue of swaggering anecdotes rich enough to float a week of testimonial dinners. Bradlee stories almost always relate to his glittering surface qualities, which combined the Brahmin and the profane. Let’s get at least one good one out of the way:
During his reign, from 1968 to 1991, as the executive editor of the WashingtonPost, Bradlee took time periodically to dictate correspondence into a recorder. His letters in no way resembled those of Emily Dickinson. He was given neither to self-doubt nor to self-restraint. In his era, there may have been demands by isolated readers for greater transparency, for correction or explanation, but there was no Internet, no Twitter, to amplify them. Bradlee was, by today’s standards, unchallengeable, and he was expert in the art of florid dismissal. His secretary, Debbie Regan, was, in turn, careful to reflect precisely his language when transcribing his dictation. One day, Regan approached the house grammarian, an editor named Tom Lippman, and admitted that she was perplexed. “Look, I have to ask you something,” she said. “Is ‘dickhead’ one word or two?”
This sort of stuff was especially entertaining when you remembered that Bradlee’s family was a concoction of seventeenth-century Yankees and semi-comic Vanity Fair-like European royalty.
Rory Stewart in the New York Review of Books:
There is a consensus in Afghan society: violence…must end. National reconciliation and respect for fundamental human rights will form the path to lasting peace and stability across the country. The people’s aspirations must be represented in an accountable, broad-based, gender-sensitive, multi-ethnic, representative government that delivers daily value.
That was twelve years ago. No one speaks like that now—not even the new president. The best case now is presented as political accommodation with the Taliban, the worst as civil war.
Western policymakers still argue, however, that something has been achieved: counterterrorist operations succeeded in destroying al-Qaeda in Afghanistan, there has been progress in health care and education, and even Afghan government has its strengths at the most local level. This is not much, given that the US-led coalition spent $1 trillion and deployed one million soldiers and civilians over thirteen years. But it is better than nothing; and it is tempting to think that everything has now been said: after all, such conclusions are now reflected in thousands of studies by aid agencies, multilateral organizations, foreign ministries, intelligence agencies, universities, and departments of defense.
But Anand Gopal’s No Good Men Among the Living shows that everything has not been said. His new and shocking indictment demonstrates that the failures of the intervention were worse than even the most cynical believed.