Monday, July 25, 2016
As you know, we are able to run the site only because our regular readers support us through subscriptions or one-time payments.
Whichever you'd like to do, please take a couple of minutes and use the appropriate button near the top of the left-hand column to make a contribution.
Please do it now!
Also, if you missed it, read this great profile of 3QD by Thomas Manuel in The Wire.
New posts below.
Friday, July 29, 2016
Martha Nussbaum: Anger is the emotion that has come to saturate our politics and culture. Philosophy can help us out of this dark vortex
Martha Nussbaum in Aeon:
There’s no emotion we ought to think harder and more clearly about than anger. Anger greets most of us every day – in our personal relationships, in the workplace, on the highway, on airline trips – and, often, in our political lives as well. Anger is both poisonous and popular. Even when people acknowledge its destructive tendencies, they still so often cling to it, seeing it as a strong emotion, connected to self-respect and manliness (or, for women, to the vindication of equality). If you react to insults and wrongs without anger you’ll be seen as spineless and downtrodden. When people wrong you, says conventional wisdom, you should use justified rage to put them in their place, exact a penalty. We could call this football politics, but we’d have to acknowledge right away that athletes, whatever their rhetoric, have to be disciplined people who know how to transcend anger in pursuit of a team goal.
If we think closely about anger, we can begin to see why it is a stupid way to run one’s life. A good place to begin is Aristotle’s definition: not perfect, but useful, and a starting point for a long Western tradition of reflection. Aristotle says that anger is a response to a significant damage to something or someone one cares about, and a damage that the angry person believes to have been wrongfully inflicted. He adds that although anger is painful, it also contains within itself a hope for payback. So: significant damage, pertaining to one’s own values or circle of cares, and wrongfulness. All this seems both true and uncontroversial. More controversial, perhaps, is his idea (in which, however, all Western philosophers who write about anger concur) that the angry person wants some type of payback, and that this is a conceptual part of what anger is. In other words, if you don’t want some type of payback, your emotion is something else (grief, perhaps), but not really anger.
Is this really right? I think so.
Rachel Wong in The Point:
After a landslide victory in Myanmar’s national elections last year, Aung San Suu Kyi and the National League for Democracy came to power this February. Among those who took their seats in parliament were eleven poets, many of whom were active during the democracy protests of 1988 and are former political prisoners. Myanmar’s new president is the son of the renowned poet Min Thu Wun. And in a highly publicized trial this year, Maung Saungkha was arrested for defaming the former president in his verses. Circles of poets in traditionalist Mandalay, socialist-realist Pyinmana, cosmopolitan Yangon and elsewhere are debating what it means to write poetry in a time of transition from dictatorship to democracy. Under a state that has abolished censorship, what is the function of a dissident? When the opposition of many years, led by Aung San Suu Kyi, has finally become the ruling power, who forms the new civil society?
Recorded in April at a dinner gathering in the apartment of Point editor Rachel Wong, what follows is a conversation with four writers, publishers and translators from Yangon. Also present were American journalist Maddy Crowell and Point founding editor Jon Baskin.
Among the intriguing issues in plasma physics are those surrounding X-ray pulsars—collapsed stars that orbit around a cosmic companion and beam light at regular intervals, like lighthouses in the sky. Physicists want to know the strength of the magnetic field and density of the plasma that surrounds these pulsars, which can be millions of times greater than the density of plasma in stars like the sun.
Researchers at the U.S. Department of Energy's (DOE) Princeton Plasma Physics Laboratory (PPPL) have developed a theory of plasma waves that can infer these properties in greater detail than in standard approaches. The new research analyzes the plasma surrounding the pulsar by coupling Einstein's theory of relativity with quantum mechanics, which describes the motion of subatomic particles such as the atomic nuclei—or ions—and electrons in plasma. Supporting this work is the DOE Office of Science.
The key insight comes from quantum field theory, which describes charged particles that are relativistic, meaning that they travel at near the speed of light. "Quantum theory can describe certain details of the propagation of waves in plasma," said Yuan Shi, a graduate student in the Princeton Program in Plasma Physics and lead author of a paper published July 29 in the journal Physical Review A. Understanding the interactions behind the propagation can then reveal the composition of the plasma.
I have it on the highest authority that summer will never end. It might get cooler, intermittently, but it will never stop being summer. Which is of course wonderful, because summer is a bubble during which life’s ordinary rules are suspended. Summer is when we don’t have to get up in the morning, or even the afternoon. Summer is when we insist on ice-cold beer to chill our body cavity, especially the spleen. Summer is when we go see particularly stupid movies because it would be unseasonal to have to think. Summer is when we get into fights with the neighbors over noise or property lines or because we should live next door as well as at our house. Summer is when nobody ever has to make eye contact. Summer is when nothing ever happened before this moment right now. Summer is when we trash the joint because whatever. Summer is when we fire guns into the air and the bullet never comes down.
All the very best countries celebrate their national holidays in the summer. Summer is the season best adapted to modern commerce. When you think Hollywood, you think summer. Arson perpetuates summer; it makes it an action and not just a moment. In summer, we think of rain as a calamity, and so does the weatherman, who pulls a long face at the announcement of a sunless day. Summer craves fresh bodies.
What Charlotte Brontë: A Life does with conviction is remember what many biographies forget: that this is a terrific story. Brimming with indomitable personalities, trials and ordeals, passions and disappointments, it has all the elements of a traditional romance. At the same time, its protagonist, a restless, dissatisfied heroine struggling to make and remake the world in her quest for growth and recognition, is the quintessential modern subject: the subject of the modern novel.
Harman places the propulsive force of story-telling at the heart of her narrative, and Charlotte Brontë steps forth newly minted, as if we’ve never met her before. Harman’s portraits are always three-dimensional and humane, even when they treat of energetic, creative, sometimes monstrously difficult people. Her account of Charlotte’s father, the Revd Patrick Brontë is characteristic. As a young man, Patrick wrote a poem for his fiancée Mary Burder praising her eyes of sparkling blue. When the Burder option fizzled out (Patrick seems to have got cold feet because she was a Dissenter), the poem found its way into his slim volume of Cottage Poems. He then presented the volume, with the poem, to another young woman, pointedly inking in a correction to “sparkling hazle eye”. As Harman observes, this efficient repurposing might at a push have seemed engagingly “familiar and jocular” if the message of the poem had been less dour. “But hark, fair maid! whate’er they say / You’re but a breathing mass of clay / Fast ripening for the grave” were lines “unlikely to delight a twenty-year-old girl” even if he had got the colour of her eyes right. “[A] reckless and rather chilly lover” is Harman’s crisp verdict.
He is an old man from a coastal town. He’s uneducated by modern standards, and worked for an industry that is now defunct. He spends his retirement shooting suspicious looks at anyone who looks “forrun” and wincing at the sound of Polish voices. He voted to quit the EU. He’s Mr Leave.
In the aftermath of Brexit, this caricature has haunted the imagination of many a Remain voter. But a new report from the think tank British Future shows it is a false one. Just as a quarter of Remain voters also backed the Tories in 2015 (sorry, progressive alliancers), Leave voters have different views on immigration, sovereignty and the economy.
Here are some of the most surprising insights from the polling, which was carried out with pollsters ICM:
While a quarter of Leave voters cited immigration as their number one reason, more than half said they were motivated by “taking power back from Brussels”.
In contrast to the caricature of the ancient xenophobe, the older a Leave voter, the more likely sovereignty was their motivation.
A son and a daughter.
The mother prefers the son to the daughter.
The son will stand by his mother through the vicissitudes of life.
The daughter will produce another son to stand by her side.
To be in love is not to be a bird in the hand of the one you love,
better for them than ten in the bush.
A bird in the bush is better than ten in the hand,
from the birds’ point of view.
Sometimes love is like a meal to someone who’s been fasting
at other times it’s like a new pair of sports shoes given
to a disabled child.
Love, in general, is a deal that brings much loss
to all parties.
The old door applauds the wind by clapping
for the dance it has performed, accompanied by the trees.
The old door doesn’t have hands
and the trees haven’t been to dancing school.
And the wind is an invisible creature,
even when it’s dancing with the trees.
by Ashraf Fayadh’s
from Instructions Within
translated by Jonathan Wright:
Palestinian poet Ashraf Fayadh, is serving eight years and 800 lashes in Saudi Arabia
for the alleged apostasy in his collection Instructions Within
—“I am fearful of being forgotten,” he has said.
Jill Abramson in The Guardian:
In her convention speech, Hillary repeated a story she’s often told about her mother. Dorothy Rodham insisted that her daughter stand up to bullies, saying “Cowards don’t live in this house.” Her mantra was hit back when someone hits you. Her daughter clearly took the lesson to heart and enjoyed every punch she delivered on Thursday night in Philadelphia. It was payback time for all of those “Crooked Hillary” jabs from Trump and the Republican convention refrain, “Lock her up.” It was an effective performance, spoken not in anger, but in a tone of humorous sarcasm. Clinton often and proudly talks about how she gets under Trump’s skin and judging from his tweets after the speech, she surely did. Besides getting a kick out of kicking Trump, she’s quite good at ridiculing him. Her script was scrupulously factual, making her case against her opponent all the more devastating. Running against Bernie Sanders in the Democratic primaries and Barack Obama, she never seemed comfortable going negative. Politically, it was risky for a female candidate, especially her, to seem mean as she’s already viewed as unlikeable by a significant portion of voters.
But Trump has been so vulgar and mean himself, the political risk seems minimal. His temperament and crazed policy proposals, which have only become more preposterous lately, make him an easy target. Hillary’s best speeches in the campaign have been the ones in which she tears apart Trump’s proposals. Her speech in San Diego before the California primary was a triumph with its tight focus on Trump’s dangerous international and national security proposals, including banning Muslims from the country and reviving torture. Her convention speech was an artful retort to Trump, contrasting President Reagan’s “Morning in America” with Trump’s “Midnight in America.” She portrayed Trump’s boasts of being able to fix the country’s problems himself as un-American. The American way, she stressed repeatedly, is working together to fix the ills of society. Her performance in Philadelphia also showed that she’s become more media-savvy. A witty put-down is sure to receive more coverage than a dry policy lecture. Maybe Clinton has finally learned that she can’t let Trump own every news cycle. One of her funniest lines was: “There is no other Donald Trump. This is it.” The delegates roared.
John Harris in Nature:
In Redesigning Life, molecular pharmacologist John Parrington has produced a veritable compendium of games that scientists like him can play with life itself. He invites us to imagine the potential of life forms “whose very genetic recipe was manufactured in a chemistry lab using new components never seen before on Earth”. What larks! What follows is a thorough and comprehensive account of the methodologies for altering life that have been or are being developed, and the directions that they may take in future. Those methodologies include the insertion or deletion of genes, the engineering of synthetic genes and the design of creatures unprecedented in nature. As Parrington shows, many of the technologies are familiar: for example, designing immunity to disease through vaccination, or animal and plant breeding. He ends with the concept of a “redesigned planet”, replete with new types of people, as well as designer babies, pets, plants and drugs. Invoking the catchphrase of comic-book superhero Spiderman, “with great power comes great responsibility”, he touches on the challenges that such a possibility would entail.
...Why might it be better to aim to increase cognitive powers and perhaps even intelligence by education, diet or exercise than through gene editing or drugs? One would also need to identify elements that would clearly be unethical to design into a person, such as an increased propensity to disease or premature death. Most importantly, one would need to consider why attempts at design are morally worse (if they are) than simply leaving things to the genetic lottery of sexual reproduction. There is a story that in the early twentieth century, the pioneering modern dancer Isadora Duncan suggested to writer George Bernard Shaw that they should have a child, surmising that with her looks and his brains any progeny would have huge advantages. The ever-rational Shaw responded, “But what if it had your brains and my looks?” Was Duncan's proposal unethical or just misconceived? What would or should have made such a proposal ethically problematic? And if it was not ethically problematic, why might more 'techie' attempts become unethical?
Thursday, July 28, 2016
Helen De Cruz in The Prosblogion:
Can you tell me something about your academic position, and about your current religious affiliation/self-identification – please feel free to say something about your religious upbringing or history, or anything else that might be relevant to your current religious affiliation.
I am James B. Duke University Professor of Philosophy at Duke University in Durham NC, where I am Co-director of the Center for Comparative Philosophy. I was raised as a Roman Catholic and still have that Catholic boy inside me. I received a fantastic education from nuns, most of whom had never been to anything that we would call college. I get Catholicism. It is in my blood and bones. It is familiar. In Rome last year, my wife and I visited Saint Peter’s, many other churches, went to vespers at a convent, and I was consistently moved, engaged. But I haven’t practiced since I was a young teenager. I was bothered by hell, specifically the idea that a good God would have such a place, by the emphasis on sexual sins, and by a sincere worry that although Jesus might be understood as a prophet, as he is in the Koran, but was simply nowhere good enough to be God.
So, I am a certain kind of atheist, a philosophical one, who has never heard a substantive conception of God, the sort that is presented in creedal religions (I believe in god the Father almighty…) that I thought the weight of reasons supported belief in. The reasons always seem to weigh against actually believing in THAT God. This philosophical orientation goes well with a certain resistance to epistemic over-confidence that is needed to speak confidently about the existence or nature of one’s God or gods.
In part, I have been too impressed, in a good way I think, by my interest and study of other great world religions to be confident about the creedal parts of the Catholicism I was raised in, which I was told was the one true religion.
Dan Falk in The Atlantic:
Einstein once described his friend Michele Besso as “the best sounding board in Europe” for scientific ideas. They attended university together in Zurich; later they were colleagues at the patent office in Bern. When Besso died in the spring of 1955, Einstein—knowing that his own time was also running out—wrote a now-famous letter to Besso’s family. “Now he has departed this strange world a little ahead of me,” Einstein wrote of his friend’s passing. “That signifies nothing. For us believing physicists, the distinction between past, present, and future is only a stubbornly persistent illusion.”
Einstein’s statement was not merely an attempt at consolation. Many physicists argue that Einstein’s position is implied by the two pillars of modern physics: Einstein’s masterpiece, the general theory of relativity, and the Standard Model of particle physics. The laws that underlie these theories are time-symmetric—that is, the physics they describe is the same, regardless of whether the variable called “time” increases or decreases. Moreover, they say nothing at all about the point we call “now”—a special moment (or so it appears) for us, but seemingly undefined when we talk about the universe at large. The resulting timeless cosmos is sometimes called a “block universe”—a static block of space-time in which any flow of time, or passage through it, must presumably be a mental construct or other illusion.
More here. [Thanks to Sean Carroll.]
My sister and fellow 3QD editor Azra Raza in Stat:
It’s a long way from where I grew up in Karachi, Pakistan, to the dining room in Vice President Joe Biden’s home at the Naval Observatory in Washington, D.C. Yet that’s where I found myself one day last December, along with a handful of other cancer specialists. We had been invited to offer our perspectives on the current cancer landscape, which contributed to shaping the “cancer moonshot.” I’m convinced that my perspective on medicine as an immigrant is what ultimately got me to the table.
Early in my career as an oncologist at Roswell Park Cancer Institute in Buffalo, N.Y., I treated a woman who was terminally ill with acute myeloid leukemia (AML). As her disease progressed, I watched her struggle to write letters for her 2-year-old twin daughters. She wanted them to read a letter from her on each of their birthdays until they turned 21. She died before she got to the ones for their 13th birthday.
That experience nearly broke my heart. It also suddenly clarified the purpose of my career. I realized that we needed a more comprehensive understanding of her disease. I needed to learn how pre-leukemia develops into leukemia, how it continues to evolve, and how it can be treated.
Had I received my scientific training in the United States, my immediate instinct probably would have been to develop a sophisticated mouse model to work on each of those steps. But because I was educated in Pakistan, I thought about taking a simpler approach — examining the cells of patients with myelodysplastic syndromes (MDS), an early-stage version of leukemia.
Marilynne Robinson is a Christian in a country that increasingly isn’t. She belongs to the American “mainline,” a collection of Protestant denominations with deep roots in European history, reliably liberal politics and, if current demographic and attendance trends continue, just a few decades to live. Why should the mainline be disappearing? And why would anybody care if it did? In her most recent books, a collection of essays,The Givenness of Things, and a novel, Lila, Robinson poses these questions but only partially answers them. Her reply to the first question is never fully satisfying, perhaps because she has much in common with the movement that is largely responsible for the mainline’s decline. The second question is even more difficult, but Robinson the novelist gives a better answer.
Liberal mainline denominations—like the Episcopalians, Congregationalists, and certain strains of Lutheranism—never represented the whole of American Christianity, but as recently as the middle of the last century their adherents numbered in the scores of millions and included almost all of America’s political and cultural elites. Since then their fall has been rapid and steep. Between 2000 and 2010 the United Church of Christ (Robinson’s denomination) lost nearly seven hundred congregations and over three hundred thousand members, bringing its total membership to less than half what it was in 1957.
The legend of O’Keeffe is so monumental that her art sometimes seems secondary. With some artists – Picasso, for instance – the work lives up to the man; with others, such as Frida Kahlo (coincidentally a friend of O’Keeffe’s), not so much. There is, remarkably, not a single painting by O’Keeffe in a British public collection, which makes the retrospective of her work at Tate Modern, the largest ever held outside America, a unique opportunity to see just how much she deserves her hallowed reputation.
If O’Keeffe’s personality was all about control so, too, was her art. Born on a farm in Wisconsin, she was initially drawn to music but when she turned to painting she gave herself a thorough theoretical grounding before she ever touched a canvas. She studied in both Chicago and New York and learned about modernism from the painter and educator Arthur Wesley Dow. Believing she wouldn’t make it as an artist, she took a job as a commercial designer in Chicago. What changed things for her was when she sent some drawings to a friend who, without her knowledge, showed them to Stieglitz, who then, again without O’Keeffe’s knowledge, exhibited them at his 291 gallery in New York (where he had been the first person to show Cézanne’s work in America). “Finally,” he wrote, “a woman on paper.” A correspondence between the pair followed, then a meeting, then a solo show, and finally marriage.
Our fairy tale is almost ended, and we’re going to marry and live happily ever after just like the princess in her tower who worried you so much—and made me so very cross by her constant recurrence. I’m so sorry for all the times I’ve been mean and hateful—for all the miserable moments I’ve caused you when we could have been so happy. You deserve so much—so very much— … And I do want to marry you—even if you do think I “dread” it. I wish you hadn’t said that—I’m not afraid of anything. To be afraid a person has either to be a coward or very great and big. I am neither. Besides, I know you can take much better care of me than I can, and I’ll always be very, very happy with you—except sometimes when we engage in our weekly debates—and even then I rather enjoy myself. I like being very calm and masterful, while you become emotional and sulky. I don’t care whether you think so or not—I do.
Darling, I nearly sat it off in the Strand today and all because T.E. Lawrence of the Movies is your physical counter-part. So I was informed by half a dozen girls before I could slam on a hat and see for myself. He made me so homesick. I thought at first waiting would grow easier later—but every day I need you more. All these soft, warm nights going to waste when I ought to be lying in your arms under the moon—the dearest arms in all the world—darling arms that I love so to feel around me. How much longer—before they’ll be there to stay? ***
The onion, now that’s something else.
Its innards don’t exist.
Nothing but pure onionhood
fills this devout onionist.
Oniony on the inside,
onionesque it appears.
It follows its own daimonion
without our human tears.
Our skin is just a coverup
for the land where none dare go,
an internal inferno,
the anathema of anatomy.
In an onion there’s only onion
from its top to its toe,
At peace, of a piece,
internally at rest.
Inside it, there’s a smaller one
of undiminished worth.
The second holds a third one,
the third contains a fourth.
A centripetal fugue.
Nature’s rotundest tummy,
its greatest success story,
the onion drapes itself in its
own aureoles of glory.
We hold veins, nerves, and fat,
secretions’ secret sections.
Not for us such idiotic
by Wisława Szymborska
from Poems New and Collected, 1957-1997
Mark Changizi in Seed:
Where are we humans going, as a species? If science fiction is any guide, we will genetically evolve like in X-Men, become genetically engineered as in Gattaca, or become cybernetically enhanced like General Grievous in Star Wars. All of these may well be part of the story of our future, but I’m not holding my breath. The first of these—natural selection—is impracticably slow, and there’s a plausible case to be made that natural selection has all but stopped acting on us. Genetic engineering could engender marked changes in us, but it requires a scientific bridge between genotypes—an organism’s genetic blueprints—and phenotypes, which are the organisms themselves and their suite of abilities. A sufficiently sophisticated bridge between these extremes is nowhere in sight. And machine-enhancement is part of our world even today, manifesting in the smartphones and desktop computers most of us rely on each day. Such devices will continue to further empower us in the future, but serious hardware additions to our brains will not be forthcoming until we figure out how to build human-level artificial intelligences (and meld them to our neurons), something that will require cracking the mind’s deepest mysteries. I have argued that we’re centuries or more away from that.
Simply put, none of these scenarios are plausible for the immediate future. If there is something next, some imminently arriving transformative development for human capabilities, then the key will not be improved genes or cortical plug-ins. But what other way forward could humans possibly have? With genetic and cyborg enhancement off the table for many years, it would seem we are presently stuck as-is, sans upgrades. There is, however, another avenue for human evolution, one mostly unappreciated in both science and fiction. It is this unheralded mechanism that will usher in the next stage of human, giving future people exquisite powers we do not currently possess, powers worthy of natural selection itself. And, importantly, it doesn’t require us to transform into cyborgs or bio-engineered lab rats. It merely relies on our natural bodies and brains functioning as they have for millions of years. This mystery mechanism of human transformation is neuronal recycling, coined by neuroscientist Stanislas Dehaene, wherein the brain’s innate capabilities are harnessed for altogether novel functions. This view of the future of humankind is grounded in an appreciation of the biologically innate powers bestowed upon us by hundreds of millions of years of evolution. This deep respect for our powers is sometimes lacking in the sciences, where many are taught to believe that our brains and bodies are taped-together, far-from-optimal kluges. In this view, natural selection is so riddled by accidents and saddled with developmental constraints that the resultant biological hardware and software should be described as a “just good enough” solution rather than as a “fine-tuned machine.”
Wednesday, July 27, 2016
Simon T. Meiners in the Courier-Journal:
A lot of moderate pundits trot out this plea for patience every time the media spotlights another black life stolen by the police. In this case, it's National Review editor Charles C.W. Cooke offering "A Few Thoughts on the Killing of Philando Castile" the morning after Castile was gunned down by Officer Jeronimo Yanez during a routine traffic stop in Falcon Heights, Minnesota. His death has become another rallying cry and another proxy for discussing state-on-black violence.
Using one tragedy as a stand-in for systemic racism is common, but it gives a lot of moderates, especially white ones, anxiety. Here's how they see it play out: police kill a black man under sketchy circumstances, and before the dust can clear, racial demagogues swoop in and shoehorn the story into some reductive narrative pitting evil racist cops against noble black victims — facts be damned; that's why fair-minded people of all races need to step up, they say, and defend the officer's right to due process.
As a recovering white moderate, I get it: you should never scapegoat one cop for the collective sins of the criminal justice system. It's illiberal. But here's the problem with that mindset: you can't referee injustices one by one, either. You'll miss the big picture. That is why the shortsighted white moderates who try to be cool-headed neutrals ought to spend a little more time meditating on big picture facts.
Jerry Alper in Existential Cosmology:
On May 8, in Manhattan, I had met with Sean Carroll for a prearranged interview on his forthcoming The Big Picture. The three hour conversation that ensued had been an exhilarating, if somewhat overwhelming experience. Before I could write about it, however, I needed to process it. Sean Carroll, for his part, wanted to do “some writing” in preparation for his scheduled May 10th talk at the Bell House in Brooklyn. This was the event that would kick off the grand book tour of his much-anticipated, new book, the wildly-ambitious, magisterial synthesis called simply The Big Picture. The pub. Date (May 10th) was two days away but already the reviews from some of the brightest names in science were starting to arrive. Here, for example, is what the brilliant quantum experimental physicist, Sabine Hossenfelder, says about the Big Picture.
“The Big Picture is, above everything else, a courageous book — and an overdue one. So, I am super happy about the book. The Big Picture should make clear that physicists aren’t just arrogant when they say their work reveals insights that reach far beyond the boundaries of their disciplines. Physics indeed has an exceptional status among the sciences.”
Be that as it may, I was of two minds about actually travelling to the Bell House which I had never heard of and to which I had never been (but to which I’ve now been invited to). On the one hand — the event being scheduled between 8 p.m. — 11 p.m. — I would be lucky if I could arrive home before midnight. On the other hand, this was a rare opportunity, perhaps a once in a lifetime opportunity, to see Sean Carroll lecture in person and I did not want to miss out on it.
Franklin Foer in Slate:
A foreign government has hacked a political party’s computers—and possibly an election. It has stolen documents and timed their release to explode with maximum damage. It is a strike against our civic infrastructure. And though nobody died—and there was no economic toll exacted—the Russians were aiming for a tender spot, a central node of our democracy.
It was hard to see the perniciousness of this attack at first, especially given how news media initially covered the story. The Russians, after all, didn’t knock out a power grid. And when the stolen information arrived, it was dressed in the ideology of WikiLeaks, which presents its exploits as possessing a kind of journalistic bravery the traditional media lacks.
But this document dump wasn’t a high-minded act of transparency. To state the obvious, only one political party has been exposed. (Selectively exposed: Many emails were culled from the abridged dump.) And it’s not really even the inner workings of the Democrats that have been revealed; the documents don’t suggest new layers of corruption or detail any new conspiracies. They’re something closer to the embarrassing emails that fly across every office in America—griping, the testing of stupid ideas, the banal musings that take place in private correspondence.
If you’ve ever taken I-81 north through Virginia, you’ve passed the town of Natural Bridge, in Rockbridge County—home to a ninety-foot limestone arch that extends over a gorge, a geological anomaly probably formed by an ancient underground river. Natural Bridge is an anachronism from the Route 66 era of highway travel, a place where you can pay twenty dollars to look at a rock, eat a rock-themed lunch, and then buy a shot glass illustrated with a picture of that same rock. As any respectable tourist trap must, the town hosts a constellation of other attractions: a petting zoo, a dinosaur/Civil War theme park, and the Natural Bridge Wax Museum (now closed, and former home to a ghoulish Obama tribute). Best of all is the featherlight, faux prehistoric monument known as Foamhenge.
As its name suggests, Foamhenge is a one-to-one scale replica of Stonehenge, made of foam. It is identical to the original, save the flecked gray paint, the accompanying statue of a deadhead-ish Merlin, and the fact that it was erected several millennia later. For the past twelve years, the henge has been the most public of Natural Bridge’s draws, garnering a steady stream of visitors and enough press to be mentioned in the same breath as the area’s actual ancient rocks. Its creator, an artist named Mark Cline, calls it his “foam-nomenon”: the unlikely culmination of his career as a sculptor of roadside attractions.
When Samuel Beckett was a young man, his parents wanted to him to work in the family’s accountancy business and assume his place in Dublin’s Protestant merchant class. As Tim Parkswrites in his new book, Life and Work: Writers, Readers, and the Conversations between Them, “a battle of wills ensued between mother and son…As the impasse intensified, [Beckett] developed a number of physical symptoms — boils, anal cysts, pelvic pains, tachycardia, panic attacks…” The panic attacks would plague Beckett for years, and his biographer Anthony Cronin tells us, inSamuel Beckett: The Last Modernist, that he didn’t reflect on his maladies in a conventional manner. In 1935 he attended a lecture by Swiss psychiatrist and former Freud protégé C.J. Jung. Beckett was 29 years old, in analysis, and believed he suffered from a neurotic disorder that “had its origins in infancy, in a time he could not remember,” Cronin writes.
In the lecture, Jung described the case of a young girl whose difficulties baffled him until he fell upon a simple, though rather esoteric diagnosis: “The girl had never really been born.” The idea immediately fired Beckett’s imagination. Cronin claims it triggered something crucial in Beckett and would become central to his self-understanding, and a recurring motif in his works. Beckett, he writes, “thought the diagnosis was a profoundly suggestive illumination of his own case, his sense of alienation from the world and of not being ready or fitted to cope with it, to join in its activities as others did, or even to understand the reasons for them.”