Saturday, May 21, 2016
Lewis H. Lapham in Lapham's Quarterly:
The champions of Western civilization make a bad mistake by deploring the mind and method of jihad as medieval and barbaric. The techniques and the objectives are modern. From whom do we suppose that jihadists learn to appreciate the value of high explosive as vivid speech if not from the example of the U.S. Air Force overhead Vietnam, Serbia, and Iraq? The organizers of the 9/11 attacks on Manhattan clearly not only understood the ethos of globalized finance capitalism but also the idiom of the American news and entertainment media. Their production values were akin to those of Independence Day; the spectacle of the World Trade Center collapsing in ruins was rated by the New York film and social critics as “awe inspiring,” “never to be forgotten,” “shatteringly emotional.”
The sense of living in the prophetic end time has been running around in the American consciousness for the past twenty-five years, on the disheartened political left as on the ferocious political right. The final battle of Armageddon furnished the climax for the Left Behind series of sixteen neo-Christian fables that have sold more than 65 million copies to date, presumably to Rush Limbaugh’s dittoheads and future members of the Tea Party. The coauthors of the books, Tim LaHaye and Jerry B. Jenkins, offer their hatred of man as testimony to their love of God, and devote many fondly worded pages to the wholesale slaughter of intellectuals in New York, politicians in Washington, and homosexuals in Los Angeles. Their language is of a piece with the film footage in Mel Gibson’s Passion of the Christ or the videos just in of an ISIS beheading.
From The New York Times:
Randy Shilts’s “And the Band Played On,” about the early days of the AIDS epidemic, and Atul Gawande’s “Being Mortal,” about how systems of care can affect the way we die. And Ian McEwan’s “Enduring Love,” a novel spun out of an obsessive psychiatric syndrome.
Was there any book that influenced your decision to become a writer?
Without a doubt: Primo Levi’s “Survival in Auschwitz.” Levi, notably, defined himself first as a chemist and then as a writer. He has a particularly charming essay about why scientists can be good writers because they distill and clarify, because they boil questions down to their tar, because they understand the Silly Putty-ness of language. If chemists can write like Levi, then God help the writers.
What was the most interesting book you read while researching “The Gene”? And what was the best book you read for “The Emperor of All Maladies”?
I read a wide and bizarre collection of books for “The Gene,” including comics from the 1950s that fantasized about future human mutants, and a popular genre from the 1930s — I guess we might call it Eugenics Lite — that advocated the measurement and breeding of the best babies (blue-eyed, white) to improve the national gene pool. Perhaps the most interesting was Eugen Bleuler’s first case description of schizophrenia from 1911 that reads like the most incredible novel. For “Emperor of All Maladies,” the one book that I particularly scoured for inspiration was Richard Rhodes’s “The Making of the Atomic Bomb” — an epic account of the Manhattan Project. I cannot think of another book that makes scientific history more riveting.
Friday, May 20, 2016
Justin E. H. Smith in his blog:
Sometime in the summer of 1987 I walked out to our rural-route mailbox and found my membership card for the Young Socialist Alliance, accompanied by a typewritten letter filled with both practical information as well as elevated rhetoric about the youth being the future. I had heard that talk before at Catholic Youth Organization meetings, and was annoyed that I was made to join the mere youth auxiliary of the Socialist Workers' Party. But I was 15 and those were the rules, and I was happy enough to now be officially linked to the largest association of Trotskyists in the United States, whose publishing wing, Pathfinder Press, had already taught me so much about the larger world beyond the Sacramento Valley.
By the following year I had obtained another official document with my name on it, from the Department of Motor Vehicles, which enabled me to drive to the national convention of the SWP at Oberlin College in Cleveland. It enabled me, while my mother, for some mysterious reason, permitted me. In what would have been my junior year I had stopped attending high school for some months, out of sheer stubbornness, and didn't seem to have any other concrete plans, so driving off to do something at a university might have been hoped to hold open the possibility of what was known, even then, as a 'positive influence'. A 'positive influence on the youth'.
So I made it through the high desert of Nevada, through the salt flats of Utah, through the locust plagues of Nebraska, through Illinois, Indiana, and, finally, the state in which I would much later reside for two years and where I am still registered to vote: bleak pseudopalindromic Ohio, microcosm of all that is worst of 'these United States', the state Whitman had the most trouble rhapsodising about. But it was all new and fresh to me in 1988 and I was happy to go to some artsy café in the little town next to the campus and meet some dude named Harold who wore the best thrift-shop sweaters and knew more trivia about The Residents and Negativland than I did. This was the larger world too.
A new report estimates that by 2050, drug-resistant infections will kill one person every three seconds, unless the world’s governments take drastic steps now.
Ed Yong in The Atlantic:
The report’s language is sober but its numbers are apocalyptic. If antibiotics continue to lose their sting, resistant infections will sap $100 trillion from the world economy between now and 2050, equivalent to $10,000 for every person alive today. Ten million people will die every year, roughly one every three seconds, and more than currently die from cancer. These are conservative estimates: They don’t account for procedures that are only safe or possible because of antibiotics, like hip and joint replacements, gut surgeries, C-sections, cancer chemotherapy, and organ transplants.
And yet, resistance is not futile. O’Neill’s report includes ten steps to avert the crisis. Notably, only two address the problem of supply—the lack of new antibiotics. “When I first agreed to do this, the advisors presented it to me as a challenge of getting new drugs,” says O’Neill. “But it dawned on me very quickly that there were just as many, if not more, important issues on the demand side.” Indeed, seven of his recommendations focus on reducing the wanton and wasteful use of our existing arsenal. It’s inevitable that microbes will evolve resistance, but we can delay that process by using drugs more sparingly.
Tom Blunt in Signature:
There are two versions of the Blanche Knopf story. The first is one of triumph, documenting the calculated risks taken by the publishing maven to carve out paths for otherwise-neglected authors who would ultimately shape 20th-century culture and change the book business forever. America’s Harlem Renaissance, hard-boiled detective genre, and fascination with Europe’s sexual freedom can all be traced back to Mrs. Alfred A. Knopf’s business gambits, which in most cases sprang directly from her personal interests, or those of her close friends.
The second version is a tale of what might have been. How differently would Mrs. Knopf’s life and career have turned out if her husband had truly made her an equal partner in their business, as he promised when they were young newlyweds? To what greater heights might the company have flown if Mr. Knopf hadn’t vetoed some of her more risqué choices? Might Blanche have eventually summoned enough independence to go her own way if the couple’s gradual estrangement hadn’t nudged her toward a diet pill habit that slowly destroyed her health and eyesight? And perhaps most regretfully: how many more women might have felt called to work in the publishing world if Alfred hadn’t relentlessly downplayed Blanche’s involvement at every turn, only begrudgingly admitting his wife’s contributions long after her death in 1966?
These questions arise several decades too late to make any difference to Mrs. Knopf, and if it wasn’t for Laura Claridge’s new biography The Lady With the Borzoi, they might never have been posed at all.
he name Albert Murray was never household familiar. Yet he was one of the truly original minds of 20th-century American letters. Murray, who died in 2013 at the age of 97, was an accomplished novelist, a kind of modern-day oral philosopher, a founder of Jazz at Lincoln Center, and the writer of a sprawling, idiosyncratic, and consistently astonishing body of literary criticism, first-rate music exposition, and cunning autobiography. In our current moment of identity politics and multicultural balkanization, the publication of any new Murray text would serve as a powerful reminder that his complex analysis of art and life remain as timely as ever—probably more so. T
It’s 2016, and another management guru is revealing the secrets of the creative mind.
It’s not really a very original thing to do. The literature on encouraging corporate nonconformity is already enormous; it goes back many years, to at least 1960, when someone wrote a book called How to Be a More Creative Executive. What was once called “the creative revolution” in advertising got going at around the same time. I myself wrote a book about that subject—a history book!—nearly twenty years ago.
There have been slight variations in the creativity genre over the half-century of its ascendancy, of course. The cast of geniuses on whom it obsessively focuses has changed, for example. And while the study of creativity has always been surrounded with a quasi-scientific aura, today that science is more micro than macro, urging us to enhance our originality by studying the functioning of the human brain.
In the larger literary sense, however, it is now clear that the capitalist’s tribute to creativity and rebellion is an indestructible form. There is something about the merging of bossery and nonconformity that beguiles the American mind. The genre marches irresistibly from triumph to triumph. Books pondering the way creative minds work dominate business-best-seller lists. Airport newsstands seem to have been converted wholly to the propagation of the faith. Travel writers and speechwriters alike have seen the light and now busy themselves revealing the brain’s secrets to aspiring professionals.
Realist historical fictions, with the rustling demands of their costumes and their period-appropriate speech, often depend on painstakingly described physical veracity, sensory believability, to steep a reader in the past. While not necessarily factual, such works say: This really occurred, and now you, too, may experience it. As the literary historian Stephen Greenblatt enthused in a review of “Wolf Hall,” Hilary Mantel’s novel about the rise of Thomas Cromwell—perhaps the paradigmatic contemporary example of such fiction—great historical novels “provide a powerful hallucination of presence, the vivid sensation of lived life.”
But a handful of recent works of fiction have taken up history on radically different terms. Rather than presenting a single, definitive story—an ostensibly objective chronicle of events—these books offer a past of competing perspectives, of multiple voices. They are not so much historical as archival: instead of giving us the imagined experience of an event, they offer the ambiguous traces that such events leave behind. These fictions do not focus on fact but on fact’s record, the media by which we have any historical knowledge at all. In so doing, such books call the reader’s attention to both the problems and the pleasures of history’s linguistic remains.
The book that made this distinction clear to me is a new novel by Danielle Dutton, called “Margaret the First.” Dutton’s Margaret is Margaret Cavendish, Duchess of Newcastle-upon-Tyne, who lived from 1623 to 1673 and was one of the first British women to publish in print under her own name.
Gary Saul Morson in The New Criterion:
One hundred and fifty years ago, when Dostoevsky published Crime and Punishment, Russia was seething with reform, idealism, and hatred. Four years earlier, the “tsar-liberator” Alexander II (reigned 1855–1881) had at last abolished serfdom, a form of bondage making 90 percent of the population saleable property. New charters granted considerable autonomy to the universities as press censorship was relaxed. The court system, which even a famous Slavophile said made his hair stand on end and his skin frost over, was remodeled along Western lines. More was to come, including the beginnings of economic modernization. According to conventional wisdom, Russian history alternates between absolute stasis—“Russia should be frozen so it doesn’t rot,” one reactionary writer urged—and revolutionary change. Between Peter the Great (died 1725) and the revolutions of 1917, nothing compared with the reign of Alexander II. And yet it was the tsar-liberator, not his rigid predecessor or successor, who was assassinated by revolutionary terrorists. The decade after he ascended the throne witnessed the birth of the “intelligentsia,” a word we get from Russian, where it meant not well-educated people but a group sharing a set of radical beliefs, including atheism, materialism, revolutionism, and some form of socialism. Intelligents (members of the intelligentsia) were expected to identify not as members of a profession or social class but with each other. They expressed disdain for everyday virtues and placed their faith entirely in one or another theory. Lenin, Trotsky, and Stalin were typical intelligents, and the terrorists who killed the tsar were their predecessors.
The intelligentsia prided itself on ideas discrediting all traditional morality. Utilitarianism suggested that people do, and should do, nothing but maximize pleasure. Darwin’s Origin of Species, which took Russia by storm, seemed to reduce people to biological specimens. In 1862 the Russian neurologist Ivan Sechenov published his Reflexes of the Brain, which argued that all so-called free choice is merely “reflex movements in the strict sense of the word.” And it was common to quote the physiologist Jacob Moleschott’s remark that the mind secretes thought the way the liver secretes bile. These ideas all seemed to converge on revolutionary violence.
Abigail Tucker in Smithsonian:
Humans tend to dismiss Neanderthals as dimwits, yet the brains of our doomed cousins were actually larger than our own. “If you go to a site from 150,000 years ago,” says Miki Ben-Dor, a Tel Aviv University archaeologist, “you won’t be able to tell whether Neanderthals or Homo sapiens lived there, because they had all the same tools.” Which helps explain why, to fathom how our fates diverged, he recently scrutinized Neanderthals’ bodies instead of their skulls. While humans have barrel-shaped chests and narrow pelvises, Neanderthals had bell-shaped torsos with wide pelvises. The prevailing explanation has been that Neanderthals, often living in colder and drier environments than their human contemporaries, needed more energy and therefore more oxygen, so their torsos swelled to hold a bigger respiratory system. But Ben-Dor had a gut feeling this was wrong. What if the difference was what they ate? Living in Eurasia 300,000 to 30,000 years ago, Neanderthals settled in places like the Polar Urals and southern Siberia—not bountiful in the best of times, and certainly not during ice ages. In the heart of a tundra winter, with no fruits and veggies to be found, animal meat—made of fat and protein—was likely the only energy source.
Alas, though fat is easier to digest, it’s scarce in cold conditions, as prey animals themselves burn up their fat stores and grow lean. So Neanderthals must have eaten a great deal of protein, which is tough to metabolize and puts heavy demands on the liver and kidneys to remove toxic byproducts. In fact, we humans have a “protein ceiling” of between 35 and 50 percent of our diet; eating too much more can be dangerous. Ben-Dor thinks that Neanderthals’ bodies found a way to utilize more protein, developing enlarged livers and kidneys, and chests and pelvises that widened over the millennia to accommodate these beefed-up organs.
Thursday, May 19, 2016
Christian Lorentzen in Vulture:
I’m cursed with a mind that looks at a sentence and sees grammar before it sees meaning. It might be that I’m doing math by other means, that I overdid it with diagramming sentences as a boy, or that my grasp of English was warped by learning Latin. Translating Horace felt like solving math problems. Reading Emily Dickinson began to feel like solving math problems. You might think this is a cold way of reading, but it’s the opposite. You develop feelings. Pronoun, verb, noun — I like sentences that proceed in that way, in a forward march. Or those tricked out with a preposition, another noun, and a couple of adjectives. Conjunctions and articles leave me unfazed. If these combinations result in elaborate syntactical tangles, it thrills me. It’s cheap words I hate, and I hate adverbs.
I’m unembarrassed to admit that my taste in literary style owes a lot to my adolescent reading of The Sun Also Rises — Hemingway was no friend of adverbs. He’s not alone. “Use as few adverbs as possible” is among V. S. Naipaul’s rules for beginning writers. When William Strunk and E. B. White admonish us to omit unnecessary words, I know they’re talking about adverbs without their having to say it.
Karan Jani in The Wire:
What was your first reaction when you saw the gravitational-wave event on September 14, 2015 and the whole process which followed until the historic announcement?
I think it was just one of deep satisfaction, that a dream that Rai Weiss, Ron Drever and Joseph Weber and Vladimir Braginsky and Stan Whitcomb and others had developed and shared so many decades ago that was finally reaching fruition.
In fact, nature turned out to be giving us just what I had expected – I’d expected since the early 1980s that the first thing we would see would be merging blackholes because the distance you can see goes up roughly proportionally with the mass of the binary, and so the volumes are cubed, and that factor would overwhelm the absolute lower event rate for blackhole binaries compared to neutron star binaries. It seemed very likely to me so that’s just what I thought would happen. It’s a big part of how I hoped to sell this project.
To have that come out right was pleasing, to have the strength of the waves be 10-21 – that’s a number we started targeting in 1978. So it all came to pass the way we expected it to, thanks to enormous work by your generation of experimenters. You were the ones who really pulled it off. The way I like to say it is that it’s your generation of experimenters that makes me look good!
There is something uncanny about staying in another person’s house — the stark differences and the small convergences of sameness. We all like to snoop a bit. Now, public historian Ruth Goodmangives us the chance to snoop on the lives of people who died 500 years ago. When you’re watchingThe Tudors or Wolf Hall, Goodman is the woman behind the scenes ensuring that the clothes look right, the home interiors are accurate, and the sumptuous feasts are as true to life as possible. InHow to Be a Tudor: A Dawn-to-Dusk Guide to Tudor Life, she makes her almost preternatural knowledge about life during the 16th century available to the reading public.
You wouldn’t expect the intricacies of Tudor baking, brewing, ploughing, cooking, needlework, painting, dancing, and card-playing to hold an audience rapt, and yet Goodman makes the minutia of everyday life a half-millennia ago tremendously interesting. Indeed, her voluminous knowledge makes Goodman seem not so much a specialist on period authenticity as an actual time traveler. Ingeniously structuring the book around the hourly rhythms of daily life (with chapters going from “At Cock’s Crow” to “And so to bed”), Goodman transmits information about food, work, medicine, education, leisure, lodging, sleep, and even sexuality. How to Be a Tudor, with its grounding in physical detail and avoidance of theoretical analysis, is true to the guide book genre, but one featuring recipes for veal meatballs (exceedingly expensive at the time) and Galenic medical advice.
Inside the monastery of S. Trinità dei Monti, which stands at the top of the Spanish Steps in Rome, is a room decorated in glorious trompe l’oeilas a ruin. Created in 1766 by Charles-Louis Clérisseau, and originally intended to be the cell of the monastery’s resident mathematician Fr Thomas Le Sueur, it imitates a decaying classical temple, with tumbled columns, a roof open to the sky, encroaching vegetation and a large parrot perched on one of the apparently surviving crossbeams. The irony of the design worked on several levels. It allowed the famous scholar to enjoy the pleasure of ruins without the discomfort. But it was also a wry comment on the life cycle of buildings. Ruins are one stage on their inevitable journey to destruction. As we know from some of the most ambitious modern attempts at conservation on archaeological sites all over the world, from Pompeii to Machu Picchu, collapse can be delayed – but not prevented. Here Clérisseau offered dilapidation frozen in time, a ruin built to last.
That life cycle of buildings, from conception to death, with an occasional lucky, or unlucky, resurrection, is the theme of James Crawford’s Fallen Glory – twenty chapters telling the biography of twenty structures, from across the world, ancient and modern, real and imaginary (the first chapter is on the Tower of Babel, the last on the virtual world of the web hosting service GeoCities). Some of these life stories work better than others. The Roman Forum, the subject of Chapter Six, needs so much background that we tend to lose sight of the main character as it rises out of the marshes, becomes the monumental centre of the empire, and slips back into pasture, only to be revived again in the service of Mussolini’s grandiose ambitions.
Hiat was born in Kletsk, a town south of Minsk, in Belarus. As a child, he began to doubt the possibility of God. “I’ve seen children die, small children, and the doubt of a merciful God really drove me” away from religious belief, he said to Roth during the first interview session, describing the crucible of his political consciousness and suggesting the rigor of his autodidactic mind. But at the same time, at the cheder in Kletsk, Hiat was introduced to the Jewish teaching that opened him intellectually to a “revolutionary instinctive upbringing.” “Socialism,” he said, “is part of philosophical Judaism.” There is, he explained to Roth, who never received, or pursued, a full Jewish education, “a certain Hebrew word, ein kemach, ein Torah: If you have no bread, you have no Torah.”
Bernie Sanders, who perhaps embodies this connection as thoroughly as any American public figure in history, rarely draws that line. In a speech last year to the students of the Evangelical Christian Liberty University, he quoted the Book of Matthew, not Torah or Talmud, in citing a religious influence in his political ideology. (Hillary Clinton, for her part, draws a connection between the Christianity she experienced growing up and her instinct to volunteer in poor neighborhoods of Chicago.) Sanders sometimes directs the question of how his Jewish self-identity inspired his political beliefs to the specter of the Holocaust, from which his father escaped but many of his relatives in Poland did not; more often, he simply identifies his parents as “Polish.”
Tom Mullaney in Foreign Policy:
Even in the age of China’s social media boom, with billion-dollar valuations for Beijing-based IT start-ups, prejudice against the Chinese language is alive and well. One would be forgiven for thinking that by 2016, the 20th century’s widespread critiques of racism, colonialism, and Social Darwinism would have sounded the death knell of 19th-century Orientalism, which viewed China and the Chinese language through a condescending, colonialist lens. At the least, one might hope that if notions of Chinese Otherness were still with us, those who carry on the tradition of these threadbare ideas would generally be seen as archaically Eurocentric and gauche — the dross of airport bookshop paperbacks, unworthy of serious engagement. If only. Nineteenth-century understandings of China persist, not only surviving the decline of Social Darwinism and race science, but flourishing in this new century, driven primarily by arguments about China’s unfitness for modern technology and media.
Call it Orientalism 2.0.
Note: For my sister Ga who is the real Qawwali aficionado in the family.
Jamie Metzl in KurzweilAI:
After 4 billion years of evolution by one set of rules, our species is about to begin evolving by another. Overlapping and mutually reinforcing revolutions in genetics, information technology, artificial intelligence, big data analytics, and other fields are providing the tools that will make it possible to genetically alter our future offspring should we choose to do so. For some very good reasons, we will. Nearly everybody wants to have cancers cured and terrible diseases eliminated. Most of us want to live longer, healthier and more robust lives. Genetic technologies will make that possible. But the very tools we will use to achieve these goals will also open the door to the selection for and ultimately manipulation of non-disease-related genetic traits — and with them a new set of evolutionary possibilities. As the genetic revolution plays out, it will raise fundamental questions about what it means to be human, unleash deep divisions within and between groups, and could even lead to destabilizing international conflict.
And the revolution has already begun. Today’s genetic moment is not the stuff of science fiction. It’s not Jules Verne’s fanciful 1865 prediction of a moon landing a century before it occurred. It’s more equivalent to President Kennedy’s 1962 announcement that America would send men to the moon within a decade. All of the science was in place when Kennedy gave his Houston speech. The realization was inevitable; only the timing was at issue. Neil Armstrong climbed down the Apollo 11 ladder seven years later. We have all the tools we need to alter the genetic makeup of our species. The science is here. The realization is inevitable. Timing is the only variable.
Daphne in Mourning
Palm fronds have woven out the sky.
Fog has infiltrated every vein.
My hair has interlaced with vines.
Cobwebs lash their gauze across my eyes.
I’ve stood so since the world began,
and turned almost to stone some years ago.
Who passes by perceives a lichened post,
my girlish figures, ghostly, nearly gone.
My bark is warmer than the dead’s.
Human blood still lulls the underside of leaves.
My fingers hold the very dress I loved
to dance in, when dancing mattered-and it did.
by Melissa Green
from Daphne in the Morning
Pen & Anvil Press, 2010
Wednesday, May 18, 2016
Stuart Elden in Berfrois:
Foucault’s Last Decade is a study of Foucault’s work between 1974 and his death in 1984. In 1974, Foucault began writing the first volume of his History of Sexuality, developing work he had already begun to present in his Collège de France lecture courses. In that first volume, published in late 1976, Foucault promised five further volumes, and indicated some other studies he intended to write. But none of those books actually appeared, and Foucault’s work went in very different directions. At the very end of his life, two further volumes of the History of Sexuality were published, and a fourth was close to completion. In contrast to the originally planned thematic treatment, the final version was a much more historical study, returning to antiquity and early Christianity. In this book, I trace these developments, and try to explain why the transition happened.
Foucault’s Last Decade has its roots as far back as the late 1990s. I had just finished a PhD thesis on Nietzsche, Heidegger and Foucault. Right at the end of that process Foucault’s courses from the Collège de France began to be published – the first in 1997, the second in 1999. I already knew how much Heidegger scholarship had been changed by the insights of his lecture courses and thought that the same would be true for Foucault. (Of course, with Heidegger, much more and much worse was to come with his notebooks.) I wrote a review essay on the second published Foucault course – The Abnormals – for the journal boundary 2, on the invitation of Paul Bové, and then Paul invited me to the University of Pittsburgh when I spoke about‘Society Must Be Defended’, a text which was also published in boundary 2. I thought then that if I wrote something about each course as they came out, then in time there might be the raw materials for a book.
And so, on and off, in and around other projects, I read, spoke and sometimes wrote about most of Foucault’s courses as they appeared. Some of these were published here at Berfrois.Foucault taught at the Collège de France from late 1970 until his death in 1984. There were thirteen courses in total, but they were published in non-chronological order – the earliest courses presented the greatest editorial difficulties, and so were among the last to appear. The last of the Collège de France ones was published in 2015. Some courses from elsewhere and other material has also been published in the intervening years, and we now have far more material published since Foucault’s death than appeared in his lifetime. This, despite, his wish for ‘no posthumous publications’ – a request that was once followed scrupulously, then generously interpreted and is now largely ignored.
Dipesh Chakrabarty in Eurozine:
The current planetary crisis of climate change or global warming elicits a variety of responses in individuals, groups, and governments, ranging from denial, disconnect, and indifference to a spirit of engagement and activism of varying kinds and degrees. These responses saturate our sense of the now. Alan Weisman's best-selling book The World without Us suggests a thought experiment as a way of experiencing our present: "Suppose that the worst has happened. Human extinction is a fait accompli. [...] Picture a world from which we all suddenly vanished. [...] Might we have left some faint, enduring mark on the universe? [...] Is it possible that, instead of heaving a huge biological sigh of relief, the world without us would miss us?" I am drawn to Weisman's experiment as it tellingly demonstrates how the current crisis can precipitate a sense of the present that disconnects the future from the past by putting such a future beyond the grasp of historical sensibility. The discipline of history exists on the assumption that our past, present, and future are connected by a certain continuity of human experience. We normally envisage the future with the help of the same faculty that allows us to picture the past. Weisman's thought experiment illustrates the historicist paradox that inhabits contemporary moods of anxiety and concern about the finitude of humanity. To go along with Weisman's experiment, we have to insert ourselves into a future "without us" in order to be able to visualize it. Thus, our usual historical practices for visualizing times, past and future, times inaccessible to us personally – the exercise of historical understanding – are thrown into a deep contradiction and confusion. Weisman's experiment indicates how such confusion follows from our contemporary sense of the present insofar as that present gives rise to concerns about our future. Our historical sense of the present, in Weisman's version, has thus become deeply destructive of our general sense of history.
I will return to Weisman's experiment in the last part of this essay. There is much in the debate on climate change that should be of interest to those involved in contemporary discussions about history. For as the idea gains ground that the grave environmental risks of global warming have to do with excessive accumulation in the atmosphere of greenhouse gases produced mainly through the burning of fossil fuel and the industrialized use of animal stock by human beings, certain scientific propositions have come into circulation in the public domain that have profound, even transformative, implications for how we think about human history or about what the historian C. A. Bayly recently called "the birth of the modern world".
Nate Silver over at FiveThirtyEight:
Trump is one of the most astonishing stories in American political history. If you really expected the Republican front-runner to be bragging about the size of his anatomy in a debate, or to be spending his first week as the presumptive nominee feuding with the Republican speaker of the House and embroiled in a controversy over a tweet about a taco salad, then more power to you. Since relatively few people predicted Trump’s rise, however, I want to think through his nomination while trying to avoid the seduction of hindsight bias. What should we have known about Trump and when should we have known it?
It’s tempting to make a defense along the following lines:
Almost nobody expected Trump’s nomination, and there were good reasons to think it was unlikely. Sometimes unlikely events occur, but data journalists shouldn’t be blamed every time an upset happens, particularly if they have a track record of getting most things right and doing a good job of quantifying uncertainty.
We could emphasize that track record; the methods of data journalism have been highly successful at forecasting elections. That includes quite a bit of success this year. The FiveThirtyEight “polls-only” model has correctly predicted the winner in 52 of 57 (91 percent) primaries and caucuses so far in 2016, and our related “polls-plus” model has gone 51-for-57 (89 percent). Furthermore, the forecasts have been well-calibrated, meaning that upsets have occurred about as often as they’re supposed to but not more often.
But I don’t think this defense is complete — at least if we’re talking about FiveThirtyEight’s Trump forecasts. We didn’t just get unlucky: We made a big mistake, along with a couple of marginal ones.
The big mistake is a curious one for a website that focuses on statistics. Unlike virtually every other forecast we publish at FiveThirtyEight — including the primary and caucus projections I just mentioned — our early estimates of Trump’s chances weren’t based on a statistical model. Instead, they were what we sometimes called ”subjective odds” — which is to say, educated guesses. In other words, we were basically acting like pundits, but attaching numbers to our estimates. And we succumbed to some of the same biases that pundits often suffer, such as not changing our minds quickly enough in the face of new evidence. Without a model as a fortification, we found ourselves rambling around the countryside like all the other pundit-barbarians, randomly setting fire to things.
Robert Epstein in Aeon:
No matter how hard they try, brain scientists and cognitive psychologists will never find a copy of Beethoven’s 5th Symphony in the brain – or copies of words, pictures, grammatical rules or any other kinds of environmental stimuli. The human brain isn’t really empty, of course. But it does not contain most of the things people think it does – not even simple things such as ‘memories’.
Our shoddy thinking about the brain has deep historical roots, but the invention of computers in the 1940s got us especially confused. For more than half a century now, psychologists, linguists, neuroscientists and other experts on human behaviour have been asserting that the human brain works like a computer.
To see how vacuous this idea is, consider the brains of babies. Thanks to evolution, human neonates, like the newborns of all other mammalian species, enter the world prepared to interact with it effectively. A baby’s vision is blurry, but it pays special attention to faces, and is quickly able to identify its mother’s. It prefers the sound of voices to non-speech sounds, and can distinguish one basic speech sound from another. We are, without doubt, built to make social connections.
A healthy newborn is also equipped with more than a dozen reflexes – ready-made reactions to certain stimuli that are important for its survival. It turns its head in the direction of something that brushes its cheek and then sucks whatever enters its mouth. It holds its breath when submerged in water. It grasps things placed in its hands so strongly it can nearly support its own weight. Perhaps most important, newborns come equipped with powerful learning mechanisms that allow them to changerapidly so they can interact increasingly effectively with their world, even if that world is unlike the one their distant ancestors faced.
Senses, reflexes and learning mechanisms – this is what we start with, and it is quite a lot, when you think about it. If we lacked any of these capabilities at birth, we would probably have trouble surviving.
But here is what we are not born with: information, data, rules, software, knowledge, lexicons, representations, algorithms, programs, models, memories, images, processors, subroutines, encoders, decoders, symbols, or buffers – design elements that allow digital computers to behave somewhat intelligently. Not only are we not born with such things, we also don’t develop them – ever.