Sunday, May 22, 2016
Tonight I've watched
the moon and then
The night is now
goes; I am
in bed alone
Saturday, May 21, 2016
Tom Bartlett in the Chronicle of Higher Education:
Weeks of planning can evaporate in an instant, forcing the researchers to improvise. Beyond the logistical aggravation, there’s the matter of personal safety. Where there are fighters, there is often fighting, and while the semi-autonomous Kurdish region of northern Iraq remains relatively sheltered compared with Syria or large swaths of southern Iraq, the proximity to bloodshed prompts understandable unease.
The least jittery member of the team is its leader, Scott Atran, an anthropologist who floats among several institutions, including the University of Michigan and the John Jay College of Criminal Justice, part of the City University of New York. He’s also a founder of the Centre for the Resolution of Intractable Conflict, at the University of Oxford. He’s normally the one arguing to go a little farther afield, to challenge the group’s comfort zone, perhaps to cross over into Syria. While sitting around the hotel he appears restless and testy, headed toward ISIS territory he is in his element, enlivened and unfazed. "We don’t want to drive off the road, because it’s probably mined on both sides," he warns casually from the passenger seat, the way you might note a change in speed limit or a forthcoming rest stop.
Atran is known as an expert on terrorism, a title he doesn’t particularly want and a word he doesn’t find useful. He views his work, broadly, as examining what motivates people to do things beyond themselves, for good or ill. These days he focuses on the ill, specifically ISIS.
Carl Zimmer in the New York Times:
As the director of the Center for GeoGenetics at the University of Copenhagen, Dr. Willerslev uses ancient DNA to reconstruct the past 50,000 years of human history. The findings have enriched our understanding of prehistory, shedding light on human development with evidence that can’t be found in pottery shards or studies of living cultures.
Dr. Willerslev led the first successful sequencing of an ancient human genome, that of a 4,000-year-old Greenlander. His research on a 24,000-year-old Siberian skeleton revealed an unexpected connection between Europeans and Native Americans.
Dr. Willerslev was one of the early pioneers of the study of ancient DNA, and today he remains at the forefront of an increasingly competitive field. His colleagues credit his success to his relentless work and to his skill at building international networks of collaborators.
“His role is that of catalyst, choreographer, conductor and cajoler — and sometimes all at once,” said David J. Meltzer, an archaeologist at Southern Methodist University.
The scientific enterprise that Dr. Willerslev helped invent now sometimes crosses into culturally sensitive terrain.
Thomas Pogge, one of the world’s most prominent ethicists, stands accused of manipulating students to gain sexual advantage
Katie J.M. Baker in BuzzFeed:
When Thomas Pogge travels around the world, he finds eager young fans waiting for him in every lecture hall. The 62-year-old German-born professor, a protégé of the philosopher John Rawls, is bespectacled and slight of stature. But he’s a giant in the field of global ethics, and one of only a small handful of philosophers who have managed to translate prominence within the academy to an influential place in debates about policy.
A self-identified “thought leader,” Pogge directs international health and anti-poverty initiatives, publishes papers in leading journals, and gives TED Talks. His provocative argument that wealthy countries, and their citizens, are morally responsible for correcting the global economic order that keeps other countries poor revolutionized debates about global justice. He’s also a dedicated professor and mentor, at Yale University — where he founded and directs the Global Justice Program, a policy and public health research group — as well as at other prestigious institutions worldwide. By Pogge’s own count, he’s taught 34 graduate seminars, given 1,218 lectures in 46 countries, and supervised 66 doctoral dissertations.
But a recent federal civil rights complaint describes a distinction unlikely to appear on any curriculum vitae: It claims Pogge uses his fame and influence to manipulate much younger women in his field into sexual relationships. One former student said she was punished professionally after resisting his advances.
Pogge did not respond to more than a dozen emails and phone calls from BuzzFeed News, nor to a detailed letter laying out all the claims that were likely to appear in this article.
When it was first published in Istanbul in 1943, it made no impression whatsoever. Decades later, when Madonna in a Fur Coat became the sort of book that passed from friend to friend, the literary establishment continued to ignore it. Even those who greatly admired the other works of Sabahattin Ali viewed this one as a puzzling aberration. It was just a love story, they said – the sort that schoolgirls fawned over. And yet, for the past three years, it has topped the bestseller lists in Turkey, outselling Orhan Pamuk. It is read, loved and wept over by men and women of all ages, but most of all by young adults. And no one seems able to explain quite why.
The story begins in 1930s Ankara, the Turkish Republic’s newly appointed capital. The narrator has fallen on hard times, and it is only with the help of a crass and belittling former classmate that he is able to find work as a clerk at a firm trading in lumber. Here he meets the sickly, affectless Raif Bey, who is, we’re told, “the sort of man who causes us to ask ourselves: “What do they live for? What do they find in life? What logic compels them to keep breathing?” When at last they make friends, it becomes clear that Raif’s reason for living cannot be his family. The relatives assembled under his roof treat him with the utmost contempt. And yet he welcomes their derision. Even on his deathbed, he seems to accept it as his due. But there is also a notebook, hidden in his desk drawer at work, which he asks his friend to destroy.
Death, for most people, is a rumour; something that happens to others, far away. But it is the last thing you will 'do' - or which will happen to you - and the likelihood is that it will take place in an acute hospital or a care home, orchestrated by strangers. You will have little say in its pace or its manner. There is a risk that, during the course of your dying, you will be subjected to procedures and treatments that are painful, degrading and ultimately futile. If you are old, your children may make all the major decisions for you. Death may creep up on you without warning, without a chance for you to prepare yourself and settle your affairs.
Few books, as R. D. Laing remarked, are forgivable. Most of what I read about death and dying bears little relation to what I see every day in my work on the hospital wards. Doctors and nurses rarely write about death; those who do are generally palliative care (hospice) specialists, and have a particular perspective on the subject, one that I do not completely share. The language used about death and dying tends to have a quality of cloying earnestness: nobody 'dies' anymore; they 'pass over', they 'pass on', or they simply 'pass'. The book I wanted to read about death and dying didn't exist.
Doctors who work in large, acute-care hospitals see death differently to doctors working in the hushed and serene environs of a hospice. And yet most dying still takes place in this kind of hospital, rather than in the hospices. Half a million people die every year in England. A study of deaths in England between 2005 and 2007 found that 58 per cent of all deaths occurred in hospital, 16 per cent in nursing homes, 19 per cent at home and only 5 per cent in hospices.
Al Alvarez called suicide “a dubious immortality”: “All that anguish, the slow tensing of the self to that final, irreversible act, and for what? In order to become a statistic.” In 1971, Alvarez published The Savage God: A Study of Suicide, a literary and philosophical expedition into the places where creativity and suicide overlap; the book was also a tribute to his friend Sylvia Plath, who had taken her life less than a decade earlier. He had published several of her poems in the Observer, where he worked as poetry critic, and they grew closer after she split from Ted Hughes. In Plath’s final months, she and Alvarez would often sit in his London flat, talking about poetry, creativity, and sometimes suicide, though “with a wry detachment,” as he describes it. “It was obviously a matter of self-respect that her first attempt had been serious and nearly successful,” he writes. “It was an act she felt she had a right to as a grown woman and a free agent.”
The Savage God is not a memoir, although the prologue and epilogue that bookend it are intensely personal. “I want the book to start, as it ends, with a detailed case-history, so that whatever theories and abstractions follow can somehow be rooted in the human particular,” Alvarez writes; the haunting prologue presents his brief personal account of Plath’s last months, in which he carefully dissects her depression and the ways it contributed to her eventual death, an outcome that (he suggests) might have been a mistake. Alvarez ultimately believes Plath intended only to act out the allegory of death into which she had written herself, expecting, or perhaps gambling, that she would be saved at the last minute—a miscalculation from which the Myth of Sylvia Plath has grown. “The pity is not that there is a myth…” writes Alvarez, “but that the myth is not simply that of an enormously gifted poet whose death came carelessly, by mistake, and too soon.”
The News of Flowers
Spring. Everything’s liberated.
The news of flowers
eases the poverty of this world.
Throughout this fractured country
(some say it’s a pity,
others not so)
spring has come full force.
An azalea blooming at Cheju Island
in the very south,
after a few days
across the sea
in Southern Cheolla
& Southern Kyeongsang.
A few days later
& it reaches the shore of the Han, mid-country,
& all along the Soyang River.
About a month later
on the upper reaches of the Yalu, North Korea: blossoms.
At the end of May
about 2700 meters up
by a cold spring at the treeline
azaleas bloom in many colors.
This is enough.
One cannot wish for more.
Where could things be better than among the flowers of a spring day?
So with South & North: gradually, evenly.
By Ko Un
from Abiding Places: Korea South & North
Lewis H. Lapham in Lapham's Quarterly:
The champions of Western civilization make a bad mistake by deploring the mind and method of jihad as medieval and barbaric. The techniques and the objectives are modern. From whom do we suppose that jihadists learn to appreciate the value of high explosive as vivid speech if not from the example of the U.S. Air Force overhead Vietnam, Serbia, and Iraq? The organizers of the 9/11 attacks on Manhattan clearly not only understood the ethos of globalized finance capitalism but also the idiom of the American news and entertainment media. Their production values were akin to those of Independence Day; the spectacle of the World Trade Center collapsing in ruins was rated by the New York film and social critics as “awe inspiring,” “never to be forgotten,” “shatteringly emotional.”
The sense of living in the prophetic end time has been running around in the American consciousness for the past twenty-five years, on the disheartened political left as on the ferocious political right. The final battle of Armageddon furnished the climax for the Left Behind series of sixteen neo-Christian fables that have sold more than 65 million copies to date, presumably to Rush Limbaugh’s dittoheads and future members of the Tea Party. The coauthors of the books, Tim LaHaye and Jerry B. Jenkins, offer their hatred of man as testimony to their love of God, and devote many fondly worded pages to the wholesale slaughter of intellectuals in New York, politicians in Washington, and homosexuals in Los Angeles. Their language is of a piece with the film footage in Mel Gibson’s Passion of the Christ or the videos just in of an ISIS beheading.
From The New York Times:
Randy Shilts’s “And the Band Played On,” about the early days of the AIDS epidemic, and Atul Gawande’s “Being Mortal,” about how systems of care can affect the way we die. And Ian McEwan’s “Enduring Love,” a novel spun out of an obsessive psychiatric syndrome.
Was there any book that influenced your decision to become a writer?
Without a doubt: Primo Levi’s “Survival in Auschwitz.” Levi, notably, defined himself first as a chemist and then as a writer. He has a particularly charming essay about why scientists can be good writers because they distill and clarify, because they boil questions down to their tar, because they understand the Silly Putty-ness of language. If chemists can write like Levi, then God help the writers.
What was the most interesting book you read while researching “The Gene”? And what was the best book you read for “The Emperor of All Maladies”?
I read a wide and bizarre collection of books for “The Gene,” including comics from the 1950s that fantasized about future human mutants, and a popular genre from the 1930s — I guess we might call it Eugenics Lite — that advocated the measurement and breeding of the best babies (blue-eyed, white) to improve the national gene pool. Perhaps the most interesting was Eugen Bleuler’s first case description of schizophrenia from 1911 that reads like the most incredible novel. For “Emperor of All Maladies,” the one book that I particularly scoured for inspiration was Richard Rhodes’s “The Making of the Atomic Bomb” — an epic account of the Manhattan Project. I cannot think of another book that makes scientific history more riveting.
Friday, May 20, 2016
Justin E. H. Smith in his blog:
Sometime in the summer of 1987 I walked out to our rural-route mailbox and found my membership card for the Young Socialist Alliance, accompanied by a typewritten letter filled with both practical information as well as elevated rhetoric about the youth being the future. I had heard that talk before at Catholic Youth Organization meetings, and was annoyed that I was made to join the mere youth auxiliary of the Socialist Workers' Party. But I was 15 and those were the rules, and I was happy enough to now be officially linked to the largest association of Trotskyists in the United States, whose publishing wing, Pathfinder Press, had already taught me so much about the larger world beyond the Sacramento Valley.
By the following year I had obtained another official document with my name on it, from the Department of Motor Vehicles, which enabled me to drive to the national convention of the SWP at Oberlin College in Cleveland. It enabled me, while my mother, for some mysterious reason, permitted me. In what would have been my junior year I had stopped attending high school for some months, out of sheer stubbornness, and didn't seem to have any other concrete plans, so driving off to do something at a university might have been hoped to hold open the possibility of what was known, even then, as a 'positive influence'. A 'positive influence on the youth'.
So I made it through the high desert of Nevada, through the salt flats of Utah, through the locust plagues of Nebraska, through Illinois, Indiana, and, finally, the state in which I would much later reside for two years and where I am still registered to vote: bleak pseudopalindromic Ohio, microcosm of all that is worst of 'these United States', the state Whitman had the most trouble rhapsodising about. But it was all new and fresh to me in 1988 and I was happy to go to some artsy café in the little town next to the campus and meet some dude named Harold who wore the best thrift-shop sweaters and knew more trivia about The Residents and Negativland than I did. This was the larger world too.
A new report estimates that by 2050, drug-resistant infections will kill one person every three seconds, unless the world’s governments take drastic steps now.
Ed Yong in The Atlantic:
The report’s language is sober but its numbers are apocalyptic. If antibiotics continue to lose their sting, resistant infections will sap $100 trillion from the world economy between now and 2050, equivalent to $10,000 for every person alive today. Ten million people will die every year, roughly one every three seconds, and more than currently die from cancer. These are conservative estimates: They don’t account for procedures that are only safe or possible because of antibiotics, like hip and joint replacements, gut surgeries, C-sections, cancer chemotherapy, and organ transplants.
And yet, resistance is not futile. O’Neill’s report includes ten steps to avert the crisis. Notably, only two address the problem of supply—the lack of new antibiotics. “When I first agreed to do this, the advisors presented it to me as a challenge of getting new drugs,” says O’Neill. “But it dawned on me very quickly that there were just as many, if not more, important issues on the demand side.” Indeed, seven of his recommendations focus on reducing the wanton and wasteful use of our existing arsenal. It’s inevitable that microbes will evolve resistance, but we can delay that process by using drugs more sparingly.
Tom Blunt in Signature:
There are two versions of the Blanche Knopf story. The first is one of triumph, documenting the calculated risks taken by the publishing maven to carve out paths for otherwise-neglected authors who would ultimately shape 20th-century culture and change the book business forever. America’s Harlem Renaissance, hard-boiled detective genre, and fascination with Europe’s sexual freedom can all be traced back to Mrs. Alfred A. Knopf’s business gambits, which in most cases sprang directly from her personal interests, or those of her close friends.
The second version is a tale of what might have been. How differently would Mrs. Knopf’s life and career have turned out if her husband had truly made her an equal partner in their business, as he promised when they were young newlyweds? To what greater heights might the company have flown if Mr. Knopf hadn’t vetoed some of her more risqué choices? Might Blanche have eventually summoned enough independence to go her own way if the couple’s gradual estrangement hadn’t nudged her toward a diet pill habit that slowly destroyed her health and eyesight? And perhaps most regretfully: how many more women might have felt called to work in the publishing world if Alfred hadn’t relentlessly downplayed Blanche’s involvement at every turn, only begrudgingly admitting his wife’s contributions long after her death in 1966?
These questions arise several decades too late to make any difference to Mrs. Knopf, and if it wasn’t for Laura Claridge’s new biography The Lady With the Borzoi, they might never have been posed at all.
he name Albert Murray was never household familiar. Yet he was one of the truly original minds of 20th-century American letters. Murray, who died in 2013 at the age of 97, was an accomplished novelist, a kind of modern-day oral philosopher, a founder of Jazz at Lincoln Center, and the writer of a sprawling, idiosyncratic, and consistently astonishing body of literary criticism, first-rate music exposition, and cunning autobiography. In our current moment of identity politics and multicultural balkanization, the publication of any new Murray text would serve as a powerful reminder that his complex analysis of art and life remain as timely as ever—probably more so. T
It’s 2016, and another management guru is revealing the secrets of the creative mind.
It’s not really a very original thing to do. The literature on encouraging corporate nonconformity is already enormous; it goes back many years, to at least 1960, when someone wrote a book called How to Be a More Creative Executive. What was once called “the creative revolution” in advertising got going at around the same time. I myself wrote a book about that subject—a history book!—nearly twenty years ago.
There have been slight variations in the creativity genre over the half-century of its ascendancy, of course. The cast of geniuses on whom it obsessively focuses has changed, for example. And while the study of creativity has always been surrounded with a quasi-scientific aura, today that science is more micro than macro, urging us to enhance our originality by studying the functioning of the human brain.
In the larger literary sense, however, it is now clear that the capitalist’s tribute to creativity and rebellion is an indestructible form. There is something about the merging of bossery and nonconformity that beguiles the American mind. The genre marches irresistibly from triumph to triumph. Books pondering the way creative minds work dominate business-best-seller lists. Airport newsstands seem to have been converted wholly to the propagation of the faith. Travel writers and speechwriters alike have seen the light and now busy themselves revealing the brain’s secrets to aspiring professionals.
Realist historical fictions, with the rustling demands of their costumes and their period-appropriate speech, often depend on painstakingly described physical veracity, sensory believability, to steep a reader in the past. While not necessarily factual, such works say: This really occurred, and now you, too, may experience it. As the literary historian Stephen Greenblatt enthused in a review of “Wolf Hall,” Hilary Mantel’s novel about the rise of Thomas Cromwell—perhaps the paradigmatic contemporary example of such fiction—great historical novels “provide a powerful hallucination of presence, the vivid sensation of lived life.”
But a handful of recent works of fiction have taken up history on radically different terms. Rather than presenting a single, definitive story—an ostensibly objective chronicle of events—these books offer a past of competing perspectives, of multiple voices. They are not so much historical as archival: instead of giving us the imagined experience of an event, they offer the ambiguous traces that such events leave behind. These fictions do not focus on fact but on fact’s record, the media by which we have any historical knowledge at all. In so doing, such books call the reader’s attention to both the problems and the pleasures of history’s linguistic remains.
The book that made this distinction clear to me is a new novel by Danielle Dutton, called “Margaret the First.” Dutton’s Margaret is Margaret Cavendish, Duchess of Newcastle-upon-Tyne, who lived from 1623 to 1673 and was one of the first British women to publish in print under her own name.
Gary Saul Morson in The New Criterion:
One hundred and fifty years ago, when Dostoevsky published Crime and Punishment, Russia was seething with reform, idealism, and hatred. Four years earlier, the “tsar-liberator” Alexander II (reigned 1855–1881) had at last abolished serfdom, a form of bondage making 90 percent of the population saleable property. New charters granted considerable autonomy to the universities as press censorship was relaxed. The court system, which even a famous Slavophile said made his hair stand on end and his skin frost over, was remodeled along Western lines. More was to come, including the beginnings of economic modernization. According to conventional wisdom, Russian history alternates between absolute stasis—“Russia should be frozen so it doesn’t rot,” one reactionary writer urged—and revolutionary change. Between Peter the Great (died 1725) and the revolutions of 1917, nothing compared with the reign of Alexander II. And yet it was the tsar-liberator, not his rigid predecessor or successor, who was assassinated by revolutionary terrorists. The decade after he ascended the throne witnessed the birth of the “intelligentsia,” a word we get from Russian, where it meant not well-educated people but a group sharing a set of radical beliefs, including atheism, materialism, revolutionism, and some form of socialism. Intelligents (members of the intelligentsia) were expected to identify not as members of a profession or social class but with each other. They expressed disdain for everyday virtues and placed their faith entirely in one or another theory. Lenin, Trotsky, and Stalin were typical intelligents, and the terrorists who killed the tsar were their predecessors.
The intelligentsia prided itself on ideas discrediting all traditional morality. Utilitarianism suggested that people do, and should do, nothing but maximize pleasure. Darwin’s Origin of Species, which took Russia by storm, seemed to reduce people to biological specimens. In 1862 the Russian neurologist Ivan Sechenov published his Reflexes of the Brain, which argued that all so-called free choice is merely “reflex movements in the strict sense of the word.” And it was common to quote the physiologist Jacob Moleschott’s remark that the mind secretes thought the way the liver secretes bile. These ideas all seemed to converge on revolutionary violence.
Abigail Tucker in Smithsonian:
Humans tend to dismiss Neanderthals as dimwits, yet the brains of our doomed cousins were actually larger than our own. “If you go to a site from 150,000 years ago,” says Miki Ben-Dor, a Tel Aviv University archaeologist, “you won’t be able to tell whether Neanderthals or Homo sapiens lived there, because they had all the same tools.” Which helps explain why, to fathom how our fates diverged, he recently scrutinized Neanderthals’ bodies instead of their skulls. While humans have barrel-shaped chests and narrow pelvises, Neanderthals had bell-shaped torsos with wide pelvises. The prevailing explanation has been that Neanderthals, often living in colder and drier environments than their human contemporaries, needed more energy and therefore more oxygen, so their torsos swelled to hold a bigger respiratory system. But Ben-Dor had a gut feeling this was wrong. What if the difference was what they ate? Living in Eurasia 300,000 to 30,000 years ago, Neanderthals settled in places like the Polar Urals and southern Siberia—not bountiful in the best of times, and certainly not during ice ages. In the heart of a tundra winter, with no fruits and veggies to be found, animal meat—made of fat and protein—was likely the only energy source.
Alas, though fat is easier to digest, it’s scarce in cold conditions, as prey animals themselves burn up their fat stores and grow lean. So Neanderthals must have eaten a great deal of protein, which is tough to metabolize and puts heavy demands on the liver and kidneys to remove toxic byproducts. In fact, we humans have a “protein ceiling” of between 35 and 50 percent of our diet; eating too much more can be dangerous. Ben-Dor thinks that Neanderthals’ bodies found a way to utilize more protein, developing enlarged livers and kidneys, and chests and pelvises that widened over the millennia to accommodate these beefed-up organs.
Thursday, May 19, 2016
Christian Lorentzen in Vulture:
I’m cursed with a mind that looks at a sentence and sees grammar before it sees meaning. It might be that I’m doing math by other means, that I overdid it with diagramming sentences as a boy, or that my grasp of English was warped by learning Latin. Translating Horace felt like solving math problems. Reading Emily Dickinson began to feel like solving math problems. You might think this is a cold way of reading, but it’s the opposite. You develop feelings. Pronoun, verb, noun — I like sentences that proceed in that way, in a forward march. Or those tricked out with a preposition, another noun, and a couple of adjectives. Conjunctions and articles leave me unfazed. If these combinations result in elaborate syntactical tangles, it thrills me. It’s cheap words I hate, and I hate adverbs.
I’m unembarrassed to admit that my taste in literary style owes a lot to my adolescent reading of The Sun Also Rises — Hemingway was no friend of adverbs. He’s not alone. “Use as few adverbs as possible” is among V. S. Naipaul’s rules for beginning writers. When William Strunk and E. B. White admonish us to omit unnecessary words, I know they’re talking about adverbs without their having to say it.
Karan Jani in The Wire:
What was your first reaction when you saw the gravitational-wave event on September 14, 2015 and the whole process which followed until the historic announcement?
I think it was just one of deep satisfaction, that a dream that Rai Weiss, Ron Drever and Joseph Weber and Vladimir Braginsky and Stan Whitcomb and others had developed and shared so many decades ago that was finally reaching fruition.
In fact, nature turned out to be giving us just what I had expected – I’d expected since the early 1980s that the first thing we would see would be merging blackholes because the distance you can see goes up roughly proportionally with the mass of the binary, and so the volumes are cubed, and that factor would overwhelm the absolute lower event rate for blackhole binaries compared to neutron star binaries. It seemed very likely to me so that’s just what I thought would happen. It’s a big part of how I hoped to sell this project.
To have that come out right was pleasing, to have the strength of the waves be 10-21 – that’s a number we started targeting in 1978. So it all came to pass the way we expected it to, thanks to enormous work by your generation of experimenters. You were the ones who really pulled it off. The way I like to say it is that it’s your generation of experimenters that makes me look good!
There is something uncanny about staying in another person’s house — the stark differences and the small convergences of sameness. We all like to snoop a bit. Now, public historian Ruth Goodmangives us the chance to snoop on the lives of people who died 500 years ago. When you’re watchingThe Tudors or Wolf Hall, Goodman is the woman behind the scenes ensuring that the clothes look right, the home interiors are accurate, and the sumptuous feasts are as true to life as possible. InHow to Be a Tudor: A Dawn-to-Dusk Guide to Tudor Life, she makes her almost preternatural knowledge about life during the 16th century available to the reading public.
You wouldn’t expect the intricacies of Tudor baking, brewing, ploughing, cooking, needlework, painting, dancing, and card-playing to hold an audience rapt, and yet Goodman makes the minutia of everyday life a half-millennia ago tremendously interesting. Indeed, her voluminous knowledge makes Goodman seem not so much a specialist on period authenticity as an actual time traveler. Ingeniously structuring the book around the hourly rhythms of daily life (with chapters going from “At Cock’s Crow” to “And so to bed”), Goodman transmits information about food, work, medicine, education, leisure, lodging, sleep, and even sexuality. How to Be a Tudor, with its grounding in physical detail and avoidance of theoretical analysis, is true to the guide book genre, but one featuring recipes for veal meatballs (exceedingly expensive at the time) and Galenic medical advice.
Inside the monastery of S. Trinità dei Monti, which stands at the top of the Spanish Steps in Rome, is a room decorated in glorious trompe l’oeilas a ruin. Created in 1766 by Charles-Louis Clérisseau, and originally intended to be the cell of the monastery’s resident mathematician Fr Thomas Le Sueur, it imitates a decaying classical temple, with tumbled columns, a roof open to the sky, encroaching vegetation and a large parrot perched on one of the apparently surviving crossbeams. The irony of the design worked on several levels. It allowed the famous scholar to enjoy the pleasure of ruins without the discomfort. But it was also a wry comment on the life cycle of buildings. Ruins are one stage on their inevitable journey to destruction. As we know from some of the most ambitious modern attempts at conservation on archaeological sites all over the world, from Pompeii to Machu Picchu, collapse can be delayed – but not prevented. Here Clérisseau offered dilapidation frozen in time, a ruin built to last.
That life cycle of buildings, from conception to death, with an occasional lucky, or unlucky, resurrection, is the theme of James Crawford’s Fallen Glory – twenty chapters telling the biography of twenty structures, from across the world, ancient and modern, real and imaginary (the first chapter is on the Tower of Babel, the last on the virtual world of the web hosting service GeoCities). Some of these life stories work better than others. The Roman Forum, the subject of Chapter Six, needs so much background that we tend to lose sight of the main character as it rises out of the marshes, becomes the monumental centre of the empire, and slips back into pasture, only to be revived again in the service of Mussolini’s grandiose ambitions.
Hiat was born in Kletsk, a town south of Minsk, in Belarus. As a child, he began to doubt the possibility of God. “I’ve seen children die, small children, and the doubt of a merciful God really drove me” away from religious belief, he said to Roth during the first interview session, describing the crucible of his political consciousness and suggesting the rigor of his autodidactic mind. But at the same time, at the cheder in Kletsk, Hiat was introduced to the Jewish teaching that opened him intellectually to a “revolutionary instinctive upbringing.” “Socialism,” he said, “is part of philosophical Judaism.” There is, he explained to Roth, who never received, or pursued, a full Jewish education, “a certain Hebrew word, ein kemach, ein Torah: If you have no bread, you have no Torah.”
Bernie Sanders, who perhaps embodies this connection as thoroughly as any American public figure in history, rarely draws that line. In a speech last year to the students of the Evangelical Christian Liberty University, he quoted the Book of Matthew, not Torah or Talmud, in citing a religious influence in his political ideology. (Hillary Clinton, for her part, draws a connection between the Christianity she experienced growing up and her instinct to volunteer in poor neighborhoods of Chicago.) Sanders sometimes directs the question of how his Jewish self-identity inspired his political beliefs to the specter of the Holocaust, from which his father escaped but many of his relatives in Poland did not; more often, he simply identifies his parents as “Polish.”