Monday, May 02, 2016
by Paul North
In these monthly posts I will survey the landscape of "fateful thinking," as we glimpse it on the moons orbiting old Europe today. The premise will be that in politics, culture, academia, medicine, economics, and private life, among other regions of experience, we—those in charge and those charged up and those under the thumb of others in this orbit—tend to express ourselves, on the most important matters, in fateful terms. "It has to be like this or that." Whether we are correct or not when we say "it is" and mean "it must be," "it has always been," we regularly call on such statements to support our most critical decisions. Let us assume provisionally that, despite so much hurried change, with all our freedom of imagination and all our progress, we still tend to base our decisions on what must be the case, what could not be otherwise, what comes out of a finished past or certain future and determines the core of our being. In our times these sound like old-fashioned, even ancient sentiments. For the purposes of this survey, I shall assume that "fateful thinking" is as at home in the new as it was in the old. Fate ideas operate equally in science and religion, although "fate" certainly takes distinct forms in each. What remains then is to describe and analyze those forms, the current genres of fate, in hopes of discovering by chance a way of living in which the idea of life has not already been settled in advance.
Current Genres of Fate 1: Kafka's Innocents
When did the idea of fate arise, the one in which every tiny detail of life, every twist in life's way is a sign that says: "no way out." Classical labyrinths have exits, though they are hard to find. When did the intuition of a labyrinth whose doors open back into itself take over the imagination? When did we enter into zones of experience in which the exit brings us back to square one? Some think it was the work of the Protestant Reformation. Iris Murdoch attributed it to the rise of science: "The idea of life as self-enclosed and purposeless is of course not simply a product of the despair of our own age. It is the natural product of the advance of science and has developed over a long period." Whatever its origins, when certainty about the destiny of any single human life is taken away, every tiny event becomes a possible portal to destiny. When fate toppled from its throne at the end of history, fateful thinking seeped back into everyday life, filling its crevices. Institutions like law and bureaucracy grew exponentially alongside the rise of science, and this only intensified the seepage of fate into the crevices of life. Institutional protocols took on the offices of destiny and made destiny into a matter of finding the right office.
Life's suffusion with fate had a peculiar consequence: we became innocent again. It is a new Eden, except that, under this version of fate, whereas in the Garden we could do nothing wrong because there was not yet any wrong in the world, now we can do nothing wrong because our actions are so severely limited by the strictures that surround us. We can do nothing really wrong because we really can do so little. Kafka wrote about this constricting context and its new innocence.
by Michael Liss
If you love classical music, there is a place in your imagination that takes you back 192 years, to May 7, 1824, and puts you at one of the most extraordinary moments in musical history—the first public performance of Beethoven's Ninth Symphony.
You want to be there. You want to see Beethoven himself raise his hands for the first downbeat, that odd woosh that then unfolds almost like an orchestra tuning up. You want to hear those crisp, slashing sounds as it moves through the second movement, and swirling cloud of notes, floating above you, that is the third. But the payoff comes in the fourth, when Beethoven surpasses himself, first trying, and then rejecting, the themes of the first three to resolve by joining instrumental beauty to vocal, fused in the pure elation of Schiller's "Ode to Joy. "
If you were there, you would do as every other person in attendance did—leap to your feet and roar your approval. And you would be witness to the most dramatic, even shattering moment in music history—when one of soloists, Caroline Unger, gently turns the ailing, unhearing Beethoven to receive their adoration.
As the historian (and musicologist) Edmund Morris recounts in "Beethoven, the Universal Composer," if there was any silence in the house, it could have only come from the Imperial Box, which was empty. Beethoven, a man underwritten for decades by the aristocratic and wealthy, had begun to edge away from them, and they from him. The Ninth is not only revolutionary in its form, it is perhaps the first large-scale truly democratic work. With one 74-minute effort, Beethoven created an entirely new vocabulary, one that not only spoke of a stateless universal brotherhood, but in form and delivery, frees the individual to participate to the extent of his abilities.
To put Beethoven in better context, it's useful to place him, and two of the great composers before him, Bach and Mozart, in "political-musical" time, or perhaps more accurately, "political-musical-economic" time.
by Ryan Ruby
"Ordinary men commonly condemn what is beyond them." —François de La Rochefoucauld, Maxims
For the American reader Dan Fox is an ideal guide to the murky space where class overlaps with taste. His position in the art world—he is a co-editor of the renowned contemporary art magazine frieze—has furnished him with ringside seats to some of the "nastiest brawls over pretentiousness." Moreover, he is British. The class education the English receive as a matter of their cultural heritage enables them to view the matter more clearly than their American counterparts, whose understanding of class has been systematically retarded by taboo, ideology, and denialism, resulting in a deeply classed society that has no idea how to talk about this aspect of itself.
Class is not "just a question of money and how you spend it," Fox helpfully reminds us in his book-length essay Pretentiousness: Why It Matters (Coffee House Press, 2016). It's also "about how your identity is constructed in relationship to the world around you." When we divide classes solely on the basis of wealth—into upper, middle, and lower—as we tend to do in America, it becomes easy to forget that the division is not only arbitrary, but also a gross simplification. In fact, the more generally we talk about class, the easier we fall into confusion. The so-called upper, middle, and lower classes are by no means unified groups, whose members view themselves as bound by the same interests. Every member of the "upper class," for example, may be considered an elite, but this elite group is comprised of a number of class segments, whose members may in turn be ranked on the basis of their access to various kinds of capital (financial, educational, social, cultural, geographical, symbolic, etc.) whose relative importance is in a permanent state of flux.
by Evan Edwards
In September of 1851, a word enters the journal of Henry David Thoreau: perambulation. An inveterate enthusiast of walking, as well as a voracious collector of words, such a sudden introduction of this peculiar peripatetic term, which is an antiquated relative of the more familiar ‘ambulation,' or ‘to amble,' (from the Latin Ambulare, ‘to walk') should stand out to us as readers. He writes that "[o]n Monday, the 15th, I am going to perambulate the bounds of the town," and later, "Sept. 17. Perambulated the Lincoln Line," and "Sept. 18. Perambulated Bedford line." This word begins to cross Thoreau's mind more and more steadily for the better part of a month until, in October, he gives up ‘perambulating,' and instead uses a near synonym, ‘surveying,' (which, like perambulating, has to do, at least on the surface, with the work he was doing at the time) to describe his activities. He then rarely returns to ‘perambulation' for the rest of his life. At least in word.
Instead, in October, he begins to speak exclusively of ‘surveying,' ‘walking,' or elsewhere, ‘skating to,' and then, as he enters the late 1850s, in the last half-decade of his life, he all but ceases to lead journal entries with a description of his own activity at all, perambulation or otherwise, referring instead to the conditions of the environment and then, occasionally, drifting into descriptions of his own mind and body. Although the term does not seem to return, it tells us worlds about Thoreau's philosophical position.
In order to understand the significance of the brief intrusion of this term, we should keep two things in mind: first, the time at which he was writing these entries; and second, the difference between ambulating and perambulating. Attending to these two points should help us not only understand Thoreau, but also something about our own relationship to nature.
Sughra Raza. Black Ducks, Winter 2016.
by Dave Maier
Math is pretty easy when you’re just starting out. You’re just adding and subtracting and multiplying and dividing. They might even let you use calculators, but even if they don’t, you’re just dealing with whole numbers, the kind you use when you’re counting on your fingers. (Sometimes they spring some newfangled versions of the multiplication algorithm on you, but it’s still just multiplication.)
Some students first run into trouble when they get to fractions, usually in sixth grade or so. Now we are writing the same number in rather different ways (1/2 = 2/4 = 0.5, and so on), and we can’t really count on our fingers either. All of a sudden there are a whole bunch of numbers between 2 and 3. In fact, as it turns out, there are an infinite number of such numbers. Infinity was okay when it was the biggest number of all, all the way on the end (or ends) of the number line and thus safely out of the way, but now we’re using it to count things, and those things are themselves not only the things we count with, but the numbers between what we seem now to be calling the “counting” numbers. (It even turns out – although they don’t make a big deal of this in sixth grade, thank goodness – that there are more numbers between 2 and 3 than there are “counting” numbers on the whole number line, even though both numbers are infinite. Yikes!)
Again, though, in arithmetic at least we’re just talking about numbers. Every problem has a single right answer, even if we now get to write that answer in different ways. But then, all of a sudden, straight up ahead: algebra.
by Paul Braterman
Toilet etiquette is where prudery meets absurdity. Your chance of being embarrassed, let alone molested, by a transgender person in a US public toilet is probably zero, and certainly less than your chance of being shot dead at home by a toddler playing with a gun; after all, the only public display of genitalia is at the men's urinal, and you can always use a booth if you prefer.
It is said that an undergrad once asked Sir John Pentland Mahaffy, Provost of Trinity College Dublin, where he might find a lavatory. "At the end of the corridor," Mahaffy grandly gestured, "you will find a door marked GENTLEMEN; but don't let that stop you." In the UK, of which Dublin was stll part at the time, class trumps gender. Incidentally, Trinity had been admitting female undergraduates since 1903, 74 years before Harvard; I assume that sanitary arrangements were instituted to cope with this.
It is established law in the US that the teaching of creationism serves a religious, rather than scientific or educational, purpose. It follows (Edwards v. Aguillard) that such teaching is unconstitutional in US public schools, since it violates the First Amendment separation of Church and State. There is no prospect of this ruling being overturned, unless we end up with a Supreme Court nominated by President Ted Cruz.
It has also been repeatedly established that display of the Ten Commandments on Government property violates the US Constitution, for much the same reasons.
So why do we have States bringing in transgender bathroom laws, scientifically baseless (as discussed here by my friend Faye Flam), whose only effect would be to inconvenience and offend one particular small minority? Why has this monumental non-issue even spilled over into the moronic drivelfest that is now the Republican Party's nomination debate?
Prince, Bowie, and Glenn Frey: 21st Century Public Mourning as a Rejection of Cold War Culture, or, Why Nobody Really Gives a Shit About that Guy from the Eagles
by Akim Reinhardt
David Bowie was a white Englishman. Prince was a black American. Bowie was deeply rooted in the riffs, major/minor chords, and melody of rock-n-roll. Prince was grounded in the syncopated rhythms and arrangements of funk and R&B.
Prince's and Bowie's careers did overlap to a degree. Their biggest selling albums, Bowie's Let's Dance and Prince's Purple Rain, were released within a year of each other. But of course Let's Dance was Bowie's capstone in many ways, his big pop breakthrough after nearly 15 years of churning out music, whereas Purple Rain came fairly early in Prince's career, establishing him as an international pop icon for decades to come. So despite the kissin' cousin chronology of their biggest albums, the respective heydays of David Bowie and Prince were, in many ways, separated by about a decade. That makes sense since Prince was ten years younger than Bowie.
Despite all these differences, however, their deaths, coming three months apart from each other, produced similar strains of public mourning. In particular, many people confessed how one or the other artist had profoundly affected them during their formative years. And this heartfelt influence, many said, came not just from Bowie's and Prince's music, but especially from their artistic personae.
In between Bowie's and Prince's passing came the death of Glenn Frey, one of the two lead singer/songwriters of the Eagles, one of the most successful bands in the history of recorded music.
I have yet to see anyone write an essay, post a facebook comment, tweet, or make any other public expression of their deep gratitude for the vital role Glenn Frey played in helping them cope during their formative years.
Why? I suspect the answer is the Cold War.
by Brooks Riley
by Leanne Ogasawara
Last summer, marooned with a large group of astronomers in a remote 11th century abbey in the Tuscan countryside, I found myself growing increasingly antsy. Hatching a plan to break out, I dragged my astronomer off on what should have been one of the great pilgrimages of our lifetime--for as luck would have it, just down the road lay what Aldous Huxley considered to be the greatest picture in the world.
I am referring to one of the paintings on the famous Piero della Francesca trail. To see those masterpieces in situ is astonishing, and I consider the Piero Pilgrimage to be one of the great art historical experiences in the world.
Like all pilgrimages, however, this one was not without its mishaps.... Flushing my phone accidentally down the toilet after seeing the astonishingly beautiful and transportive fresco cycles in Arezzo was bad enough; but then to finally arrive at the climax of the pilgrimage where Aldous' "best picture on earth" stood, only to find it unavailable for viewing (and not just that but veiled in such a way as to tantalize us about what glorious beauty we were missing)-- was close to unbearable.
Our biggest blunder, however, came when we willfully decided to skip driving an extra half hour to go see the Madonna del Parto. Yes, I want to kick myself! Located in Monterchi, the Madonna del Parto is an extremely rare (perhaps the only?) treatment in Christian art of the Virgin pregnant. "Del Parto" can mean labor or childbirth--and in the picture, Piero depicts a very pregnant Mary.
by Matt McKenna
Zootopia’s target audience may be a tad younger than Bernie Sanders’ target audience, but youthful Sanders supporters should nonetheless consider watching the film in order to see a dark vision of their potential future. Like many animated Disney films, Zootopia includes talking animals working together to solve a problem. Also like many animated Disney films, the audience is bludgeoned with allusions comparing the cartoon animals’ society to our own (in Zootopia, institutions are specist like real world institutions are racist). There’s nothing wrong with talking animals or ham-fisted moralizing--after all, the film is for kids. What differentiates this Disney film from previous Disney films is that a young voter--pro-Sanders or not--may well see their dreary, hopeless future in Officer Judy Hopps’ transition from plucky bunny to establishment stooge.
The hero of Zootopia is Judy Hopps who, like young voters in reality, starts out as an ardent advocate for the downtrodden. Though she is but a humble rabbit, a child to carrot farmers, Judy dreams of becoming a police officer in the big city of Zootopia, which is an interesting choice for the name of a city built by animals since (at least for me) the name conjures up images of caged creatures on display for human amusement. Anyway, young and full of hope, Hopps enrolls in the police academy, lands a job as the city’s first rabbit cop, and quickly thereafter becomes disillusioned by her role in the force. You can probably guess the challenges she faces: the chief is a jerk, the sleazy Mayor Lionheart (he’s a lion) cares about his image and not about the city’s crime wave, and the people Officer Hopps attempts to protect eventually take advantage of her naïveté. At the film’s emotional nadir, Hopps falls into a depression and heads home to farm carrots with her parents. It's the classic tale of a kid rebelling at twenty only to go mainstream at thirty. Admittedly, Hopps speeds through this transition much faster than a decade, but that shortened time period may be narratively justified by converting the film’s timeline into rabbit-years or something.
Duke Ellington was one of the great composers and bandleaders of the last century, and his band was one of the great bands. Touring, however, is unforgiving. Long hours sitting in a bus, meals if and when you can grab them, and gigs every night. And when you’ve played the same tunes with the same cats for decades, well, it can be rough to get up for a gig. Fact is there were times when Ellington’s musicians looked like they were asleep on the stage.
That’s how they appeared the one time I saw Ellington live. It was at one of those sessions held by the Left Bank Jazz Society in Baltimore’s Famous Ballroom on Sunday afternoons. This was probably in 1970, 71, or 72, long after Ellington’s prime years in the second quarter of the century. The Famous Ballroom was on North Charles Street, not too far from the train station, and up three flights of fairly wide stairs. It too was past its prime years, but the patrons of the Left Bank, they were always primed for good music. Some were dressed to the nines in their church Sunday best, the men in sharp suits, the women in elaborate hats; and some were dressed casually in jeans and sneakers.
That’s generally how it was, but I only specifically remember three things from that concert. Ellington dressed well and had a line of patter smooth as silk and brittle as glass. He’d been doing this a long time. That’s one. The guys slumped in their chairs like they’d just gotten off an all-night flight from Timbuktu. Perhaps they had. That’s two.
And three: Paul Gonsalves burned the place down with his tenor sax. I forget what the number was. All I remember is that Gonsalves strode out on stage to play a solo, but he didn’t position himself in front of the microphone. He stood to one side. A helpful member of the audience moved the mike directly in front of him as he started to blow. He stopped playing for a second, grabbed the mike angrily and shoved it aside. Not for him the brittle reverberations of amplified sound. Then he started blowing again. The pure juice of the natural human essence flowed from his sax to embrace us in its majesty and urgency.
Sunday, May 01, 2016
Patrick Sauer in Signature:
In his works, Vonnegut’s fondness for the Bard can be traced from Kurt Sr.’s 1949 woodworking through a 2005 essay in A Man Without A Country, the last work published in his lifetime. In that piece, Vonnegut compares Hamlet to Cinderella and Kafka’s cockroach, expounds on how apparitions are not to be trusted, compares Polonius to Rush Limbaugh, and commends Shakespeare for doing what so few people do: Telling the truth, admitting we know so little about life. It’s a theme mirrored throughout Vonnegut’s career, even if the Bard’s technique didn’t require as many authorial surrogates. Tomato, Tomahto, so it goes…
The paths of Shakespeare and Vonnegut crossed multiple times, once through dimensions only known by the Tralfamadorians. In God Bless You, Dr. Kevorkian, which originated as a series of 90-second public radio pieces, Vonnegut interviews people about the afterlife. The “tongue-tied, humiliated, self-loathing semi-literate Hoosier hack” is fisked by a feisty William Shakespeare who starts out by mocking Vonnegut’s dialect, calling it the “ugliest English he had ever heard, ‘fit to split the ears of groundlings.’” The Bard is salty throughout, responding to Kurt’s congratulations on all the Oscars Shakespeare in Love won by retorting the movie is “a tale told by an idiot, full of sound and fury, signifying nothing.” Both novelists and playwrights love a good callback.
David P. Barash at the History News Network:
For now, I want to focus on why monogamy has become so popular, at least in the modern Western world, and at least in theory, if not always in practice. Although monogamy is exceedingly rare in the animal world, it is found in a few cases, and nearly always, the payoff seems to be associated with the adaptive benefit of biparental childcare, something that Homo sapiens finds especially beneficial, given that we are unusual in that our offspring are profoundly helpless at birth, remaining needy for an extraordinarily long time. Nor is it absolutely necessary that the cooperating adults be man and woman; we know from abundant sociological data that two women or two men can do an excellent job, and that when it comes to child rearing, two – of any sex – are better than one. But we also know that prior to the cultural homogenization that followed European colonialism, more than 83% of human societies were preferentially polygamous, and that polygamy was also prominent in the ancient Near East from which that presumed Western move to monogamy originated.
So my question for now is: why did such a large segment of human society switch from polygamy to monogamy? And my first answer is: at present, we don’t know. My second answer is a guess, which goes as follows. (I propose it simply as a hypothesis, in the hope that readers will not only find it interesting but also useful in generating informed discussion and, if possible, meaningful research.)
Imagine a polygynous society with an average harem size of, say, ten. This means that for every male harem-keeper, there are nine unsuccessful, sexually and reproductively frustrated, resentful bachelors. The simply reality is that polygyny is disadvantageous not only for women – for complex reasons – but even more so for men, since with a 50/50 sex ratio, there are unmated men in proportion as polygyny obtains. This, btw, runs counter to the lascivious imaginings of many men, who, when I describe the evidence for primitive human polygyny, often express regret that they weren’t alive in those days, imagining that they would be a happy harem-holder.
Francine Prose in The Guardian:
Late in Alain de Botton’s engaging novel, a married couple, Rabih and Kirsten, find that the demands and stresses of ordinary life – work, domestic chores, financial worries, the harrowing expenditure of energy required to raise their two adored children – have made them irritable and contentious. In part, the narrator concludes, they are at odds “because they have so seldom seen their struggles sympathetically reflected in the art they know … Were Rabih and Kirsten able to read about themselves as characters in a novel, they might ... experience a brief but helpful burst of pity at their not at all unworthy plight, and thereby perhaps learn to dissolve some of the tension that arises on those evenings when, once the children are in bed, the apparently demoralising and yet in truth deeply grand and significant topic of ironing comes up.”
Presumably, the novel that Rabih and Kirsten need to read is the one De Botton has written: a sympathetic account of the relationship that begins only after the besotted courtship has ended. Having fallen deeply in love, the couple “will marry, they will suffer, they will frequently worry about money, they will have a girl first, then a boy, one of them will have an affair, there will be passages of boredom, they’ll sometimes want to murder one another and on a few occasions to kill themselves. This will be the real love story.”
Rabih and Kirsten are well-drawn, individualised characters, with distinct and separate backgrounds (he’s half-Lebanese, half-German; she’s Scottish), careers (he’s an architect working in an urban design studio; she’s a surveyor employed by Edinburgh City Council) and personalities (she’s confident and feisty; he’s dreamy and insecure). But what’s interesting is De Botton’s decision to make their experience so thoroughly ordinary that their lives seem emblematic, their stories interchangeable with those of countless couples.
Jefferson Morley in The Intercept:
Last summer I paid a visit to Georgetown University’s Lauinger Libraryas part of my research on legendary CIA counterspy James Jesus Angleton. I went there to investigate Angleton’s famous mole hunt, one of the least flattering episodes of his eventful career. By the early 1960s, Angleton was convinced the KGB had managed to insert a penetration agent high in the ranks of the CIA.
In researching and writing a biography of Angleton, I constantly confront a conundrum: Was the man utterly brilliant? Or completely nuts?
Angleton is one of America’s archetypal spies. He was the model for Harlot in Harlot’s Ghost, Norman Mailer’s epic of the CIA, a brooding Cold War spirit hovering over a story of corrupted idealism. In Robert De Niro’s cinematic telling of the tale, The Good Shepherd, the Angletonian character was a promising product of the system who loses his way in the moral labyrinth of secret intelligence operations.
In real life, Jim Angleton was a formidable intellectual and canny bureaucrat who helped shape the ethos of the Central Intelligence Agency we have today.
Jeff Guo in the Washington Post:
Scholars have long puzzled over the different fates of the world’s peoples. Why, on the eve of the modern world, were some societies so technologically and politically complex? For centuries, leading intellectuals from Adam Smith to Karl Marx believed that agricultural abundance had propelled the rise of advanced civilizations. The Assyrians and Babylonians of ancient Mesopotamia, for instance, flourished thanks to their fertile farms, which fed an upper class that devoted itself to religion and empire.
In his 1997 bestseller “Guns, Germs and Steel,” historian Jared Diamond argued that the availability of nutritious and easily domesticated plants and animals gave some societies a head start. In the Middle East there was barley and wheat; in Asia there was millet and rice. “People around the world who had access to the most productive crops became the most productive farmers,” Diamond later said on his PBS show. And more productivity led to more advanced civilizations.
But the staple crops associated with less-advanced peoples — like manioc, the white potato, the sweet potato and taro — weren’t necessarily less productive. In fact, manioc and the potato are superstar crops, less demanding of the soil and less thirsty for water. These plants still feed billions of people today.
Kathleen Downes in Women's Media Center:
When I was a little girl, I held hands with my friends. It was a sign of companionship and togetherness, one that wordlessly affirmed the strong force that is female friendship. As I grew from a girl into a woman, I started to get a lot of cultural messages, implicit and explicit, that holding hands was no longer acceptable between friends because it was now assumed to be romantic, reserved for those who are “more than friends.” Suddenly, this way to be close to those I love was sexualized. Hand holding between any two people is beautiful when used as a romantic gesture. But it grieves me, as it should grieve us all, that our culture is so hypersexualized that just about anything we do stands the possibility of being perceived as sexual. This is especially true for women. A simple gesture that in my childhood served as a means of human connection is now treated as sexual, and all its other meanings—like unity, strength, and togetherness—seem to fade away in the eyes of the world.
Not wanting my friends or those around us to misinterpret a gesture of friendship as something more, I stopped holding their hands. I more or less stopped connecting with my friends through touch altogether after childhood because I didn’t want to “give the wrong idea.” When we lose social permission to hold hands as an expression of sisterhood, all women lose something. As a disabled woman, I have felt this loss uniquely and profoundly. I was born with cerebral palsy, and I spend most of my time in a power wheelchair. I view my wheelchair as a tool of freedom, as natural to me as a leg or an arm. I do not resent my wheelchair or see it as confining. Any metaphors likening my chair to a metal prison will be swiftly rejected. However, it cannot be denied that being seated on an electronic throne of metal, plastic, and overpriced foam affects my relationship with physical touch. I live in a world that does not even know how to look at me, much less touch me.
Patrick Hennessey in The Telegraph:
You wouldn’t think that Rudyard Kipling would be particularly esteemed in modern India. Now notorious, rather than celebrated, as the “Bard of Empire,” you might imagine that, if Kipling were remembered in India at all, it would be with understandable awkwardness at best and, at worst, disdain. Memories are long in the Punjab, and few have forgiven Kipling for his public support of General Dyer, the Butcher of Amritsar. Yet as I followed in the young writer’s footsteps through modern Pakistan and India to make the documentary Kipling’s Indian Adventure, from Lahore across the hot Punjabi plain and up into the fresh foothills of the Himalayas to the Raj’s summer capital of Shimla, I discovered not only that Kipling was well known, but that many of his works are well regarded and even taught in schools — more so, I dare say, than in Britain. And no matter where I went or to whom I spoke, one particular set of stories was loved above all others: the Jungle Books. Of course, their modern reach owes a lot to the 1967 Disney animation, the very last production overseen by Walt Disney himself. On the Mall in Shimla, in the shadow of the arch-Gothic Gaiety Theatre — surely the symbolic apotheosis of the Raj — I discussed Kipling’s legacy in India with a group of young students. As soon as the Jungle Books were mentioned, someone started humming The Bare Necessities.
It was probably for the best that Kipling did not live to see the liberties Disney took with his work. But despite straying far from the original texts, the film gleefully and stubbornly kept one of Kipling’s finest creations in the hearts of successive generations — for that alone it should be applauded. No matter what one thinks of Kipling’s politics, The Jungle Book (1894) and The Second Jungle Book (1895) represent one of the great pieces of imaginative writing in English, allegorical tales as timeless as Aesop’s fables and flights of masterfully realized fancy on a par with Lewis Carroll’s Alice stories and Kenneth Grahame’s The Wind in the Willows (1908).
Saturday, April 30, 2016
Steven Nadler in Aeon:
In July 1656, the 23-year-old Bento de Spinoza was excommunicated from the Portuguese-Jewish congregation of Amsterdam. It was the harshest punishment of herem (ban) ever issued by that community. The extant document, a lengthy and vitriolic diatribe, refers to the young man’s ‘abominable heresies’ and ‘monstrous deeds’. The leaders of the community, having consulted with the rabbis and using Spinoza’s Hebrew name, proclaim that they hereby ‘expel, excommunicate, curse, and damn Baruch de Spinoza’. He is to be ‘cast out from all the tribes of Israel’ and his name is to be ‘blotted out from under heaven’.
Over the centuries, there have been periodic calls for the herem against Spinoza to be lifted. Even David Ben-Gurion, when he was prime minister of Israel, issued a public plea for ‘amending the injustice’ done to Spinoza by the Amsterdam Portuguese community. It was not until early 2012, however, that the Amsterdam congregation, at the insistence of one of its members, formally took up the question of whether it was time to rehabilitate Spinoza and welcome him back into the congregation that had expelled him with such prejudice. There was, though, one thing that they needed to know: should we still regard Spinoza as a heretic?
Unfortunately, the herem document fails to mention specifically what Spinoza’s offences were – at the time he had not yet written anything – and so there is a mystery surrounding this seminal event in the future philosopher’s life. And yet, for anyone who is familiar with Spinoza’s mature philosophical ideas, which he began putting in writing a few years after the excommunication, there really is no such mystery. By the standards of early modern rabbinic Judaism – and especially among the Sephardic Jews of Amsterdam, many of whom were descendants ofconverso refugees from the Iberian Inquisitions and who were still struggling to build a proper Jewish community on the banks of the Amstel River – Spinoza was a heretic, and a dangerous one at that.
The American public sphere is blessed with many religious experts. In the midst of the Syrian refugee crisis, pundits reminded us that Christianity enjoins the welcoming of refugees. Many of the same people, it turns out, are also deeply familiar with Islam, allowing them to piously intone that it is a “religion of peace.” These claims often come from people who are not themselves affiliated with those faiths or any other: they are political interventions masquerading, sometimes insultingly, as exegesis. They serve an important function, however, as a form of wish fulfillment. If these pat, nervous descriptions of long and complex religious traditions were true, the age-old problem of religion in the public square could vanish into a puff of banalities. Peace and refugee assistance are perfectly good secular, progressive goals, and it would be convenient if Christianity and Islam, which long antedate secular progressivism, happened to enjoin the same things. Alas, the world is not so simple. But what, then, are we to do? What should we expect from religion in a secular society?
The conservative position on religiosity has the virtue of coherence: America, from this perspective, is a Christian nation. Even if other religions should be tolerated in the name of Christian charity, they should cede pride of place to America’s exceptional Christian heritage. Progressives have a much more difficult time, and we ricochet between contradictory and unsustainable positions. On the one hand, religion is transparently absurd, but on the other the triumphant atheism of Richard Dawkins is embarrassing, too. When someone such as Kim Davis forces us to confront difficult issues of law and faith, we often have recourse to uncomfortable mockery, unsure why it is wrong to disobey political authority in the name of individual conscience. The old Marxist account of religion as an “opiate of the people” survives, too, in the conventional wisdom that evangelical voters cling to guns and religion because they are distracted from their true economic interests. These attempts to sidestep the question of religion’s role are dangerous but understandable. The great philosopher Richard Rorty once sighed that religion was a conversation-stopper: If someone claims to be acting for religious reasons, what is there to say? If he were alive today, he would know that if we cease talking about religion, we start shouting about it.