Friday, January 03, 2014
The poet Kobayashi Issa suffered greatly in his life — suffered as we all, in time, suffer — and like us, Issa’s suffering informed his opinions about New Year’s. Beginning with his mother at age three, Issa’s loved ones seemed always to die — his grandmother, his children, his wife. No one Issa loved was immune. Issa’s body of work is a chronicle of loneliness and loss. It is not easy to laugh when everything in life goes wrong. Which is what makes Issa’s reputation as a funny poet even more significant. For example:
the moonflowers …
Issa was also a Buddhist and so had a Buddhist perspective on New Year’s. Meaning, he was inclined to view the big through the lens of the microscopic. (Issa wrote no less than 200 poems about frogs, around 230 on the firefly, over 150 about mosquitoes, 90 on flies, and over 100 on fleas, not to mention his gentle meditations on excrement and flatulence.) Issa began his autobiographical work The Spring of My Life with a New Year’s story. It went like this. Long ago, in Fuko Temple, there was a devout priest who was determined to celebrate New Year’s to the fullest. So on New Year’s Eve he wrote a letter to himself and asked a novice to deliver the letter back to himself — the priest — in the morning. On New Year’s Day, the novice entered the priest’s room and handed him the letter. The priest quickly opened the letter and read aloud. “Give up the world of suffering! Come to the Pure Land. I will meet you along the way with a host of bodhisattvas!” And then the priest began weeping so hard the tears soaked his sleeves.
Gandhi's years in South Africa were the making of him and could be said to mark the beginning of the unmaking of the British Empire. Ramachandra Guha's fine new book examining this time is the first of two projected volumes that will eventually cover the whole of the Mahatma's life. Usually this period is treated by biographers simply, and relatively sketchily, as a prelude to the much more important events that happened after his return to India in 1914. Guha thinks this does 'injustice to both man and place'. If he had been killed then, rather than in 1948, he would still have left a huge mark. Gandhi himself, when informed of an assassination plot against him in Johannesburg in March 1914, told a nephew that, if it succeeded, it would 'be welcome and a fit end to my work'.
We forget how celebrated he already was, in India and Britain as well as in South Africa, before the great events of the 1920s and 1930s. He was in South Africa for twenty years, after all. That was time enough not only for some great achievements on behalf of 'Asiatics' living and working there, but also for him to hone his broader ideas and strategies into the distinctive forms we are familiar with from his later career. Guha ends this book with some speculation ('counter-factual history') on what difference it might have made if he had never gone abroad. (If his father had still been alive, he probably wouldn't have done.) There can be no doubt that the experience was crucial - to him and, consequently, the world.
The Emperor Augustus had a long life, dying at the age of seventy-five, in 14 AD – appropriately enough in August, the month that had already been renamed in his honour by the grateful (or sycophantic) senate. This 2,000th anniversary is now being commemorated in Rome by another exhibition just a few hundred yards away from the Palazzo delle Esposizioni: simply titled Augusto, it is on show in the Scuderie del Quirinale, the wonderful exhibition space created some fifteen years ago out of an eighteenth-century stable block on the Quirinal hill, from where in March it moves to the Grand Palais in Paris. Seventy-five years on, the curators of this new, and much smaller, show have clearly been concerned to distance it from its Fascist forebear. Indeed, in the first essay in the excellent catalogue that accompanies the exhibition, Andrea Giardina takes great care to treat Mussolini’s Augustus with shrewd, analytic dispassion – and so to consign that Mostra to “history”.
The two shows are certainly very different. Where the Mostra Augustea celebrated the art of reconstruction, Augusto celebrates the artistic originality of the Augustan age (31 BC–14 AD), particularly in sculpture; though painting is discussed in the catalogue, there is none on display (no Livia’s Garden Room, for example, nor the exquisite decoration from the Villa della Farnesina). Even so, the show has managed to gather together in one place more of the most significant, original works of art of Augustus’ reign than have ever been assembled before – even in the ancient world itself. This makes it possible, for the first time, directly to compare portraits of the Emperor scattered across Europe (from London to Corinth, Ancona to Athens); and, no less important, to put side by side statues usually housed a short but inconvenient journey apart, the other side of the city of Rome.
Lucy Hughes-Hallett in The Telegraph:
Charlotte Brontë wrote not one but two masterpieces. Most readers know Jane Eyre. Even non-readers feel they know it, because they have seen a film version, or just because it is a part of our common culture. But Villette, Brontë’s last and – to my mind – greatest novel, is less popular, perhaps because it is so uncompromising and so original. It is high time it was recognised as the blazing work it is. Reading it you enter an area of experience – of passion and disappointment and the violent return of the repressed – that has seldom been so lucidly articulated. It is also an astonishing piece of writing, a book in which phantasmagorical set pieces alternate with passages of minute psychological exploration, and in which Brontë’s marvellously flexible prose veers between sardonic wit and stream-of-consciousness, in which the syntax bends and flows and threatens to dissolve completely in the heat of madness, drug-induced hallucination and desperate desire.
...As Virginia Woolf wrote, Brontë was one of those writers whose “overpowering personality” means “they have only to open the door to make themselves felt. There is in them some untamed ferocity perpetually at war with the accepted order of things.” The Brontë myth would have us see Charlotte and her sisters as spinsters, timidly hiding behind male pseudonyms. Anyone who has read Villette knows that Woolf comes much closer to the truth. It is a fierce book, and an irrestibly compelling one.
Christof Koch in Scientific American:
For every inside there is an outside, and for every outside there is an inside; though they are different, they go together.
—Alan Watts, Man, Nature, and the Nature of Man, 1991
Taken literally, panpsychism is the belief that everything is “enminded.” All of it. Whether it is a brain, a tree, a rock or an electron. Everything that is physical also possesses an interior mental aspect. One is objective—accessible to everybody—and the other phenomenal—accessible only to the subject. That is the sense of the quotation by British-born Buddhist scholar Alan Watts with which I began this essay. I will defend a narrowed, more nuanced view: namely that any complex system, as defined below, has the basic attributes of mind and has a minimal amount of consciousness in the sense that it feels like something to be that system. If the system falls apart, consciousness ceases to be; it doesn't feel like anything to be a broken system. And the more complex the system, the larger the repertoire of conscious states it can experience.
My subjective experience (and yours, too, presumably), the Cartesian “I think, therefore I am,” is an undeniable certainty, one strong enough to hold the weight of philosophy. But from whence does this experience come? Materialists invoke something they call emergentism to explain how consciousness can be absent in simple nervous systems and emerge as their complexity increases. Consider the wetness of water, its ability to maintain contact with surfaces. It is a consequence of intermolecular interactions, notably hydrogen bonding among nearby water molecules. One or two molecules of H2O are not wet, but put gazillions together at the right temperature and pressure, and wetness emerges. Or see how the laws of heredity emerge from the molecular properties of DNA, RNA and proteins. By the same process, mind is supposed to arise out of sufficiently complex brains.
Memoirs of a Mad Cook
Thre's no point kidding myself any longer,
I just can't get the knack of it ; I suspect
there's a secret society which meets
in dark cafeterias to pass on the art
from one member to another.
It's so personal preparing food for someone's
insides, what can I possibly know
about someone's insides, how can I presume
to invade your blood?
I'll try, God knows I'll try
but if anyone watches me I'll scream
because maybe I'm handling a tomato wrong,
how can I know if I'm handling a tomato wrong?
something is eating away at me
with splendid teeth
Wistfully I stand in my difficult kitchen
and imagine the fantastic salads and soufflés
that will never be.
Everyone seems to grow thin with me
and their eyes grow black as hunters' eyes
and search my face for sustenance.
All my friends are dying of hunger,
there is some basic dish I cannot offer,
and you my love are almost as lean
as the splendid wolf I must keep always
at my door.
by Gwendolyn MacEwen
from The Armies of the Moon
Toronto: Macmillan, 1972
Thursday, January 02, 2014
Is it a condition of comic genius to be perpetually wrestling with demons? From Canio, the iconic, stiletto-wielding clown of Ruggero Leoncavallo’s 1892 opera, Pagliacci, to modern greats like Richard Pryor, Andy Kaufman, and John Belushi, it would seem so. Even in Chaplin’s day, the depressed and often violent clown was a well-established trope, both offstage and on. Hollywood Pagliacci types included Frank Tinney, the blackface vaudevillian accused of brutally assaulting his mistress; Roscoe “Fatty” Arbuckle, whose brilliant career was undone by the untimely death of Virginia Rappe, a bit-part actress who suffered a fatal trauma in his hotel room; and the suave French comedian Max Linder, brought in by Essanay Films to replace Chaplin after the tramp had departed the studio but who failed to replicate his predecessor’s success. Suffering from a severe depression that was deepened by service in the Great War, Linder claimed he could practically feel the ability to be funny seeping out from him. In February 1924, he and his young wife, Ninette, a wealthy heiress, made a suicide pact at a hotel in Vienna but failed to consume a sufficient dose of sleeping powders. The following autumn in Paris they were better prepared. Both drank large drafts of barbiturates before injecting morphine into their veins and slitting their wrists. Chaplin dedicated a film to his replacement, declaring himself Linder’s disciple.
ABOUT FOUR-FIFTHS OF THE WAY through this vast and rich omnium-gatherum of epistolary activity by Malcolm Cowley, this almost throwaway line in a letter to Yvor Winters arrives: “I’m weak, deplorably weak, in knowledge of the sixteenth century lyric.” Nobody’s perfect! The remark doesn’t come off as disingenuous; instead, it reflects Cowley’s enduring engagement (he was then sixty-nine) with verse techniques and history as both a critic and a practicing poet himself.
“Poet” ranks just above “translator” at the bottom of the multiple job descriptions usually applied to Malcolm Cowley, one of the most important and influential men of letters (or freelance literary intellectuals, if you prefer) of the twentieth century. Yet his lifelong grappling with poetic matters (he was an acolyte of Amy Lowell’s at Harvard before and after World War I) speaks to the degree, impossible to imagine in our balkanized literary situation, to which literature in its every manifestation was all of a piece to him, something he regarded in its broadest, most inclusive vistas.
Quirin Schiermeier in Nature:
When pondering the best way to study the impact of climate change, researcher Hans Joachim Schellnhuber liked to recall an old Hindu fable. Six men, all blind but thirsty for knowledge, examine an elephant. One fumbles the pachyderm’s sturdy side, while others grasp at its tusk, trunk, knee, ear or tail. In the end, all are completely misled as to the nature of the beast.
The analogy worked. Although many researchers had modelled various aspects of the global-warming elephant, there had been no comprehensive assessment of what warming will really mean for human societies and vital natural resources. But that changed last year when Schellnhuber, director of the Potsdam Institute for Climate Impact Research in Germany, and other leading climate-impact researchers launched the Inter-Sectoral Impact Model Intercomparison Project. This aims to produce a set of harmonized global-impact reports based on the same set of climate data, which will for the first time allow models to be directly compared. Last month it published its initial results in four reports in Proceedings of the National Academy of Sciences1–4. These suggest that even modest climate change might drastically affect the living conditions of billions of people, whether through water scarcity, crop shortages or extremes of weather. The group warns that water is the biggest worry. If the world warms by just 2 °C above the present level, which now seems all but unavoidable by 2100, up to one-fifth of the global population could suffer severe shortages.
An Aviary of Small Birds
My love is an aviary
of small birds
and I must learn
to leave the door ajar . . .
Are you the sparrow
who landed when I sat
at a slate table
Webbs Wonder, Lollo
Rosso, English Cos . . .
Swift and deft
you flit and peck peck
quick as the light that
constitutes your spirit.
Yes, you were briefer
than Neruda’s octobrine.
So much rain that night.
Our room is an ocean
where swallows dive.
The bubble bursts
too soon, too late, too long:
all sorts of microscopia
swim upstream, float in
on summer’s storm.
The tenor of your heart
is true as a tuning fork struck
– and high! My love
is the bird who flies free.
by Karen McCarthy Woolf
from the Rain of Poems Project 2012
Justin E. H. Smith in his own blog:
Around the same time English-language philosophers were debating whether or not you can know what it is like to be a bat (generally deciding that you can not), the Australian poet Les Murray was busy directly transcribing the thought-world of an imagined representative of this order. Here are the final six lines from his 1986 poem, "Bat's Ultrasound":
ah, eyrie-ire; aero hour, eh?
O'er our ur-area (our era aye
ere your raw row) we air our array
err, yaw, row wry—aura our orrery,
our eerie ü our ray, our arrow.
A rare ear, our aery Yahweh.
Murray channels the inner language of other species as well. For instance, pigs, in his 1992 poem, "Pigs":
Us shoved down the soft cement of rivers.
Us snored the earth hollow, filled farrow, grunted.
Never stopped growing. We sloughed, we soughed
and balked no weird till the high ridgebacks was us
with weight-buried hooves. Or bristly, with milk.
While the individual pig refers to the collectivity as 'us', Murray imagines that cattle conceptualize that same first-person plural as 'me'. This from "The Cows on Killing Day" of 1998:
The heifer human smells of needing the bull humanand is angry. All me look nervously at heras she chases the dog me dream of horning dead: our enemyof the light loose tongue. Me’d jam him in his squeals.Me, facing every way, spreading out over feed.
The individual 'me' (to the extent that these can be individuated), the cow that narrates the poem, ends up slaughtered by a blade, and now sees the blood, or perhaps the guts, running out of her as 'me' too:
Looking back, the glistening leaf is still moving.All of dry old me is crumpled, like the hills of feed,and a slick me like a huge calf is coming out of me.
Benjamin Bratton at AlterNet:
Let me tell you a story. I was at a presentation that a friend, an astrophysicist, gave to a potential donor. I thought the presentation was lucid and compelling (and I'm a professor of visual arts here at UC San Diego so at the end of the day, I know really nothing about astrophysics). After the talk the sponsor said to him, "you know what, I'm gonna pass because I just don't feel inspired ... you should be more like Malcolm Gladwell."
At this point I kind of lost it. Can you imagine?
Think about it: an actual scientist who produces actual knowledge should be more like a journalist who recycles fake insights! This is beyond popularisation. This is taking something with value and substance and coring it out so that it can be swallowed without chewing. This is not the solution to our most frightening problems – rather this is one of our most frightening problems.
So I ask the question: does TED epitomize a situation where if a scientist's work (or an artist's or philosopher's or activist's or whoever) is told that their work is not worthy of support, because the public doesn't feel good listening to them?
I submit that astrophysics run on the model of American Idol is a recipe for civilizational disaster.
More here. [Thanks to Syed Tasnim Raza.]
From Gravity and Levity:
What do you think are the odds that you will die during the next year? Try to put a number to it — 1 in 100? 1 in 10,000? Whatever it is, it will be twice as large 8 years from now.
This startling fact was first noticed by the British actuary Benjamin Gompertz in 1825 and is now called the “Gompertz Law of human mortality.” Your probability of dying during a given year doubles every 8 years. For me, a 25-year-old American, the probability of dying during the next year is a fairly minuscule 0.03% — about 1 in 3,000. When I’m 33 it will be about 1 in 1,500, when I’m 42 it will be about 1 in 750, and so on. By the time I reach age 100 (and I do plan on it) the probability of living to 101 will only be about 50%. This is seriously fast growth — my mortality rate is increasing exponentially with age.
And if my mortality rate (the probability of dying during the next year, or during the next second, however you want to phrase it) is rising exponentially, that means that the probability of me surviving to a particular age is falling super-exponentially. Below are some statistics for mortality rates in the United States in 2005, as reported by the US Census Bureau (and displayed by Wolfram Alpha):
This data fits the Gompertz law almost perfectly, with death rates doubling every 8 years. The graph on the right also agrees with the Gompertz law, and you can see the precipitous fall in survival rates starting at age 80 or so.
Wednesday, January 01, 2014
Dan Colman in Open Culture:
Making resolutions stick is tricky business. But it’s possible, and Stanford psychologist Kelly McGonigal has a few scientifically-proven suggestions for you.
For years, McGonigal has taught a very popular course called The Science of Willpower in Stanford’s Continuing Studies program, where she introduces students to the idea that willpower is not an innate trait. Rather it’s a “complex mind-body response that can be compromised by stress, sleep deprivation and nutrition and that can be strengthened through certain practices.” For those of you who don’t live in the San Francisco Bay Area, you can also find McGonigal’s ideas presented in a recent book, The Willpower Instinct: How Self-Control Works, Why It Matters, and What You Can Do to Get More of It, which just came out in paperback yesterday. Below, we have highlighted 15 of Dr. McGonigal’s strategies for increasing your willpower reserves and making your New Year’s resolution endure.
1. Will power is like a muscle. The more you work on developing it, the more you can incorporate it into your life. It helps, McGonigal says inthis podcast, to start with small feats of willpower before trying to tackle more difficult feats. Ideally, find the smallest change that’s consistent with your larger goal, and start there.
2. Choose a goal or resolution that you really want, not a goal that someone else desires for you, or a goal that you think you should want. Choose a positive goal that truly comes from within and that contributes to something important in life.
3. Willpower is contagious. Find a willpower role model — someone who has accomplished what you want to do. Also try to surround yourself with family members, friends or groups who can support you. Change is often not made alone.
Bruce Robbins reviews Vivek Chibber's Postcolonial Theory and the Specter of Capital in n+1:
Taking Subaltern Studies as symbolic of a wider intellectual failure, he asserts in its place the validity and explanatory power of a renewed and unapologetic Marxism, and with it the Enlightenment universals that it relied on. In the view of thinkers like Chibber, the charge that Marxist theory suffers from “Eurocentrism”—represented by decades of thinking, entire libraries of books, and hundreds of academic departments—is sterile and empty. Drawing a line in the sand naturally makes both sides upset, and the debate over Chibber’s book has been heated. The field was sowed when his less guarded, even more polemical thoughts about the bankruptcy of Subaltern Studies came out in an interview with Jacobin: “When Subalternist theorists put up this gigantic wall separating East from West, and when they insist that Western agents are not driven by the same kinds of concerns as Eastern agents, what they’re doing is endorsing the kind of essentialism that colonial authorities used to justify their depredations in the 19th century,” he said. “It’s the same kind of essentialism that American military apologists used when they were bombing Vietnam or when they were going into the Middle East. Nobody on the left can be at ease with these sorts of arguments.” The interview seems at times almost unhinged. A widely read critique by Chris Taylor, an English professor at the University of Chicago, published under the title, “Not Even Marxist,” argued that Chibber mistakenly forced readers to “choose sides” between Subaltern Studies and Marxism; moreover, Chibber’s brand of Marxism, Taylor suggested, was a bad one. This received a riposte at Verso’s blog (“Not Even Marxist?”), which got fought over in turn. The closing session of the Historical Materialism conference at NYU in April 2013 was a debate between Partha Chatterjee and Chibber, and was advertised like it was the return of Ali vs. Frazier.
The heatedness of the debates that flared up around Chibber’s book has to do with the place of honor it gives to the showdown between universalism and culture. Chibber’s impatience with culturalist interpretation—that is, interpretation that doesn’t merely deal with culture but wants to demonstrate how unnecessary and misleading it is to talk about the economic at all—is now widely shared. In an era when purely cultural explanations are no longer as persuasive, “economic determinism” loses its force as a smear. It would be surprising if Chibber’s book had not benefited from this development. Postcolonial Theory and the Specter of Capitalcomes bearing endorsements from a political philosopher (Joshua Cohen), an economic historian (Robert Brenner), Noam Chomsky, and, well, Žižek.
Hartosh Singh Bal in Caravan:
The party that Indira Gandhi invented anew in 1969 was built around a personality cult, but it sustained itself through her ability to triumph electorally. The old system of organisational loyalty was now replaced by a network of patronage in which people who paid obeisance to the personality cult were rewarded by the benefits that come with a share in political power. There was no longer any question of people being attracted by the party’s vision, because no such thing existed; it is easy enough to define the term Nehruvian, but impossible to give a coherent shape to what Indira espoused. If today what we call the Congress does not have an organisation independent from the Nehru-Gandhi dynasty and its patronage, it is because Indira excised this possibility in 1969.
This model of politics soon began to show its weakness. The Congress was first voted out of power in 1977, after the Emergency. Although Indira returned to power in 1979, by the time Rajiv was defeated in the general election of 1989 it had become clear that the Nehru-Gandhi dynasty no longer had the appeal necessary to repeat the triumph of 1971. After Rajiv’s death in 1991, Narasimha Rao became the Congress president, and the party managed to cobble together a coalition government under him; it was the first time since 1969 that the party had been guided for any meaningful length of time by someone who was not from the Nehru-Gandhi dynasty. An electoral defeat five years later confined the party to the opposition until 2004.
With each successive stint out of power, the party’s ability to retain its supporters dwindled. Even where the Congress could win elections, it was not the “same type of political force it was in the 1960s”, Atul Kohli notes; by the mid 1980s, the Congress system “had almost vanished”. This was a natural corollary of the split in 1969: any network of patronage can survive only if it can assure benefits in the near future. Although the party won in 2004 and 2009, the victories were mostly exercises in coalition building; they did not demonstrate any newfound electoral strength among the Nehru-Gandhi dynasty, and could not reverse the party’s disintegration at the level that matters—in the states, where local patronage is handed out.
Michael Saler in The Immanent Frame:
There are at least two ways that we can understand the meanings of “enchantment” and “disenchantment.” We can define them as stages within a broader historical process, and we can define them as human affects. In terms of historical process, the narrative of Weber and others described the shift from a premodern, “enchanted” world governed by an overarching supernatural order, to the modern “disenchanted” world characterized by scientific naturalism. Scholars advanced different historical periods for the origins of this process, but their accounts of its outcome were similar. A recognizable discourse equating modernity with disenchantment emerged among the late eighteenth century romantics, was given added momentum by nineteenth century cultural pessimists, and apparent scientific legitimacy by twentieth century sociologists, philosophers, and political scientists. The constant iteration that modernity has foresworn enchantment for disenchantment made it a virtual orthodoxy in the West until very recently.
In terms of human affect, since the Middle Ages “enchantment” had two meanings in Western culture: enchantment as “delight” and enchantment as “delusion.” The pleasures of enchantment as delight could be so overpowering that one is placed under a spell—an “enchantment”—and becomes deluded. The remedy was to become disenchanted. But disenchantment, like enchantment, also had positive and negative meanings. A positive meaning of disenchantment is that of emancipation: one is freed from dangerous illusions. A negative meaning of disenchantment is that of disillusion, a hard-bitten refusal of ideals or any form of transcendence.
The problem with the historical discourse was that it became conflated with the affective discourse. It equated the historical shift to a disenchanted world with the affect of disenchantment as disillusion, the end of a sense of wonder. States of enchantment might be delightful, but they were also delusory and regressive, at best suitable for children and other irrational beings, such as women, the working classes, and non-Western peoples. The historical narrative of modernity and enchantment could have positive elements—this was true of Weber’s account—but fundamentally it was one of discontent and loss.
Lego figurines were found to be growing angrier. Researchers reconstructed the face of Richard III, discovered the heart of Richard I to have been embalmed in daisy, mint, and myrtle, and calculated that Double Stuf Oreos contain only 1.86 times as much cream filling. In England, two North Anglians dressed as Oompa Loompas attacked a man outside a kebab house, an appellate court ended Cadbury’s monopoly on the color purple, and Lord Sugar was investigated for racism. An Edinburgh Krispy Kreme sold an average of one doughnut every three seconds in the six months after it opened. “They are ruinous,” said Scottish National Obesity Forum spokesman Tam Fry. Frito-Lay began selling Taco Bell Doritos, which taste like Taco Bell Doritos Locos tacos, which taste like Doritos. Conor P. Fudge was charged for a burglary at Iowa City’s Cold Stone Creamery. Toronto mayor Rob Ford admitted that he had smoked crack cocaine while in “one of my drunken stupors.” Doctors declared cured a Mississippi baby born with HIV. Belgium permitted twins born deaf to commit suicide because they had also become blind. In Spain, the recipient of the world’s first double-leg transplant had his transplanted legs amputated. South Korean police arrested two students for selling diet pills made of human flesh, and hackers in Montana broadcast an emergency alert warning of a zombie uprising.
17 April. Shots of the cabinet and the ex-cabinet at Lady Thatcher’s funeral in St Paul’s just emphasise how consistently cowardly most of them were, the only time they dared to stand up to her when eventually they kicked her out. What also galls is the notion that Tory MPs throw in almost as an afterthought, namely that her lack of a sense of humour was just a minor failing, of no more significance than being colourblind, say, or mildly short-sighted. In fact to have no sense of humour is to be a seriously flawed human being. It’s not a minor shortcoming; it shuts you off from humanity. Mrs Thatcher was a mirthless bully and should have been buried, as once upon a time monarchs used to be, in the depths of the night.
3 May. I am reading Neil MacGregor’s Shakespeare’s Restless World. It’s very good, even overcoming my (A.L. Rowse generated) prejudice against reading about Shakespeare. I hadn’t realised at Richard Griffiths’s funeral in Stratford that Shakespeare’s father had been buried in the churchyard, the whereabouts of the grave now unknown. So when, waiting for the service to start, I went out for a pee under one of the yews in a sheltered corner of the cemetery I may well have been pissing on Shakespeare’s dad’s grave.
Imagine that tomorrow morning you woke up and discovered that the familiar rock pigeon—scientifically known as Columba livia, popularly known as the rat with wings—had disappeared. It was gone not simply from your window ledge but from Piazza San Marco, Trafalgar Square, the Gateway of India arch, and every park, sidewalk, telephone wire, and rooftop in between. Would you grieve for the loss of a familiar creature, or rip out the spikes on your air-conditioner and celebrate? Perhaps your reaction would depend on the cause of the extinction. If the birds had been carried off in a mass avian rapture, or a pigeon-specific flu, you might let them pass without guilt, but if they had been hunted to death by humans you might feel honor-bound to genetically engineer them back to life.
This thought experiment occurred to me while reading “A Feathered River Across the Sky: The Passenger Pigeon’s Flight to Extinction” (Bloomsbury), Joel Greenberg’s study of a bird that really did vanish after near-ubiquity, and that really is the subject of Frankenpigeon dreams of resurrection. Even before the age of bioengineering, Ectopistes migratorius could seem as much science-fiction fable as fact, which is why it is good to have Greenberg’s book, the first major work in sixty years about the most famous extinct species since the dodo.
Emine Saner in The Guardian:
In many ways, says the writer G Willow Wilson, Marvel's new comic-book character is a typical teenager, dealing with the angst of high school, before discovering superhero powers. So far, so Peter Parker. Except 16-year-old Kamala Khan is female, and a Muslim: unusual enough in the world of comics to have caused quite a ripple when it was announced in November. "She's a child of Pakistani immigrants," says Wilson from her home in Seattle, where she is already working on the third issue; the series will start in February. "On the one hand, she grew up in an American city as a fairly typical middle-class American kid, but she's also got the tradition and history of her parents. She faces a lot of the same dilemmas many second-generation kids do."
The idea came from two Marvel editors: Sana Amanat, who had been telling her colleague, Steve Wacker, tales of growing up in a Muslim family. (The idea predates the recent rise to prominence of the similarly named, similarly brave teenage girl, education activist Malala Yousafzai.) Amanat and Wacker approached Wilson – who converted to Islam in college, and whose work includes the comic Cairo and novel Alif the Unseen – to be the writer (with artwork by Adrian Alphona). "My immediate thought was, 'What are we going to get ourselves into?'" she says.
From The Editorial Board of The New York Times:
Sometimes the New Year comes in feeling merely newish, a matter of changing months and not much else. But sometimes the New Year brings with it a powerful sense of regeneration, as if, like certain insects, you were entering a new stage of complete metamorphosis.
...There are no hymns to the New Year, and the only music most of us associate with this holiday is that dirge of the departing year, “Auld Lang Syne.” There is no traditional ceremony either — everyone seems to celebrate the day in a different manner. And perhaps this is a holiday that defies both tradition and ceremony. Does it make sense, after all, to welcome the New Year in the Same Old Way? There is not much ritual in turning to a new page in the calendar. All the ritual lies within us, in the aspiration to live up to our highest hopes. The dead of winter is not a natural season for rebirth. Yet all of nature, dormant now under the cover of cold and snow, is preparing for a re-emergence that always seems spectacular when it eventually comes. Meanwhile, we persist, as much like ourselves on Jan. 1 as we were on Dec. 31. The newness we hope for is something that is ours to construct day by day.
Picture: "The Priest that Preyed" by Sam Weber.
Tuesday, December 31, 2013
Algorithms decide what we are recommended on Amazon, what films we are offered on Netflix. Sometimes, newspapers warn us of their creeping, insidious influence; they are the mysterious sciencey bit of the internet that makes us feel websites are stalking us—the software that looks at the e-mail you receive and tells the Facebook page you look at that, say, Pizza Hut should be the ad it shows you. Some of those newspaper warnings themselves come from algorithms. Crude programs already trawl news pages, summarise the results, and produce their own article, by-lined, in the case of Forbes magazine, "By Narrative Science".
Others produce their own genuine news. On February 1st, the Los Angeles Times website ran an article that began "A shallow magnitude 3.2 earthquake was reported Friday morning." The piece was written at a time when quite possibly every reporter was asleep. But it was grammatical, coherent, and did what any human reporter writing a formulaic article about a small earthquake would do: it went to the US Geological Survey website, put the relevant numbers in a boilerplate article, and hit send. In this case, however, the donkey work was done by an algorithm.