Thursday, January 29, 2015
Kenan Malik in Pandaemonium:
I published recently a transcript of a radio documentary I had made that explored the question of ‘Who owns culture?’. Perhaps the most fractious of recent debates around this question has been over ‘Kennewick Man’, an ancient skeleton found on the banks of the Columbia River in America’s Washington State. The 9000-year old skeleton became the focus for two major controversies: What is race? And who owns history? I tell the story of Kennewick Man in my book Strange Fruit: Why Both Sides are Wrong in the Race Debate. I am publishing here an extract that lays out part [of] that story, looking at the question of the ownership of culture and history and of the clash between scientific rationality and cultural identity. I will publish a second extract next week that delves into the debate about race posed by Kennewick Man.
Darold Treffert in Scientific American:
I met my first savant 52 years ago and have been intrigued with that remarkable condition ever since. One of the most striking and consistent things in the many savants I have seen is that that they clearly know things they never learned.
Leslie Lemke is a musical virtuoso even though he has never had a music lesson in his life. Like “Blind Tom” Wiggins a century before him, his musical genius erupted so early and spontaneously as an infant that it could not possibly have been learned. It came ‘factory installed’. In both cases professional musicians witnessed and confirmed that Lemke and Wiggins somehow, even in the absence of formal training, had innate access to what can be called “the rules” or vast syntax of music.
Alonzo Clemons has never had an art lesson in his life. As an infant, after a head injury, he began to sculpt with whatever was handy–Crisco or whatever–and now is a celebrated sculptor who can mold a perfect specimen of any animal with clay in an hour or less after only a single glance at the animal itself–every muscle and tendon perfectly positioned. He has had no formal training.
To explain the savant, who has innate access to the vast syntax and rules of art, mathematics, music and even language, in the absence of any formal training and in the presence of major disability, “genetic memory,” it seems to me, must exist along with the more commonly recognized cognitive/semantic and procedural/habit memory circuits.
Genetic memory, simply put, is complex abilities and actual sophisticated knowledge inherited along with other more typical and commonly accepted physical and behavioral characteristics.
The second is that I am saturated in digital life and I want to return to the actual world again. I’m a human being before I am a writer; and a writer before I am a blogger, and although it’s been a joy and a privilege to have helped pioneer a genuinely new form of writing, I yearn for other, older forms. I want to read again, slowly, carefully. I want to absorb a difficult book and walk around in my own thoughts with it for a while. I want to have an idea and let it slowly take shape, rather than be instantly blogged. I want to write long essays that can answer more deeply and subtly the many questions that the Dish years have presented to me. I want to write a book.
I want to spend some real time with my parents, while I still have them, with my husband, who is too often a ‘blog-widow’, my sister and brother, my niece and nephews, and rekindle the friendships that I have simply had to let wither because I’m always tied to the blog. And I want to stay healthy. I’ve had increasing health challenges these past few years. They’re not HIV-related; my doctor tells me they’re simply a result of fifteen years of daily, hourly, always-on-deadline stress. These past few weeks were particularly rough – and finally forced me to get real.
more here. And I should say that we here at 3QD send him off into the real world with a special, heartfelt, blogger's salute.
This extraordinary book, a huge dictionary of philosophical terms from many languages, is a translation of Vocabulaire européen des philosophies: Dictionnaire des intraduisibles, originally published in 2004, the brainchild of the French philosopher Barbara Cassin. If the original project was paradoxical, then the present version is doubly so: not just a dictionary of untranslatable words, but a translation of that dictionary. Rather than despair at the self-undermining self-referentiality of the whole idea, the editors rejoice in it. Indeed, moving the word “untranslatable” to the beginning of the English title proudly asserts the paradox even more forcefully than the original French title does, and forms what the English-language editor Emily Apter calls “an organising principle of the entire project”.
In her preface, Apter comments (apparently without irony) that “the extent of our translation task became clear only when we realised that a straightforward conversion of the French edition into English simply would not work”. She is right, of course: translation is almost never a straightforward conversion. This is why it is such a fertile subject for philosophy. Like so much in philosophy, theorizing about translation (and, of course, about the related concept of meaning) lurches between two unappealing extremes.
IT IS OBVIOUS BY NOW that Paul Thomas Anderson isn’t making individual movies so much as building an oeuvre block by block—the sturdiest, most resilient body of work by a big-time American director since Stanley Kubrick died and Martin Scorsese ran out of steam.
Big, ambitious, and American are the operative words. Boogie Nights (1997) and Magnolia (1999) were sprawling ensemble pieces that challenged Scorsese and Robert Altman on their own turf; in their concern with self-invented American Übermenschen and up-front eccentricity, There Will Be Blood (2007) and The Master (2012) engaged Orson Welles. Anderson’s smaller films, Hard Eight (1996) and Punch-Drunk Love(2002), pondered more marginal if equally echt-American types, and his latest movie, Inherent Vice, which stars Joaquin Phoenix as Thomas Pynchon’s hippie private eye Doc Sportello, falls into this category. A panoramic actor fest, it is also an extremely credible adaptation of the closest thing to an easy read by the writer whom some consider America’s greatest living novelist.
Structurally, Inherent Vice is pure School of Chandler, with Doc suckered into the plot by an old girlfriend, Shasta Fay Hepworth (Katherine Waterston), whose problems with her sugar daddy, scumbag developer Mickey Wolfmann (Eric Roberts), illuminate a classically Los Angeles real-estate scam . . . for starters. Behind it all is an “Indo-Chinese” drug cartel, a stand-in for the Vietnam War and ultimately a front for whatever cosmic antiplan you like—Doc Sportello being a sort of acidhead Don Quixote complete with intermittent sidekick, maritime lawyer Sauncho Smilax (Benicio Del Toro).
Just occasionally in Blake’s engravings there are pictures within pictures, and we get a glimpse of the life he thought images might lead in a better world. The most moving of these visions is Plate 20 of Blake’s Illustrations of the Book of Job. Job has survived his doubts and torments, and is telling the story to his daughters – in an earlier watercolour, they hold the instruments of Poetry, Painting and Music. No doubt the young women are taking their father’s narrative to heart, and in due course will rephrase it in terms appropriate to their arts: the lute and lyre are in the margins of the plate, ready to be strummed. But the first form of the story is visual: Job sits in a circular room – or maybe it is ten or 12-sided – and points towards two frescoed roundels on the walls left and right. Neither is unequivocally an episode from Job’s life – they could be analogous scenes from the story of the Fall – but the square panel over his head must be a version of ‘Then the Lord answered Job out of the Whirlwind.’ (It combines and condenses elements of Blake’s previous engraving of the subject.) As so often in Blake, the balance between positive and negative in the scene as a whole is precarious: Job is central and patriarchal (‘their Father gave them Inheritance among their Brethren’), and there is more than a touch of the baleful exhausted God-the-Father to him, heavy lids, pointing fingers and all. But there cannot be any doubt that the basic form and function of the room, with its echoes of the early 19th-century diorama (it is important that the plate was engraved in 1825), were meant to strike the viewer as wonderful – all-enveloping. Here were images at work.
Sandip Roy in The Telegraph:
“Just come back any time with madam to approve the kitchen design,” the beaming modular kitchen consultant told me. I explained patiently, again, that there was no madam around. I would be approving my own modular kitchen, cabinet colours and all. He smiled indulgently and said, “But we can wait few days if needed for madam.” When it finally dawned on him that there was no madam at all, he was aghast. I don’t know what shocked him more – that a man might approve a kitchen design, or that I lived alone, or that a man who lived alone wanted a kitchen.
When I first moved to the United States as a graduate student I could not wait to live by myself. The idea of a town where no one knew your name was just exhilarating.When I was moving back to India after 20 years in the US, many friends were aghast. How will you manage, they wondered uneasily. Twenty years of San Francisco can change you. How would I adjust to life back in a city without non-GMO Swiss chard, late-night carnitas quesadillas and gay bars? “Do they have gay bars in India?” well-meaning American friends asked me. Kolkata actually had the first Rainbow Pride parade in India back in 1999. But no, there were no gay bars here, though there were several men-only bars, no Leather Weekend street fairs with paddling stations, no same-sex marriages officiated by the city’s mayor. I knew and I understood that certain things I took for granted in a San Francisco lifestyle would just not work in India. Neighbours in San Francisco minded their own business. Neighbours in India minded your business. While the gay movement in the US was focused on marriage equality, in India it had its hands full trying to overturn a Victorian era anti-sodomy law that had hung around after the British had packed up and left. India had changed dramatically in the last decade when it came to visibility of gay issues in the media but there was still a fog of Don’t Ask Don’t Tell around issues of sexuality.
Imagine a micromotor fueled by stomach acid that can take a bubble-powered ride inside a mouse — and that could one day be a safer, more efficient way to deliver drugs or diagnose tumors for humans. That’s the goal of a team of researchers at the University of California, San Diego. The experiment is the first to show that these micromotors can operate safely in a living animal, said Professors Joseph Wang and Liangfang Zhang of the NanoEngineering Department at the UC San Diego Jacobs School of Engineering. Wang, Zhang and others have experimented with different designs and fuel systems for micromotors that can travel in water, blood and other body fluids in the lab. “But this is the first example of loading and releasing a cargo in vivo,” said Wang. “We thought it was the logical extension of the work we have done, to see if these motors might be able to swim in stomach acid.”
In the experiment, the mice ingested tiny drops of solution containing hundreds of the micromotors, which are 20 micrometers long. The motors become active as soon as they hit the stomach acid and zoom toward the stomach lining at a speed of 60 micrometers per second. They can self-propel like this for up to 10 minutes. This propulsive burst improved how well the cone-shaped motors were able to penetrate and stick in the mucous layer covering the stomach wall, explained Zhang. “It’s the motor that can punch into this viscous layer and stay there, which is an advantage over more passive delivery systems,” he said. The researchers found that nearly four times as many zinc micromotors found their way into the stomach lining compared with platinum-based micromotors, which don’t react with and can’t be fueled by stomach acid. Wang said it may be possible to add navigation capabilities and other functions to the motors, to increase their targeting potential. Now that his team has demonstrated that the motors work in living animals, he noted, similar nanomachines soon may find a variety of applications including drug delivery, diagnostics, nanosurgery and biopsies of hard-to-reach tumors.
You shot them.
Two beautiful purebred dogs
Siberian Huskies, each
In the head,
They were sniffing around your chickens,
Biting a few,
Killing a few.
And this I understand,
Those birds are easily excitable, and they start
Clucking, and in the way acquaintances soon become intolerable
When they start squawking and screaming,
The dogs would snap up a chicken,
Around the throat,
to tell them,
Someone is coming
He is not a friend.
The bullet, zipping through the air,
Silent now after bursting “Hallelujah!”
From its pre-natal chamber,
Rides the wind. It is done playing
And it pierces a skull, rollicking
in the explosive greeting.
His twin brother,
Born seconds after,
Grabs Esau’s heel and follows his steps.
My sons of snow and survival
They died an ignominious death, with yelps
And lived through a funeral of disgrace
When you hog-tied them and dumped them on my doorstep this morning.
by Elaine Wang
from Cahoodaloodaling, Issue 14
Wednesday, January 28, 2015
Adam Rutherford in The Guardian:
In this lush, epic and hugely enjoyable book, biologist Armand Marie Leroi explores the idea that it was another ancient Greek giant whose shoulders we may all stand upon. In his mid 30s, around 346 BCE, Aristotle exiled himself from Athens to islands in the Aegean, where he spent time thinking and writing about nature, possibly near a lagoon on Lesbos. We primarily know him as a philosopher, but here, Aristotle's biological output was titanic: he dissected dozens of species and compiled the first biology textbook, Historia Animalium. It is this body of work that Leroi argues continues to percolate through scientific thought today.
There's great temptation in analysing historical scholars to suggest that their insights were in some way vatic, or to use that horrid phrase, they were "anticipating" things to come. Leroi does a splendid job of avoiding hagiography of his hero, and never springs that inviting trap. Aristotle's contention that seals are mutated quadrupeds is true in a purely Darwinian sense: they are mammals, evolved from terrestrial four-legged mammals. But Leroi points out that this is not Aristotle's thinking. The idea that mutation from common ancestors was the cause of species is not present in any of Aristotle's work. He was not predicting evolution, nor did he once consider that Darwinian truth.
Katia Moskvitch in Scientific American:
What does graphene mean for the future of computing?
It is certain that silicon will be used for transistors—semiconductor devices that are the building blocks of modern computers—for at least the next five to 10 years. But people are already thinking about possible alternative materials and technologies to replace silicon when it will fail to deliver for increasingly smaller and smaller transistors. A graphene transistor is one of the alternatives.
I’m also looking into other one-atom-thick 2-D materials that were obtained soon after graphene and at heterostructures based on those 2-D crystals. Potentially they can provide an alternative to silicon technologies, but here we’re talking about completely new architecture rather than just introducing a new material into the system. It’s hard to predict how it will develop because when you introduce one new material into a process, it’s already quite a complicated step, and if you want to change the whole architecture, it requires years of research. That’s why research should start now if we want to achieve something like that in 10 years’ time.
What do you think computers of the future could look like?
Computers are much more than just a display, interface and software: they are mainly about computing power and microprocessors—also known as the central processing unit [CPU], or the “brain” of a computer. In the future, we’ll probably expand the parallel computations, utilizing microprocessors with larger number of cores, when several CPUs will be working together on the same chip, enabling the computer to perform many more tasks with a much greater overall system performance. At the same time more specialized computers will start to appear because the cost won’t be so prohibitive anymore.
Conor Friedersdorf in The Atlantic:
Last month, an improbable Internet exchange inspired many who noticed it to reconsider what's possible when debating politics online. It began when MIT professor Scott Aaronson published a blog post on a sexual harassment controversy. A predictably heated argument ensued in the comments section. Then, 171 comments into the thread, Aaronson achieved a breakthrough: He posted a reply so personal, vulnerable and powerful that it transformed the character of the conversation. And all sides emerged better able to see one another's humanity.
The comment that begat this small Internet miracle wasn't perfect. Neither were the responses to it–as ever online, some needless cruelty and lack of charity followed.
But Aaronson and his interlocutors did transform an obscure, not-particularly-edifying debate into a broad, widely read conversation that encompassed more earnest, productive, revelatory perspectives than I'd have thought possible. The conversation has already captivated a corner of the Internet, but deserves wider attention, both as a model of public discourse and a window into the human experience. It began with the most personal thing that the professor had ever publicly shared.
If posthumanism signals the end of a certain way of describing—or, more precisely, orienting—selfhood, then we might ask, as Ralph Waldo Emerson did at the start of his famous essay, “Experience” (that addressed, among other crucial issues, slavery), “Where do we find ourselves?” (266). 
To be sure, technology has already expanded ideas about seeing the human as created through evolution. Marvin Minsky argues that robots will be the next evolutionary phase; they will be our “children.” Ray Kurzweil anticipates the ethical issues of posthumanism will be worked out by machines gaining consciousness and then guiding themselves (and, presumably, us) through deeper realms of spiritual experience and insight. 
But, it must be asked, where does all this talk about spiritual transcendentalism leave the crucial subject of our bodies? N. Katherine Hayles cautions that privileging the disembodiment of information is a return to Cartesian dualism that supports the liberal humanist subject: what posthumanism seeks to challenge. Cary Wolfe, moreover, reminds us that we have to take into account how posthumanism is shaped by our relationships with other embodied forms of life constituted by non-human animals.
The art of literary conversation, by whatever name, is certainly not new.Hannah Rosefield opened her review of John Freeman’s How to Read a Novelist to a larger discussion of our cultural obsession with the interview as a way to look behind the authorial mask. Rosefield is dismissive of Freeman’s collection of 55 profiles of novelists, calling them “weirdly artificial…as if the writer is sitting alone in a restaurant or, sometimes, in her glamorous apartment, addressing occasional comments to the atmosphere.” Literary hero worship.
Rosefield isn’t enthralled with interviews as a whole, but her discussion is insightful. Many contemporary writers are known for their disinterest in the form — ranging from the prolific and visible Joyce Carol Oates to the prolific and invisible Thomas Pynchon — but she traces the displeasure back to Henry James, who gave his first interview in 1904, nearly 30 years after he published his first novel.
The magazine that has become synonymous with interviews is The Paris Review, which, as Rosefield notes, published a long interview with E.M. Forster in their first issue, Spring 1953. John Rodden, author of Performing the Literary Interview: How Writers Craft Their Public Selves, the first book-length examination of the literary interview genre, thinks George Plimpton “virtually invented” the literary interview as a genre for the “little magazine.”
In any analysis of a public figure, partisan interests will influence one’s opinion, and there isn’t anything particularly productive about pointing out that conservatives tend to forgive in conservative leaders what they don’t in liberals. A more helpful question is this: Why has Pope Francis addressed political issues, such as climate change, inequality, poverty, and overpopulation? Is it evidence of abject partisan interest, or a covert dedication to communism, Marxism, or some other insidious ideology?
Or is it just that we now presume that “politics” belongs outside the Church’s purview—despite the Church’s historical record of considering and intervening in political affairs? To me, this appears to be the distortion at hand.
This is partly because the notion that "politics" can be neatly separated from daily life is a new one. For earlier political theorists, like Aristotle and Augustine, politics was just a natural extension of community life. But over time, a fantasy of “politics” wholly divorced from everyday life and experience has emerged in certain corners of liberal thought, producing with it the expectation that politics is a matter for professional politicians and their colleagues, while those in religious offices should simply avoid addressing politics altogether.
Carley Moore in TNB:
Last summer I turned 42 years old. On the morning of my birthday, my then-boyfriend asked me what I was doing when I was 21, half that age. I said, “Baking quiches, dropping acid, and chasing boys.” I imagined this retort as a tweet—short and to the point. I’d managed to get my life at that time down to 39 characters, and it was mostly accurate.
At 21 years old, I was obsessed with Molly Katzen’s Moosewood cookbook, The Enchanted Broccoli Forest. I was going to a state school in upstate New York, not far from the home of the Moosewood restaurant in Ithaca, which had always seemed to me a cultural mecca in a vast state of industrial depression and blight. Ithaca was the home of my favorite thrift shop, Zoo Zoos, and a lot of cute hippie musicians I dreamed of fucking. The cookbook was steeped in that same sexy, vintage, hippie musician lore. I imagined myself cooking for one of those musicians. I could be his “old lady” for a recipe or two. Many of my activities then were overlaid with a fantasy plot line, worthy of an episode of Laverne and Shirley or Three’s Company. I was rarely just doing something; I was doing that thing while imagining I was in the TV sitcom version of it. As a child, I’d made it through my sometimes chore of washing the dishes by pretending I was in a Dawn dish soap ad.
Alison Abbott in Nature:
If you have to make a complex decision, will you do a better job if you absorb yourself in, say, a crossword puzzle instead of ruminating about your options? The idea that unconscious thought is sometimes more powerful than conscious thought is attractive, and echoes ideas popularized by books such as writer Malcolm Gladwell’s best-selling Blink. But within the scientific community, ‘unconscious-thought advantage’ (UTA) has been controversial. Now Dutch psychologists have carried out the most rigorous study yet of UTA — and find no evidence for it.
...A typical study probing UTA asks subjects to make a complex decision, such as choosing a car or a computer, after either mulling over a list of the object’s attributes or viewing the list quickly and then engaging in a distracting activity such as a word puzzle. However, such studies have drawn different conclusions, with about half of those published so far reporting a UTA effect and the other half finding none. Proponents of the theory claim that the effect is exquisitely sensitive to experimental variations, and often attribute the negative results to the fact that many research groups varied elements of the set-up, such as the choice of puzzle used for the distraction3. Critics say that the positive results came from having too few participants in the experiments. Psychologists Mark Nieuwenstein and Hedderik van Rijn at the University of Groningen in the Netherlands set out with their colleagues to determine which explanation was correct. They asked 399 participants — around ten times more than the typical (median) sample sizes in other studies — to choose between either 4 cars or 4 apartments on the basis of 12 desirable or undesirable features. They incorporated the full list of conditions that UTA proponents had reported as yielding the strongest effect, such as the exact type of puzzle used as a distraction. They found that the distracted group was no more likely than the deliberating group to choose the most desirable item.
Steven Pinker in the Boston Review:
More than two centuries after freedom of speech was enshrined in the First Amendment to the Constitution, that right is very much in the news. Campus speech codes, disinvited commencement speakers, jailed performance artists, exiled leakers, a blogger condemned to a thousand lashes by one of our closest allies, and the massacre of French cartoonists have forced the democratic world to examine the roots of its commitment to free speech.
Is free speech merely a symbolic talisman, like a national flag or motto? Is it just one of many values that we trade off against each other? Was Pope Francis right when he said that “you cannot make fun of the faith of others”? May universities muzzle some students to protect the sensibilities of others? Did the Charlie Hebdo cartoonists “cross a line that separates free speech from toxic talk,” as the dean of a school of journalism recently opined? Or is free speech fundamental — a right which, if not absolute, should be abrogated only in carefully circumscribed cases?
The answer is that free speech is indeed fundamental. It’s important to remind ourselves why, and to have the reasons at our fingertips when that right is called into question.
Tuesday, January 27, 2015
Larry Sultan's photos capture, tenderly, the paradise of Southern California – and the tensions and desires that complicate it
Morgan Meis in The Smart Set:
When I look at photographs by Larry Sultan I smell eucalyptus and lavender. Those are the plants I can name. I also smell the gnarled bushes and brown weeds, the nameless desert flowers that grew in the nooks and crannies of Laurel Canyon, in the places that hadn’t been watered and replanted by homeowners. Running around in the behind-spaces of the Hollywood Hills as a youth, I smelled all those smells, got them on my fingers. It is strange that a photograph, something that cannot be smelled or touched or tasted, could recall those sensations. But those dirt-dusted plant smells are in every frame of Sultan’s photographic series Pictures from Home (1982-92), The Valley (1998-2003), and Homeland (2006-9).
Larry Sultan died of cancer in 2009; the Homeland series constitutes his final testimony. The smelliest of the Homeland pictures is, for me, Creek, Santa Rosa (2007). Santa Rosa is actually up in Northern California, but Sultan shot it to look more or less the same as L.A.’s Valley. The picture shows a half-dried creek behind a housing development. A youngish Latino man crouches down with a bucket. Presumably, he is gathering water. Another man is heading up the hill behind, his bucket already full. In terms of content, the picture is not unlike something you might see in a 17th century painting. European villagers drawing water from a nearby river. Something painted, maybe, by Claude Lorrain.
But in a picture by Claude Lorrain, the villagers drawing the water are fully integrated into the landscape. With Claude, people belong to place and place belongs to people. The paintings work because everything holds together in a neo-classical pictorial oneness. It doesn’t work that way in a picture by Larry Sultan. The two men in Creek, Santa Rosa are drawing water from the creek because they don’t have access to the taps in the modern homes just behind them.
Allison Gehlhaus in Brain, Child:
It’s been an eye-opening twelve years. A time to examine some preconceived—literally—notions regarding the raising of boys and girls. Especially my own. I had been stunned and hurt by the comments I heard after the birth of our daughters. The nurses at the hospital told me that they hear a lot of women apologize to their husbands after giving birth to girls. Seriously. Right in the labor room. One nurse said, “Don’t they realize that it is the man who determines the sex of the baby?” Another quipped, “So maybe the men should apologize.”
So I shouldn’t have been surprised when visitors would say, “Maybe next time,” with a dismissive wave at our little pink bundles of joy. Or, “How soon are you going to try again?”
My brother-in-law actually said, “Three girls. That’s the pits.”
He’s lucky to be alive.
“Another girl? Is Hank mad at you?” a neighbor asked.
And when I answered, “Yeah, my husband’s furious, he’s kicking me out next week,” she didn’t even flinch.
Read the rest here.
David X. Noval in The Critical Flame:
Claude McKay, born in Jamaica in 1890, is the first modern master of the sonnet form. Yeats of course had turned out a few—one a classic—as had Pound. Cummings wrote plenty of sonnets, but, because of their idiosyncrasies, they are more complications than masterworks. Wilfred Owen, if he had lived, might have rivaled McKay, as he was disposed to the sonnet and masterful in its usage. The only other serious contender, to my mind, would be Edna St. Vincent Millay, and perhaps later, Berryman. In the Selected Poems, first published in 1953, several years after his death, Max Eastman writes (with what McKay’s biographer, Wayne Cooper, generously describes as an “unconscious condescension”):
Claude McKay was most widely known perhaps as a novelist, author of Home to Harlem, a national best-seller in 1928. But he will live in history as the first great lyric genius that his race produced.
Why then has McKay’s work languished in relative obscurity? Eastman writes in 1953 that his “place in the world’s literature is unique and is assured.” Yet in his fifties, after a decade of illness, McKay sought to end his financial difficulties by taking on employment as a riveter—something he was not physically equipped to handle, and which surely contributed to a stroke at age fifty-three. He had turned down a “sizeable” book advance, explaining, “I haven’t been able to concentrate on a plot. It’s quite impossible when one’s mind is distracted. People can’t realize the state of one’s mind under such conditions, and the few I meet make me angry by telling me how happy I look.”
Robert Stone, who died on January 10, was a member of the all-but-vanished tribe of hard-living, two-fisted, wildly ambitious American novelists who grew up in Hemingway’s slipstream. In the 1960s, when Bob was writing his first novel, Hall of Mirrors, the elders were guys like Norman Mailer, William Styron, Saul Bellow, and Nelson Algren. They punched like heavyweights. They swung for the fences. They all stalked that shaggy beast, the Great American Novel.
Or so it seemed to an impressionable younger writer like me; I felt just enough younger to have missed out on something. Perhaps I romanticized the generation of writers ahead of me, but from the safety of the academy—I was on path of the writer-teacher—they did seem larger-than-life: more adventurous, more daring, more glamorous that the rest of us would ever be.
By the time I met Bob, he had already won the 1975 National Book Award for his second novel, Dog Soldiers, a thriller that linked the disastrous war in Vietnam with the drug culture that was epidemic on the home front. That book is richly populated with the kinds of characters who would become familiar to his readers: the druggies, the drunks, the psychopaths, the world-weary, the desperate, and the deluded, some of them so violent, cruel, or just plain loony that they could strike fear in your heart.