Thursday, April 24, 2014
Our America is a brilliant, difficult book which seeks “to show that there are other US histories than the standard Anglo narrative” by focusing on “Hispanic influence in the country’s past and future”. A United States is revealed that has its origins not in the Pilgrim Fathers or Plymouth Rock, but over a century before that, with Ponce de León and the Fountain of Youth. Felipe Fernández-Armesto’s survey covers “the whole country from 1505 to the present”, an awesome timespan that would daunt most mortals but which the author handles with his customary fluency, humour and unremitting scholarship.
Who are the Hispanics? They have moved centre stage in recent decades because of their impressive and widely feared political clout. Not only are they the fastest-growing minority in the United States, numbering over 50 million overall, representing two-thirds of the population of Miami, nearly half of Los Angeles, and over one-fifth of New York and Chicago: they also fill important positions in government and state administration throughout the country. No political party dare approach the voters without making concessions to Hispanic priorities and taking account of Spanish – the second language of the US – in their publicity.
When you cross a bridge, it is the sweeping view or the rolling horizon that holds your attention, not the structure that makes the air solid beneath your feet. These wonders of engineering that cheat difficult terrain and smooth our passage are apt to be taken for granted. There are a few showmen—Sydney’s soaring Harbour Bridge, the romantic Rialto in Venice and London’s stately Tower Bridge—which are destinations in their own right. But there are many more dogsbodies that span rivers and ravines, stoically fulfilling their purpose.
For this photo essay Garry Simpson has captured bridges that have a quiet beauty. Some caught his eye during road trips between jobs, “not hero bridges, but ones off the beaten track”. We asked him to photograph a few more in England’s industrial north and the Swiss Alps—a land of mountains, valleys and exemplary engineering.
Simpson, who grew up at the other end of England in Bournemouth, was always a “right brain” kid, constantly drawing cartoons, mostly from “The Jungle Book”. His father gave him his first camera when he was 11. “A horrendous cliché” for a photographer, he admits with a grin. But the rest of his story is not so predictable. On leaving school in the 1980s, a creative career wasn’t an option for Simpson, so he joined the Royal Marines for eight years and hardly took a snap.
The other evening, sitting with friends, literate professional writers of my own generation, discussing boundaries between history and fiction, I made a reference to the ambiguities of Thucydides’ invention of historical speechifying. Suddenly, my table-mates looked at me as though I had sprouted a not particularly attractive horn. Thucydides! Where did that come from? A great gap seemed to yawn between what used to be called the Ancients and the Moderns, with the Ancients consigned to the “classics,” which are presumed to be in decline, and the Moderns content to talk about memoir workshops. At this point, I might be expected to bemoan the bemusement of my colleagues. But that bemusement is part of a conversation, not the end of one. There is no decline, and no lack of engagement, as Mary Beard demonstrates triumphantly in her collection of essays and reviews,Confronting the Classics (Liveright, 2013).
To put this as crisply as I can, the study of Classics is the study of what happens in the gap between antiquity and ourselves. It is not only the dialogue that we have with the culture of the classical world; it is also the dialogue that we have with those who have gone before us who were themselves in dialogue with the classical world. . . . it is we who ventriloquise, who animate what the ancients have to say.”
Geoffrey Ginsburg in Nature:
More and more people are getting their DNA sequenced. But the use of genetic data to inform medical decisions is lagging. More than a decade since the Human Genome Project was declared complete, fewer than 60 genetic variants are deemed worthy for use in clinical care, most for severe conditions in very young children. These genetic variants can guide medical decisions (see ‘Genes that doctors use’). By some estimates, women with certain variants in the BRCA genes have about an 80% chance of developing breast cancer, leading some who carry the mutation to opt for preventive mastectomies. Screening for faulty genes involved in iron transport can alert affected individuals to a need to alter their diets to avoid developing haemochromatosis, a toxic build-up of iron that damages the liver, heart and other organs. Mutations in the EGFR gene can indicate whether lung cancer will respond to expensive drugs with fewer side effects than standard chemotherapy. But five years after EGFR tests were commercialized, only around 6% of appropriate US patients were being genotyped, partly because their physicians were unaware of the tests2. Clinical trials have been used to assess whether genomic information yields practical benefits. A study3 of nearly 2,000 patients with HIV showed that genetic screening for a variant called HLA-B*5701 could help to prevent toxic reactions to the AIDS drug abacavir — a fact that is now written into US treatment guidelines.
...Such 'pharmacogenomic' applications — in which genetic markers are used to fit drugs to patients — are among the most promising areas for collecting evidence during clinical care. In Thailand, about 12% of people have genetic predisposition to Stevens–Johnson syndrome, in which certain medications trigger a blistering, life-threatening rash. The government has sponsored a programme in which any Thai citizen can be genotyped to predict reactions to problematic drugs such as carbamazepine, commonly used to control seizures. Ramathibodi Hospital in Bangkok provides a health card for patients with risky genetic variants to present to pharmacists, alerting them to provide alternative medications. Whether decreased toxicity merits the use of less-effective drugs is being evaluated.
Your full force was first raised against me
Let these spear-tipped streams
flow . . . my gullied eyes greening your fields
Let this crop of pain ripen,
........ this harvest from wounds
You and I? Let's
enjoin ourselves in friendship
........ How engaging!
At dusk where the road forks
I ran into you. Before I knew what was happening,
you raped me. Then and there, witness of this cruel intimacy,
drops of virgin blood spread on the gravel of the crossroads
like an unclaimed corpse
At each moment
be it morning or night
coming & going time & again
those stains return to me
my memory of you
From the outset
..... your every thrust
blazed as fire,
..... tore through the skin as thorns do,
pierced as a blade
..... appeared as the night of the dark moon
But these days
..... your every stroke,
a mere touch
..... and as for my self
..... the oven that contains the flame,
the bush that raises up thorns
..... the sheath that holds the blade,
fangs for the cobra's deadly poison
..... darkness of the night that swallows the moon
Like a tigress tamed in the circus,
a female snake soothed by the charmer's tune,
wound, so quickly was I transformed in you
Now you and I
..... have become nail & flesh,
miser and money.
..... footpath and footsole
Tread upon me with all your thieves & robbers
For this is certain: you'll tire, not me!
Let the variegated wishes for life germinating in me
be winnowed by your stormy gusts. Finish it! Destroy!
Wound! Maul and smother me
Lick me with your slathering flames
For I convert your force. I'm hardened to it
Where you store your weaponry of thrust and violation,
I burrow and hide, grazed from all sides by your firing guns
flameburst upon flameburst everywhere in every corner
But it is surely so, violator
..... Violation! tearing your ears, listen
Your armory will be emptied --I will not
your armory will be emptied --I will not
by Banira Giri
from Jeevan Thayamaru (Life: No Place)
publisher: Sanjha Prakashan, Kathmandu, 1978
translation: Wayne Amtzis and Banira Giri, 2000
Wednesday, April 23, 2014
Richard J. Bernstein in Public Seminar (Book cover of The Later Works of John Dewey by John Dewey, edited by Jo Ann Boydston © Southern Illinois University Press | Amazon.com):
[A]lready in 1934, Dewey saw the parallels between what was happening in the U.S.S.R. and the growth of fascism in Italy and Germany. “As an unalterable opponent of Fascism in every form, I cannot be a Communist” (LW 9: 93).
What is distinctive and admirable about Dewey in the early 1930s is the combination of a sharp critique of the excesses of American capitalism and Soviet Communism combined with a passionate commitment to a vision of a radical democracy. Dewey practiced what he firmly believed. This became evident when Dewey agreed to be chair of Commission of Inquiry into the charges made against Leon Trotsky in the Moscow trials. Popular front liberals tended to down play the significance of these purges, but not Dewey. Dewey was not only severely attacked for agreeing to chair the Commission — there were even threats on his life. Dewey made it clear he was defending “Trotsky’s right to a public trial, although I have no sympathy with what seems to me abstract ideological fanaticism.” So Dewey, at the age of 78, set aside his work on his Logic, and made the arduous trip to Mexico City where he chaired the hearings in Coyocan, Mexico that consisted of thirteen sessions held between April 10 and 17. Strictly speaking, the inquiry was not a trial. The Commission sought to ascertain the veracity of the charges that had been made against Trotsky and his son in Stalin’s trumped up Moscow trials. As Dewey stated in the opening session, the Commission “is here in Mexico neither as a court nor as a jury. … Our sole function is to ascertain the truth as far as is humanly possible” (LW 11: 306). The transcript shows just how active Dewey was in carrying out its task. Ironically, for the all the criticism of the pragmatist conception of truth, Dewey before, during, and after the inquiry defended the importance of ascertaining the truth.
Nathaniel Frank in Slate (Photo by Kimberly White/Reuters):
“This is how a revolution begins,” commences Jo Becker, a Pulitzer Prize-winningNew York Times reporter, in her new book, explaining that the gay marriage movement had “languished in obscurity” until 2008, when a young political consultant named Chad Griffin grew impatient and deployed his “unique ability” to leverage his Hollywood connections to “rebrand a cause.” It was a cause, argues Becker, that had to be rescued from established gay advocates who had spent 40 years doing virtually nothing worth mentioning in a major history of the marriage-equality battle. The book, excerpted in Sunday’s New York Times Magazine, focuses on Ted Olson and, to a lesser extent, David Boies, two straight lawyers recruited by Griffin and funded, initially, by Hollywood stars to challenge California’s Proposition 8, the 2008 ballot initiative that revoked gay marriage in that state. Olson and Boies were on opposite sides of the 2000 Supreme Court battle that landed George W. Bush in the White House, and their teaming up to fight for gay marriage was a brilliant coup by Griffin. Olson’s conservative bona fides and eloquence in embracing the cause of gay marriage was enormously valuable in growing support for the cause just as it was reaching a tipping point.
Yet that’s a far cry from suggesting that this small, well-heeled group was responsible for bringing the nation gay marriage, or for a major leap in public approval, something that was in the works long before these players arrived on the scene, and which was jolted forward by widespread national anger against Prop 8, not just the anger of Chad Griffin and Ted Olson.
The actual revolution that led to gay marriage began, of course, not in a spacious San Francisco hotel suite in 2008 but on the streets of New York in 1969, when LGBTQ activists got tired of perpetual abuse and chose to fight a police raid at the Stonewall Inn. This remarkable uprising, which built on earlier efforts that can be traced back to the first gay rights organization in Chicago in 1924, led to gay marriage lawsuits in the early 1970s that were laughed out of court but were followed by the victorious 1993 Hawaii ruling that launched the gay marriage revolution.
And let’s be clear how we’re using revolution. This revolution began within the LGBTQ movement, which had been split over thoughtful, principled differences about the value and role of marriage in the social structure and, specifically, for the LGBTQ population.
.. —after Iqbal
River mirrors the glow of dawn
Night silence mirrors night song
Rose mirrors the fame of spring
Bridal cup mirrors the virgin wine
Sun’s glory revealed in the sun
Your passionate speech mirrors my heart
Concealed from the world’s eyes
You revealed the world with your own
Nature protects her secrets so jealousy
Never again will there be such knowledge
By Rafiq Kathwari
Jeff Madrick reviews Mariana Mazzucato's The Entrepreneurial State: Debunking Public vs. Private Sector Myths and William H. Janeway's Doing Capitalism in the Innovation Economy: Markets, Speculation and the State in the NYRB (image from Andrew Innerarity/Reuters):
[T]he respected Northwestern economist Robert Gordon reiterated the conventional view in a talk at the New School, saying that he was “extremely skeptical of government” as a source of innovation. “This is the role of individual entrepreneurs. Government had nothing to do with Bill Gates, Steve Jobs, Zuckerberg.”
Fortunately, a new book, The Entrepreneurial State, by the Sussex University economist Mariana Mazzucato, forcefully documents just how wrong these assertions are. It is one of the most incisive economic books in years. Mazzucato’s research goes well beyond the oft-told story about how the Internet was originally developed at the US Department of Defense. For example, she shows in detail that, while Steve Jobs brilliantly imagined and designed attractive new commercial products, almost all the scientific research on which the iPod, iPhone, and iPad were based was done by government-backed scientists and engineers in Europe and America. The touch-screen technology, specifically, now so common to Apple products, was based on research done at government-funded labs in Europe and the US in the 1960s and 1970s.
Similarly, Gordon called the National Institutes of Health a useful government “backstop” to the apparently far more important work done by pharmaceutical companies. But Mazzucato cites research to show that the NIH was responsible for some 75 percent of the major original breakthroughs known as new molecular entities between 1993 and 2004.
Further, Marcia Angell, former editor of The New England Journal of Medicine, found that new molecular entities that were given priority as possibly leading to significant advances in medical treatment were often if not mostly created by government. As Angell notes in her book The Truth About the Drug Companies(2004), only three of the seven high-priority drugs in 2002 came from pharmaceutical companies: the drug Zelnorm was developed by Novartis to treat irritable bowel syndrome, Gilead Sciences created Hepsera to treat hepatitis B, and Eloxatin was created by Sanofi-Synthélabo to treat colon cancer. No one can doubt the benefits of these drugs, or the expense incurred to develop them, but this is a far cry from the common claim, such as Gordon’s, that it is the private sector that does almost all the important innovation.
Robert Solow reviews Thomas Piketty's Capital in The New Republic:
The key thing about wealth in a capitalist economy is that it reproduces itself and usually earns a positive net return. That is the next thing to be investigated. Piketty develops estimates of the “pure” rate of return (after minor adjustments) in Britain going back to 1770 and in France going back to 1820, but not for the United States. He concludes: “[T]he pure return on capital has oscillated around a central value of 4–5 percent a year, or more generally in an interval from 3–6 percent a year. There has been no pronounced long-term trend either upward or downward.... It is possible, however, that the pure return on capital has decreased slightly over the very long run.” It would be interesting to have comparable figures for the United States.
Now if you multiply the rate of return on capital by the capital-income ratio, you get the share of capital in the national income. For example, if the rate of return is 5 percent a year and the stock of capital is six years worth of national income, income from capital will be 30 percent of national income, and so income from work will be the remaining 70 percent. At last, after all this preparation, we are beginning to talk about inequality, and in two distinct senses. First, we have arrived at the functional distribution of income—the split between income from work and income from wealth. Second, it is always the case that wealth is more highly concentrated among the rich than income from labor (although recent American history looks rather odd in this respect); and this being so, the larger the share of income from wealth, the more unequal the distribution of income among persons is likely to be. It is this inequality across persons that matters most for good or ill in a society.
Heather Berg in Jacobin:
Why, under the banner of concern for “the women at the heart of the debate” (represented by a list of predictable tropes of abject sex workers) is Pollitt asking us to consider whether prostitution encourages men to feel entitled to sex without having to charm an unpaid woman in a bar? Because the women at the heart of this debate aren’t sex workers, but secondary consumers who might have to deal with male partners who are rude, socially awkward, or bad in bed.
Unpaid intimacy is a space of work too, and a Marxist feminist dialogue about how paid and unpaid sexual partners might struggle in solidarity would be wonderful. That would, however, require a radical departure from the “you’re not a worker because I don’t like what you produce” line of argument.
It’s rhetoric we’re all too familiar with. Catherine MacKinnon made the question of which women count painfully clear: “One does not have to notice that pornography models are real women to whom something real is being done … The aesthetic of pornography itself, the way it provides what those who consume it want, is itself the evidence.” Pollitt suggests that Gira Grant spends too much time taking easy shots at the “dead gray mare of 1980s anti-porn feminism.” “Was any cause ever so decisively defeated?” she writes.
But one of the more chilling aspects of that cause — the insistence that workers don’t matter, products are the point — is alive and well at The Nation.
I suggest the reverse: the nature of a product is irrelevant to how we should theorize, legislate, or organize the labor involved in producing it. Workers are not socially accountable for whatever may come from their work. To accept otherwise encourages the over-identification with work that management finds so efficient in getting us to do more for less. It allows capital to extract not only time, but also ethical responsibility from workers.
Faiza Virani in Dawn:
Ayesha Khan, a young, single female reporter in Karachi, despises the elite in Pakistan. That much is clear from the onset of Saba Imtiaz’s debut novel, Karachi, You’re Killing Me!, as the protagonist mocks her boss / editor being gifted the newspaper she is employed at by his industrialist father on his 26th birthday “following a giant tantrum.” References to Agha’s and Okra follow suit, as the narrative is grounded into an us versus them tone while readers are introduced to the novel’s characters and plot. Imtiaz presents a gritty yet humourous narrative that takes the reader through the inner workings of a national newspaper, political rallies, literature festivals and socialites at fashion week. We experience all this through a reporter’s lens as Ayesha jets from pressers to rallies in rickshaws and taxis all the while working through her plentiful personal issues, topmost among which is finding a suitable man to date in the wasteland that is Karachi.
In describing life as usual, Imtiaz takes on the many serious issues facing journalists in the field today — safety (or lack thereof), the deficient infrastructure and support, and the alarming rate at which journalists are being recruited by political parties to report as required. The story is told uniquely, from an advantage point of Imtiaz’s years spent being a reporter in Karachi for one of the country’s leading newspapers. Imtiaz aptly packs Karachi’s myriad idiosyncrasies and nuances neatly into a narrative that spans everything that is relative to and reflective of Karachi’s inherent fabric — from the bomb blasts to terrorism reports, the CNG crisis, politicians’ tiresome and endless security detail and much more, highlighting what is necessary to grasp quickly all that is wrong with Karachi today.
Tony Scully in Nature:
For a condition as prevalent and dangerous as obesity (see page S50), we know surprisingly little about its causes and cures. We have much to learn about how fat tissue stores and burns lipids; there may even be new types of human fat cell yet to be discovered (S52). And although it is clear that the types of microbe living in the gut correlate with body weight, we do not know whether changes in these populations are a cause of weight gain, or a consequence (S61).
The best way to lose weight is to eat less and exercise more. But as a strategy to combat obesity at the population level, this common-sense prescription is proving ineffective over the long term. Tailored treatment programmes that factor in the stresses and temptations of the real world, using insights from behavioural research, are showing some success. Drugs may also form part of the solution (S54). Or perhaps the pharmaceutical option should be a last resort, and society should instead use the power of government regulation to encourage healthier lifestyle options (S57). Of course, obesity does not result from the environment alone — it is one of our most strongly genetically influenced traits. Scores of genes have been implicated, but the evidence suggests that something other than genes accounts for whether someone is likely to become obese (S58). Controlling appetite is not just a matter of will power; much of our dietary behaviour is hardwired. Neuroscientists are using new techniques to map the neural circuits that control when and how much we eat (S64). But these appetite systems, which evolved to ensure we have enough of the right nutrients, are now being subverted by modern food processing (S66).
Monday, April 21, 2014
by Ahmed Humayun
The causes of contemporary militancy in Muslim majority societies are many and complex, but one of the important factors is a virulent ideology that glorifies violence as a means to achieve political and religious ends. This ideology draws upon various historical inspirations—some Islamic and some Western, some local and some global—and can boast intellectuals, activists, and propagandists operating across different Muslim cultures and languages.
My concern here is not in tracing out the intellectual history of militant ideology. Nor am I seeking to precisely determine the importance that can be placed on ideology relative to other factors—a partial list of which might include a particular interpretation of Islamic doctrines about just war, the colonial legacy, the repression and failures of the authoritarian modern state, the consequences of the Shia Islamic revolution in Iran and the corresponding Sunni reaction in Saudi Arabia, Western alliances with authoritarian Muslim states or occupations of Muslim lands, and the systemic tendency of a wide range of states to utilize militant groups as proxies to advance their narrow interests. I am interested instead in exploring some of the consequences of militant ideology for Muslim societies today.
There is a tendency in the West to primarily view the activities of militant Islamist groups from the perspective of the danger that they pose to Western homelands. This is natural as a matter of pragmatic policy and national interest. Cataclysmic events like 9/11 in the United States or 7/7 in Britain have underscored the fact that the element of anti-Westernism in militant ideology is deeply ingrained. And yet it is clear that the greatest danger of militant ideology is posed to Muslims living in Muslim majority societies. This can be seen in the endless, gruesome wave of violence that has yielded enormous death tolls in recent years, mostly civilian, in countries as varied as Afghanistan and Pakistan, Iraq and Yemen, and many other places besides. (In Pakistan alone, militant violence may have claimed as many as forty to fifty thousand lives since September 11th).
As the drawdown of Western military forces from Muslim lands proceeds—Iraq having been vacated, and the withdrawal from Afghanistan imminent—the level of danger seems greater than ever. Militant groups show no signs of disbanding once Western military forces depart (though that departure may weaken the force of some militant arguments, and is therefore a positive development). Instead, militant ideas that emphasize the use of slaughter to advance political and religious change, and that target minorities and Muslims deemed beyond the pale of Islam—that is, the overwhelming majority of them—is on the rise. Militant ideology can boast many more factions today than it did on September 11th, with hundreds of them fueling sustained campaigns of terrorism and insurgency, and a general resistance to state authority, across the Middle East and South Asia.
A Ramachandran. Portrait of Rajkumari. 1998.
Oil on canvas.
by Yohan J. John
No one knows exactly how life began, but a pivotal chapter in the story was the formation of the first single-celled organism -- the common ancestor to every living thing on the planet. I like to think of the birth of life as the creation of the first boundary -- the cell membrane. That first cell membrane enclosed a drop of the primordial soup, creating a separation between inside and outside, and between life and non-life. Through this act of individuation the cell could become a controlled environment: a chemical safe zone for the sensitive molecular machinery needed to maintain integrity and facilitate replication. The game of life consists in large part of perpetuating the difference between inside and outside for as long as possible. Death, then, is the dissolution of difference. But the paradox at the heart of life is that the inside cannot survive without the outside. The cell requires raw materials -- nutrients and energy -- to sustain itself and to reproduce, and these must be sought outside the safe zone, in the wild and unpredictable outside world.
The cell membrane has a dichotomous role. It must preserve the cell’s identity as an entity that is distinct from everything outside it, but it must not be an impenetrable wall. It must be a gateway through which the cell can absorb raw material and eject waste, but it cannot allow the inside to become inundated by the outside. It fulfills this challenge by being selectively permeable, carefully overseeing the traffic between the inside and the outside. The cell membrane must also be flexible, because it serves the roles of locomotion and consumption. In a single-celled organism, the cell membrane is therefore a primitive sense organ, a transportation system and a digestive system, all rolled into one.
The birth of life was a moment of cleaving: when the first cell membrane enveloped its drop of primordial ooze, it cleaved the inside from the outside, but it also became the conduit through which the inside could cleave to the outside. Like Janus, the two-faced Roman god of beginnings and endings, of doors and passageways, the cell membrane is a sentry looking in two directions simultaneously. Given its role in cellular transaction, transition and transformation, the cell membrane’s function might even be described as a precursor to intelligence.
by Carol A. Westbrook
Less than 100,000 people in the entire world have had their genome sequenced. I am now one of them. As I wrote in 3QuarksDaily in December, I went into this with some trepidation--you never know what bad news lurks in your genome! I promised to give a report of my results, and here it is.
To get my genome sequenced, I enrolled in Illumina's "Understand Your Genome" Program. Illumina is one of the few companies licensed by the FDA to perform whole genome sequencing (WGS) for medical diagnosis--other consumer products such as Ancestry.com, National Geographic's Geno 2.0, and 23andMe, provide only a limited analysis. I sent in a blood sample in November, and in February received a detailed analysis by Illumina's genetic counselors. In March I attended the "Understand Your Genome," conference, where I received an iPad with my WGS uploaded into the "MyGenome" app, training on the use of the app, and a fascinating daylong seminar which explored the interpretation and medical uses of genome sequences. My daughter, a medical student, attended the program with me.
Viewed on the iPad, my genome sequence consists of two similar but not identical, parallel lines of the letters, one from each chromosome. There are only 4 letters, A,C,G, and T, representing the four DNA nucleotides that are aligned to make the sequence. A human sequence is about 6 billion nucleotides long, with half inherited from one parent and half from the other, and a few new mutations that arose on their own, probably less than 100. Thus, from a family perspective, a person's DNA sequence is 50% identical to each of his parents, children or siblings, 25% identical to grandparents, grandchildren, and so on to my distant relatives. My genome is very similar to every other person's, but it is not identical to anyone's. No one has ever had the same DNA as me, and never will -- it is what makes me uniquely me.
How different am I from everyone else? My genetic analysis showed that I have 3,524,186 individual nucleotide differences, from the "average" genome to which it was compared, reference genome hg19, NCBI build 37. This is about 0.05% variation, which is typical for most people. To put this in perspective, if you were to compare my DNA to that of our two most closely-related primate species, bonobos and chimpanzees, the differences would be over 4%; when comparing me to Neanderthal man, however, you would find only 0.3% variation. So 0.05% is small enough to make me human, but large enough to make me a unique individual.
by Brooks Riley
by Tamuira Reid
Duck into the nearest bar, grab a stool, roll-up your sleeves. Get down to business. Take a shot. Take another. Take a third. Drink a glass of wine, a glass of beer, a glass of vodka. Rinse. Repeat. You remember how to do this. A pro never forgets.
You should call your sponsor but you won't. You should probably feel guilty but you don't.
Drink with the ones who have nothing to lose because they've lost everything already. Or maybe they never had anything to begin with. Some people are dealt a shit hand in life. You are not one of them. You had it all and fucked-it up.
It doesn't matter if you have seven hours or seven months or seven years. IT is always there, waiting. Disguised as a good time. A giant Band-Aid. The best lay of your life. Up the five flights of stairs to your studio in Harlem, or your loft in Soho, or in the family room of your green-shuttered craftsmen in Stamford. Right behind you.
The anticipation is over. The "what if" becomes the "what now". You drink and drink and drink until body and mind unravel and you want nothing and feel nothing and coming undone like this is better than air. It's better than life. It's better.
Across town your family is getting ready for the party. Pink balloons hang from streamers stretched across doorways. Bowls of M & M's and potato chips are placed on a table next to the Dora the Explorer sheet cake you ordered, a massive number "5" candle jetting out of its middle.
Remember when she was born. All conehead and piercing scream. How she spread across your chest and fell asleep. How you felt your dark heart open up for a split second, then close again.
Let the man next to you buy another round. Don't stop him when he puts his hand on your thigh. Don't stop him when he leans over and breathes into your neck, face buried in your hair. Remember when your husband used to do this. Remember when he stopped.
You met him at a coffee shop on Bleeker Street five days into your sobriety. Talked about books and shitty local poets and how no one writes anything worth a damn anymore. Six months later you married. You wore a black dress and wrote your own vows and watched as your aging parents held hands and cried, relieved you'd finally found someone who could put up with your shit.
Let the man kiss you now. Hard. Let it remind you of how wild you were back then. How all of that crazy has been replaced by a certain brand of peace others mistake for weakness. But addicts are never truly peaceful. Not down in the soul where it matters.
The jukebox spits out some music and everything in you moves, shifts. The mute button on your life suddenly lifted.
Sunday, April 20, 2014
Peter Brown, Henry Roediger III and Mark McDaniel in Salon:
Here’s a study that may surprise you. A group of eight-year-olds practiced tossing beanbags into buckets in gym class. Half of the kids tossed into a bucket three feet away. The other half mixed it up by tossing into buckets two feet and four feet away. After twelve weeks of this they were all tested on tossing into a three-foot bucket. The kids who did the best by far were those who’d practiced on two- and four-foot buckets but never on three-foot buckets. Why is this? We will come back to the beanbags, but first a little insight into a widely held myth about how we learn.
Most of us believe that learning is better when you go at something with single-minded purpose: the practice-practice-practice that’s supposed to burn a skill into memory. Faith in focused, repetitive practice of one thing at a time until we’ve got it nailed is pervasive among classroom teachers, athletes, corporate trainers, and students. Researchers call this kind of practice “massed,” and our faith rests in large part on the simple fact that when we do it, we can see it making a difference. Nevertheless, despite what our eyes tell us, this faith is misplaced. If learning can be defined as picking up new knowledge or skills and being able to apply them later, then how quickly you pick something up is only part of the story. Is it still there when you need to use it out in the everyday world? While practicing is vital to learning and memory, studies have shown that practice is far more effective when it’s broken into separate periods of training that are spaced out. The rapid gains produced by massed practice are often evident, but the rapid forgetting that follows is not. Practice that’s spaced out, interleaved with other learning, and varied produces better mastery, longer retention, and more versatility.