Wednesday, August 24, 2016
Lynn Sherr in Bill Moyers Blog:
My mother was born in the United States of America without the right to vote. I just stopped to re-read that sentence because it seems so, you know, quaint. Okay, preposterous. By the time she neared voting age in 1920, the 19th Amendment to the Constitution was ratified, prohibiting federal and state governments from denying citizens the right to vote “on account of sex.” For the next seven decades, Mother didn’t miss an election. As a child, I remember watching her dress for the occasion: girdle and stockings, dress and heels, hat and gloves, because like many first-generation Americans who’d endured two World Wars, she considered voting a formal affair, a sacred privilege — and duty — defined by her citizenship. That’s my ritual, too (without the body armor), which is why this Friday, Aug. 26 — the 96th anniversary of the day American women got the vote — I’ll offer my annual thanks to the women and men who made it happen. Their exhausting slog over more than half a century was, as noted by the leader of the final surge, Carrie Chapman Catt, agonizing: “480 campaigns to urge legislatures to submit suffrage amendments to voters; 47 campaigns to induce state constitutional conventions to write woman suffrage into state constitutions; 277 campaigns to persuade state party conventions to include woman suffrage planks; 30 campaigns to urge presidential party conventions to adopt woman suffrage planks in party platforms, and 19 campaigns with 19 successive Congresses.” Not to mention countless insults, inanities and hurled rotten eggs. But the result changed the dynamic, opening the electoral process to more individuals than ever before in American history. And with Hillary Clinton now tying her historic candidacy to the legacy of our foremothers, it’s useful to recall its unique place in our often grudgingly shared democracy.
Early suffrage leaders understood their goal as a natural right of citizenship, right up there with life, liberty and the pursuit of happiness. Elizabeth Cady Stanton, who rewrote the Declaration of Independence along feminist lines for the 1848 Seneca Falls women’s rights convention, bemoaned the “degradation of disfranchisement.” But when men didn’t share on the simple grounds of equality, some women resorted to a higher calling — moral superiority — slyly predicting that female reformers would elevate and cleanse the corrupt political world; that everything from the drunken rowdiness on election day to the character of candidates would be purified. According to Susan B. Anthony, woman suffrage would “compel both political parties to nominate candidates of the highest character. A woman would no more vote for a low-down man than a good man for a degraded woman.”
Amy Maxmen in Nature:
First, there was the pitching and rolling in an old Jeep for eight hours. Next came the river crossing in a slender canoe. When Nathalie Strub Wourgaft finally reached her destination, a clinic in the heart of the Democratic Republic of the Congo, she was exhausted. But the real work, she discovered, had just begun. It was July 2010 and the clinic was soon to launch trials of a treatment for sleeping sickness, a deadly tropical disease. Yet it was woefully unprepared. Refrigerators, computers, generators and fuel would all have to be shipped in. Local health workers would have to be trained to collect data using unfamiliar instruments. And contingency plans would be needed in case armed conflict scattered study participants — a very real possibility in this war-weary region. This was a far cry from Wourgaft's former life as a top executive in the pharmaceutical industry, where the hospitals that she commissioned for trials were pristine, well-resourced and easy to reach. But Wourgaft, now medical director for the innovative Drugs for Neglected Diseases initiative (DNDi), was confident that the clinic could handle the work. She was right. With data from this site and others, the DNDi will next year seek approval for a sleeping-sickness tablet, fexinidazole. It would be a massive improvement on existing treatment options: an arduous regimen of intravenous injections, or a 65-year-old arsenic-based drug that can be deadly.
The DNDi is an unlikely success story in the expensive, challenging field of drug development. In just over a decade, the group has earned approval for six treatments, tackling sleeping sickness, malaria, Chagas' disease and a form of leishmaniasis called kala-azar. And it has put another 26 drugs into development. It has done this with US$290 million — about one-quarter of what a typical pharmaceutical company would spend to develop just one drug. The model for its success is the product development partnership (PDP), a style of non-profit organization that became popular in the early 2000s. PDPs keep costs down through collaboration — with universities, governments and the pharmaceutical industry. And because the diseases they target typically affect the world's poorest people, and so are neglected by for-profit companies, the DNDi and groups like it face little competitive pressure. They also have lower hurdles to prove that their drugs vastly improve lives. Now, policymakers are beginning to wonder whether their methods might work more broadly. “For a long time, people thought about R&D as so complicated that it could only be done by the biggest for-profit firms in the world,” says Suerie Moon, a global-health researcher at the Harvard T.H. Chan School of Public Health in Cambridge, Massachusetts, who studied PDPs and joined the DNDi's board of directors in 2011. “I think we are at a point today where we can begin to take lessons from their experience and begin to apply to them non-neglected disease,” she says.
Middle of the Way
I wake in the night,
An old ache in the shoulder blades.
I lie amazed under the trees
That creak a little in the dark,
The giant trees of the world.
I lie on earth the way
Flames lie in the woodpile,
Or as an imprint, in sperm or egg, of what is to be.
I love the earth, and always
In its darkness I am a stranger.
Tuesday, August 23, 2016
John Freeman in Literary Hub:
For the past 30 years, Svetlana Alexievich has been writing one long book about the effect of communism and its demise on people in the former Soviet Bloc. Based on interviews, her books conjure a chorus of voices that rise and fall and arrange themselves into symphonic narratives: Here are the voices of Russians scarred by the meltdown of Chernobyl (Voices from Chernobyl), angered by the shame of Afghan War (Zinky Boys), and now, with Secondhand Time, bewildered by the collapse of communism and assumption they should all be capitalists now.
Alexievich was in some ways born into this task. Both of her parents were teachers and her father once studied journalism himself. At university, Alexievich was exposed to the work of Belarusian writer, Ales Adomovich, who believed the 20th century was so horrific it needed no elaboration.
Unlike Studs Terkel, whose oral histories of American life arrange themselves like transcribed radio interviews, Alexievich’s books are strange creations. They never ask the reader to think to imagine their subjects are representative individuals. When she won the Nobel in 2015, Alexievich described them as novels—which is a fair comparison given the meticulous arrangement required to create such clear and evocative pastiche. Whatever they are, her books are as eerie and beautiful as overheard voices on a crowded train car traveling through the night.
Jonathan Weiner in the New York Times:
Reader, as you read these words, trillions of microbes and quadrillions of viruses are multiplying on your face, your hands and down there in the darkness of your gut. With every breath you take, with every move you make, you are sending bacteria into the air at the rate of about 37 million per hour — your invisible aura, your personal microbial cloud. With every gram of food you eat, you swallow about a million microbes more.
According to the latest estimates, about half of your cells are not human — enough to make you wonder what you mean by “you.” Your human cells come from a single fertilized egg with DNA from your mother and father. Microbes began mingling with those human cells even before your first breath, the first kiss from your mother, your first taste of milk. And your human cells could not have built a healthy body without intimate help from all those trillions of immigrant microbes — your other half.
“I am large, I contain multitudes,” Walt Whitman declares in “Leaves of Grass,” in his great poem “Song of Myself.” But what is “self”? According to conventional wisdom, your immune system is supposed to protect you by detecting and rejecting anything in your body that is not “self.” And yet your very immune system is partly built and even partly run by microbes. “Even when we are alone, we are never alone,” Ed Yong writes in his excellent and vivid introduction to our microbiota, or microbiome, the all-enveloping realm of our microbes. “When we eat, so do they. When we travel, they come along. When we die, they consume us.”
Zaheer Kazmi in Prospect:
From religious leaders to former extremists and western governments, a consensus has emerged since 9/11 that stresses the compatibility between Islam and the liberal values of civility, freedom and tolerance, as opposed to terrorist groups such as Islamic State (IS). Yet in many ways Islamist militancy and Islamic liberalism—though seemingly opposed—are two sides of the same reformist coin. They are both engaged in ideological projects for an Islamic revival in a time of western ascendancy. And they are equally plagued with the problems encountered by movements that rest their legitimacy on claims to a unique and timeless authenticity.
Muslim liberals tend to prescribe modern answers to postmodern questions. Their focus on reviving supposedly representative forms of religious authority show them to be ill at ease with the ways in which Islam has become increasingly atomised in a fragmented world. Their intellectual antecedents are the 19th-century modernist movements such as the al-Nahda or cultural “awakening” in the Arab world and the Aligarh movement in British India. They cling to these modes of reform grounded in synthesising Islam with western notions of progress. Post-9/11 calls from western governments and civil society for Muslims to counter the extremism in their midst have reactivated these agendas.
Four problems in particular blight attempts at Islamic liberal reform—none of which have anything to do with duplicity or conspiracy, as Islamophobes allege.
From the TED website:
Summer, 2016: amid populist revolts, clashing resentments and fear, writer Anand Giridharadas doesn't give a talk but reads a letter. It's from those who have won in this era of change, to those who have, or feel, lost. It confesses to ignoring pain until it became anger. It chides an idealistic yet remote elite for its behind-closed-doors world-saving and airy, self-serving futurism — for at times worrying more about sending people to Mars than helping them on Earth. And it rejects the exclusionary dogmas to which we cling, calling us instead to "dare to commit to the dream of each other."
On a warm spring evening in early May, 1950, Edward Steichen, the director of the Department of Photography at the Museum of Modern Art in New York, opened an exhibition touted as a milestone event: the museum’s first exhibition devoted solely to color photography—titled, authoritatively, “Color Photography.” It featured an extravagant profusion of photographs drawn from science, journalism, and commerce: a microscope picture of an amoeba; an aerial photo taken from a rocket launched over the White Sands missile facility in New Mexico; Life’s images of exploding atomic bombs and the first published color photo of Mars; tear-sheets from the pages of Vogue, Fortune,Ladies’ Home Journal, and other popular magazines. The varied list of photographers included Roman Vishniac, Paul Outerbridge, Eliot Porter, Weegee, Irving Penn, Richard Avedon, Louise Dahl-Wolfe, Horst P. Horst, and Steichen himself.
The abundance of imagery on display served Steichen’s curatorial aim: to probe whether this technology could be a “new” and “creative” medium for the artist, and not just a “means of supplementing or elaborating the recognized attainments of black and white photography.” Chronologically, the show’s earliest works were printed reproductions of Autochromes, the colored-starch glass plates patented by the Lumière Brothers in 1904, marking that invention as the inauguration of modern color photography. But chronology didn’t guide the exhibition: the physical layout was determined by image source (e.g., the military, the magazine) and display needs of various media, particularly the many color transparencies, which had to be lit from behind in darkened rooms.
The name has been
to a stub.
For sixteen years
I have ransacked
looking for a way
to say how it was.
Because we have
no word for light
we live in shadows.
by Demetria Matinez
from Breathing Between the Lines
University of Arizona Press, 1997
Earlier this summer I was on a panel at a literary conference where I happened to say that Rudyard Kipling was a wonderful writer. Immediately, a number of people in the audience began to boo and hiss. Two of my fellow panelists nearly shrieked that Kipling was utterly beyond the pale, being at once racist, misogynist and imperialist. Not entirely surprised by this reaction, but nonetheless flabbergasted by its vehemence, I made a flustered attempt to champion the author of “Plain Tales From the Hills,” “The Jungle Books” and “Kim.” I declared what many believe, that he is the greatest short-story writer in English. This only made things worse. Finally, with some desperation I blurted out: “How much Kipling have you actually read?”
A short silence followed, and, without any answer to my question, the discussion moved on to other, less heated topics. But I felt significantly downcast. So when I got home I sat down and reread“The Jungle Books,” recently reprinted by Penguin because of a new film about Mowgli, the “Man-cub” reared by wolves. I also dipped into a number of biographies and critical works, visited the website of the Kipling Society and tried to clarify my own thoughts about, arguably, the most controversial author in English literature.
Tailleferre struggled to be considered asexual in musical terms, asking to be called simply “a composer,” not a “woman composer.” Even today her granddaughter Elvie de Rudder, still teaching music in a Paris lycée, bristles at those who refer to her as “Germaine.” Nobody calls Milhaud “Darius” or Poulenc “Francis,” she says. “So just call her ‘Tailleferre’.” Easy first-name usage can connote condescension, especially in protocol-sensitive France.
With hindsight, we can conclude that Tailleferre was cheated out of her rightful place in the legacy of Les Six. The other more prominent members – Poulenc, Milhaud and Arthur Honegger – are routinely credited as originators of a modern French School of composing. No less an authority on contemporary music as the late Joseph Machlis maintained in his book Introduction to Contemporary Music that Tailleferre and another member of the group, Louis Durey, “dropped from sight” after a brush with fame in the 1920s. Not true. They continued making an impact of their own choosing and at their own pace.
Tailleferre’s natural modesty didn’t help her career. She undervalued herself in part because of the patriarchal culture of early 20th century Europe. Playing her submissive role to the hilt, she told an interviewer she had no grand pretensions about her oeuvre. “It’s not great music, I know, but it’s gay, light hearted music which is sometimes compared with that of the ‘petits maîtres’ of the 18th century. And that makes me very proud.’’ She added, “I write music because it amuses me.” You can almost hear her tiny voice apologizing for what she has done.
Andreas Kluth in The Economist:
Many Germans have been glued to a television series, “Where We Come From”, that explains Germany’s long, complicated and often tragic history. The “we” in the title, however, is deceptive, for the host and narrator is Sir Christopher Clark, an Australian historian knighted for his services to Anglo-German relations. His academic credentials are excellent. His book on Prussia, “Iron Kingdom”, may be the best on the subject. His tome on the first world war, “The Sleepwalkers”, became a bestseller. But Germany has plenty of its own historians. Why Clark?
The answer starts with the dappled bow tie he wears as he drives around Germany in a red cabriolet vw Beetle: the quintessential Brit (Aussies are close enough) in the quintessential German vehicle. Then there’s the language. Clark speaks grammatically flawless German, but with enough of an English cadence to sound cheeky, witty and incisive. Occasionally he uses humour, which can still be shocking on German public television. Sometimes he even says nice things about the country’s past, which to Germans is truly shocking. He does not seem full of himself. To Germans that is refreshing. German Anglophiles consider such attributes “Anglo-Saxon”. The term is stretchable in this context and includes anybody English-speaking, whether Celtic or Saxon, pale or brown, from down under or beyond the pond. Clark is not an isolated case. The late Gordon Craig, a Scottish-American historian, achieved similar success. So has Timothy Garton Ash, a historian at Oxford and Stanford, who wows Germans with pithy insights delivered in sophisticated German.
George Johnson in The New York Times:
Carcinogens abounded 1.7 million years ago in Early Pleistocene times when a nameless protohuman wandered the South African countryside in what came to be known as the Cradle of Humankind. Then, as now, ultraviolet radiation poured from the sun, and radon seeped from granite in the ground. Viruses like ones circulating today scrambled DNA. And there were the body’s own carcinogens, hormones that switch on at certain times of life, accelerating the multiplication of cells and increasing the likelihood of mutations.
That, rather than some external poison, was probably the cause of a bone tumor diagnosed as an osteosarcoma found fossilized in Swartkrans Cave, a paleoanthropological trove northwest of Johannesburg. A paper in the current South African Journal of Science describes the discovery, concluding that it is the oldest known case of cancer in an early human ancestor. “The expression of malignant osteosarcoma,” the authors wrote, “indicates that whilst the upsurge in malignancy incidence is correlated with modern lifestyles, there is no reason to suspect that primary bone tumours would have been any less frequent in ancient specimens.” Perhaps the main reason there is more cancer today is that people live much longer, leaving more time for dividing cells to accumulate genetic mistakes. Osteosarcoma, however, occurs most frequently in younger people, as their limbs undergo adolescent spurts of growth. That and the fact that bones outlast softer organs make osteosarcoma a natural cancer to look for among early hominins, the zoological tribe that includes humans and their extinct kin.
Monday, August 22, 2016
by Paul North
Natural science is as valuable as its expression. What does this mean? An analogy with the economy will help. When we work, our working has to produce benefits for us beyond just the work itself. Those benefits can be called the expression of the work. Mostly they take the form of money—but not always. Natural science is similar. To be valuable it has to express itself in other forms, just as work does.
Even where natural science gives us a big new truth, like the theory of relativity in physics or the theory of evolution in biology, this truth—which may be compelling and very simple to grasp, or very complex and difficult and require some convincing before we believe in it—its truth is not its expression. Truth is one thing, expression another. Natural science has all sorts of expressions. We tend to think of it as a closed club with secret goings on in hidden chambers. Nevertheless, it periodically sends out bulletins that concern us all. This is part of what I mean by "expression." Natural science—physics, biology, chemistry, both the theoretical and the empirical kinds—sends out messages about life and history to a very large audience.
Natural science has many expressions. To be sure, its earliest and most secretive expression is the technical papers shared among scientists, which journalists misquote and the NSF decodes for the purposes of giving grants. But of course these technical documents are only as good, as useful, as praised and as prized—ultimately—as their capacity for wider expression. They should produce conferences and volumes. They should change the practices of other scientists and they should eventually change the goals of science, by a little or a lot. Such expressions remain within the non-public confines of scientific circuits however. Science has its own internal public, but intellectual sectors like natural science always aim at a greater public. This does not mean that science follows fads and fashions. Quite the contrary: it exists to remake the popular sphere in its own image, just as art does, or entertainment, or the military or other kinds of politics. So what are the greater effects of natural science? What are its widest expressions?
by Muhammad Aurangzeb Ahmad
Science Fiction literature is fraught with examples of what-ifs of history which speculate on how the would have looked like if certain events had happened a different way e.g., if the Confederates had won the American Civil War, if the Western Roman Empire had not fallen, if Islam had made inroads in the imperial household in China etc. At best these are speculations that we can entertain to shed light on our own world but imagine if there was a way to gauge how societies react under certain environmental constraints, social structures and stress. Simulation is often described as the Third Paradigm in Science and the field of Social Simulation seeks to model social phenomenon that cannot otherwise be studied because of practical and ethical constraints. Isaac Asimov envisions the science of predicting future with the psychohistory in the foundation series of science fiction novels.
The history of social simulation can be traced back to the idea of Cellular Automata by Stainlaw Ulam and John von Neumann: A cellular automata is a system of cell objects that can interact with its neighbors given a set of rules. The most famous example of this phenomenon being Conway’s Game of Life, which is a very simple simulation, that generates self-organizing patterns, which one could not really have predicted by just knowing the rules. To illustrate the concept of Social Simulation consider Schilling’s model of how racial segregation happens. Consider a two dimensional grid where each cell represents an individual. The cells are divided into two groups represented by different colors. Initially the cells are randomly seeded in the grid representing an integrated neighborhood. The cells however have preference with respect to what percentage of cells that are their neighbors should belong to the same group (color). The simulation is run for a large number of steps. At each step a person (cell) checks if the number of such neighbors is less than a pre-defined threshold then the person can move by a single cell. If the number of such neighbors meets the threshold then the person (cell) remains at its current position. Even with such a simple setup we observe that the integrated neighborhood slowly becomes segregated so that after some iterations the neighborhood is completed segregated. The evolution of the simulation can be observed in Figure 1. The main lesson to be learned here is that even without overt racism and just having a preference about one’s neighbors can lead to a segregated neighborhood.
by Michael Liss
“In every country there must be a just and equal balance of powers in the government, an equal distribution of the national forces. Each section and each interest must exercise its due share of influence and control. It is always more or less difficult to preserve their just equipoise, and the larger the country, and the more varied its great interests, the more difficult does the task become, and the greater the shock and disturbance caused by an attempt to adjust it when once disturbed.” —Henry J. Raymond, Editor of the New York Times, January, 1860 (as quoted by Allan Nevins).
“We don’t win anymore. But we are going to start winning again.” —Donald J. Trump, just about any and every day, 2015-16.
Donald Trump is done with keeping quiet. It’s possible you might not have noticed the buttoned-up, reserved Trump (I’ve heard it compared to the Higgs boson), but worry not; it’s no longer relevant, and you won’t be seeing it in the future.
Trump wants to be Trump, and he’s tired of people telling him he needs to appear more substantive, more Presidential. So he shook up his campaign, demoted the controversial Paul Manafort (who subsequently resigned), elevated the pollster Kellyanne Conway to campaign manager, and made Stephen Bannon the campaign’s chief executive. Conway is an operative who previously worked for Ted Cruz and has good contacts with the conservative base. But Bannon is the real prize, and the one who raised eyebrows, and a little fear, even amongst Republicans. Bannon runs the influential and persistently inflammatory conservative outlet Breitbart News, which has recently closely coordinated with Trump’s messaging. And Breitbart takes no prisoners. Wild speculation, innuendo, and hyperbole are its stock in trade, and if you are in its sightline, expect to lose.
Trump has made a decisive choice. He will do what got him the nomination. Back to his fastball: an unscripted (but obviously deliberate) stream-of consciousness mélange of pugnacity, promises, patriotism, law-and order, and a firm, unkindly hand towards those who are undesirable because of their origins or political beliefs. He will occasionally throw in a kinder, gentler Donald because he’s retained slash-and-burn types to act as surrogates, but the core Trump message will remain intense and in your face. That’s who he is, a hammer in search of a nail.
This is actually a very smart move, a businessman’s move, and the freak-out from his fellow Republicans misses the mark. Trump isn’t like other politicians. He doesn’t do “pivot.”
I’m in the weeds on my knees pawing dark earth
looking for my squash among prolific opportunist grasses
and broad-leafed virtuosos at finding sustenance
in the garden of a part-time farmer—
finding advantage in his jammed schedule,
in life’s necessary distractions and precious
irrelevancies, his asamprajanya
On knees I sweat under an indifferent sun
to undo the effects of looking the other way
while rooted intruders ensconced themselves
in a life of ease throttling zucchini
under the erratic care of a life-long
junkie of mysteries, dreams and peeks behind scenes,
looking for grails among wild greens
which threaten his squash’s fundamental urge to bear fruit,
who counts angels and grasps at clouds
while many weeds take root
*Asamprajanya (Sanskrit): inattentiveness, non-alertness
by Leanne Ogasawara
The other night, I was dancing with the Dalai Lama. We were in a large auditorium that looked like a high school gym-- and in front of a packed audience sitting in the bleachers, we danced, just the two of us--cheek to cheek. I am not actually such a huge fan of his holiness-- so this all was rather unexpected.
As we were floating and twirling ballroom style out on the dance floor, he pressed me very close, and giggled-- and I started to laugh; and then still in my dream, I thought, "Wow, maybe I died and this is heaven..."
I've long wondered, why it is that right from the very start, peopled have preferred Dante's Inferno to his Paradiso?
Am I the only one who-- while utterly unable to imagine hell-- often finds myself lost in dreams of paradise?
It's true, I love to fantasize about paradise.
Often imagining it like a Persian garden, there is the intoxicating fragrance of roses, jasmine and gardenias. There is music and gently perfumed spring breezes. And people picnic, unendingly.
by Akim Reinhardt
During your 20s and 30s, when you don't have any children, it is inevitable that people will periodically ask you: "Do you want to have kids?"
It never mattered who asked. Family, friends, or lesser acquaintances, men or women, married or single, parents themselves or not. I always had the same answer.
Yes, just not now.
As I approached my mid-30s, I began to append a caveat: If I didn't have any children by age 40, I probably never would. I didn't want to be an old dad.
But the realization, that I'd rather not be a middle aged gray beard huffing and puffing while I try to keep up with the little rascals, opened a door. Whereas I'd previously assumed I wanted kids, just not now, the 40 year old expiration date I adopted forced me to question my pat answer and ask myself if I really wanted them at all.
After spending a couple of decades saying Yes, but not now, I finally realized something. There was never a "now" because I never actually wanted them. And I probably never would.
The generations that came of age after World War II made divorce mainstream.
As teens, they were still subject to intense social pressure to marry and have kids, which most of them did. But the Boomers became increasingly resentful of their parents as they matured, or in many cases, at least leery of their elders' mistakes. They and the so-called Silent Generation (Depression and War babies) asked themselves: Must I really spend half-a-century and all of my best years in a bad marriage that I jumped into when I was way too young to know better?
As the 1970s unfolded, more and more of them decided the answer was No.
Katharina Grosse. Rockaway. July 2016.
Presented by MoMA PS 1 at Gateway National Recreational Area, Fort Tilden, NY.
by Dave Maier
Consider how difficult it has been to get Big Tobacco to admit that cigarette smoking is bad for you at all, let alone that it kills many thousands of people every year. In particular, you might remember that time when all the major executives swore under oath at Congressional hearings that cigarettes are perfectly safe. Consider as well that most tobacco profits come from heavy users of tobacco, not smokers of only the occasional cigarette. So the all-important bottom line – public health be damned – can be preserved only by recruiting new heavy smokers as the older (or not so older) ones die off or quit. For Big Tobacco, this means targeting children, who are not only risk-takers by nature, but very often concerned above all to be cool. If cigarettes are risky and cool, then children will become smokers, and many (some studies say 30%) will become hooked, preserving corporate profits for another generation.
Marijuana prohibitionists hold the analogous establishment of Big Marijuana up as a nightmare scenario. If big money is involved – as of course it is – it is quite natural to worry that Big Marijuana will be just as bad as Big Tobacco: fighting warning labels, putting out deceptive and child-friendly advertising (Joe Camel = Joe Cannabis?) fighting class-action lawsuits with expensive lawyers, and so on. Prohibitionists point to the existence of yummy cannabis edibles (THC-infused gummy bears! “Pot Tarts”!) and fanciful marijuana strain names (“Girl Scout Cookies”! “Green Crack”!) as evidence that even the nascent legal cannabis industry has our defenseless children in its sights.
The most vocal proponent of this line is Kevin Sabet of the anti-legalization organization Project SAM [Smart Approaches to Marijuana]. Sabet represents a new development in prohibitionism, consciously distancing himself from old-school drug-warrior tactics in the hope of reaching a more moderate audience. In terms of actual policy recommendations, in fact, Sabet sounds quite a bit like yesterday’s marijuana reform activists. NORML's Roger Roffman, for example, whose book we looked at last time, spent most of his career pushing not for legalization, but for decriminalization, and more generally a reconstrual of marijuana policy not as a matter for law enforcement but instead as a public health issue: not arrest and incarceration, but education and treatment.
by Evert Cilliers aka Adam Ash (original visuals by David Thall)
Here are the main traits that distinguish a psychopath:
1. A lack of empathy
2. A disregard for the rights of others
3. A failure to feel remorse or guilt
4. Grandiose self-worth
5. Pathological lying
6. Glib and superficial charm
7. Cheating, conning and defrauding others for personal gain
8. A tendency to display violent behavior
Remind you of somebody running for president of America?
1. Donald Trump, in the way he responded to the charges of the Khan family, showed a stunning lack of empathy.
2. Donald Trump, in the way he talks about Mexicans and Muslims, shows a stunning disregard for the rights of others.
3. Donald Trump, in the way he jauntily smears John McCain, Mexicans, Muslims, women, and even fellow Republicans, shows no remorse or guilt.
4. Donald Trump, in his stunningly high regard for his own amazingness, has a sense of grandiose self-worth second to none. Nobody in public life has ever exhibited such an amazing degree of narcissism.
5. Donald Trump can lie and then lie about that lie in the same sentence. PolitiFact states that 72% of Trump's public remarks about factual circumstances are false.
6. Donald Trump has an amazing amount of glib and superficial charm.
7. Donald Trump, who all his life has stiffed his business suppliers by not paying them for goods and services rendered unto him, has always been a cheating, defrauding con.
8. Donald Trump has encouraged his followers to commit violence and threatened Hillary Clinton with assassination. He said of the Democratic convention that he felt like hitting many of its speakers. "There was one guy in particular, a very little guy. I was going to hit this guy so hard, his head would spin. He wouldn't know what the hell happened."
All of this is very true, but when did it become apparent that Trump was actually a psychopath?
by Brooks Riley
Strained Analogies Between Recently Released Films and Current Events: Suicide Squad and Why It's Rough to Be a Republican Right Now
by Matt McKenna
Though the target demographic for Suicide Squad can’t yet vote in the United States, it was was still thoughtful of director David Ayer to create a PG-13 film that educates children as to the state of the Republican Party. By fashioning the silly misanthrope protagonists in Suicide Squad after Republican candidates, Ayers deftly describes the sad circumstance many Republican politicians and voters are experiencing this election cycle--they dislike the candidates they’re obliged to support.
For those unfamiliar with the Suicide Squad comic book franchise, the protagonists are a group of DC Comics villains who are pressed into service by the United States in order to battle other, even worse villains. These anti-heroes, who share a cinematic universe with Batman, Superman, and other superheros in similarly boring films, are compelled to fight alongside the “good guys” because the good guys have threatened to detonate an explosive device implanted in each of the villain’s necks. It does seem a bit unfair to call the team the “Suicide Squad” given that if the protagonists don’t go along with the plan, their heads will be blown off. Alas, I suppose a more accurate title like “Hostage Squade” isn’t quite as mellifluous.
If the concept of Suicide Squad seems like a breath of fresh air compared to the noxious wind that accompanies most of the other Marvel and DC comic book movies, prepare to be disappointed. While the protagonists may not be the trite, righteous do-gooders we’ve been forced to endure over the past decade of me-too comic book cash-grabs fashioned in the form of feature films, the plot centers around the same tired tropes as its predecessors. Like the similarly incoherent Ghostbusters film from earlier this summer, Suicide Squad involves an action-figure-ready gaggle of wry underdogs charging into a skyscraper to battle a supernatural being who, before destroying the world, must first conjure a glowing beam of light and shoot it into the sky for two hours. Why does this evil spirit monster need to project a glowing energy beam through the ceiling? I don't know; Maybe that detail was covered somewhere within the bountiful dialogue, but even if it was, I can't imagine myself thinking, “Ohhh, okay. Sure, that makes sense.” Anyway, what Suicide Squad lacks in an interesting plot, it certainly makes up for in its uncanny depiction of the current state of the Republican Party.
These songs of mine have to be played. They mustn’t be lost, they have to be out there....They’re Byzantine and their ‘roads’, their tunes are ancient.
To read this book, this as-told-to autobiography of Markos Vamvakaris, is to confront how strange is this thing we call writing, the child of this strange thing in which we live, called civilization. It is not that Markos, as he came to be known, is uncivilized. It is not that. Living at the time and place that he did, Greece during the early and middle twentieth century, he couldn’t avoid it, this civilization.
But he could resist it. And that he did, with wine, women, and song. Hashish too, more than the wine, and the bouzouki, along with the song and more than the women. Civilization didn’t win, neither did Markos. But I wouldn’t call it a draw either. It was a dance.
* * * * *
I knew almost nothing about rebetiko – Greek urban folk music with Asian influence – when I began reading this book, this circle dance between Markos the road warrior, Angeliki Vellou-Keil, scholar and scribe who published the material in Greek in 1972, and Noonie Minogue, who translated and edited this English edition (2015). Yet the story herein set forth, Markos Vamvakaris: The Man and the Bouzouki, that story is a familiar one: poverty, social marginalization, drugs, rubbing shoulder with criminals, womanizing, dedication to craft, and the transformation of a nation’s musical culture. Rebetiko has been likened to the blues, and the stories of major blues musicians have all those elements. It is a story of resistance, survival, and transformation.
Markos Vamvakaris was born in 1905 on the island of Syra in the Cyclades in the South Aegean Sea. That puts it on one of the major crossroads of world travel and trade for three millennia, between mainland Greece to the West and Turkey to the East. Its largest city, Ermopouli, was the major Greek port in the second half of the 19th Century, and a center for commerce and industry. Many different peoples have lived in and passed through Syra, as they do today in these days of destruction and despair in the Middle East. The dance of snivilization, as James Joyce called it, power and domination, freedom and music, pomp and circumcision, the bouzouki vs. bullets. Markos snubbed the law and the songs won. For awhile.