Wednesday, March 29, 2017
Interviews by Ben Beaumont-Thomas in The Guardian:
I was working for Microsoft’s typography team, which had a lot of dealings with people from applications like Publisher, Creative Writer and Encarta. They wanted all kinds of fonts – a lot of them strange and childlike. One program was called Microsoft Bob, which was designed to make computers more accessible to children. I booted it up and out walked this cartoon dog, talking with a speech bubble in Times New Roman. Dogs don’t talk in Times New Roman! Conceptually, it made no sense.
So I had an idea to make a comic-style text and started looking at Watchmen and Dark Knight Returns, graphic novels where the hand lettering was like a typeface. I could have scanned it in and copied the lettering, but that was unethical. Instead, I looked at various letters and tried to mimic them on screen. There were no sketches or studies – it was just me drawing with a mouse, deleting whatever was wrong.
I didn’t have to make straight lines, I didn’t have to make things look right, and that’s what I found fun. I was breaking the typography rules. My boss Robert Norton, whose mother Mary Norton wrote The Borrowers, said the “p” and “q” should mirror each other perfectly. I said: “No, it’s supposed to be wrong!” There were a lot of problems like that at Microsoft, a lot of fights, though not physical ones.
Natalie Wolchover in Quanta:
As he was brushing his teeth on the morning of July 17, 2014, Thomas Royen, a little-known retired German statistician, suddenly lit upon the proof of a famous conjecture at the intersection of geometry, probability theory and statistics that had eluded top experts for decades.
Known as the Gaussian correlation inequality (GCI), the conjecture originated in the 1950s, was posed in its most elegant form in 1972 and has held mathematicians in its thrall ever since. “I know of people who worked on it for 40 years,” said Donald Richards, a statistician at Pennsylvania State University. “I myself worked on it for 30 years.”
Royen hadn’t given the Gaussian correlation inequality much thought before the “raw idea” for how to prove it came to him over the bathroom sink. Formerly an employee of a pharmaceutical company, he had moved on to a small technical university in Bingen, Germany, in 1985 in order to have more time to improve the statistical formulas that he and other industry statisticians used to make sense of drug-trial data. In July 2014, still at work on his formulas as a 67-year-old retiree, Royen found that the GCI could be extended into a statement about statistical distributions he had long specialized in. On the morning of the 17th, he saw how to calculate a key derivative for this extended GCI that unlocked the proof. “The evening of this day, my first draft of the proof was written,” he said.
Video length: 43:06
Bo Winegard and Ben Winegard in Quillette:
To paraphrase Mark Twain, an infamous book is one that people castigate but do not read. Perhaps no modern work better fits this description than The Bell Curve by political scientist Charles Murray and the late psychologist Richard J. Herrnstein. Published in 1994, the book is a sprawling (872 pages) but surprisingly entertaining analysis of the increasing importance of cognitive ability in the United States. It also included two chapters that addressed well-known racial differences in IQ scores (chapters 13-14). After a few cautious and thoughtful reviews, the book was excoriated by academics and popular science writers alike. A kind of grotesque mythology grew around it. It was depicted as a tome of racial antipathy; a thinly veiled expression of its authors’ bigotry; an epic scientific fraud, full of slipshod scholarship and outright lies. As hostile reviews piled up, the real Bell Curve, a sober and judiciously argued book, was eclipsed by a fictitious alternative. This fictitious Bell Curve still inspires enmity; and its surviving co-author is still caricatured as a racist, a classist, an elitist, and a white nationalist.
Myths have consequences. At Middlebury college, a crowd of disgruntled students, inspired by the fictitious Bell Curve — it is doubtful that many had bothered to read the actual book — interrupted Charles Murray’s March 2nd speech with chants of “hey, hey, ho, ho, Charles Murray has got to go,” and “racist, sexist, anti-gay, Charles Murray go away!” After Murray and moderator Allison Stanger were moved to a “secret location” to finish their conversation, protesters began to grab at Murray, who was shielded by Stanger. Stanger suffered a concussion and neck injuries that required hospital treatment.
It is easy to dismiss this outburst as an ill-informed spasm of overzealous college students, but their ignorance of The Bell Curve and its author is widely shared among social scientists, journalists, and the intelligentsia more broadly. Even media outlets that later lamented the Middlebury debacle had published – and continue to publish – opinion pieces that promoted the fictitious Bell Curve, a pseudoscientific manifesto of bigotry.
Above Pate Valley
We finished clearing the last
Section of trail by noon,
High on the ridge-side
Two thousand feet above the creek
Reached the pass, went on
Beyond the white pine groves,
Granite shoulders, to a small
Green meadow watered by the snow,
Edged with Aspen—sun
Straight high and blazing
But the air was cool.
Ate a cold fried trout in the
Trembling shadows. I spied
A glitter, and found a flake
Black volcanic glass—obsidian—
By a flower. Hands and knees
Pushing the Bear grass, thousands
Of arrowhead leavings over a
Hundred yards. Not one good
Head, just razor flakes
On a hill snowed all but summer,
A land of fat summer deer,
They came to camp. On their
Own trails. I followed my own
Trail here. Picked up the cold-drill,
Pick, singlejack, and sack
Ten thousand years.
by Gary Snyder
from Riprap and Cold Mountain Poems
Shoemaker & Hoard Publishers.
Erin O'Donnell in Harvard Magazine:
Oncologists know that men are more prone to cancer than women; one in two men will develop some form of the disease in a lifetime, compared with one in three women.But until recently, scientists have been unable to pinpoint why. In the past, they theorized that men were more likely than women to encounter carcinogens through factors such as cigarette smoking and factory work. Yet the ratio of men with cancer to women with cancer remained largely unchanged across time, even as women began to smoke and enter the workforce in greater numbers. Pediatric cancer specialists also noted a similar “male bias to cancer” among babies and very young children with leukemia. “It’s not simply exposures over a lifetime,” explains Andrew Lane, assistant professor of medicine and a researcher at the Dana-Farber Cancer Institute. “It’s something intrinsic in the male and female system.” Now, discoveries by Lane and the Broad Institute of Harvard and MIT reveal that genetic differences between males and females may account for some of the imbalance. A physician-researcher who studies the genetics of leukemia and potential treatments, Lane says that he and others noted that men with certain types of leukemia often possess mutations on genes located on the X chromosome. These mutations damage tumor-suppressor genes, which normally halt the rampant cell division that triggers cancer.
Lane initially reasoned that females, who have two X chromosomes, would be less prone to these cancers because they have two copies of each tumor suppressor gene. In contrast, men have an X and a Y chromosome—or just one copy of the protective genes, which could be “taken out” by mutation. But the problem with that hypothesis, Lane says, was a “fascinating phenomenon from basic undergraduate biology called X-inactivation.” In a female embryo, he explains, cells randomly inactivate one of the two X chromosomes. “When a female cell divides, it remembers which X chromosome is shut down, and it keeps it shut down for all of its progeny.” If female cells have only one X chromosome working at a time, then they should be just as likely as male cells to experience cancer-causing gene mutations. So Lane and his team dug deeper into existing studies and encountered a little-known and surprising finding: “There are about 800 genes on the X chromosome,” he says, “and for reasons that are still unclear, about 50 genes on that inactive X chromosome stay on.” In a “big Aha! moment,” Lane’s group realized that those gene mutations common in men with leukemia were located on genes that continue to function on women’s inactive chromosome. The researchers dubbed those genes EXITS for “Escape from X-Inactivation Tumor Suppressors.” Women, Lane explains, thus have some relative protection against cancer cells becoming cancer because they, unlike men, do have two copies of these tumor-suppressor genes functioning at all times.
Sheherzad Preisler in Tonic:
Growing human tissue is a huge challenge for researchers, even on a small scale. But some ultra-creative scientists hit on a potential solution last week when they flushed out a plant's cells and injected human cells in their place. That was how they got heart cells to beat on a spinach leaf. A major issue in tissue regeneration is creating a vascular system that ensures blood can flow to the tissue and deliver all-important oxygen and nutrients to keep the tissue alive and growing. Current techniques, including 3D printing, as innovative as it is, can't yet create the blood vessels and tinier capillaries needed in a circulatory system. But guess what's abundant and already has lots of veins? Plants, that's what. Researchers from Worcester Polytechnical Institute in Massachusetts, Arkansas State University-Jonesboro, and the University of Wisconsin-Madison hope to use plants as "scaffolds" to grow human tissue. For a proof-of-concept experiment, which will be published in the May issue of Biomaterials, WPI biomedical engineering graduate student Joshua Gerslak cleared out spinach leaves' plant cells by flushing a detergent solution through the stem.
…Down the line, researchers may be able to use this technique on multiple spinach leaves to create heart tissue, which could be grafted on to the hearts of people who've had heart attacks. (Parts of survivors' hearts have died from a lack of blood flow and no longer contract properly; other researchers are looking into using stem cells to repair this tissue.) While this is all super cool and exciting, we're many years away from any salad-based heart patches. The team was able to flush the cells out of other plants including parsley, peanut hairy roots, and sweet wormwood, and they think the technique could be adapted to work with other plants that would be a good match to grow certain types of human cells. They wrote:
"The spinach leaf might be better suited for a highly-vascularized tissue, like cardiac tissue, whereas the cylindrical hollow structure of the stem of Impatiens capensis (jewelweed) might better suit an arterial graft. Conversely, the vascular columns of wood might be useful in bone engineering due to their relative strength and geometries."
This is far from the only lab looking to the plant world for body parts: One Canadian researcher is working on making ears out of apples. The phrase "you are what you eat" suddenly takes on a whole new meaning, doesn't it?
Tuesday, March 28, 2017
Muneeza Shamsie reviews Only the Longest Threads by Tasneem Zehra Husain in Newsweek Pakistan:
Her novel is framed and juxtaposed by the growing friendship between Sara Byrne, a theoretical physicist, and Leonardo Santorini, a science journalist. They are both in Geneva on July 4, 2012, among an expectant and excited crowd, to witness a historic event: proof of the Higgs boson’s existence. This elusive subatomic particle so crucial to the understanding of the universe and its building blocks is revealed onscreen in an auditorium and becomes reality when the underground Large Hadron Collider creates such a high-speed collision of protons that it releases energy and shortlived particles, akin to the Big Bang—the birth of the universe.
Sara, heady from the jubilation of the moment, encourages Leo to move beyond the immediacy of journalism to the imaginative realms of fiction. He wants to recreate those moments of intensity and joy which impelled scientists in their search for answers. Sara says, “Theoretical physics is largely a private matter, a life lived out in the mind.” Leo captures this in the six stories he creates. In each, he employs a different narrator. In each, he welds scientific ideas of the era in which the narrator lives with the language, intonations, references, and lifestyle of that time. Hussain enhances her narrative by creating an email exchange between them that gives further context to Leo’s stories. He sends all six to her for comment in three installments. He then asks her to write the seventh one, on string theory.
Patricia Traxler in Agni:
Just to give some idea of what killing the NEA will (or more aptly, will not) accomplish, the $146 million budget of the National Endowment for the Arts represents just 0.012% (about one one-hundredth of one percent) of our federal discretionary spending. According to 2012 NEA figures, the annual budget for the arts per capita (in dollars) in Germany was $19.81; in England, $13.54; in Australia, $8.16; in Canada, $5.19, and in the United States just $0.47. Yes, 47 cents annually per capita. For all the arts combined. And the new POTUS feels that’s too much.
It would be impossible to enumerate all the programs that will likely die when the NEA and the NEH are killed, and the many people these cuts will deprive of things like public television programming and National Public Radio; school enrichment programs in the arts; and community programs to encourage music, dance, theater, visual art and literary art, literacy, and the pleasure of reading.
In September 2013, Marko Ahtisaari resigned from his position as the head of product design at Nokia. The Finnish company had just been acquired by Microsoft and Ahtisaari, the son of a former president of Finland, decided it was time to look for his next startup. He joined the MIT Media Lab shortly after, where he was introduced by Joi Ito, the Lab’s director, to Ketki Karanam, a biologist who was studying how music affects the brain. Ahtisaari was naturally interested: he grew up playing the violin and later studied music composition at Columbia University. “I used to be part of the New York scene,” Ahtisaari says. “I left to do product design and to be an entrepreneur. For 15 years I didn’t play much. I have friends who are now playing with Tom Yorke and the Red Hot Chili Peppers.”
Karanam showed Ahtisaari that there was an increasing body of evidence based on imaging studies that showed what happens to the brain when exposed to music. “It fires very broadly,” Ahtisaari. “It’s not just the auditory cortex. What happens is essentially similar to when we take psycho-stimulants. In other words, when we take drugs.”
To Ahtisaari, this indicated that music could, at least in principle, complement or even replace the effects that pharmaceuticals had on our neurology. For instance, there were studies that showed that patients with Parkinson’s disease improved their gait when listening to a song with the right beat pattern.
Video length: 1:03:50
Video length: 19:04
Holmes’s “This Long Pursuit” is itself a complement to two earlier volumes: “Footsteps: Adventures of a Romantic Biographer” (1985) and “Sidetracks: Explorations of a Romantic Biographer” (2000). All three are, essentially, collections of essays, talks, reminiscences and reviews held together by their author’s description of himself as a “romantic biographer.” That phrase carries multiple meanings: While Holmes’s field is, roughly, England in the age of Coleridge, he sometimes writes about romantic figures of other nations and periods (poet Gérard de Nerval, novelist Robert Louis Stevenson) and he himself clearly possesses an adventurous, romantic spirit.
Though few contemporary Christians would likely admit it, many of the American colonies were built upon the idea of redistribution. Those dour Puritans who first populated the territories of New England were not lured by the promise of windfall profits. Nor had they endured months of seasickness and disease for the chance to start a small business. Instead, they were hopeless utopians, runaway apostates of the established church who yearned to embrace a higher manner of being, one founded upon a system of communitarian ethics. John Winthrop, the Puritan governor of the Massachusetts Bay Colony, sketched the tenets of this new society in a sermon called “A Model of Christian Charity,” which he delivered in 1630 while on board a British ship headed across the Atlantic. A gusty ode to American exceptionalism, the homily christened the new continent “The City Upon a Hill,” a metaphor that Ronald Reagan would make a watchword for Republicans some three-hundred-and-fifty years later. But in Winthrop’s eyes what gave the New World its luster were the egalitarian principles of the Protestant gospel, central among them the commitment to redistributing wealth on the basis of individual need. “We must be willing,” Winthrop said, “to abridge ourselves of our superfluities for the sakes of others’ necessities . . . we must bear one another’s burdens.”
It is stupefying to consider how, over the course of four centuries, American Christianity would forsake these humble sentiments for the telegenic hucksterism of preachers like Joel Osteen. This Pentecostal quack with a garish smile doesn’t tout the spiritual benefits of communal interdependence. Nor does he acknowledge the ethical requirements of the Christian social contract. Instead, like so many stewards of the “prosperity gospel,” Osteen thinks individual wealth is a hallmark of Christian virtue and urges his followers to reach inside themselves to unlock their hidden potential.
My husband and I have lived in Bulgaria for six months, lived in this country often confused for other places. “You’ll have to brush up on your French,” said a friend before I left the U.S., believing me bound for Algeria. “Enjoy the northern lights,” said another. Bulgaria is one of the forgotten nations once tucked behind the Iron Curtain, its cities now stocked with crumbling Soviet tenements and silent factories and stray dogs too hungry to bark. In the winter, in Haskovo —the city where I teach English to three hundred hardened teenagers—the air thickens to a gray haze as residents burn brush and scraps of trash to heat their homes. The smoke makes me cough, makes my eyes sting, makes my thoughts turn dark.
Today, though, we have left Haskovo. We have left winter as well. The first spring blossoms are starting to show, forsythia yellowing the countryside. As the road to the Devil’s Throat continues its manic winding route through the Rhodopes, we pass the occasional village of squat red-roofed dwellings, laundry lines strung with colorful underwear like prayer flags. Chickens bustle after bugs. Kids kick soccer balls on smears of new grass.
“21 km,” says a sign.
Even in the presence of spring, I feel nervous. I can’t help imagining the ways we might die on this mountain road, squeezed between cliffs and a squalling river. It’s a bad habit of mine: envisioning worst-case scenarios.
Siddhartha Mukherjee in The New Yorker:
Explanations run shallow and deep. You have a red blister on your finger because you touched a hot iron; you have a red blister on your finger because the burn excited an inflammatory cascade of prostaglandins and cytokines, in a regulated process that we still understand only imperfectly. Knowing why—asking why—is our conduit to every kind of explanation, and explanation, increasingly, is what powers medical advances. Hinton spoke about baseball players and physicists. Diagnosticians, artificial or human, would be the baseball players—proficient but opaque. Medical researchers would be the physicists, as removed from the clinical field as theorists are from the baseball field, but with a desire to know “why.” It’s a convenient division of responsibilities—yet might it represent a loss? “A deep-learning system doesn’t have any explanatory power,” as Hinton put it flatly. A black box cannot investigate cause. Indeed, he said, “the more powerful the deep-learning system becomes, the more opaque it can become. As more features are extracted, the diagnosis becomes increasingly accurate. Why these features were extracted out of millions of other features, however, remains an unanswerable question.” The algorithm can solve a case. It cannot build a case.
Yet in my own field, oncology, I couldn’t help noticing how often advances were made by skilled practitioners who were also curious and penetrating researchers. Indeed, for the past few decades, ambitious doctors have strived to be at once baseball players and physicists: they’ve tried to use diagnostic acumen to understand the pathophysiology of disease. Why does an asymmetrical border of a skin lesion predict a melanoma? Why do some melanomas regress spontaneously, and why do patches of white skin appear in some of these cases? As it happens, this observation, made by diagnosticians in the clinic, was eventually linked to the creation of some of the most potent immunological medicines used clinically today. (The whitening skin, it turned out, was the result of an immune reaction that was also turning against the melanoma.) The chain of discovery can begin in the clinic. If more and more clinical practice were relegated to increasingly opaque learning machines, if the daily, spontaneous intimacy between implicit and explicit forms of knowledge—knowing how, knowing that, knowing why—began to fade, is it possible that we’d get better at doing what we do but less able to reconceive what we ought to be doing, to think outside the algorithmic black box?
Levi Garraway in Nature:
I first realized I'd been bitten by the science bug in the summer of 1987. I was walking home from the laboratory, mulling over an organic chemistry reaction that I had been attempting — and mostly failing — to execute. Suddenly, a notion coalesced in my 19-year-old brain: all human biology and disease must ultimately come down to reactions that either proceed properly or go awry. As I savoured the evening breeze, I knew that I wanted to dedicate my career to understanding these mechanisms and thereby to hasten new treatments. Nearly every scientist remembers moments like these. I am saddened, therefore, by the cynical view that has become increasingly common in both academia and industry: that much biomedical science, even — or perhaps especially — that which appears in 'high-profile' journals, is bogus. I am one of many scientists who have seen their past research subjected to unexpected scrutiny as a result. An attempt to replicate work from my team was among the first described by the Reproducibility Project: Cancer Biology, an initiative that independently repeated experiments from high-impact papers. In this case, as an editorial that surveyed the first replications explained, differences between how control cells behaved in the two sets of experiments made comparisons uninformative1. The replicators' carefully conducted experiment showed just how tough it can be to reproduce result.
...We scientists search tenaciously for information about how nature works through reason and experimentation. Who can deny the magnitude of knowledge we have gleaned, its acceleration over time, and its expanding positive impact on society? Of course, some data and models are fragile, and our understanding remains punctuated by false premises. Holding fast to the three Rs ensures that the path — although tortuous and treacherous at times — remains well lit.
Monday, March 27, 2017
by Paul Bloomfield
The decision guaranteeing abortion rights in the United States, found in Roe v. Wade (1973), was based on a right to privacy, which the court found to be primarily protected by the Fourteenth amendment's "concept of personal liberty and restrictions upon state action" and the Ninth amendment's "reservation of rights to the people". While it is not discussed at any length, the First amendment is cited in relation to the freedom of speech, most substantially as subsidiary foundation for the right to privacy, established by Stanley v. Georgia (1969). Religion played no role in Roe v. Wade, though it has arguably played a direct role in Planned Parenthood v. Casey (1992). There, the majority's decision plainly states, "The destiny of the woman must be shaped to a large extent on her own conception of her spiritual imperatives and her place in society." One might naturally read this as an expression of "religious liberty" and an implication of the non-establishment clause of the first Amendment of the Constitution, stating that "Congress shall make no law respecting an establishment of religion".
Despite this, "religious liberty" has come to the fore most forcefully in recent years as a contrary banner under which some religiously minded people insist that the First amendment's protection against laws "prohibiting the free exercise" of religion secures the right to refuse various services to homosexuals and to deny homosexual couples the right to marry. The free exercise clause is invoked in the Supreme Court case Burwell v. Hobby Lobby (2014), in a decision finding that corporations need not pay for employees' contraception. It is worth noting that Neil Gorsuch, the current nomination to the Supreme Court, was an author of the appellate decision that was upheld in Burwell. But as important as the "free exercise" clause is, it must be balanced against the "non-establishment" clause, which precedes it in the document as the first clause in the amendment.
Okay, poets, we get it: things are like other things
...... —A. R. & M. G.
Ah, But Math is Like That Too
When poets are so dissed
by engineers and physicists
they really should consider this:
(4+2) is just like 6
and keeping that in mind
81’s like the square of 9
and in case you think these
are a poet's tricks,
√36 is too like 6
(in this, poetry’s like
In fact, when quantities and things align
like is like an equal sign
and, what’s more,
(4×4) is 16’s metafour
by Dwight Furrow
In philosophy the most important development in the last 300 years has been the idea that what can be intelligibly said about reality is constructed out of our subjective responses, suitably constrained by social norms and intersubjective communication. This is the essence of Immanuel Kant's so-called Copernican Revolution in philosophy which converted us from naïve realists who took reality at face value to sophisticated anti-realists constructing reality via the structures of consciousness and language.
Kant's argument is sound but preposterous. One would have thought that reality's stubborn resistance to our ideas and expectations and the fact we are often surprised by this resistance might lead us to take the idea of a real world more seriously. The performative contradiction of claiming all reality is a social construction while traipsing off to the doctor when ill renders truth and knowledge the exclusive purview of scientists who have never shown much inclination toward anti-realism. But once these "naïve" realist thoughts are cast out in favor of Kant's fastidious, critical skepticism, common sense can't find a way back in. And so for 300 years we have been denying what to non-philosophers seems obvious—there is a real world out there with which our senses put us into contact.
In light of this revolution in thought we were, by now, supposed to be basking in the friendly solidarities of intersubjective agreement, a consequence that unfortunately appears to be increasingly remote. This idea that reality is a social construction ebbs and flows outside the philosophy class but in today's "post-truth" society it seems ascendant. Perhaps a new way must be found to anchor truth in something more substantial than contingent, collective agreements.
Asad Raza. Root Sequence. Mother Tongue. Whitney Biennial, 2017.
Installation: 26 young, potted trees, tools, and caretakers.
"In economics, the majority is always wrong."
~ JK Galbraith
One of the unfortunate gifts of the current, star-crossed administration is that there's something for everyone that will get their knickers in a twist. If immigration or climate change isn't your thing, just wait a few days, and some administration official will come out with a statement that lands somewhere in the space between spectacularly ignorant or merely deeply ill-considered. My latest opportunity to double-take arrived a few days ago, when Secretary of the Treasury (and Goldman Sachs alum) Steven Mnuchin opined that the threat of artificial intelligence to employment is "not even on my radar screen".
To be fair, the clip is brief enough that it is difficult to conclude whether or not Mnuchin knows what he is talking about. Too often when we talk about technology we fixate on one aspect of it, and intend (although not always) that this aspect stands in for the entirety of the technological phenomenon. These days, favored metonymies are ‘AI', along with ‘robots' and ‘algorithms'. Keeping this in mind while listening to the Mnuchin clip, it's unclear what he actually means when referring to AI. Although I suspect he's talking about the holy grail of AI, which is artificial general intelligence, or an AI that is indistinguishable from human intelligence.
If that is the case, then he did a disservice to the question, which was about the impact of AI on employment. Or, if you'll allow me to pluck out the metaphor, the impact of technology on employment, which is much more amorphous. Mnuchin's dodge was to say that, since we won't have human-equivalent AI for the foreseeable future, it's something that's not worth thinking about, at least until it happens. Come to think of it, I've heard this dodge before, mostly from the mouths of climate change skeptics and deniers. In both cases, the purpose is to obfuscate and delay until the truly catastrophic comes to pass, then innocently maintain that "no one could have seen this coming" or some such nonsense.
by Tamuira Reid
The day Luna went mad her mother thought, finally. The signs had been there, hanging around at the dinner table, in the bathroom where she ironed her hair.
It had waited patiently in the corner of a room, under a chair, in the oven with the bread. Now they wouldn't need to wonder when it would all fall apart because it just had.
The day Luna went mad she was wearing pink lipstick. Her legs were waxed and smoothed down with cocoa butter because she was religious about that kind of thing. Never know who you're gonna see, she'd say, sliding a gold hoop through each ear.
It happened slowly and over a period of time. Shop closed. Her mind just closed-up on her. Went out of business.
Luna sang to the plants as she watered them. Would be normal except she thought she heard them sing back. Her mother turned up the radio and hung wet nylons from the fire escape.
It's hard to talk about it, when it's your daughter.
The emptiness in her eyes scared her mother. The empty blackness of her eyes. They held nothing but crazy and she knew that. And somewhere deep inside, her daughter knew what was happening too but she couldn't stop it.
The police said they had found her in the fetal position, on a sidewalk in Times Square. She was licking her arms like a cat. Her clothes sat next to her in a pile, perfectly folded. She wanted to go home if that was okay.
Your hair is perfect. Sit.
Her mother sat her down at the table and did what she did best. Fed her. A hot plate of arroz con pollo, a Malta, tostones with the heat still rising off them.
Something is happening to me, she said and stared out the window. A plastic bag floated by, white and ripped on one side.
by Brooks Riley
by Daniel Ranard
In the twentieth century, two important ideas arose with a nominal similarity: Einstein's theory of relativity on the one hand, and the idea of cultural or moral relativism on the other. It's probably fruitless to draw parallels between concepts that arose at distant ends of the intellectual spectrum—the hard sciences versus the humanistic disciplines—but sometimes you can't help yourself: "relativity" is right there in the name. In 1905, Einstein declared that certain facts about space and time are only true relative to a particular person or reference frame. In subsequent decades, philosophical "relativists" argued that questions of what is moral, what is true, or even what exists can only be answered relative to individuals or groups. Of course, Einstein's theory proved to be right, while the philosophical strand of relativism has evolved into a variety of contentious ideas.
First I will focus on Einstein's relativity, before touching on relativism in philosophy. To me, the story begins with two opposing accounts of what physics is. According to one account, physics provides an objective description of the world itself, like an encyclopedia entry on "The Universe." The encyclopedia tells you what stuff the world is made of and how that stuff behaves. This approach might be called the realist approach: physics presents objective facts about the real world.
Others prefer an "operationalist" account that focuses on the individual. By this account, physics is simply a collection of rules telling the individual what to expect in various circumstances. It's like a personal guidebook for experience: it predicts what you will observe when you follow various experimental procedures, like following a recipe in a laboratory. Unlike the realist's encyclopedia entry, the operationalist's guidebook does not attempt to describe the real world objectively. Instead, it prescribes how your experience should lead you to predict future experiences, using your own observations. Operationalists avoid referring to fundamental aspects of nature, like mass or length. To the operationalist, the length of an object is not some fundamental property – it's just a number you observe when you measure the object with a ruler, and any notion of length must be accompanied by well-specified procedure for how to measure it.
by Scott F. Aikin and Robert B. Talisse
We all have had moments when we feel that those with whom we disagree not only reject the point we are focused on at the moment, but also reject our values, general beliefs, modes of reasoning, and even our hopes. In such circumstances, productive critical conversation seems impossible. For the most part, in order to be successful, argument must proceed against the background of common ground. Interlocutors must agree on some basic facts about the world, or they must share some source of reasons to with they can appeal, or they must value roughly the same sort of outcome. And so, if two parties disagree about who finished runners-up to Leister City in their historic BPL win last year, they may agree to consult the league website, and that will resolve the issue. Or if two travelers disagree about which route home is better, one may say, "Yes, your way is shorter, but it runs though the traffic bottleneck at the mall, and that adds at least ten minutes to the journey." And that may resolve the dispute, depending perhaps on whether time is what matters most.
But some disagreements invoke deeper disputes, disputes about what sources are authoritative, what counts as evidence, and what matters. Such disputes quickly become argumentatively strange. And so if someone does not recognize the authority of the soccer league's website about last year's standings, it is unclear how a dispute over last year's runners-up to Leister City could be resolved. What might one say to a disputant of this kind? Does he trust news sites, television reporting, or Wikipedia entries concerning the BPL? Does he regard the news sites and the league website as reliable sources of information concerning this year's standings or when the games are played? What if our interlocutor in the route-home case doesn't see why the quickest route is preferable to the shortest? Maybe our traveling companion regards our hurry-scurry as a part of a larger social problem, or maybe wants to enjoy the Zen of a traffic jam. Sometimes a disagreement about one thing lies at the tip of a very large iceberg of composed of many other, deeper, disagreements.