Monday, October 27, 2014
by Mathangi Krishnamurthy
Long years ago, when I waddled around in pigtails, I said aloud the magic words that for many years characterized how I felt about the world, my world. "I will settle in America", I said. Neither did I know how heavy "settling" can be nor was I clued into the power of words. Carelessly, toddler-ly, I threw around that which would one day make my world.We didn't say politically correct things then. As far as we all knew, all of the Americas was North America, and all of North America was the US. My father had just returned from travels to the US, and he had brought back suitcases spilling over with things guaranteed to charm curmudgeonly three year olds.
America was then not only an idea but an escape. I was charmed into thinking that going to America indicated not only the newness of a world, but a not-ness of the one I inhabited. No school, no dreary days, no strange scapes of a scary adult world with its inexplicable sorrows and forbidding rules. America was fabulous, with its flowery denims, and video games, and automatic erasers. I was mesmerized by View-Masters, with their otherworldly scuffed gaze onto so-near foreign shores.
These were the eighties. India was a sovereign, socialist, secular, democratic republic with one, and later two, television channels. We all read the national pledge aloud in school, that went something to the effect of "India is my country and all Indians are my brothers and sisters". We all suffered one heckler in every class who would mutter sotto voce "Well who do I marry then?" We received our news from singular sources and imagined our leaders sovereign, if ineffectual. We trusted secularism, even if in its often troubled avatar, tolerance. We muddled through power cuts, and ration cards, and held onto a quiet, steely middle-classness. Benedict Anderson would have pronounced us a truly well-imagined nation; or at least, some of us.
In this world, America's otherness beckoned ever so strongly with its free love (read sex), and rampant spending; with its alter-egoness of individualism and seeming control over the world. But India allied with the USSR. The mythical Russia communicated to us only held Mathematics books, fairy tales, and War and Peace in stock. I hated math, much preferred the Brothers Grimm, and to date, am at odds with the melancholies of Tolstoy.
Sunday, October 26, 2014
A potent theory has emerged explaining a mysterious statistical law that arises throughout physics and mathematics
Natalie Wolchover in Quanta:
Imagine an archipelago where each island hosts a single tortoise species and all the islands are connected — say by rafts of flotsam. As the tortoises interact by dipping into one another’s food supplies, their populations fluctuate.
In 1972, the biologist Robert May devised a simple mathematical model that worked much like the archipelago. He wanted to figure out whether a complex ecosystem can ever be stable or whether interactions between species inevitably lead some to wipe out others. By indexing chance interactions between species as random numbers in a matrix, he calculated the critical “interaction strength” — a measure of the number of flotsam rafts, for example — needed to destabilize the ecosystem. Below this critical point, all species maintained steady populations. Above it, the populations shot toward zero or infinity.
Little did May know, the tipping point he discovered was one of the first glimpses of a curiously pervasive statistical law.
The law appeared in full form two decades later, when the mathematicians Craig Tracy and Harold Widom proved that the critical point in the kind of model May used was the peak of a statistical distribution. Then, in 1999, Jinho Baik, Percy Deift and Kurt Johansson discovered that the same statistical distribution also describes variations in sequences of shuffled integers — a completely unrelated mathematical abstraction. Soon the distribution appeared in models of the wriggling perimeter of a bacterial colony and other kinds of random growth. Before long, it was showing up all over physics and mathematics.
“The big question was why,” said Satya Majumdar, a statistical physicist at the University of Paris-Sud. “Why does it pop up everywhere?”
Pankaj Mishra in the New York Times:
India, V.S. Naipaul declared in 1976, is “a wounded civilization,” whose obvious political and economic dysfunction conceals a deeper intellectual crisis. As evidence, he pointed out some strange symptoms he noticed among upper-caste middle-class Hindus since his first visit to his ancestral country in 1962. These well-born Indians betrayed a craze for “phoren” consumer goods and approval from the West, as well as a self-important paranoia about the “foreign hand.” “Without the foreign chit,” Mr. Naipaul concluded, “Indians can have no confirmation of their own reality.”
Mr. Naipaul was also appalled by the prickly vanity of many Hindus who asserted that their holy scriptures already contained the discoveries and inventions of Western science, and that an India revitalized by its ancient wisdom would soon vanquish the decadent West. He was particularly wary of the “apocalyptic Hindu terms” of such 19th-century religious revivalists as Swami Vivekananda, whose exhortation to nation-build through the ethic of the kshatriya (the warrior caste) has made him the central icon of India’s new Hindu nationalist rulers.
Despite his overgeneralizations, Mr. Naipaul’s mapping of the upper-caste nationalist’s id did create a useful meme of intellectual insecurity, confusion and aggressiveness. And this meme is increasingly recognizable again. Today a new generation of Indian nationalists lurches between victimhood and chauvinism, and with ominous implications.
Julian Assange in Newsweek:
Eric Schmidt is an influential figure, even among the parade of powerful characters with whom I have had to cross paths since I founded WikiLeaks. In mid-May 2011 I was under house arrest in rural Norfolk, England, about three hours’ drive northeast of London. The crackdown against our work was in full swing and every wasted moment seemed like an eternity. It was hard to get my attention.
But when my colleague Joseph Farrell told me the executive chairman of Google wanted to make an appointment with me, I was listening.
In some ways the higher echelons of Google seemed more distant and obscure to me than the halls of Washington. We had been locking horns with senior U.S. officials for years by that point. The mystique had worn off. But the power centers growing up in Silicon Valley were still opaque and I was suddenly conscious of an opportunity to understand and influence what was becoming the most influential company on earth. Schmidt had taken over as CEO of Google in 2001 and built it into an empire.
I was intrigued that the mountain would come to Muhammad. But it was not until well after Schmidt and his companions had been and gone that I came to understand who had really visited me.
Evgeny Morozov in The New Yorker:
In June, 1972, Ángel Parra, Chile’s leading folksinger, wrote a song titled “Litany for a Computer and a Baby About to Be Born.” Computers are like children, he sang, and Chilean bureaucrats must not abandon them. The song was prompted by a visit to Santiago from a British consultant who, with his ample beard and burly physique, reminded Parra of Santa Claus—a Santa bearing a “hidden gift, cybernetics.”
The consultant, Stafford Beer, had been brought in by Chile’s top planners to help guide the country down what Salvador Allende, its democratically elected Marxist leader, was calling “the Chilean road to socialism.” Beer was a leading theorist of cybernetics—a discipline born of midcentury efforts to understand the role of communication in controlling social, biological, and technical systems. Chile’s government had a lot to control: Allende, who took office in November of 1970, had swiftly nationalized the country’s key industries, and he promised “worker participation” in the planning process. Beer’s mission was to deliver a hypermodern information system that would make this possible, and so bring socialism into the computer age. The system he devised had a gleaming, sci-fi name: Project Cybersyn.
Beer was an unlikely savior for socialism. He had served as an executive with United Steel and worked as a development director for the International Publishing Corporation (then one of the largest media companies in the world), and he ran a lucrative consulting practice. He had a lavish life style, complete with a Rolls-Royce and a grand house in Surrey, which was fitted out with a remote-controlled waterfall in the dining room and a glass mosaic with a pattern based on the Fibonacci series. To convince workers that cybernetics in the service of the command economy could offer the best of socialism, a certain amount of reassurance was in order. In addition to folk music, there were plans for cybernetic-themed murals in the factories, and for instructional cartoons and movies. Mistrust remained. “CHILE RUN BY COMPUTER,” a January, 1973, headline in the Observer announced, shaping the reception of Beer’s plan in Britain.
At the center of Project Cybersyn (for “cybernetics synergy”) was the Operations Room, where cybernetically sound decisions about the economy were to be made. Those seated in the op room would review critical highlights—helpfully summarized with up and down arrows—from a real-time feed of factory data from around the country. The prototype op room was built in downtown Santiago, in the interior courtyard of a building occupied by the national telecom company. It was a hexagonal space, thirty-three feet in diameter, accommodating seven white fibreglass swivel chairs with orange cushions and, on the walls, futuristic screens. Tables and paper were banned. Beer was building the future, and it had to look like the future.
That was a challenge: the Chilean government was running low on cash and supplies; the United States, dismayed by Allende’s nationalization campaign, was doing its best to cut Chile off. And so a certain amount of improvisation was necessary.
Chelsea Wald in Slate:
The six horses in a 2002 study were “known weavers.” When stabled alone, they swayed their heads, necks, forequarters, and sometimes their whole bodies from side to side. The behavior is thought to stem from the social frustration brought on by isolation. It can be seen in a small percentage of all stabled horses, and owners hate it—they think it causes fatigue, weight loss, and uneven muscle development, and it looks disturbing. People had tried stopping the weaving by installing metal bars that limit a horse’s movement, but the study found that a different modification to the stable worked surprisingly well: a mirror. “Those horses with the mirror were rarely [observed] weaving,” the researchers reported. A later study even found that the mirror worked just as well as the presence of another horse.
Studies have shown that mirrors can improve the lives of a variety of laboratory, zoo, farm, and companion animals. Isolated cows and sheep have lower stress reactions when mirrors are around. With mirrors, monkeys alone or in groups show a healthy increase in social behaviors such as threats, grimaces, lip-smacking, and teeth chattering, and laboratory rabbits housed alone are also more active. Mirrors in birdcages reduce some birds’ fear. Gordon Gallup invented the test that shows whether an animal recognizes itself in the mirror: He marked primates’ faces and ears with dye and watched whether they used a mirror to investigate the spots. If they did, it revealed that the animals understood that the faces in the mirror were their own. But he thinks that most animals probably think of their reflections as another animal. The calming effect in some cases could come partly from the reflection’s apparent mimicking. “The animal confronting its own reflection in a mirror has complete control over the behavior of the image, and therefore the image is always attentive and ready to reciprocate when the animal is,” he and Stuart Capper wrote in 1970. In other words, the mirror image is sort of like a friend who always does exactly what you want.
Jared Diamond: ‘150,000 years ago, humans wouldn’t figure on a list of the five most interesting species on Earth’
Oliver Berkeman in The Guardian:
Most people would be overjoyed to receive one of the MacArthur Foundation’s annual “genius grants” – around half a million dollars, no strings attached – but when Jared Diamond won his, in 1985, it plunged him into a depression. At 47, he was an accomplished scholar, but in two almost comically obscure niches: the movement of sodium in the gallbladder and the birdlife of New Guinea. “What the MacArthur call said to me was, ‘Jared, people think highly of you, and they expect important things of you, and look what you’ve actually done with your career’,” Diamond says today. It was a painful thought for someone who recalled being told, by an admiring teacher at his Massachusetts school, that one day he would “unify the sciences and humanities”. Clearly, he needed a larger canvas. Even so, few could have predicted how large a canvas he would choose.
In the decades since, Diamond has enjoyed huge success with several “big books” – most famously, 1997’s Guns, Germs and Steel – which ask the most sweeping questions it is possible to ask about human history. For instance: why did one species of primate, unremarkable until 70,000 years ago, come to develop language, art, music, nation states and space travel? Why do some civilisations prosper, while others collapse? Why did westerners conquer the Americas, Africa and Australia, instead of the other way round? Diamond, who describes himself as a biogeographer, answers them in translucent prose that has the effect of making the world seem to click into place, each fact assuming its place in an elegant arc of pan-historical reasoning. Our interview itself provides an example: one white man arriving to interview another, in English, on the imposing main campus of the University of California, Los Angeles, in a landscape bearing little trace of the Native Americans who once thrived here. Why? Because 8,000 years ago – to borrow from Guns, Germs and Steel – the geography of Europe and the Middle East made it easier to farm crops and animals there than elsewhere.
Hazards of Hindsight
For a moment
prudence and reconsideration
Hindsight dry-cleans your speech
Forget caution and correction
don’t render me speechless with your reason –
all I want from you is a quick artless response
that knocks judgement off into history’s oblivion
only then I'll get a pure no, a simple yes from you
not the elusive past, I wasn’t a part of
To make any sense of history
I need an artless response
In its freshness
I can see better
the peanuts enclosed in the sturdy shell
the fresh oil in its ripened seeds.
by Monika Kumar
from Samalochan, 2012
translation by author
Saturday, October 25, 2014
To what extent is the price of immortality humanity, as you could put it? Must the revolutionary artist ignore — even flout — the basic laws of decency that govern our world in order to transform that world? “Perfection of the life, or of the work,” as Yeats had it. “And if it take the second,” he went on, the intellect of man “must refuse a heavenly mansion, raging in the dark.”
It was an ancient question even then, but somehow every other book I’ve been reading of late comes back to it. Walter Isaacson’s unbiddable 2011 biography of Steve Jobs presents his subject as a kind of Lee Kuan Yew of the tech industry, demanding we give up our ideas of democracy and control in exchange for a gorgeously designed new operating system. Innovation doesn’t have to be so dictatorial: Albert Einstein, the subject of Isaacson’s previous biography, is revered in part for his readiness to defer to what he didn’t understand. Yet the more we read about Jobs publicly humiliating colleagues and refusing to acknowledge responsibility for the birth of his first child, the more we see that his genius could seem inextricable from his indifference to social norms.
Assaf Gavron's 2010 novel "Almost Dead" does something I would have thought impossible — it makes satire out of terrorism. The story of a man who becomes an Israeli national hero after surviving three attacks in a single week, the book offers a sharply ironic look at the intersection of image and reality.
This character is no role model; he's a guy in the wrong place at the right time. Gavron, who was born near Jerusalem and lives in Tel Aviv, is suggesting that we are all of us (citizens, nations, even, to some extent, terrorists) making it up as we go along.
A similar sensibility centers "The Hilltop," Gavron's seventh book, although only the second (after "Almost Dead") to appear in the United States. A sprawling novel that revolves around a small settlement in the occupied territories, its focus is less satirical than absurdist, offering a middle vision between the ridiculous and the sublime.
Perhaps of greater importance to Thomas’s poetry, however, was the wider cultural landscape of 1930s Wales, and Thomas’s geographical and familial location within it. Thomas’s parents were the personification of the intellectual and industrial movement from rural to urban that characterised Wales in the early 20th century. Both originated from Welsh-speaking families of agricultural and religious occupation. The young Thomas, a listener “in love with words”, found himself at the centre of a linguistic and cultural maelstrom. The languages of both Welsh and English informed his ear, just as both the streets of Swansea and the fertile fields of the Llanstephan peninsula informed his eye.
In his now famous notebooks, Thomas’s search for a poetic voice can be traced as if following his route on a map. In these notebooks, he passes through a period of derivative free verse before evolving his poems into the grander, more visceral and patterned work the world met just a few years later when he published 18 Poems. Although the book itself was modest, even retiring – no jacket copy and no author portrait, both on Thomas’s request – inside, the poems themselves were the polar opposite. Bold, physical and sonorous, they have been described by some critics as “biomorphic”. Thomas once wrote: “Every idea, intuitive or intellectual, can be imaged and translated in terms of the body, its flesh, skin, blood, sinews, veins, glands, organs, cells and senses.
Priscilla Gilman in The New York Times:
On Dec. 28, 1817, Benjamin Robert Haydon, then England’s pre-eminent history painter, hosted a dinner party to celebrate his progress on his latest work, “Christ’s Entry Into Jerusalem.” He invited, among others, three men anachronistically pictured in that painting: John Keats, William Wordsworth and the essayist Charles Lamb. In “The Immortal Evening” (the phrase is from Haydon’s letters and diaries), the poet and biographer Stanley Plumly offers an idiosyncratic, heartfelt, at once sinuous and expansive exploration of the dinner, its “aesthetic context and the larger worlds of the individual guests, particularly the three ‘immortal’ writers, Keats, Wordsworth and Lamb.”
Plumly begins the story strikingly, elliptically, in the present tense: “Keats has the most ground to cover.” What the walk to the dinner was like for each of its major participants, the look and feel of Regency London, what kind of food they would have eaten, all come to vivid life in Plumly’s evocative rendering. But if it presents a historical place and moment with immediacy and “present-tense personal intensity” (as Plumly says of Romantic art), “The Immortal Evening” also tackles timeless questions: “How does a living moment in time become ‘immortal’? What are a painting’s terms of immortality?” (Or, for that matter, a poem’s?) Why are some artists remembered and some forgotten?
Neal Hartman in Nautilus:
To those who say that there is no room for genius in modern science because everything has been discovered, Fabiola Gianotti has a sharp reply. “No, not at all,” says the former spokesperson of the ATLAS Experiment, the largest particle detector at the Large Hadron Collider at CERN. “Until the fourth of July, 2012 we had no proof that nature allows for elementary scalar fields. So there is a lot of space for genius.”
She is referring to the discovery of the Higgs boson two years ago—potentially one of the most important advances in physics in the past half century. It is a manifestation of the eponymous field that permeates all of space, and completes the standard model of physics: a sort of baseline description for the existence and behavior of essentially everything there is.
By any standards, it is an epochal, genius achievement.
What is less clear is who, exactly, the genius is. An obvious candidate is Peter Higgs, who postulated the Higgs boson, as a consequence of the Brout-Englert-Higgs mechanism, in 1964. He was awarded the Nobel Prize in 2013 along with Francois Englert (Englert and his deceased colleague Robert Brout arrived at the same result independently). But does this mean that Higgs was a genius? Peter Jenni, one of the founders and the first “spokesperson” of the ATLAS Experiment Collaboration (one of the two experiments at CERN that discovered the Higgs particle), hesitates when I ask him the question.
Farahnaz Ispahani Nina Shea in The Weekly Standard:
Pakistan’s blasphemy law, which turns 30 this year, has become only more deadly with age. Since blasphemy was made a capital crime under the nation’s secular penal code, the effect has been to suppress moderate influences, pushing “Pakistani society further out on the slippery slope of extremism,” said Mujeeb-ur-Rahman, senior advocate at the Supreme Court of Pakistan, in Washington last week. With its large population and sensitive location, Pakistan is a place where any societal shift in the direction of the Taliban deserves the attention of all concerned about Islamic extremism. Instead, this is one more foreign threat that the Obama administration underestimates.
On October 16, for the first time, an appeals court affirmed a death sentence for blasphemy meted out to a woman. A Christian mother of five, Asia Bibi was arrested in 2009 after fellow field hands complained that, during a dispute, she had insulted the prophet of Islam. No evidence was produced, because to repeat blasphemy is blasphemous. Similarly, anyone who defends an accused blasphemer risks being labeled a blasphemer; two officials who made appeals on Bibi’s behalf—Salman Taseer, governor of Punjab, and Shahbaz Bhatti, federal minister for minorities affairs—were assassinated in 2011. Bibi has one last legal recourse, an appeal to the federal Supreme Court, but now no public official dares speak up for her—or for any other blasphemy defendant.
Accusations of blasphemy are brought disproportionately against Pakistan’s Christians, some 2 percent of the population. Intent is not an element of the crime, and recent years have seen cases brought against illiterate, mentally disabled, and teenage Christians. Each case seems to heighten the sensitivities of the extremists and further fracture society. The flimsiest rumor of a Koran burning can spark hysteria ending in riots against entire Christian communities. Lahore’s St. Joseph Colony was torched last year in such a pogrom.
Moustafa Bayoumi, Kayla Epstein, Alan Yuhas, and Eli Valley in The Guardian:
As this panel’s Orthodox Jewish participant, I’m aware that I’ve been asked to participate for a very specific purpose: to bring the bearing of my religious and cultural upbringing to the question of whether The Death of Klinghoffer is antisemitic.
So let’s just get that out of the way: the answer is no.
I can understand why some would jump to to that conclusion, especially if they haven’t seen the opera. Klinghoffer forces Jewish audiences to confront some uncomfortable aspects of Israel’s history, and to relive a tragic chapter in the history of the Israeli-Palestinian conflict. There are some antisemitic lines delivered by one of the hijackers: “America is one big Jew,” he sneers at his cowering captives. But his brutal actions and the shrill, frenzied music that accompanies his words so clearly prove him a villain that it’s ridiculous to say composer John Adams and his librettist Alice Goodman are promoting that view.
Though two opposing sides are given the opportunity, over three hours, to present their narratives, it’s crucial to remember who gets the last word. Klinghoffer ends with a beautiful, heartbreaking aria by his widow Marilyn, who has just learned of the death of her husband. “I wanted to die,” she cries out. The finale lays bare the suffering and anguish that terrorism and antisemitism has wrought.
So why the outrage? Because there’s a lot of ignorance out there drowning out the facts about what this opera is about.
Sitting up with a yawn,
Rolling up the tattered mat,
Tucking up the torn mundu,
Walking along the hedges.
Not for a lark.
The muddy fields grimace,
The cows wag their tails.
Where is that long night –
The one they sang their fervent hymns about,
The one they said spring thunder
Would light up with brilliant flashes
Before the great new dawn arrived?
Hate, anger –
On racing pulses.
They stood leaning against the good old walls,
The graying firebrands.
Out of the dry, cracked, poetry-less soil they had sprung.
Drained by the waters of compassion
They had grown dreams on their bodies.
They now watch
As texts are served on a platter.
by Raghavan Atholi
from Poetry International Web
Over at the Columbia University Press Blog, an interview with Herve This on his new book:
Question: How does note-by-note cooking differ from molecular gastronomy?
Herve This: Molecular gastronomy is a scientific activity, not to be confused with molecular cooking. Indeed, molecular gastronomy, being science, has nothing to do with cooking. In other words, science is not about making dishes. Science looks for the mechanism of phenomena. That’s all. And technology uses the results of science to improve technique. So, note-by-note cooking is a technique.
Another question could be, how is note-by-note cooking different from molecular cooking? And here the answer would be that the definition of molecular cooking is “to cook using modern tools” (such as siphons, liquid nitrogen, etc.). But you still use meat, vegetables, etc. However, with note-by-note cooking, the instruments are not important, and the big revolution is to cook with pure compounds, instead of meat, vegetables, fruits, eggs, etc.
Q: Where does the name Note-by-Note Cooking come from?
HT: In 1999, when I introduced the name “molecular cooking,” I was upset, because it was a bad choice, which had to be made for many complex reasons. Unfortunately, people now confuse molecular gastronomy and molecular cooking. So, For note-by-note cooking, I wanted a name that could appeal to artists and it’s fair to say that note-by-note cooking is comparable to a term such as electro-acoustic music.
Q: Won’t not-by-note cooking produce artificial forms of food?
HT: Yes, but all food is “artificial”! Do you think that barbecue meat hangs “naturally” on the trees of the wild forest? Or that French fries appear suddenly from potatoes? No, you need a cook, to make them. In ordinary language, “natural” means “what was not transformed by human beings”, and “artificial” means that it was transformed, it was the result of human “art”.
Friday, October 24, 2014
Gary Wills in the New York Review of Books:
People are amazed or disgusted, or both, at today’s “power of the media.” The punch is in that plural, “media”—the twenty-four-hour flow of intermingled news and opinion not only from print but also from TV channels, radio stations, Twitter, e-mails, and other electronic “feeds.” This storm of information from many sources may make us underestimate the power of the press in the nineteenth century when it had just one medium—the newspaper. That also came at people from many directions—in multiple editions from multiple papers in every big city, from “extras” hawked constantly in the streets, from telegraphed reprints in other papers, from articles put out as pamphlets.
Every bit of that information was blatantly biased in ways that would make today’s Fox News blush. Editors ran their own candidates—in fact they ran for office themselves, and often continued in their post at the paper while holding office. Politicians, knowing this, cultivated their own party’s papers, both the owners and the editors, shared staff with them, released news to them early or exclusively to keep them loyal, rewarded them with state or federal appointments when they won.
It was a dirty game by later standards, and no one played it better than Abraham Lincoln. He developed new stratagems as he rose from citizen to candidate to officeholder. Without abandoning his old methods, he developed new ones, more effective if no more scrupulous, as he got better himself (and better situated), for controlling what was written about him, his policies, and his adversaries.
Rebecca Morelle at the BBC:
Now, the rest of the dinosaur's body has been unearthed, and researchers say that the creature is even more bizarre than they had thought.
They say it was huge, with a beak, a humped back and giant, hoofed feet.
The study is published in the journal Nature.
Lead researcher Yuong-Nam Lee, from South Korea's Institute of Geoscience and Mineral Resources (Kigam), said: "It turned out to be one of the weirdest dinosaurs, it's weird beyond our imagination."
Brendan Fitzgerald in The Morning News:
A man dies, leaving behind, among other things, a combination lock. Opening it may just prove the existence of the afterlife.
I first learned about Stevenson through his obituary, which ran in the New York Times in 2007. My wife and I have a habit of sharing interesting obituaries with each other. I sent Stevenson’s to her, with a few sentences highlighted: “Tucked away in a file cabinet in the Division of Perceptual Studies is an ordinary combination lock, which Dr. Stevenson bought and locked nearly 40 years ago. He had set the combination himself.” A colleague told the Times that Stevenson hoped to communicate the combination to a friend or loved one, who could open the lock and prove that some part of Stevenson had survived death. Ian Stevenson, who was born in 1918, spent about half his life studying reincarnation and past-life memories. He documented thousands of stories from young men and women who claimed to recall previous lives, whose birthmarks resembled fatal wounds sustained by others. Perhaps Stevenson’s death provided him with the answer he sought for most of his life. For me, it simply rekindled the question I’d struggled with for most of mine: What happens when we die?
...The Combination Lock Test for Survival began the same year Stevenson founded the Division of Perceptual Studies. Stevenson had heard a number of stories from colleagues about men and women opening combination locks set by deceased loved ones. In one instance, a widow opened a locked box, though she claimed her husband was the only person who knew the combination. In another, a widower asked a medium to help him open a lock set by his deceased wife, and the medium recounted the correct combination. Perhaps, after the death of its keeper, a combination might be communicated, its lock unclasped, its mouth opened. Stevenson purchased a Sargent & Greenleaf model combination lock from Brown’s Lock & Safe, a Charlottesville locksmith located a few miles from his home. He set its password then placed the lock in a drawer inside his office. “In setting my own locks, I started not with meaningless random numbers, but with a word or phrase that is extremely meaningful to me,” Stevenson said at the time. “I have no fear whatever of forgetting it on this side of the grave and, if I remember anything on the other side, I shall surely remember it.” Though, on one occasion, Stevenson opened his own lock—in 1975, to check either the lock’s reliability or his own—then closed it again. The Division of Perceptual Studies collected 10 more locks—a duplicate for Stevenson, and nine others. Upon news of the death of a lock’s owner, Stevenson awaited contact from family or friends who might have received some suggestion of a combination, or might have visited a medium for help contacting the dead. When a colleague and fellow combination lock owner died in 1979, Stevenson received more than 40 letters that suggested combinations, and consulted two mediums himself, his colleague’s lock in his pocket. All combinations failed.
Sophia Nguyen in Harvard Magazine:
In the early 2000s, a riptide of business scandals toppled Enron, Arthur Andersen, and WorldCom. In the aftermath, says Straus professor of business administration Max Bazerman, “society turned to professional schools” to ask why their graduates were misbehaving. Behavioral ethics—combining aspects of moral philosophy, cognitive science, psychology, and economics—was born: “a creation of the new millennium.” As a teacher in this field, Bazerman explains, “My job is not about what ethics you follow, but how to bring you up to your own ethical standards.”
...In his book, Bazerman highlights promising directions in behavioral decision research that have the potential to promote more effective and more ethical noticing. One involves “choice architecture,” a term coined by Richard H. Thaler and Walmsley University Professor Cass R. Sunstein in their 2008 book Nudge: Improving Decisions about Health, Wealth, and Happiness. Choice architecture taps knowledge of psychology to identify better ways to present options—and Bazerman asserts that organizations can use it to create systems that increase the likelihood of their staff noticing key data. In a study he conducted with Kennedy School colleagues Alexandra van Geen and professor of public policy Iris Bohnet, supervisors were asked to assess a pool of job candidates. When judging applicants one at a time, they tended to favor men on quantitative tasks and women on verbal tasks. But when judging male and female candidates side-by-side, they relied on performance-related data; gender biases no longer factored into the decision. Changing the structure of the hiring process encouraged people to pay attention to the important information.