Wednesday, May 07, 2014
Dramatised in the 1957 film Ill Met by Moonlight, in which Dirk Bogarde rather improbably played the leading role, Patrick Leigh Fermor's kidnapping of a German general in Crete in the spring of 1944 was one of the most dashing and unconventional episodes of the Second World War. Leigh Fermor published little on the subject during his lifetime - a very brief account is provided in his 2003 collection of essays, Words of Mercury - but Wes Davis's book usefully plugs the gap. First published in America last year, too early to benefit from Artemis Cooper's biography of Leigh Fermor, it draws on previously unpublished papers in the National Archives and the Imperial War Museum, as well as on Antony Beevor's exhaustive history of the German invasion and occupation of Crete, and on the published memoirs of other veterans of the long guerrilla war. The result is an exciting, fast-moving and crisply written adventure story.
Leigh Fermor's mentor and precursor as a Cretan resistance leader was John Pendlebury, a Cambridge-educated archaeologist with a glass eye who had worked on the Minoan excavations at Knossos before the war and liked to wear traditional Cretan clothes, complete with cloak and turban. Pendlebury was captured and executed shortly after the German invasion of the island in May 1941, and Leigh Fermor took his place fighting alongside the andartes ('guerrillas') of the Cretan resistance.
Heidi Ledford in Nature:
Dutch celebrity daredevil Wim Hof has endured lengthy ice-water baths, hiked to the top of Mount Kilimanjaro in shorts and made his mark in Guinness World Records with his ability to withstand cold. Now he has made a mark on science as well. Researchers have used Hof’s methods of mental and physical conditioning to train 12 volunteers to fend off inflammation. The results, published today in the Proceedings of the National Academy of Sciences1, suggest that people can learn to modulate their immune responses — a finding that has raised hopes for patients who have chronic inflammatory disorders such as rheumatoid arthritis and inflammatory bowel disease.
In 2010, as a graduate student, Kox was exploring how the nervous system influences immune responses. That's when he first learned that Hof had said that he could regulate not only his own body temperature, but also his immune system. “We thought, ‘Alright, let’s give him a chance’,” says Kox. “But we thought it would be a negative result.” Kox, and his adviser, physician and study co-author Peter Pickkers, also at Radboud University Medical Center, invited Hof to their lab to investigate how he would react to their standard inflammation test. It involves exposure to a bacterial toxin, made by Escherichia coli, to induce temporary fever, headache and shivering. To Kox’s surprise, Hof’s response to the toxin was milder than that of most people — he had less severe flu-like symptoms, for example, and lower levels of inflammatory proteins in his blood2.
Peter T. Leeson on OUPblog:
In eras bygone, in societies across the globe, governments didn’t exist—or weren’t strong enough to provide effective governance. Without governments to govern them, the members of such societies relied on self-governance.
Self-governance refers to privately supplied institutions of property protection—whether designed by individuals expressly for the purpose, such as the “codes” that pirates forged to govern their crews in the eighteenth-century Caribbean, or developed “spontaneously,” such as the system of customary law and adjudication that emerged to govern commerce between international traders in medieval Europe. Reliance on such institutions, especially in historical societies, is well known. Less widely recognized or understood is historical societies’ reliance on superstition—objectively false beliefs—to facilitate self-governance.
Consider the case of medieval monks. Today monks are known for turning the other cheek and blessing humanity with brotherly love. But for centuries they were known equally for fulminating their foes and casting calamitous curses at persons who crossed them. These curses were called “maledictions.”
Read the rest here.
Tuesday, May 06, 2014
The birth of Herman Poole Blount on May 22, 1914 was, for him, the least significant of all his births. Blount begat Bhlount and Bhlount begat Ra and Herman begat Sonny and Sonny begat Sun. Sun Ra left Alabama for Chicago and Chicago for Saturn, until he never quite understood how he got to planet Earth in the first place. The name ‘Ra’ — the Egyptian god of the sun — brought him closer to the cosmos. Each rebirth erased the one before it, until Sun Ra’s past became a lost road that trailed off into nothingness. The past was passed, dead. History is his story, he said, it’s not my story. My story, said Sun Ra, is mystery. Sun Ra’s lived life between ancient time and the future, in something like the eternal now. He told people he had no family and lived on the other side of time. Rebirth might not be the right word for the journey that Sun Ra took. Awakening is more precise, like how the ancient Egyptians were awakened. As Jan Assaman wrote in Death and Salvation in Ancient Egypt, to be a person in ancient Egypt meant to exercise self-control. In powerlessness, unconsciousness or sleep, a person is dissociated from the self. The sleeping person, then, is like a dead person. But the awakened one is a person risen.
A great one is awakened, a great one wakes,
Osiris has raised himself onto his side;
he who hates sleep and loves not weariness,
the god gains power…
Sun Ra believed that the whole of humanity was in need of waking up. He wanted to slough off old ideas and habits, brush off sleepy clothing and shake off drowsy food. Because present time mattered little to Sun Ra, they say he rarely slept.
Graham Priest in Aeon:
Western philosophers have not, on the whole, regarded Buddhist thought with much enthusiasm. As a colleague once said to me: ‘It’s all just mysticism.’ This attitude is due, in part, to ignorance. But it is also due to incomprehension. When Western philosophers look East, they find things they do not understand – not least the fact that the Asian traditions seem to accept, and even endorse, contradictions. Thus we find the great second-century Buddhist philosopher Nagarjuna saying:
The nature of things is to have no nature; it is their non-nature that is their nature. For they have only one nature: no-nature.
An abhorrence of contradiction has been high orthodoxy in the West for more than 2,000 years. Statements such as Nagarjuna’s are therefore wont to produce looks of blank incomprehension, or worse. As Avicenna, the father of Medieval Aristotelianism, declared:
Anyone who denies the law of non-contradiction should be beaten and burned until he admits that to be beaten is not the same as not to be beaten, and to be burned is not the same as not to be burned.
One can hear similar sentiments, expressed with comparable ferocity, in many faculty common rooms today. Yet Western philosophers are slowly learning to outgrow their parochialism. And help is coming from a most unexpected direction: modern mathematical logic, not a field that is renowned for its tolerance of obscurity.
Read the rest here.
Robert McCrum in The Guardian:
Sister Carrie is one of several novels in this series that address the American dream, and it does so in a radical spirit of naturalism that rejected the Victorian emphasis on morality. In some ways it's crude and heavy-handed, blazing with coarse indignation, but in its day it was, creatively speaking, a game-changer. Later, America's first Nobel laureate, Sinclair Lewis, said that Dreiser's powerful first novel "came to housebound and airless America like a great free Western wind, and to our stuffy domesticity gave us the first fresh air since Mark Twain and Whitman".
...The novel opens with Caroline – Sister Carrie – Meeber moving from the country to the city, taking the train to Chicago to realise her hopes for a better, more glamorous future. En route, she meets a travelling salesman, Charles Drouet, who soon releases her from the drudgery of machine-work in the heartless city by making her his mistress. This is the first in a succession of Carrie's fruitless attempts to find happiness. Henceforth, she becomes the victim of increasingly desperate relationships which, combined with a starstruck fascination with the stage, take her to New York and the life of a Broadway chorus girl. The novel ends with Carrie changing her name to Carrie Madenda and becoming a star just as her estranged husband, George Hurstwood, gasses himself in rented lodgings. The closing chapters of the book, in which Hurstwood is ruined and then disgraced, are among the most powerful pages in a novel of merciless momentum, whose unsentimental depiction of big-city life sets it apart. Contemporary readers were baffled, however, and Sister Carrie did not sell well.
Carl Zimmer in The New York Times:
Two teams of scientists published studies on Sunday showing that blood from young mice reverses aging in old mice, rejuvenating their muscles and brains. As ghoulish as the research may sound, experts said that it could lead to treatments for disorders like Alzheimer’s disease and heart disease. “I am extremely excited,” said Rudolph Tanzi, a professor of neurology at Harvard Medical School, who was not involved in the research. “These findings could be a game changer.” The research builds on centuries of speculation that the blood of young people contains substances that might rejuvenate older adults. In the 1950s, Clive M. McCay of Cornell University and his colleagues tested the notion by delivering the blood of young rats into old ones. To do so, they joined rats in pairs by stitching together the skin on their flanks. After this procedure, called parabiosis, blood vessels grew and joined the rats’ circulatory systems. The blood from the young rat flowed into the old one, and vice versa. Later, Dr. McCay and his colleagues performed necropsies and found that the cartilage of the old rats looked more youthful than it would have otherwise. But the scientists could not say how the transformations happened. There was not enough known at the time about how the body rejuvenates itself. It later became clear that stem cells are essential for keeping tissues vital. When tissues are damaged, stem cells move in and produce new cells to replace the dying ones. As people get older, their stem cells gradually falter.
In the early 2000s, scientists realized that stem cells were not dying off in aging tissues. “There were plenty of stem cells there,” recalled Thomas A. Rando, a professor of neurology at Stanford University School of Medicine. “They just don’t get the right signals.” Dr. Rando and his colleagues wondered what signals the old stem cells would receive if they were bathed in young blood. To find out, they revived Dr. McCay’s experiments. The scientists joined old and young mice for five weeks and then examined them. The muscles of the old mice had healed about as quickly as those of the young mice, the scientists reported in 2005. In addition, the old mice had grown new liver cells at a youthful rate.
Gary Becker died on May 3rd. Over at Crooked Timber, Kieran Healy on Michel Foucault's appreciation of Becker (image from Wikimedia commons).
Foucault lectured on Becker and related matters in the late 1970s. One of the things he saw right away was the scope and ambition of Becker’s project, and the conceptual turn—accompanying wider social changes—which would enable economics to become not just a topic of study, like geology or English literature, but rather an “approach to human behavior“. Here is Foucault in March of 1979, for instance:
In practice, economic analysis, from Adam Smith to the beginning of the twentieth century, broadly speaking takes as its object the study of the mechanisms of production, the mechanisms of exchange, and the data of consumption within a given social structure, along with the interconnections between these three mechanisms. Now, for the neo-liberals, economic analysis should not consist in the study of these mechanisms, but in the nature and consequences of what they call substitutible choices … In this they return to, or rather put to work, a defintion [from Lionel Robbins] … ‘Economics is the science of human behavior as a relationship between ends and scarce means which have alternative uses’. … Economics is not therefore the analysis of the historical logic of processes [like capital, investment, and production]; it is the analysis of the internal rationality, the strategic programming of individuals’ activity.
Then comes the identification not just of the shift in emphasis but also point of view:
This means undertaking the economic analysis of labor. What does bringing labor back into economic analysis mean? It does not mean knowing where labor is situated between, let’s say, capital and production. The problem of bringing labor back into the field of economic analysis … is how the person who works uses the means available to him. … What system of choice and rationality does the activity of work conform to? … So we adopt the point of view of the worker and, for the first time, ensure that the worker is not present in the economic analysis as an object—the object of supply and demand in the form of labor power—but as an active economic subject.
At first glance it seems strange to see Foucault emphasize the “active economic subject” here. A standard—indeed, clichéd—critique of Becker’s approach is that economic agents are calculating robots that bear little resemblance to real human beings and that, furthermore, their disembedded and completely systematic choice-making takes us far away from any sort of first-person point of view of labor in the economy. If we want a proper account of economic action on the ground surely we will have to look elsewhere. Wasn’t Marx supposed to have been doing something like this, for example?
Thomas Piketty with others in The Guardian (Photograph: Toby Melville/PA):
The European Union is experiencing an existential crisis, as the European elections will soon brutally remind us. This mainly involves the eurozone countries, which are mired in a climate of distrust and a debt crisis that is very far from over: unemployment persists and deflation threatens. Nothing could be further from the truth than imagining that the worst is behind us.
This is why we welcome with great interest the proposals made at the end of 2013 by our German friends from the Glienicke group for strengthening the political and fiscal union of the eurozone countries. Alone, our two countries will soon not weigh much in the world economy. If we do not unite in time to bring our model of society into the process of globalisation, then the temptation to retreat into our national borders will eventually prevail and give rise to tensions that will make the difficulties of union pale in comparison. In some ways, the European debate is much more advanced in Germany than in France. As economists, political scientists, journalists and, above all, citizens of France and Europe, we do not accept the sense of resignation that is paralysing our country. Through this manifesto, we would like to contribute to the debate on the democratic future of Europe and take the proposals of the Glienicke group still further.
It is time to recognise that Europe's existing institutions are dysfunctional and need to be rebuilt. The central issue is simple: democracy and the public authorities must be enabled to regain control of and effectively regulate 21st century globalised financial capitalism. A single currency with 18 different public debts on which the markets can freely speculate, and 18 tax and benefit systems in unbridled rivalry with each other, is not working, and will never work. The eurozone countries have chosen to share their monetary sovereignty, and hence to give up the weapon of unilateral devaluation, but without developing new common economic, fiscal and budgetary instruments. This no man's land is the worst of all worlds.
Justin E. H. Smith in the Chronicle of Higher Education:
There is a great die-off under way, one that may justly be compared to the disappearance of dinosaurs at the end of the Cretaceous, or the sudden downfall of so many great mammals at the beginning of the Holocene. But how far can such a comparison really take us in assessing the present moment?
The hard data tell us that what is happening to animals right now is part of the same broad historical process that has swept up humans: We are all being homogenized, subjected to uniform standards, domesticated. A curiosity that might help to drive this home: At present, the total biomass of mammals raised for food vastly exceeds the biomass of all mammalian wildlife on the planet (it also exceeds that of the human species itself). This was certainly not the case 10,000 or so years ago, at the dawn of the age of pastoralism.
It is hard to know where exactly, or even inexactly, to place the boundary between prehistory and history. Indeed, some authors argue that the very idea of prehistory is a sort of artificial buffer zone set up to protect properly human society from the vast expanse of mere nature that preceded us. But if we must set up a boundary, I suggest the moment when human beings began to dominate and control other large mammals for their own, human ends.
Sean Carroll in Preposterous Universe:
There’s no question that quantum fluctuations play a crucial role in modern cosmology, as the recent BICEP2 observations have reminded us. According to inflation, all of the structures we see in the universe, from galaxies up to superclusters and beyond, originated as tiny quantum fluctuations in the very early universe, as did the gravitational waves seen by BICEP2. But quantum fluctuations are a bit of a mixed blessing: in addition to providing an origin for density perturbations and gravitational waves (good!), they are also supposed to give rise to Boltzmann brains (bad) and eternal inflation (good or bad, depending on taste). Nobody would deny that it behooves cosmologists to understand quantum fluctuations as well as they can, especially since our theories involve mysterious aspects of physics operating at absurdly high energies.
Kim Boddy, Jason Pollack and I have been re-examining how quantum fluctuations work in cosmology, and in a new paper we’ve come to a surprising conclusion: cosmologists have been getting it wrong for decades now. In an expanding universe that has nothing in it but vacuum energy, there simply aren’t any quantum fluctuations at all. Our approach shows that the conventional understanding of inflationary perturbations gets the right answer, although the perturbations aren’t due to “fluctuations”; they’re due to an effective measurement of the quantum state of the inflaton field when the universe reheats at the end of inflation. In contrast, less empirically-grounded ideas such as Boltzmann brains andeternal inflation both rely crucially on treating fluctuations as true dynamical events, occurring in real time — and we say that’s just wrong.
All very dramatically at odds with the conventional wisdom, if we’re right. Which means, of course, that there’s always a chance we’re wrong (although we don’t think it’s a big chance).
[S]ome physicists are already beginning to theorize about what might lie beyond quantum computers. You might think that this is a little premature, but I disagree. Think of it this way: From the 1950s through the 1970s, the intellectual ingredients for quantum computing were already in place, yet no one broached the idea. It was as if people were afraid to take the known laws of quantum physics and see what they implied about computation. So, now that we know about quantum computing, it’s natural not to want to repeat that mistake! And in any case, I’ll let you in on a secret: Many of us care about quantum computing less for its (real but modest) applications than because it defies our preconceptions about the ultimate limits of computation. And from that standpoint, it’s hard to avoid asking whether quantum computers are “the end of the line.”
Now, I’m emphatically not asking a philosophical question about whether a computer could be conscious, or “truly know why” it gave the answer it gave, or anything like that. I’m restricting my attention to math problems with definite right answers: e.g., what are the prime factors of a given number? And the question I care about is this: Is there any such problem that couldn’t be solved efficiently by a quantum computer, butcould be solved efficiently by some other computer allowed by the laws of physics?
Here I’d better explain that, when computer scientists say “efficiently,” they mean something very specific: that is, that the amount of time and memory required for the computation grows like the size of the task raised to some fixed power, rather than exponentially. For example, if you want to use a classical computer to find out whether an n-digit number is prime or composite—though not what its prime factors are!—the difficulty of the task grows only like n cubed; this is a problem classical computers can handle efficiently. If that’s too technical, feel free to substitute the everyday meaning of the word “efficiently”! Basically, we want to know which problems computers can solve not only in principle, but in practice, in an amount of time that won’t quickly blow up in our faces and become longer than the age of the universe. We don’t care about the exact speed, e.g., whether a computer can do a trillion steps or “merely” a billion steps per second. What we care about is the scaling behavior: How does the number of steps grow as the number to be factored, the molecule to be simulated, or whatever gets bigger and bigger?
Monday, May 05, 2014
by Gerald Dworkin
Some of my readers may recall from an earlier blog post or Justin Smith's review of my Philosophy: A Commonplace Book that for many years I have been collecting humorous quotes, epigrams, aphorisms, parodies, etc. that have some connection to Philosophy. The connection is sometimes that it is from a philosopher, or specifically about a philosophical topic--particularly ethics. Sometimes it is a joke that I see has a philosophical point behind or around or under it. Perhaps any great joke can be seen to be philosophical in some sense if one squints hard enough at it. But many of the quotes are just interesting and thought-provoking without being humorous.
Since publishing my book I have continued to mine for gems. One of the advantages to publishing an ebook is that it makes second editions easy and I intend to revise one of these days. But in the interim I provide a sampling of my sampling for your Monday morning amusement and edification.
If you can only be good at one thing, be good at lying. ... Because if you're good at lying, you're good at everything.
The difference between genius and stupidity is that genius has its limits.
An aphorism can never be the whole truth; it is either a half-truth or a truth-and-a- half.
The devil is an optimist if he thinks he can make people worse than they are.
Generally speaking, the errors in religion are dangerous; those in philosophy only ridiculous.
by Paul Braterman
That's my friend Ramin Forghani from Iran, standing next to Maryam Namazie, carrying a placard outside the Law Society offices in London. He knows that what he is doing, and what he is about to say, could get him killed.
Imagine that you want to write your will according to sharia law, which in England you are perfectly entitled to do. You can go to your friendly neighbourhood Imam to discuss the matter, ask him to explain what is actual law, and what mere custom, talk about the various different interpretations available), and consider how best to apply them to your own family circumstances. This could be quite a long conversation; there are at least six main traditional schools of sharia jurisprudence, to say nothing of modernisers like Musawah who seek to accommodate Islamic practice to present-day principles of equality.
Or you can go to your solicitor, who handles all your ordinary legal business. And if that solicitor follows the guidance issued by the Law Society, he will simply tell you that sons inherit twice as much as daughters, adopted and illegitimate children do not inherit at all, neither do divorced spouses, and marital status is defined according to religious marriage and not according to the law of the land. Tough, by the way, on your orphaned grandkids; in the Law Society's version of sharia law "it is not possible to inherit under Sharia rules by a deceased relative."
the ground’s given way
I’m tumbling sprawling
space space —my mind
my heart, my heart
is in a parabolic arc
in a plane devoid of gravity and time
I float I float
I’m in a massless boat
the truth of gravity is failing
the sadness of abrupt conclusions gone
I’ve come apart, I’m flailing
up is all around
it’s merged with down
if I weren’t so glad
I’d certainly be wailing
by Jim Culleny
by Akim Reinhardt
Let me begin this essay by making one thing clear: I am opposed to capital punishment.
I agree with pretty much all of the arguments against it. It's clearly not a deterrent. The possibility, much less the reality, that innocent people are sometimes executed is beyond inexcusable. A variety of factors have contributed to capital punishment being disproportionately applied to minorities and the poor in the United States. And I don't believe the state should be in business of killing its own people, even its most reprehensible members.
And so for all of those reasons, and several others, I oppose capital punishment.
However, I also believe there is an element of moral ambiguity inextricably woven into the issue, and I am not comfortable with the moral absolutism that sometimes accompanies opposition to the death penalty.
While I personally oppose the use of capital punishment, I acknowledge that there is a rational and reasonable moral framework around which some supporters advocate for it. In short, I reject the notion that opponents such as myself can claim some sort of moral monopoly on the issue.
For starters, I think it is perfectly normal for someone to wish death upon a person who has brutally murdered a loved one. Opponents of capital punishment often drift into language of "savagery" when rejecting appeals for capital punishment, and I find this very troubling.
I think it extremely heartless and sanctimonious to label as "savage" or even "immoral" the very understandable desire for revenge by the loved ones of brutal crime victims. To the contrary, those feelings are incredibly normal. Ask any grief counselor.
I know that if someone, say, raped and murdered a member of my family, I would want the rapist-murderer to die. The vast majority of people would. Those who wouldn't are not the norm. Rather, the loved ones living in the aftermath of horrific, murderous crimes, who find it within their hearts to forgive the criminal, or at the very least, not want them dead, are extraordinary and admirable people.
Thus, I reject outright the notion that wishing death upon those who have committed unspeakably immoral acts of murder is itself an immoral sentiment. Rather, I see it as a humane and even sensible one, though I myself do not support the subsequent act of capital punishment.
Beyond the morality of victim survivors' desires, however, I also recognize the morality of a more distanced stance in support of capital punishment, even if I do not support the act itself. This is because I also reject what I consider to be a sentimentalized view of humanity that casts all human life as sacred. Instead, I embrace our mortality and impermanence, I reject our supposed inherent moral superiority to other beings, and I recognize that morality itself is a human construct that no other beings conceive.
by Katharine Blake McFarland
On a trip to New York to visit friends last month—a trip that coincided with the city's first beautiful spring weekend after a grueling, endless winter—I walked four miles uptown to see the Whitney Biennial exhibit. Mostly I found the show to be difficult and pedagogical, but there were a few standout pieces, works I will remember for their ability to open up some previously closed part of the heart. A pencil drawing by Elijah Burgher; a massive series of paintings by Keith Mayerson called My American Dream, which sets iconic images next to the personal moments of the painter's life; a kind of totem by Jimmie Durham called Choose Any Three, made of stacked wood pieces inscribed with names like Malcolm X, Annie Wauneka, and Kafka.
But one of the most unforgettable moments of the exhibit wasn't an installation. It was a conversation I overheard among young girls about an installation.
In a small dark room, a short film played on a loop. The film, Untitled by Jennifer Bornstein, features a group of naked women dancing. In true modern dance form, the women are barefoot, pushing and pulling their bodies across the barren backdrop, dragging and circling, arching and caving in. At one point, two of the dancers seem to be in struggle, gripping each other's bodies like wrestlers; other times, the movements are languid, more peaceful and maybe even sad. The dancers themselves are beautiful—capable bodies, confident movements, their long brown hair falling in front of their faces.
As I stood with my back to the wall, just about to leave, three little girls scurried into the room, full of secrets, followed by a bedraggled-looking father. They couldn't have been more than six or seven years old.
"Eeeewwww" the tallest girl whispered loudly.
"They're JUST NAKED!" gasped another, which prompted a general chorus of audible, enraptured disgust (that kind of disgust, so familiar to childhood, that prohibits the possibility of looking away).
"Girls," whispered the father, "if you don't like it, let's move along." The girls reacted to this suggestion by taking a seat on the front-most bench, closest to the screen, and continued their chorus. The father tried a different approach: "What do you find so gross about it?"
"Their vaginas!" said the tallest girl. At this, the father glanced around the room embarrassedly, caught my eye, and I smiled.
"What about them?" the father asked, turning back to the girls.
"They're hairy!"—and then, after a reflective pause, "They look like men pretending to be women."
As I left the dark room and walked into the bright white hallways of the museum, I immediately thought of Barbie. Her impossible proportions, gravity-defying and devoid of muscle; her smooth, and (of course) hairless, plastic skin.
Strained Analogies Between Recently Released Films and Current Events: Paul Walker's Penultimate Film and Piketty's Capital in the Twenty-First Century
by Matt McKenna
A story was recently imported from France to America, and it has since become a national sensation. It is the story of inequality and the danger of capitalism run amok. It is a prophecy for social upheaval if this inequality isn't handled in a timely manner. It is, by all accounts, an important story. Of course, I'm referring to Paul Walker's penultimate film, Brick Mansions, a parkour action flick he co-stars with David Belle and RZA. A film this dense begs for analysis, and fortunately there's already a compendium on the market whose popularity is threatening to rival that of the film itself. This study-guide, written by French economist Thomas Piketty, is called Capital in the Twenty-First Century and is essential reading for any American attempting to explore the economic allusions within Brick Mansions.
Brick Mansions is an American remake of the 2004 French film, District B13. While the remake does Americanize its subject matter, the larger plot elements of the story remain intact: as crime in Detroit increases to horrifyingly high levels, the government erects walls around the city's most dangerous neighborhood, a large block of rundown high-rises known as Brick Mansions. Lino (David Belle) is a resident of Brick Mansions and parkour enthusiast who is interested in killing drug kingpin Tremaine (RZA) for kidnapping his girlfriend. Lino is joined in his quest by Damien Collier (Paul Walker), a naive cop sent into Brick Mansions to deactivate a rogue neutron bomb that found its way into the area. As to be expected in an action film, our heroes are ceaselessly bombarded by henchmen with terrible aim and a proclivity for standing near the edges of rooftops. As the duo battles their way through the parade of bad guys, Lino's parkour skills prove to be an invaluable resource as he deftly traverses terrain filled with just-out-of-reach ladders, windows, and objects from which he can perform flips and other incredible feats of jumping. Collier is less agile than his counterpart, but a deep-seated rage over the death of his father affords him superhuman tenacity and an exceptionally wry wit.
You'd be forgiven if you read the above description and came to the conclusion that Brick Mansions is nothing but a brainless action movie whose core audience's age tops out at fourteen. In fact, you'd still be forgiven if you watched the movie and came away with the very same conclusion. Because the film is so oblique, it is easy to miss the nuanced social and economic critiques amongst the plethora of kicks and fist-bumps. Thankfully, Thomas Piketty's Capital in the Twenty-First Century, an impressive work in and of itself, decodes Brick Mansions and provides viewers with the opportunity to understand this frequently difficult film.
Sughra Raza. Self Portrait near Itaimbezinho Canyon, Brazil. 2014.
by Grace Boey
What is it like to be a bat? Philosopher Thomas Nagel famously posed this question in 1974. As he noted, the question is one that cannot be answered: no matter how many objective, scientific facts we may discover about a bat’s physiology or neurobiology, we can never access its subjective, personal experience. Phenomenal consciousness - or qualia - is a private, opaque matter. Nagel’s question (and lack of an answer) is one that almost all philosophy freshmen are acquainted with.
But long before I’d heard of Nagel - or the mind-body problem of philosophy - I’d already developed a few ideas of my own about bats. As a child, I’d been enchanted by the tale of Stellaluna, a baby fruit bat who is accidentally separated from her mother. My own mother would often read the book to me at bedtime; I’d fall asleep thinking about brave Stellaluna who befriends a group of baby birds, reluctantly learns to eat worms, and is taught to sleep the wrong way up. When Stellaluna and her mother are finally reunited, both are overjoyed - and the baby bat finally feels like she is someplace she belongs. According to the story, bats are capable of complex emotions, preferences and desires - just like us.
Stellaluna is a wonderful children’s tale. But, as a scientific description of bat psychology, the text is clearly lacking. It is questionable whether bats are capable of possessing mental states such as ‘bravery’, ‘love’ or ‘belonging’, or whether they are capable of establishing ‘friendships’. The text commits what scientists refer to as ‘anthropomorphism’ - the act of assigning human-like qualities to non-human animals. Among scientists, anthropomorphism has become somewhat of a dirty word.
It is certainly unwise to blanketly assume that non-human animals have inner mental lives identical to those of humans. Yet it also seems unlikely that non-human animals have no mentality at all. If excessive anthropomorphism is a sin, then so is excessive anthropocentrism. In all likelihood, the truth about non-human animal minds lies somewhere in between. What, then, is the ‘correct’ way to interpret animal behaviour? This question comes with significant stakes, since how we relate to non-human animals is guided by what we believe about their minds. Unfortunately, the task of animal psychology is rife with methodological and philosophical difficulties. It would certainly be responsible for us to gather as much accurate, relevant scientific data as we can - but as Nagel has pointed out, the best we can do from there is still to guess.
by Leanne Ogasawara
“"If you want to to become a man of letters and perhaps write some Histories one day, you must also lie and invent tales, otherwise your History would become monotonous. But you must act with restraint. The world condemns liars who do nothing but lie, even about the most trivial things, and it rewards poets, who lie only about the greatest things." ― Umberto Eco, Baudolino
It was every Medieval person's greatest aspiration. For, of course, finding Prester John would bring about the most glorious-not to mention grandiose-- conclusion to the Crusades. In their rich imaginations, the Medievals believed that this would culminate in the return of Jerusalem from "the Moors" and the making way for the Second Coming--and the Kingdom of Heaven.
No small undertaking, the search for the Prester was just as mind-bogglingly quixotic as the other European obsessions, like for Eldorado and Atlantis and the Grail. And, like the search for the Holy Grail, this sone had the added imperative and will to power borne of religion.
I imagine my favorite Portuguese fidalgo not taking the news well. But maybe Pêro da Covilhã was no real fidalgo anyway--of humble birth, it was his wit and skill with languages that had brought him this far up the aristocratic ladder in Lisbon. Called to court in 1487, he arrived to a room full of Jesuits.
Not the bloody Jesuits, he must have thought, Anything but them.
His despair must have only deepened when he heard what the king had in mind for him.
He was being asked to lead an emissary to Abyssinia.
As he struggled to recall where Abyssinia even was located, one of the council map-makers probably appeared and unfurled a large map of the known world; one with Jerusalem lying smack in the middle. As they explained the route he was to take, a Jesuit confidante and adviser to the viceroy explained that it was the Court of the King of Abyssinia at which they believed the legendary Prester John resided.
Prester John? Not this Catholic nonsense again? Pêro da Covilhã must have struggled to keep his disbelief from showing on his face over what they were asking of him.
by Josh Yarden
Some people like the idea that education is the great leveler of the playing field. They believe, or at least they repeat the slogan that everyone can attain the American dream if they work hard enough in school. The truth tells a different story: A great education puts you ahead of the game, but that advantage is for a select few, not for everyone. The World Series and The Superbowl may be played on level fields, but most people, even those who try their very hardest, never have an opportunity to attend the game.
If you want to examine social inequality in America, the easiest place to begin is by taking a look at the socio-economic stratification of our schools. We have several parallel educational systems. Among them are elite private schools funded by foundations and private citizens, well-funded public schools in communities with relatively affluent populations, some high quality magnet public schools that do not offer open access to all students, more schools that are funded below desired levels, and many crowded under-resourced public schools. A more detailed look at the nature of poverty points toward particular issues such as homelessness, absenteeism, illness, the low educational levels of parents and substance abuse, among others. Politicized issues such as vouchers, school choice and test scores create a lot of noise that drowns out some of the most important signals communities are sending about the real issues that impact the quality of American education.
There seems to be an insatiable desire in some corners of American society to discover the silver bullet. We want a hero to ride into town on a white horse, clean up corruption and… Hi-Yo Silver! Away!… then leave us alone. We don't like paying taxes, and we don't like it when public officials spend our money on someone else's issue. But decades of reform initiatives have proven time and again that there are no silver bullets.
by Brooks Riley
by Sue Hubbard
Siena, a mediaeval city of windy streets, dark alleys and red roofs is one of Italy's jewels. It may now be full of school children and tourists eating ice cream as they wander amongst the stylish shops or stop to have a drink in the Piazza del Campo – which twice yearly is turned into a horse racetrack for that lunatic and partisan stampede, the Palio - but it was in the Middle Ages that Siena reached its zenith. Having been ruled by the Longobards, then the Franks, it passed into the hands of the Prince-Bishops. During the 12th century these were overthrown by Consuls who set up a secular government. It was then that Siena attained the political and economic importance that led to its rivalry with that other gilded Tuscan city, Florence. The 12th century saw the construction of many beautiful buildings: numerous towers, nobles' houses, Romanesque churches, culminating in the construction of the famous black and white duomo.
The great age of Sienese art arguably started with Duccio. No contemporary accounts of him, nor any personal documents, have survived. Though there are many records about him in municipal archives: records of changing of address, payments, civil penalties and contracts that give some idea of the life of the painter. Little is known of his painting career. Many believe he studied under Cimabue, while others think that he may have actually traveled to Constantinople and learned directly from a Byzantine master.
As a young man Duccio probably worked in Assisi, though he spent virtually his entire life in Siena. He's first mentioned in Sienese documents in 1278 in connection with commissions for 12 wooden panels for the covers of the municipal books. In 1285, a lay brotherhood in Florence commissioned him to complete an altarpiece, known now as the Rusellai Madonna, for the church of Santa Maria Novella. By that date he must already have had something of a reputation, which guaranteed the quality of his work.