Tuesday, September 24, 2013
Virginia Hughes in National Geographic:
Two eyes, aligned horizontally, above a nose, above a mouth. These are the basic elements of a face, as your brain knows quite well. Within about 200 milliseconds of seeing a picture, the brain can decide whether it’s a face or some other object. It can detect subtle differences between faces, too — walking around at my family reunion, for example, many faces look similar, and yet I can easily distinguish Sue from Ann from Pam.
Our fascination with faces exists, to some extent, on the day we’re born. Studies of newborn babies have shown that they prefer to look at face-like pictures. A 1999 studyshowed, for example, that babies prefer a crude drawing of a lightbulb “head” with squares for its eyes and nose compared with the same drawing with the nose above the eyes. “I believe the youngest we tested was seven minutes old,” says Cathy Mondloch, professor of psychology at Brock University in Ontario, who worked on that study. “So it’s there right from the get-go.”
These innate predilections for faces change and intensify over the first year of life (and after that, too) as we encounter more and more faces and learn to rely on the emotional and social information they convey. Scientists have studied this process by looking mostly at babies’ abilities as they age. But how, exactly, our brains develop facial expertise — that is, how it is encoded in neurons and circuits — is in large part a mystery.
Two new studies tried to get at this brain biology with the help of a rare group of participants: children who were born with dense cataracts in their eyes, preventing them from receiving early visual input, and who then, years later, underwent corrective surgery.
Marco Roth in n + 1:
From the thumbnail headshot accompanying his essay in the Times, “the drone philosopher,” as I’ve begun to think of him, appears to be in his late twenties, or a boyish 30. In an oddly confessional-style first paragraph, he recalls what it was like to watch the second Iraq War from his college dorm television. He has clean-shaven Ken-doll looks and a prominent squarish jaw, recalling the former Republican vice-presidential candidate and representative from Wisconsin’s First Congressional District, Paul Ryan. I doubt the drone philosopher would be flattered by the comparison. The tone of his article makes him out to be a thoughtful liberal, more interested in weighing complexities than in easy solutions, simultaneously attracted by and wary of power, not unlike the commander in chief he hopes will one day read his papers.
I can make out a bit of wide-striped collegiate tie, a white collar, and the padded shoulders of a suit jacket in the photograph. I know I’m being unfair, but I don’t trust his looks. Since Republicans have become so successful at branding themselves the party of white men, I now suspect that any white guy in a suit may harbor right-wing nationalist tendencies, much as the CIA’s rules governing drone strikes have determined that groups of “military age” men in certain regions of Pakistan and Yemen may be profiled as terrorists. Even more unkindly, I catch myself thinking the drone philosopher’s portrait looks like it was taken for the high school debate club he surely belonged to. Maybe that was where — for competitions in dim auditoria from state to regional to national level, prepping in a series of carbon-copy cheap motels, four to a room — he first learned to be rewarded for making audacious arguments. It was like a job or a sport. Maybe there was one particularly formative debate, “Resolved: The United States was right to use the atomic bomb against Japanese civilians.” He would have parsed this proposition into a value, such as “Right Action” or “Justice,” made up of a checklist of criteria: saving American lives, the primary but not sole duty of the deciders; weighing potential lives lost or saved on both sides when contrasted with the alternative policy of full-scale invasion of Japan.
Afshin Molavi in Foreign Policy:
Today, four bottles of Johnnie Walker are consumed every second, with some 120 million bottles sold annually in 200 countries. Five of Johnnie Walker's top seven global markets are in the emerging world: Brazil, Mexico, Thailand, China, and a region the company calls "Global Travel Asia and Middle East."
From a small town in the Scottish Lowlands, the Striding Man has come a long way -- and he's still walking.
Ask anyone who travels in emerging markets or developing economies, and chances are they've been offered Johnnie Walker. These are just some of the places I've seen it poured: at a Beijing gathering of techies, a four-day wedding in Jaipur, countless bars in Dubai, a Nile cruise in Egypt, the home of an Arab diplomat in Bangkok, private homes in Tehran, a middle-class Istanbul house, and diplomatic parties in Riyadh.
Journalists who spent time in Baghdad during the Iraq war marveled at the easy availability of Johnnie Walker Black Label, even when food staples were scarce. The late writer Christopher Hitchens -- who fondly referred to the drink as "Mr. Walker's amber restorative" -- accurately noted that Black Label was "the favorite drink of the Iraqi Baath Party." In Saddam Hussein's era, a smuggler could make a good living taking crates across the border for thirsty Iranians.
The journey of contemporary Ukrainian literature starts with violent vomiting on a Moscow side street. Otto von F, the hero of Yuri Andrukhovych's novel The Moscoviad (1993; English translation, 2008) has been on a binge (beer, champagne, vodka, meths, madeira) during the final days of the Soviet Empire, boozing his way through a psychotic city of monarchist madmen, terrorists and thieves, ranting at the great symbols of Russian might: "This is the city of a thousand-and-one torture chambers. A tall advance bastion of the East, in anticipation of conquering the West. The city of Bolshevik Imperial architecture with the high-rise ghosts of people's commissariats. It only knows how to devour."
Otto is a Ukrainian student poet, resident in the dorm of the gigantic, jagged, neo-gothic tooth of Moscow State University, sensing "the slow bursting of the Empire's seams, with countries and peoples crawling apart, each of them now acquiring independent relevance". If Ukrainian writers once wrote about the great capital with a mixture of fear and awe, Andrukhovych's hero drunkenly gobs and waves two fingers at it: his very name, Otto von F, announces his European superiority to the barbarous Russian East.
To speak is to know that language is amoral—equally congenial to truth and falsehood, clarity and circumlocution. And therein lies the impetus not only for everyday mendacity but also for artful systems of linguistic subterfuge. As Daniel Heller-Roazen observes, human beings seem to have an innate impulse to “break and scatter” language, to alter their native idioms in order to conceal, bewilder, and dissimulate. In his fascinating Dark Tongues—which might be construed as either a highly episodic history or a collection of case studies ranging across eras and cultures—Heller-Roazen investigates this tendency, paying particular attention to those instances when secret language becomes intertwined, if not interchangeable, with poetry.
He commences with an engrossing discussion of cant, introducing us to a gang of Burgundian bandits known as the Coquillars. In 1455, a prosecutor created a glossary of the Coquillars’ “refined” argot: “Jour is torture. . . . When one of them says, ‘Estoffe!’ it means that he is asking for his booty.” Asking for one’s booty by yelling “Stuff!” hardly seems refined, but to the prosecutor, the descriptor connoted sinister craftiness.
What do I mean by “modernity,” in the special sense I discovered through the Calcutta I knew as a child? Not electric lights, telephones, cars, certainly, though it might encompass these—we had plenty of those in Bombay. I’ll keep it brief: by “modernity” I have in mind something that was never new. True modernity was born with the aura of inherited decay and life. My first impressions of Calcutta from the mid-sixties are of a Chowringhee whose advertisements shone through the smog; and of my uncle’s house in Pratapaditya Road in Bhowanipore, which, with its slatted windows, seemed to have stood in that place forever. It was built, in fact, roughly forty years before my first becoming conscious of it. Similarly, the city itself—which is by no means old by the standards of Rome, Patna, Agra, or even London—is, actually, fairly new, its origins traceable to three hundred and twenty years back in time, the groundwork for the Calcutta we now know probably laid no more than two centuries ago. Yet if you look at paintings and photographs, and see old films of the city, you notice that these walls and buildings were never new—that Calcutta was born to look more or less as I saw it as a child.more here.
Thessaly La Force in Vice:
Marilynne Robinson was my fourth and final workshop instructor at the Iowa Writers’ Workshop. She is an intimidating intellectual presence—she once told us that to improve characterization, we should read Descartes. When I asked her to sign my copy of Gilead, she admitted she had recently become fascinated by ancient cuneiform script. But she is also generous and quick to laugh—when she offered to have us to her house for dinner, and I asked if we ought to bring food, she replied, “Or perhaps I will make some loaves and fishes appear!” Then she burst into giggles. After receiving my MFA this May, I left Iowa believing that there’s no good way to be taught how to write, to tell a story. But there is also no denying that Marilynne has made me a better writer. Her demands are deceptively simple: to be true to human consciousness and to honor the complexities of the mind and its memory. Marilynne has said in other interviews that she doesn’t read much contemporary fiction because it would take too much of her time, but I suspect it’s also because she spends a fair amount of her mental resources on her students. Our interview was held on one of the last days of the spring semester. The final traces of the bitter winter had disappeared, and light filled the classroom, which now felt empty with just the two of us. My two years at Iowa were over, and I selfishly wanted to stretch the interview for as long as possible.
VICE: You recently told the class you had discovered the ending to your new novel—or so you hoped. How does that happen for you? How do you know?
Marilynne Robinson: A lot of the experience of the novel—after the beginning—is being in the novel. You set yourself with a complex problem. If it’s a good problem or one that really engages you, then your mind works on it all the time. A novel by its nature is new. The great struggle, conscious or unconscious, is to make sure that it is new. That it actually has raised issues that deserve to be dealt with in their own terms. They’re not terms that you have seen elsewhere. It’s sort of like composing music. There are options that open and options that disappear, depending on how you develop the guidelines. You think about it over time. And then something will appear, something that is the most elegant response to the question that you’ve asked yourself. And it can absorb the most in terms of the complexities that you’ve created.
Sean M. Carroll in Nature:
Thomas Pynchon's novels have several recurring themes: paranoia and conspiracy, pastiches of high and low culture, synchronicity and coincidence, shadowy networks lurking around every corner, and the impact of science and technology. With the coming of the Internet age and the surveillance society that sprang up in the wake of 11 September 2001, it seems as though reality has finally caught up with his vision. In his latest work, Bleeding Edge, Pynchon takes full advantage of this convergence. The first question asked of a new Pynchon book is: is this one of the sprawling, spiralling, time-tripping monsters with innumerable characters and a plot that is tricky to bring into focus, like Gravity's Rainbow or Against the Day; or is it one of the fun detective stories with a well-defined protagonist, like The Crying of Lot 49 or Inherent Vice? Bleeding Edge is definitely in the latter category. There is a colourful cast of memorable personalities, and high jinks often ensue, but the tale is told linearly, from the point of view of an acknowledged main character, with something approximating an explicit goal. The year is 2001. The dot-com bubble has just burst and Silicon Alley, New York's version of Silicon Valley, is in disarray. The Internet revolution is just beginning to gather steam. And, of course, the imminent 11 September attacks loom over every page.
The novel begins simply, in the mundane beauty of an ordinary morning. Maxine Tarnow is walking her kids to school in Manhattan on the first day of spring, stopping to admire the sunlight shining through a pear tree's blossom. The lapsed-licence fraud investigator is about to be drawn into a sinister web of intrigue. An old acquaintance asks her to investigate the suspiciously successful dot-com for which he is filming a corporate documentary. Poking around brings Tarnow into contact with shady hackers, gregarious Italian–American venture capitalists, Mossad agents, bloggers, petty fraudsters who are in over their heads, trophy wives, a private investigator whose primary tool is his sense of smell, a pair of disarmingly likeable Russian gangsters with a fondness for hip hop, and a mysterious government operative. Some will be exiled, some will run away and some will carry on; not all will survive.
One, two, three—is the ring of chairs around our daily
table the right size? Is it time to stretch open
or do right by? Is the operative term underfoot,
undercapitalized, or under duress?
Every room seems to have a ceiling mirror
and here we are: dressed up, dressed down,
hand to mouth, a spray of lonesome hair, a tuft
of camaraderie, a swag of hope, crown of thorns.
Are we headed for Lake Dry Dock or a wide green
barge on the Nile? If I had to choose,
what would I wish inside me from this month's love?
A stray fragrance, ravaged memory, safe
echo? Or a swoon of repeating cells,
an undertow of more? I'm not sure I can
look up from my plate. This morning's yolk
is glowing around a jot, a tiny knob of the possible
and my lap is yellow with longing.
by Ellen Doré Watson
from We Live in Bodies
Alice James Books, 1997
Monday, September 23, 2013
by Paul BratermanAlvin Plantinga claims that minds produced by undirected evolution could not even be trusted to interpret day-to-day experience. From this he infers that undirected evolution is false, and belief in it self-contradictory. Darwin doubts our capacity to think sensibly about whether or not there is a God, while Plantinga regards the fact that we can think about reality at all as proof of His existence. In Part II of this essay, I will discuss Plantinga's views in more detail, and show that they arise, not so much from anything unusual in his epistemology, as in a profound misunderstanding of the workings of evolution.
Darwin's correspondence includes extensive discussion of religious matters, but it could be argued that what he says there is tempered to his audience. However, his private Autobiography includes a short but revealing chapter on religious belief, and that is what I mainly drawn on here. The family regarded this as so contentious that it was not made public in full until 1958, and I see no reason to regard it as anything less than a full and open account. In less than four thousand words, he traces his progress from rigid orthodoxy to a principled rejection of all dogmatic positions. In the process, he lays out with admirable brevity the standard arguments against religion, using language so clear and striking that one hears echoes of it today, even, perhaps unwittingly, in the arguments used by his opponents.
Darwin initially contemplated becoming a clergyman. He tells us that he "did not then in the least doubt the strict and literal truth of every word in the Bible", and was much impressed by Paley's argument from the perfection of individual organisms to the existence of an intelligent creator. He was still quite orthodox while on the Beagle, but in the two years after his return he reconsidered his position, and gradually came to reject orthodox religion for many reasons. Old Testament history is manifestly false (he cites the Tower of Babel, and the rainbow as a sign given to Noah), and describes its God as having the feelings of "a revengeful tyrant." As for the New Testament, the beauty of its morality may be due to selective interpretation. The New Testament miracles (and here I think he includes the Virgin Birth and the Resurrection) beggar belief in a more scientific age, and the Gospels describing them are mutually contradictory, and written long after the events they claim to describe. For a while, he hoped that new archaeological discoveries would confirm the Gospel story, but gradually he moved towards total rejection on moral, as well as historical and logical, grounds.
One fascinating thing about logic is that it is common property. We all reason from what we already believe to new conclusions. Sometimes logic helps us to make up our minds; by thinking through the implications of an idea, we can weigh its merits against others. To think at all is to employ logic; as Aristotle noted, even to question logic is to deploy it. Logic is inescapable.
Logic's inescapability explains its grip on us. Exposing a logical error is always a winning argumentative strategy. And that's because no matter how deeply people might otherwise disagree about other important matters, we all embrace the strictures of logic and the standards of good reasoning that they supply. Oddly, the universality of logic also explains why logic is so often misapplied. As Charles Peirce observed, few actually study logic because everyone thinks himself an expert. Consequently we all strive to be rational; yet there is a lot of poor reasoning around.
A complicating feature is that our powers of logic are frequently exercised within interpersonal contexts of disagreement with others. In these contexts, extra-rational factors -- social standing, good manners, pressures to conform, and so on -- can infiltrate our logical activities and lead us astray. And yet it is undeniable that reasoning is a collective endeavor. In order to reason well, we must reason with others. But reasoning with others forces us to confront disagreement. Accordingly, the study of logic leads us to the study of argumentation, the processes of interpersonal reasoning within contexts of disagreement.
There are at least two tracks upon which argumentation theory travels. One is explored in our forthcoming book Why We Argue (And How We Should): A Guide to Political Disagreement (Routledge 2013). An alternative is suggested by the pseudonymous author ("Protagoras") of the first installment in a series at The Guardian about "How to Argue." (Interestingly, Protagoras's opening column is titled "Why We Argue— And How To Do It Properly"-- further evidence that the subject matter is common property). The historical Protagoras held that "man is the measure of all things," and we suspect that this is the inspiration for the current Protagoras's proposal that we argue because we find ourselves needing to convince others to agree with us. Note that the need here is practical; disagreement obstructs plans for action. Successful argument, then, removes or dissolves such obstacles by convincing others to share one's view. Protagoras hence associates proper argument with the "art of rhetoric," the skill of bringing others in line with one's own thoughts. It should be mentioned that this art is not as manipulative as it might appear, for the aim is not simply to compel agreement, but to actually convince others of one's view. As it turns out, the artful rhetorician must take careful account of the reasons and commitments of his or her audience; the rhetorician must attempt not only to reason with the audience, but reason from the audience's own premises to the rhetorician's preferred conclusion.
by Elatia Harris
Zev Robinson: It was a long process of discovery. The last place I thought I’d end up, after living in several large cities including New York and London, was a Spanish village of fewer than 800 people, where my wife is from, and where my father-in-law works and harvests his vineyards.
When we lived in London, I remember looking at a bottle of wine in a supermarket that originated from this region, and thinking how few people understood all that went into its making. After we moved here, I was taking a walk through the vineyards one day, and got the idea of making a short film about how the grape gets from the vines here to bottles in the UK.
EH: Are you a wine connoisseur -- in a big way?
ZR: I knew nothing about wine at the beginning of all this, but am always interested in processes, the history that brings an object into being.at one ment
—Yom Kippur, 2013
To find the means to a mend
To try a new take to forsake a mistake
To unfold the past and straighten its bend
To un-muddy a pool and make it clear
To lift the flat rock of our self and let the sun do its work
To yank the inside out and give it some air
To build a whole man of the parts of a jerk
To morph a long fall into a hairpin turn
To lance a boil and do what's best for us
To kindle the badly done and watch it burn
To unbury the past and make its corpse a Lazarus
To sew a split in one cloth now two
To impossibly do what the humble do
To end a grudge to make us whole again:
by Jim Culleny
by Gerald DworkinIn this country 58% of male infants are operated upon shortly after birth. A part of the body is cut off and the operation usually does not use an anaesthetic. There are three relevant features which prompt ethical reflection. The infants cannot consent to the operation. There is no convincing evidence that the operation promotes the health of the infant. The operation is usually motivated by cultural reasons--usually of a religious nature. The operation, of course, is circumcision.
Very recently the operation has come in for legal scrutiny by courts and legislatures in Germany. In 2010 a young Tunisian immigrant brought her four year old son to Cologne University Hospital. He was suffering from a postoperative hemorrhage after a circumcision had been performed by a surgeon two days before. Surgeons stopped the bleeding. The mother who appeared to be in shock mentioned a circumcision but was confusing about who performed it, and whether it was her decision or her husband's. The staff called the police who took testimony from the ER personnel. This lead to a trial charging the surgeon with criminal bodily harm.
The District Court judge acquitted on the grounds that there was no evidence of malpractice and circumcision was protected by parental consent and the parents religious freedom. A Court of Appeals overruled the District Court arguing that parental rights are limited by the best interests of the child, and rights of the child to bodily integrity. Nevertheless the court accepted the acquittal of the surgeon on the grounds that he had good reason to believe that what he was doing was legal.
The decision stirred up an enormous controversy in the media and the public. Clearly, the decision to make illegal a Jewish religious obligation by a German court was considered outrageous. Chancellor Merkel petitioned the Bundestag to take immediate action, and in an emergency session voted to draft legislation that would ensure the legality of circumcision.
( For more--much more-- on the history and the nitty-gritty legal details see H. Pekarek, "Germany's Circumcision Indecision-- Anti-Semitism or Legalism?")
I am interested in the legal issues only as a special case of what the limits of the criminal law ought to be. The issue I want to discuss is does the state have the right to limit circumcision, and, if so, ought it to do so? Note that these are distinct issues. Not everything that one might think is within the legitimate scope of state interference ought to be legislated. One might think that many lies are both wrong and harmful, and so something that a state has the right to consider making illegal, without thinking that it would be a good idea to make all lying a criminal offense. This might be because one thought that it would be impractical to do so, or that it would be destructive of many relationships to do so, or that it would over-burden the court system, or that the effects on personal relationships would be worse than leaving people free to lie. But all of these reasons for not legislating are consistent with believing that the state would not be over-stepping its legitimate powers were it to make lying a crime.
Only Mars Will Save Us Now: Space Exploration and Terrestrial Sustainability as competing Environmental Strategiesby Liam Heneghan
More than any at other conferences I have attended, participants in the annual Mars Society meeting, which was held this year in Boulder, Colorado (August 2013) — their 16th such meeting, my first — like to nod their agreement. In contrast, attendees at the meetings I more regularly visit concerning the ecological fate of the planet signal their comprehension with aghast motionlessness. When Robert Zubrin, director of the (currently Earth-bound) Mars Society, announced in Boulder this summer, that Mars is our future, the audience nodded. Rather, I should say, we nodded.
Not only is a manned mission to Mars technically feasible with existing, or almost-existing, technology but Zubrin insists that it is desirable for us to go to Mars sooner rather than later. Zubrin was reasserting an argument that he has been making for some time. In The Case for Mars — The Plan to Settle the Red Planet and Why We Must (1996) he set out his blueprint for Mars Direct, a plan for manned missions to Mars that would pave the way for colonization and would be both cost-effective and possible with current technology.
Why should we go to Mars? There are economic arguments in favor of us doing so, Zubrin claims. Certain elements, such as deuterium used in nuclear reactors, are hyper-available elements on Mars could be profitably used on Earth. Additionally, rare metals: platinum, gold and silver, can be recovered from Mars and returned to Earth. The economic arguments are important to the case for Mars, but central to Zubrin’s argument, is what exploration of Mars says about us as a species. We should go because we can; it’s who we are. According to Zubrin “virtually every element of significant interest to industry is known to exist on the Red Planet”. Of all the planets in our solar system Mars has by far the greatest potential for self-sufficiency. The resources on Mars will cater for both initial colonists and for the subsequent expansion of a civilization on the Red Planet. For example, subsurface accumulation of water can provide supplies to explorers. Moreover, the colonization of Mars “reaffirms the pioneering character of our society.” Drawing parallels to Roald Amundsen’s successfully traversing the wilderness of Canada's Northwest Passage in 1903, an expedition which adopted a “live off the land” strategy, Zubrin appeals to a pioneering grit and esprit in forging his plans for Mars. Summarizing his reasons for colonizing Mars, Zubrin wrote, “For our generation and many to follow Mars is the New World”. Considering that as of the 9th September 2013 more than 200,000 have applied for a one-way settlement mission to Mars over at the Mars One website, it would seem that Zubrin’s assessment is confirmed.
Claudio Bravo. Circe. 1986.
Oil on canvas.
by Leanne Ogasawara
I almost hate to bring it up but does anyone recall the media spectacle when China hosted the olympic games in 2008? At that time, I had not consumed American media in about 15 years, so imagine my shock when I came home to California in the summer of 2008 and turned on the TV during the opening ceremony...
I could not believe my ears.
For me, perhaps the most memorable "story" being trotted around by "intellectuals" was the supposed similarity between the Beijing opening ceremony and that seen in the 1936 Games staged by Nazi Germany. This was repeatedly stated-- but never argued-- in much the same way as Bush's "Axis of Evil" comparison. And, one wondered whether liberals have dared to "go there" without backing up their claims if they had been talking about France or Austria, for example? Or Russia? It was pretty diminishing --if not patronizing to the Chinese.
Zhang Yimou and Leni Riefenstahl? Really?
I thought the worst days were behind us after that. A few days ago, however, a friend posted on Facebook the following open letter about Syria and the "left-wing" response:
In the letter, the author describes the discussion on the left in the following way:
It is, rather, an ideologically driven habit of twisting facts so that they conveniently fit into a pre-constructed narrative about 'those people' and how they do things. It is, in other words, Orientalism.
That American journalists pander in narratives-as-consumer-products is somehow understandable given the economic realities, but I have been really taken aback by the equally troublesome response we are seeing by the left-wing elite. As Shiar suggests, to frame non-intervention as some kind of morally-elevated action (ie as "anti-war") is problematic--for not only does it cease to make any mention of what Syrians want or think but the suffering that we see happening has been absolutely staggering.
There are some very compelling practical arguments for non-action, don't get me wrong. But this feels lost in what is a tidal wave of liberal insouciance ---or twitter-talk; whether basing opinions on erroneous comparions to Iraq (really?) or in anti-war slogans and (to quote Shiar again) "obsessing about big politics from a statist perspective: regime change, foreign intervention, regional war, Israel, Iran, blah blah blah."
Obama, without any support on the left or the right, has displayed a surprising show of honest resolve.
Historians have long eschewed the term "Dark Ages." Few of them still
use it, and many of them shiver when they encounter it in popular
culture. Scholars rightly point out that the term, popularly understood
as connoting a time of death, ignorance, stasis, and low quality of
life, is prejudiced and misleading.
And so my apologies to them as I drag this troublesome phrase to center stage yet again, offering a new variation on its meaning.
In this essay I am taking the liberty of modifying the tem "Dark Ages" and applying to a modern as well as a historical context. I use it to refer to a general culture of fundamentalism permeating societies, old and new. By "Dark Age" I mean to describe any large scale effort to dim human understanding by submerging it under a blanket of fundamentalist dogma. And far from Europe of 1,500 years ago, my main purpose is to talk about far more recent matters around the world.
Life is, of course, a multi-faceted affair. The complex relationships among individuals and between individuals and societies produce a host of economic, cultural, political, and social manifestations. But one of the defining characteristics of the European Dark Ages, as I am now using the term, was the degree to which those multi-faceted aspects of the world were flattened by religious theology and dogma. As the Catholic Church grew in power and spread across Europe from roughly 500-1500, it was able, at least to some degree, to sublimate political, cultural, social, and economic understanding and action under its dogmatic authority. In many realms of life far beyond religion, forms of knowledge and action were subject to theological sanction.
Those who take pride in Western civilization, or even those like myself who don't necessarily, but who simply acknowledge its various achievements alongside its various shortcomings, recognize a series of factors that led to those achievements. Some of those factors, such as colonialism, are horrific. Some, like the growth of secular thought, are more admirable.
Not that secular thought in and of itself is intrinsically laudable; maybe it is, though I don't think so. But rather, that the rise of secular thought enabled Europe, over the course of centuries, to throw off it's own self-imposed yoke of religious absolutism. And that freeing itself in this way was one of the factors spurring Europe's many impressive achievements over the last half-millennium.
Most denizens of what was once known as the Christian world, including various colonial offshoots such as the United States and Australia, now accept and even take for granted a multi-faceted conception of life and human interaction. For most of them, including many of the religious ones, it is a given that moving away from a world view flattened by religion, at the very least, facilitated the development of things like science and the modern explosion of wealth. Of course the move from a medieval to a modern mind set also unleashed a variety of problems; but on balance, relatively few Westerners would willingly return to any version of medieval Christian theocracy.1
by Jalees Rehman
The United States Census Bureau recently released the results from its 2012 survey of income, poverty and health insurance in the United States. One of the most disheartening results is the high prevalence of poverty in the United States.
The term "poverty" is of course a relative term. The poverty thresholds in the United States depend on the size of a household and are adjusted each year. Currently, poverty for a single person household is defined as an average monthly income of $995 or less– taking into account all forms of earnings including unemployment compensation, workers' compensation, Social Security, veterans' payments, survivor benefits, pension or retirement income, interest, dividends, alimony, child support as well as other sources. A four-person family consisting of two adults and two children is considered to live in poverty if they have to live on an average monthly income of $1,940 or less. This is still a far cry from the global definition of poverty used by the World Bank, which describes fellow humans who have to survive on an income of less than $1.25 per day (or $38 per month). But the US, a country which considers itself as being among the wealthiest in the world, has to face the fact that 15 percent of its population - 46.5 million people – live in a state of poverty!
We worry about the faltering economies of Greece, Cyprus, Spain and Portugal, but the US Census reminds us that the number of poverty-stricken people in the US is roughly equal to that of the total population of Spain, and more than twice the size of the combined populations of Greece, Portugal and Cyprus.
Sunday, September 22, 2013
Kenan Malik in Padaemonium:
‘Cultural values that oppress and diminish women have no place in our society’, wrote the journalistAlison Pearson last week. I agree. The values embodied in the burqa and the niqab, the belief that women should be hidden from view for reasons of modesty or religious belief, should be trashed wherever they appear. But such values can be challenged, and new ones crafted, not top down through state prohibitions, as Pearson and others suggest, but only bottom up through social engagement. That is why, from the other side of the debate, Tariq Modood’s insistence that people should be ‘required’ to show respect towards different cultural mores, and that public arrangements be adapted to accommodate them, is also so problematic; it is an approach that eviscerates both civil society and the idea of freedom. The corollary to the right to wear the burqa is the right, indeed in my eyes the obligation, to challenge the practice of wearing it.
It is not just in the controversy over the burqa, but much more broadly in our discussions about culture and values, that the obsession with the state, and with bans and prohibitions, and the failure to nourish civil society, or even to grasp its importance, damages social life. If we want to get beyond the veil, in the sense both of moving the debate on, and of ridding the world of such medievalism, we need to think less about state proscriptions, and more about the cultivation and the transformation of civil society.
From Necessary and Proportionate:
As technologies that facilitate State surveillance of communications advance, States are failing to ensure that laws and regulations related to communications surveillance adhere to international human rights and adequately protect the rights to privacy and freedom of expression. This document attempts to explain how international human rights law applies in the current digital environment, particularly in light of the increase in and changes to communications surveillance technologies and techniques. These principles can provide civil society groups, industry, States and others with a framework to evaluate whether current or proposed surveillance laws and practices are consistent with human rights.
These principles are the outcome of a global consultation with civil society groups, industry and international experts in communications surveillance law, policy and technology.
Privacy is a fundamental human right, and is central to the maintenance of democratic societies. It is essential to human dignity and it reinforces other rights, such as freedom of expression and information, and freedom of association, and is recognised under international human rights law. Activities that restrict the right to privacy, including communications surveillance, can only be justified when they are prescribed by law, they are necessary to achieve a legitimate aim, and are proportionate to the aim pursued.
Before public adoption of the Internet, well-established legal principles and logistical burdens inherent in monitoring communications created limits to State communications surveillance. In recent decades, those logistical barriers to surveillance have decreased and the application of legal principles in new technological contexts has become unclear.
Forget the dire predictions of a looming shortfall of scientists, technologists, engineers, and mathematicians.
Robert N. Charette in IEEE Spectrum:
You must have seen the warning a thousand times: Too few young people study scientific or technical subjects, businesses can’t find enough workers in those fields, and the country’s competitive edge is threatened.
It pretty much doesn’t matter what country you’re talking about—the United States is facing this crisis, as is Japan, the United Kingdom, Australia, China,Brazil, South Africa, Singapore, India…the list goes on. In many of these countries, the predicted shortfall of STEM (short for science, technology, engineering, and mathematics) workers is supposed to number in the hundreds of thousands or even the millions. A 2012 report by President Obama’s Council of Advisors on Science and Technology, for instance, stated that over the next decade, 1 million additional STEM graduates will be needed. In the U.K., the Royal Academy of Engineering reported last year that the nation will have to graduate 100 000 STEM majors every year until 2020 just to stay even with demand.Germany, meanwhile, is said to have a shortage of about 210 000 workers in what’s known there as the MINT disciplines—mathematics, computer science, natural sciences, and technology.
The situation is so dismal that governments everywhere are now pouring billions of dollars each year into myriad efforts designed to boost the ranks of STEM workers.
More here.The Goat
El Dorado Village. Trinidad
I don't want to kill the animal.
I don't want to kill the goat.
I don't want to bring the machete
of subjects and predicates down
on Bobby's wedding for his daughter.
By hack saw, cleaver, and knife,
I don't want to render
the body and spirit of Boyo
into edible bits,
no matter how delicious.
I want the goat whole.
There is nothing to prove to the goat
as Shaffina and her sister watch
in black hajibs from the house.
He doesn't need to be led by a rope
and relieved of his life
in a little spurting fountain,
trussed up by a hind leg
in the face of his own cage
beneath the flimsy galvanized
in service to what blank red Vatican
he knows not: the poem.