Saturday, August 22, 2015
“Notes on the Death of Culture: Essays on Spectacle and Society” is a new nonfiction diatribe by Mario Vargas Llosa, or (should I say) by the Spanish-language Peruvian novelist, lapsed Catholic, last living public face of the Latin American “boom” and 2010 Nobel laureate in literature Mario Vargas Llosa, the author of over two dozen previous books. The subject of this one is “our” lack: of common culture, or common context, common sets of referents and allusions, and a common understanding of who or what that pronoun “our” might refer to anymore, now that even papers of record have capitulated to individually curated channels and algorithmicized feeds. “Notes” begins with a survey of the literature of cultural decline, focusing on Eliot’s “Notes Toward the Definition of Culture,” before degenerating into a series of squibs — on Islam, the Internet, the pre-eminence of sex over eroticism and the spread of the yellow press — most of which began as columns in the Spanish newspaper El País. All of which is to say that Vargas Llosa’s cranky, hasty manifesto is made of the very stuff it criticizes: journalism.
Vargas Llosa’s opening essay reduces its Eliotic ur-text to its crassest points, but my own version here must be crasser: After all, I have six browser tabs open, and my phone has been beeping all day. Eliot defines culture as existing in, and through, three different spheres: that of the individual, the group or class, and the entire rest of society.
You occasionally think living in Pakistan is an advantage. Since so much is obviously unsayable, you have developed a heightened sensitivity to the ways in which power operates on speech, not just there but everywhere. It is like living in a desiccated nook on the cliff wall of some dry, desert valley. Looking out from your nook you can see the forces of erosion at work. Erosion reshapes everything. One day soon, though hopefully not very soon, your nook, too, will be gone.
You see from your nook that humanity is afflicted by a great mass murderer about whom we are encouraged not to speak. The name of that murderer is Death. Death comes for everyone. Sometimes Death will pick out a newborn still wet from her aquatic life in her mother’s womb. Sometime Death will pick out a man with the muscles of a superhero, pick him out in repose, perhaps, or in his moment of maximum exertion, when his thighs and shoulders are trembling and he feels most alive. Sometimes Death will pick singly. Sometimes Death will pick by the planeload. Sometimes Death picks the young, sometimes the old, and sometimes Death has an appetite for the in-between.
Before Modiano won the Nobel Prize, this most singular writer, noted for his elliptical plots and regretful tone of voice, had barely caused a ripple in the English-speaking world. Only eight of his 30 novels had been translated into English and most of those had fallen out of print. But since the award, publishers in Britain and the US have been falling over themselves to have their own Modiano moment.
Last year, Yale University Press rushed into printSuspended Sentences, a standalone book comprising a trio of newly translated novellas — Afterimage (1993), Suspended Sentences (1988) and Flowers of Ruin(1991). This month sees the UK publication of Bloomsbury’s Occupation Trilogy, a retrospective grouping devised by his Spanish publisher that constitutes translations of Modiano’s first three novels, originally published in France between 1968 and 1972. And in September, MacLehose Press will publish the first English-language translations of Pedigree and his most recent novel So You Don’t Get Lost in the Neighbourhood, which came out in France last year; they are to be published in the US by Yale and Houghton Mifflin Harcourt. In January, MacLehose will also bring out new translations of Modiano’s 2007 novel In the Café of Lost Youth and The Black Notebook (2012).
The following remarks were delivered at The New Criterion’s gala on April 29, 2015 honoring Charles Murray with the third Edmund Burke Award for Service to Culture and Society.
We are living in a political system that has tied itself in knots. “Cleaning house” in Washington will do nothing to untie those knots. When it comes to an explanation of why government under both Democrats and Republicans has become so pathetically ineffectual across the board, even at simple tasks, a powerful underlying explanation is that American government suffers from an advanced case of institutional sclerosis.
Mancur Olson argued that there’s only one way to recover from advanced institutional sclerosis: be utterly defeated in a world war. He compares the postwar experiences of Germany and Japan with the postwar experiences of Britain and France to make his case. Germany and Japan had to start from scratch. That’s precisely why they were able to grow so much more quickly than Britain and France after the war, which won the war and thereby were encumbered by the survival of their prewar institutions—and their prewar sclerosis. How did the United States government avoid institutional sclerosis through almost two centuries of its existence? The answer is simple: the founders set up a system that by its nature prevents institutional sclerosis from getting out of hand. The enumerated powers restricted the number of favors within the power of government to sell. Sclerosis is impossible if no amount of lobbying can give Congress the power to satisfy the desires of the special interests.
And that brings me to my second reason for arguing that we cannot roll back the reach of government through the political process: the constitutional revolution that occurred from 1937 through 1943.
Colin Marshall in Open Culture:
One often hears lamented the lack of well-spoken public intellectuals in America today. Very often, the lamenters look back to James Baldwin, who in the 1950s and 1960s wrote such powerful race-, class-, and sex-examining books as Go Tell It on the Mountain, Giovanni’s Room, and The Fire Next Time, as one of the greatest figures in the field. Though Baldwin expatriated himself to France for much of his life, he seems never to have let the state of his homeland drift far from his mind, and his opinions on it continued to put a charge into the grand American debate.
Upon one return from Paris in 1957, Baldwin found himself wrapped up in the controversy around the Civil Rights Act and the related movements across the south. He wrote several high-profile essays on the subject, even ending up himself the subject of a 1963 Time magazine cover story on his views. That same year, he went on a lecture tour on race in America which put him in close contact with a variety of student movements and other protests, whose efficacy he and Malcolm X debated inthe broadcast above.
Fareed Zakaria in The New York Times:
The world has been horrified but also puzzled by the rise of ISIS. How does one comprehend its brutality and success? What is its likely path? In March 2015, The Atlantic offered an answer, in an analysis by Graeme Wood that quickly became the most widely read essay in the magazine’s 158-year history. Titled “What ISIS Really Wants,” it focused on the ideology that animates the group. Understand its ideas, Wood suggested, and you will understand the phenomenon and how to fight it. Many other, more polemical explanations of jihadi terrorism today — from Bill Maher to Sam Harris — also shine a spotlight on the ideas behind the mayhem.
Most intellectuals think ideas matter. In one of his most famous and oft-quoted lines, John Maynard Keynes declared, “Practical men who believe themselves to be quite exempt from any intellectual influence are usually the slaves of some defunct economist. Madmen in authority, who hear voices in the air, are distilling their frenzy from some academic scribbler of a few years back.” Scott L. Montgomery and Daniel Chirot concur, arguing that ideas “do not merely matter; they matter immensely, as they have been the source for decisions and actions that have structured the modern world.” In “The Shape of the New: Four Big Ideas and How They Made the Modern World,” Montgomery and Chirot make the case for the importance of four powerful ideas, rooted in the European Enlightenment, that have created the world as we know it. “Invading armies can be resisted,” they quote Victor Hugo. “Invading ideas cannot be.”
When grief comes to you as a purple gorilla
you must count yourself lucky.
You must offer her what’s left
of your dinner, the book you were trying to finish
you must put aside
and make her a place to sit at the foot of your bed,
her eyes moving from the clock
to the television and back again.
I am not afraid. She has been here before
and now I can recognize her gait
as she approaches the house.
Some nights, when I know she’s coming,
I unlock the door, lie down on my back,
and count her steps
from the street to the porch.
Tonight she brings a pencil and a ream of paper,
tells me to write down
everyone I have ever known
and we separate them between the living and the dead
so she can pick each name at random.
I play her favorite Willie Nelson album
because she misses Texas
but I don’t ask why.
She hums a little,
the way my brother does when he gardens.
We sit for an hour
while she tells me how unreasonable I’ve been,
crying in the check-out line,
refusing to eat, refusing to shower,
all the smoking and all the drinking.
Eventually she puts one of her heavy
purple arms around me, leans
her head against mine,
and all of a sudden things are feeling romantic.
So I tell her,
things are feeling romantic.
She pulls another name, this time
from the dead
and turns to me in that way that parents do
so you feel embarrassed or ashamed of something.
Romantic? She says,
reading the name out loud, slowly
so I am aware of each syllable
wrapping around the bones like new muscle,
the sound of that person’s body
and how reckless it is,
how careless that his name is in one pile and not the other.
by Matthew Dickman
from American Poetry Review, 2008
Friday, August 21, 2015
Sarah Crown in The Guardian:
In 1527, a fleet of five ships set sail from Spain for the New World, on a mission to settle the recently discovered land of La Florida. After making landfall on the Gulf coast, near where the city of St Petersburg stands today, the expedition’s leader, Pánfilo de Narváez, headed into the country’s unmapped interior in search of the gold he had convinced himself would be found there. Within days his men became hopelessly lost; soon after they began to die, from starvation, disease, drowning and the depredations of local tribes. In the end, of an original contingent of 300, just four survived: three Spanish gentlemen – Álvar Núñez Cabeza de Vaca, Alonso del Castillo and Andrés Dorantes – and Estebanico, a Moorish slave.
History, it’s said, is written by the victors. While the Narváez expedition was a catastrophe of almost absurd proportions, its name used for years afterwards as a byword for disaster, these four men (who were eventually picked up in northern Mexico by a group of Spanish slavers, “strangely dressed and in the company of Indians”), were, if not victors, at least survivors. Together, they’d lived through the worst the continent could throw at them and even, ultimately, carved out a niche for themselves as healers among the indigenous Americans. Their reappearance was a triumph bordering on the miraculous, and Cabeza de Vaca’s tale of their adventures, which he published on his return to Spain, was justly celebrated. But his story also revealed that, even among survivors, some are more equal than others. While the three Castilians were given joint billing, the slave who had been with them every step of the way for the eight long years of their exile was confined to a single line of biography. “The fourth [of us],” says Cabeza de Vaca, “is Estebanico, an Arab Negro from Azemmour.” And that’s it.
When she came across Cabeza de Vaca’s chronicle nearly 500 years after it was written, Laila Lalami was first puzzled, then fascinated by the omission.
Liz Kruesi in Quanta:
Dark matter — the unseen 80 percent of the universe’s mass — doesn’t emit, absorb or reflect light. Astronomers know it exists only because it interacts with our slice of the ordinary universe through gravity. Hence the hunt for this missing mass has focused on so-called WIMPs — Weakly Interacting Massive Particles — which interact with each other as infrequently as they interact with normal matter.
Physicists have reasons to look for alternatives to WIMPs. For two decades, astronomers have found less dark matter at the centers of galaxies than what WIMP models suggest they should. The discrepancy is even worse at the cores of the universe’s tiny dwarf galaxies, which have few ordinary stars but lots of dark matter.
About four years ago, James Bullock, a professor of physics and astronomy at the University of California, Irvine, began to wonder whether the standard view of dark matter was failing important empirical tests. “This was the point where I really started thinking hard about alternatives,” he said.
Bullock thinks that dark matter might instead be complex, something that interacts with itself strongly in the way that ordinary matter interacts with itself to form intricate structures like atoms and atomic elements. Such a self-interacting dark matter, Bullock suspects, could exist in a “dark sector,” somewhat parallel to our own light sector, but detectable only through the way it affects gravity.
He and his colleagues have created numerical simulations that predict what the universe would look like if dark matter feels strong interactions. They expected to see the model fail. Instead, they found that it was consistent with what astronomers observe.
More here. [Thanks to Sean Carroll.]
Shoaib Daniyal in Scroll:
Organised by the far-right group Hindu Samhati, the procession was a commemoration of the Great Calcutta Killings, the terrible communal riot that began exactly 69 years ago on August 16, 1946. In particular, it was feting the role of a certain Gopal Chandra Mukherjee in it. Large billboards mounted on vans proclaimed Mukherjee to be “Kolkatar Rakhakarta” (Kolkata’s protector) and prefixed the title “Hindu bir” (Hindu braveheart) before his name.
It was also connecting 1946 to 2015: people carried banners which called for an end to the “torture” of Hindus in Bengal, warned politicians to stop “appeasing” certain groups in the “greed for votes” and called for an end to “Jihadi riots”. A van carried a lurid billboard asking why Kolkata’s intellectuals were silent about the everyday killing of bloggers in Bangladesh.
On a truck, flanked by hectic activity, a man on a public address system drilled everyone about how the march would be conducted: regular slogans, march in line and be peaceful. The Mamata Banerjee government also seemed interested in the last bit: there was heavy police bandobast for the event, with scores of policemen milling around, in case things went out of hand.
Rail: One thing I noticed in your criticism of the ’70s and early ’80s: you made arguments against something that Robert Hughes had written, or some other prominent review. Before long that falls out of the writing. Was that because there were less people saying things you wanted to fight with, or—
Schjeldahl: That was ambition and antagonism. It was partly a sense of embattled vulnerability, which faded. I’m no longer the insecure kid that just ran into the room. Also I think it had to do with a trend in editorial judgment. It’s like magazines don’t like you reminding people of their competitors. I wish there was more reciprocal, name-citing argument—not name-calling, please. Critics being pissy about other critics is pathetic—as if anyone cares about our tender egos. At that time, I was antagonized by my elders, as I know I now antagonize young writers who want their turns at bat. It’s natural. I remember when Harold Rosenberg died, I felt a pang of guilt. I must have harbored a dark wish that he would.
Rail: You wanted him out of the way so that you didn’t have to deal with him?
Schjeldahl: I wanted to go toward the light and he was blocking it. But of course the big nemesis of us all was Clement Greenberg, and I’m reading him again—he’s great. An asshole on many levels and after the mid-’50s he ceased to be right about much of anything, but nobody in American history has been a more acute critic, who held himself to standards of evidence and logic that make everybody else seem like dilettantes. He had the strength and the weakness of his model, T. S. Eliot—a genius for analysis and a tic of overreaching, as the Voice of Culture. Greenberg’sArt and Culture has a hilarious title—there’s a tremendous lot about art but hardly a cogent word about culture in that entire book.
On the 12th of February, 1804, Immanuel Kant lay on his deathbed. “His eye was rigid, and his face and lips became discoloured by a cadaverous pallor.” A few days following his death, his head was shaved, and “a plaster cast was taken, not a mask merely, but a cast of the whole head, designed to enrich the craniological collection of Dr. Gall,” a local physician. The corpse of Kant was made up and dressed appropriately, and, according to some accounts, throngs of visitors came day and night. “Everybody was anxious to avail himself of the last opportunity he would have for entitling himself to say, ‘I too have seen Kant.’” Their impressions seemed to be at once reverent and grotesque. “Great was the astonishment of all people at the meagreness of Kant’s appearance; and it was universally agreed that a corpse so wasted and fleshless had never been beheld.” Accompanied by the church bells of Konigsberg, Kant’s corpse was carried from his home by torchlight, to a candle-lit cathedral, whose Gothic arches and spires were perhaps reminiscent of the philosopher’s elaborate, vaulted books.
In his book A Short History of Decay, E.M. Cioran once wrote: “I turned away from philosophy when it became impossible to discover in Kant any human weakness, any authentic accent of melancholy, in Kant and in all the philosophers.” Indeed, for many, the name of Immanuel Kant has become synonymous with a certain type of elaborate, grand, system-building philosophy that characterizes works such as The Critique of Pure Reason, first published in 1781.
The Wrights’ first aircraft, really a large kite, was made of bamboo and paper and had two wings, one over the other, with struts and crisscross wires connecting them. A system of control cords enabled its flight to be directed from the ground. Although they ended with a crash, the tests were successful, the brothers felt, and the following summer they built a full-sized glider with an eighteen-foot wingspan meant to be flown as a kite and, if that went well, to carry a man. Like any kite, this very large kite-glider needed wind to rise on, and Wilbur had written to Octave Chanute, an eminent engineer and a leading authority on aviation and gliders, asking for advice—they were looking for a location with good weather and reliable wind where they could conduct tests. Chanute suggested the coast of South Carolina or Georgia where there was also sand for soft landings. Poring through Weather Bureau records they became focused on a wide strip of land in the Outer Banks of North Carolina occupied only by fishermen, called Kitty Hawk. The winds there, they were informed, were reliably steady at ten to twenty miles an hour.
Kitty Hawk was isolated and accessible only by boat. It was seven hundred miles from Dayton, most of it by train. Wilbur went first. It was September and still extremely hot. It took him four days to find a boatman who agreed to take him across Albemarle Sound and they ran into a storm. The voyage was only forty miles but it took them two days. Kitty Hawk, Wilbur saw, was comprised of not much more than a lonely stretch a mile wide and five miles long with a single small hill. There were some houses but almost no vegetation. To the east lay the open Atlantic.
[Thanks to David Schneider.]
Tom Slater in Spiked:
Why Grow Up?, the latest book by American philosopher and essayist Susan Neiman, begins with a slyly subversive statement: ‘Being grown up is itself an ideal.’ In Britain today, this couldn’t seem further from the truth. Today, we’re told, is the worst time to be reaching adulthood. With economic strife, rising house prices, tuition fees and widespread youth unemployment weighing on Generation Y’s pasty back, coming of age merely means coming to the realisation that debt, destitution and living with mum and dad into your thirties is your inevitable inheritance. And that’s hardly an adulthood worth having. The question this book seeks to answer is why growing up seems such a grim prospect today. From the off, Neiman dispenses with the sort of neuroscientific apologism that we’ve become accustomed to in recent years. Within the current, fatalistic climate, adulthood has been defined down. The Science now says that adolescence stretches into your mid-twenties. But, as Neiman observes in her introduction, there’s nothing scientific about growing up. The lines between childhood, adolescence and adulthood are mutable, and have changed over time. Less than a century ago, childhood, as a time of pampered play and dependence, lasted barely a few years for the vast majority of the population. And when most young people were out of school and married by the end of their teens, adolescence – the rebellious grace period between Tonka trucks and 2.4 children – didn’t even exist.
Instead, Neiman presents adulthood as a process of coming to terms with the circumstances you find yourself in and then committing to changing them – reconciling the ‘is’ and the ‘ought’. She situates this in the history of Enlightenment thought, in which the doomy realism of Hume clashed with the rugged idealism of Rousseau. ‘It would take Kant’, Neiman writes, ‘to appreciate the fact that we must take both seriously – if we are ever to arrive at an adulthood we need not merely acquiesce in but actively claim as [our] own’. Kant’s concept of ‘the Unconditioned’, a point at which the world makes perfect sense, is central here. In order to develop into intellectual and moral maturity we must never lose sight of the idea of perfectible society – even as we come to recognise that the world is far from perfect. This rests, Neiman argues, on a refusal to rest in teeny cynicism, to be like Trasymachus – the indignant yoof of Plato’s Republic who rejects Socrates’ concept of justice as a prop for the powerful. ‘He is convinced that he’s seen through everything. It takes a grown up to know that this doesn’t mean he’s seen it’, she writes.
Chorus of Cells
even being very old,
(or perhaps because of it),
I like to make my bed.
In fact, the starting of each day
is the biggest thing I ever do.
I smooth away the dreams disclosed by tangled sheets,
I smack the dented pillow’s revelations to oblivion,
I finish with the pattern of the spread exactly centered.
The night is won.
And now the day can open.
All this I like to do,
mastering the making of my bed
with hands that trust beginnings,
All this I need to do,
directed by the silent message
of the luxury of my breathing.
And every night,
I like to fold the covers back,
and get in bed,
and live the dark, wise poetry of the night’s dreaming,
dreading the extent of its probabilities,
but surrendering to the truth it knows and I do not;
even though its technicolor cruelties,
or the music of its myths,
feels like someone else’s experience,
I know that I could no more cease
to want to make my bed each morning,
and fold the covers back at night,
than I could cease
to want to put one foot before the other.
Being very old and so because of it,
all this I am compelled to do,
day after day,
night after night,
directed by the silent message
of the constancy of my breathing,
that bears the news that I am alive.
by Peggy Freydberg
from Poems from the Pond
publisher: Hybrid Nation, 2015
Two of the most enticing ideas in cells biology have recently converged to create a paradigm shift of epic proportions. The first is that not only is it possible for mitochondria to emigrate from their host cell, they are in fact exchanged among cells much more regularly than has ever been imagined. The second is that while happenstance mutations are clearly associated with different aspects of a litany of cancers, the canonical force consistently driving tumor initiation, progression, and metastasis is now broadly understood to be the metabolic fickleness of their mitochondria. Mike Berridge is one of a handful of researchers firmly planted at the intersection of these two now ineluctable conclusions. As an author on a recent review in Cancer Research on the horizontal transfer of mitochondrial DNA (mtDNA), he adds much needed flesh to the first order simplification that cancer is merely a mitochondrial respiratory insufficiency. Most poignantly, in noting that the hidden force driving tumor-formation forward can more generally be understood to be the reacquisition of once lost mitochondrial function, new therapeutic opportunities immediately present themselves.
Of particular note Berridge found that the apparent need and ability of mitochondria-free primary tumor lines to re-assemble functional respirasomes, the supercomplexes responsible for respiration, differed according to cancer type. For example, breast cancer cells were found to have a unique 'threshold' level of respiration that was different from melanoma cells. Nover anticancer agents could in theory be designed to target specific components in more respiration-dependant cancer cells while leaving other cell types unscathed.
Thursday, August 20, 2015
Olivier Roy in Eurozine:
We Europeans live in secular societies and not in pre- or post-secular societies. Secularization has prevailed globally, even in Muslim countries. Of course, that does not mean that people have become irreligious. A society can consist of a majority of believers and still be secular, as in the United States.
In order to explain this assertion, which might sound paradoxical when the world is being shaken by the rise of the "Islamic State", it will be necessary to discuss the changing nature of the link between culture and religion, and particularly the "de-culturation" of religion.
There are many different ways to define secularization. As a social phenomenon, it is not an abstract process; it is always the secularization of a given religion, whose nature changes as secularization unfolds. Common definitions of secularization include three elements.
The first is the separation of state and religion, of politics and confession, without necessarily entailing a secularization of society. The United States is a good example: although there is a strong separation of church and state, levels of religiosity among the population are still high. The First Amendment of the American Constitution stresses both secularity and religious freedom. The second element in definitions of secularization is the decline in the influence of religious institutions in societies. Activities such as healthcare and education are now managed by the state or the private sector. In Europe, the churches have clearly withdrawn from the "management of society".
The third element in definitions of secularization is what Max Weber called Entzauberung – the disenchantment of the world. This does not mean that people become atheists, but that they care less about religion. Religion no longer plays a major role in our everyday lives, even if we still consider ourselves part of a religious community. In this sense, secularization corresponds to the marginalization of religion in society, rather than its exclusion.
Lee Drutman in Vox:
As the punditry attempts to make sense of the continued popularity of Donald Trump, the prevailing establishment narrative has been simple: He's an anti-establishment buffoon; he's channeling an angry mood; his moment will pass. But as Ezra Klein argued on Monday, this narrative may be wrong. What if Trump actually represents a sizable electorate that Beltway elites have marginalized?
The data on this is pretty clear. Put simply: While most elite-funded and elite-supported Republicans want to increase immigration and decrease Social Security, a significant number of voters (across both parties) want precisely the opposite — to increase Social Security and decrease immigration. So when Trump speaks out both against immigration and against fellow Republicans who want to cut Social Security, he's speaking out for a lot people.
By my count of National Election Studies (NES) data, 24 percent of the US population holds this position (increase Social Security, decrease immigration). If we add in the folks who want to maintain (not cut) Social Security and decrease immigration, we are now at 40 percent of the total electorate, which I'll call "populist." No wonder folks are flocking to Trump — and to Bernie Sanders, who holds similar positions, though with more emphasis on the expanding Social Security part and less aggression on immigration.
Over at Resurgent Dictatorship:
The Asian Infrastructure Investment Bank. The BRICS Bank. The Shanghai Cooperation Organization. What do these organizations have in common? For starters, China is a major player in each of them. And in their own way, each of them indicate how China—and other authoritarian governments, including Russia, Saudi Arabia, and Venezuela—have tired of playing by the rules of existing international institutions.
A recent panel discussion organized by the International Forum for Democratic Studies with a group of leading experts assessed how authoritarian regimes are creating new illiberal norms and institutions as part of their efforts to reshape global governance toward their own preferences. The speakers described how illiberal regimes in Eurasia, the Middle East and North Africa, and Latin America are attempting to reforge global institutional frameworks by prioritizing state sovereignty, security, and mutual non-interference over democratic accountability, government transparency, and respect for human rights.
Alexander Cooley—who analyzes the emergence of authoritarian counternorms in his July 2015 Journal of Democracy article (further discussed here on the blog)—warned that autocrats have become surprisingly adept at neutralizing and subverting the institutions that have traditionally upheld democratic norms. By introducing antidemocratic norms into regional rules-based bodies, creating alternative institutions, and cracking down on NGOs, Cooley argued that authoritarian regimes are challenging scholarly assumptions that regional integration would contribute to the proliferation of democratic norms. Instead, illiberal regimes have discovered that these tactics can be used with particular effect at the regional level as a buffer against international criticism and to silence local voices who once played a key role in bringing human rights violations to the attention of regional organizations.
Fiona MacDonald in Science Alert:
Scientists at Massachusetts Institute for Technology (MIT) in the US have designed a 6.6-metre-wide fusion reactor that they say could provide electricity to around 100,000 people. Even better, it could be up and running within 10 years, according to their calculations.
For decades, scientists have been trying to find a way to harness nuclear fusion - the reaction that powers stars - because of its ability to produce almost-unlimited energy supplies using little more than seawater, and without emitting greenhouse gasses. But despite many promising designs, finding a way to contain and commercialise the reaction on Earth has proven far more challenging than imagined. In fact it's a long-running joke among scientists that practical nuclear fusion power plants are just 30 years away - and always will be.
But not only does the new MIT design promise to be cheaper and smaller than current reactors, it also provides hope that commercial nuclear fusion reactors could become a reality in our lifetime, with the team explaining that similar devices in size and complexity have taken just five years to build.
"Fusion energy is certain to be the most important source of electricity on Earth in the 22nd century, but we need it much sooner than that to avoid catastrophic global warming," David Kingham, a UK-based nuclear fusion expert who wasn't involved in the research, told David L. Chandler from the MIT news office. "This paper shows a good way to make quicker progress."
To explain it very simply, nuclear fusion relies on fusing hydrogen atoms together at super-high temperatures to release enormous amounts of energy. This is different to the nuclear fission used in nuclear power plants, which is where scientists split atoms to generate electricity - a process that's less stable and also produces large amounts of nuclear waste.
So why aren't we already using nuclear fusion to generate ridiculous amounts of clean energy?
We are, it seems, pre-determined to love the taste of all things sweet. Evolutionary biologists argue that survival once depended on our ability to take in quickly high amounts of nutritional energy, a major source of such energy being found in carbohydrates, which include sugar. As frugivores, we generally prefer our fruit as ripe as possible, its degree of edibility being signalled by sweetness, too. While sweetness signals calories, bitterness in contrast may indicate the presence of toxins. It appears that our predilection for sweetness is, like the incest taboo, a cross-cultural phenomenon, and that it is ubiquitous and, in all likelihood, innate: the facial expressions of new-borns, for example, display unambiguous pleasure when sugar is placed on their tongues. We appear, moreover, to have raided beehives for millennia: there is evidence in Mesolithic cave paintings that feeding on honey has always been part of our primate nature. We share our love of sweetness with most other mammals, the sole exception being felines.
Psychoanalysts would mobilize a different model to explain our affection for candies, cakes and chocolates, pointing to the sweetness of mother’s milk, and to the fact that, colic notwithstanding, this earliest of our encounters with nourishment tends to be firmly aligned with comfort and pleasure. Another core function of the consumption of sweets is thus also to provide solace, by transporting us back into the domain of the oral stage where the sensory responses of the mouth and taste buds reigned supreme. As Proust has shown, madeleines and their equivalents can also be the vehicles of memory, taking us back to childhood.
It is impossible, at this space of time, to record all that he said, but his voice, his gesture, his appearance and some of his very words, are indelibly printed on my memory. Looking back, I think now as I thought then, that his greatness lay in his simplicity, that direct simplicity only possessed by the truly great. And this simplicity shone out now in two special ways – in his quietness and dignity. I might even say beauty, in that noisy, ugly room, and in his direct sincerity of speech with me, who was, after all, an unknown stranger. And I was a woman. Do not mistake me; this is no self-deprecation! The point is, and to me it is vital, that I am acutely aware that there are many men with alleged claims to greatness, sex equality creeds, and intimate friendships with women, who, nevertheless, cannot, in their inner being, accept women as fellow humans, and are therefore, in my eyes, completely damned. Some, of course, are better than their creed: what Yeats’s creed was, whether he ever formulated one, I do not know. I do know that he accepted me now as one with himself. Obviously, I am not speaking of personal achievement but of human existence. From the sex point of view, or from any other, as I saw him, there was no trace of patronage in him. Fame had left him unspoilt.
Amid all the excitement about object-oriented philosophy, no one has paused to work out how talk about these new terms for relation is supposed to improve radically on the concept of “relation” in the history of philosophy. The problem is that the original sins of “relation” are not rendered entirely clear in Harman’s and his followers’ writing, apart from glib remarks about poststructuralist relationality, systems theory, and human observation. There’s really no need to overturn the concept of relation in the cursory manner of the object-oriented ontologists, because there’s already plenty in the history of philosophy since Aristotle to instruct us that relation is not always human or correlational, reciprocal, or even fixed or permanent, or anything more than a “moment” of relating that’s always vanishing by dint of becoming and decay. That’s why philosophers in the late Middle Ages commonly distinguished between relationes reales, relations among all entities apart from human perception, and relationes rationis, those relations we’ve reasoned out in our inspection of the world. Kant, for his part, knew that relation is not only aesthetic (what Aristotle derided as the “said-of” of relation; i.e., that relation is what we make of it). Rather, he understood that the problem of relation is exactly the same as the problem of the thing-in-itself: There are relations in the noumenal world, but we cannot think them directly because we have access only to phenomenal relations, the imperfect representations of noumenal relations. The human version of relation, in other words, isn’t the same as noumenal relation, and isn’t the only kind of relation. This idea is all over Kant’s lectures in metaphysics, which none of the object-oriented ontologists seem to know.