Monday, January 31, 2011

It Wasn't a Question We Had Asked

Robert Butler:

It was a beautiful moment in 1887 when a veterinary surgeon in Northern Ireland invented a new kind of tyre, to smooth out the bumps when his son was on his tricycle. Within ten years John Dunlop’s pneumatic tyre—inflated canvas tubes, bonded with liquid rubber—had become so successful that the American civil-rights leader Susan B. Anthony could claim, “The bicycle has done more for the emancipation of women than anything else in the world.”

The invention didn't free everyone. The raw material for the pneumatic tubes came from the rubber vines of the Congo, and the demand for rubber only deepened the damage that had been inflicted in that region by the demand for ivory. A century later, the pattern hadn't entirely changed: the coltan required for mobile phones comes from the same area, where war has claimed millions of lives over the past 12 years.

...And now “Heart of Darkness” is a graphic novel by Catherine Anyango, with the ivory domino pieces—lightly touched on by Conrad in the opening pages—looming in the foreground of the opening drawings. The theme of traceability, where things come from and the journey that they take, is vividly dramatised. It’s a story for today. One day in June last year, Jeff Swartz, CEO of the leisurewear company Timberland, woke up to find the first of 65,000 angry e-mails in his inbox. These were responses to a Greenpeace campaign that said Brazilian cattle farmers were clear-cutting forests for cattle and the leather from the cattle was going into Timberland shoes. Swartz has written up his experience for the Harvard Business Review. His first action, he says, was to admit that he didn't know where the leather came from. It wasn’t a question he had asked.

As “Heart of Darkness” has moved from one medium to another, it has made a good claim to be the single most influential hundred pages of the 20th century. If you consider its central theme—how one half of the world consumes resources at the expense of the other half—it’s easy to see its relevance becoming even greater. Only the resources will no longer be ivory for piano keys, or rubber for bicycle tyres.


The United States has discovered nearly $1 trillion in untapped mineral deposits in Afghanistan, far beyond any previously known reserves and enough to fundamentally alter the Afghan economy and perhaps the Afghan war itself, according to senior American government officials.

The previously unknown deposits — including huge veins of iron, copper, cobalt, gold and critical industrial metals like lithium — are so big and include so many minerals that are essential to modern industry that Afghanistan could eventually be transformed into one of the most important mining centers in the world, the United States officials believe.

An internal Pentagon memo, for example, states that Afghanistan could become the "Saudi Arabia of lithium," a key raw material in the manufacture of batteries for laptops and BlackBerrys.

Before cotton, slavery had been in decline in the United States, but now there was a great need for labor because picking cotton remained extremely labor-intensive. At the time of Whitney's invention slavery existed in just six states; by the outbreak of the Civil War it was legal in fifteen. Worse, the northern slave states like Virginia and Maryland, where cotton couldn't be successfully grown, turned to exporting slaves to their southern neighbors, thus breaking up families and intensifying the suffering for tens of thousands. Between 1793 and the outbreak of the Civil War, over eight hundred thousand slaves were shipped south.

At the same time, the booming cotton mills of England needed huge numbers of workers - more than population increase alone could easily provide - so increasingly they turned to child labor. Children were malleable, worked cheap, and were generally quicker at darting among machinery and dealing with snags, breakages and the like. Even the most enlightened mill owners used children freely. They couldn't afford not to.

So Whitney's gin not only helped make many people rich on both sides of the Atlantic but also reinvigorated slavery, turned child labor into a necessity, and paved the way for the American Civil War. Perhaps at no time in history has someone with a simple, well-meaning invention generated more general prosperity, personal disappointment, and inadvertent suffering than Eli Whitney with his gin. That is quite a lot of consequence for a simple rotating drum.


A friend of mine is heading back to Germany to spend a couple weeks visiting family. She surprised me by asking if there were anything I wanted her to bring back for me. Put on the spot, all I could think to ask for was any sort of Nietzsche-related memorabilia. The tackier, the better. I'm envisioning perhaps a pair of faux-marble bust bookends, or maybe a giant fake mustache. I already have a bunch of t-shirts, stickers and, yes, the beanie baby you see pictured. Shame? No, I don't have any of that, sorry.

Anyway, should I get something really awfulgood, I'll post a picture of it.

Saturday, January 29, 2011

Low and Lonely, High as Kites; When Do You Think We'll Get It Right?

Morgan Meis:

Maybe the discontent that comes to us in winter is in the realization that every moment of joy is but a brief resting point on the greater journey toward oblivion. Or maybe that's too grand. Maybe the discontent in winter is the discontent about the fleeting quality of the present. Nothing holds still for very long, after all. Nothing that feels good stays good for very long. Even in Los Angeles, the perfect sunny day turns to night; a feeling of contentment is replaced by anxiety somewhere along the line.

If there is a greater contentment to be found, then, it is in the contentment of discontent. It is in the willingness, maybe, to have your winters and to have them in their dreariness and decay, neither surrendering completely to that discontent nor pretending to solve it. The cycle of the seasons is, after all, utterly pointless. It just goes round and round. I do not think any meaning can be found in stepping outside that cycle to explain its purpose from afar. Winter can't be made glorious, can't be transformed into endless summer.

Kant's Joke

Kant wanted to prove, in a way that would dumbfound the whole world, that the whole world was right: that was the secret joke of this soul. He wrote against the scholars in favor of popular prejudice, but for scholars and not for the people.

- Nietzsche

Complex logic and leaden, plodding prose do indeed combine to make Kant a grueling chore to read. But if I may quote from Philipp Blom's book one last time, he again manages to succinctly summarize a difficult philosopher:

In this model of history, Immanuel Kant fulfilled a similar function for the eighteenth century as René Descartes had for the seventeenth: His grand metaphysical investigation left open a door through which God could be introduced back into philosophy. Kant argued that our senses determine how the world appears to us, and that we may never be able to perceive things as they really are, the "things in themselves." But instead of accepting that we cannot know anything beyond our perception and that it makes no sense to talk of things we cannot know, he conjectured a purely essential, spiritual reality that is inaccessible to human understanding, a reality in which we might imagine a deity beyond the grasp of the senses. One can read Kant safely without compromising one's religious beliefs, which can always be safely tucked away among the "things in themselves."

Friday, January 28, 2011

Neti, Neti

Thanks to Shanna for passing along a link to an interesting blog, and even taking the time to ferret out several posts of interest from the archives! I'd say this pretty much guarantees her the coveted "Minion of the Month" award for January! But we'll have time for award ceremonies later. Cracking knuckles, getting down to bidness. This was one from last summer:

A letter to my fellow skeptics

What if, Dr Dawkins (or any fellow skeptic), you had an experience of a particular kind that you can’t really explain?

You catch a fleeting glimpse of, well, you can’t really say. Words fail you, but you try anyway: “I’ve witnessed, uh, something… some distinct experience in which I’ve seen the perfection and limitless beauty of the world. I believe, no… I know that I was seeing with utter clarity, and that it was not the way I normally see the world. I saw clearly that I was one with the universe, that unconditional love is its basis, and that I am an eternal being and not a hapless, mortal creature.” Already you sound nuts, even to yourself.

You search for the terms to explain the quality of it, and cringe when the only word you can summon is “divine.” Every stab you take at conveying your experience only amounts to a disappointing, evasive-sounding cliché:

•There is something more to us
•Everything is in its right place, we just can’t see that
•There is a higher intelligence behind all this

Whatever it is, you have a very strong sense that its cultivation is immeasurably valuable, not just as a means of achieving peace and ease in your life, but for others to do the same. It is, clearly, exactly what humanity needs in order to overcome — no, transcend — its current palette of troubles. For this reason you feel it is important for others to have this experience too.

Well, I could quibble with some of the specifics, but in general, sure, I have that experience quite frequently, especially via music, and to a lesser extent, writing or art in general. What I don't understand is why this kind of experience of harmonious... je ne sais quoi is so commonly held out as incommensurate with atheism and scientific materialism. Stephen Asma is currently involved in a back-and-forth exchange that demonstrates how neither of those things require that one jettison an appreciation of the role of emotions and mystery in human experience, for example. And someone like Steven Pinker, as hardboiled as they come, says this in The Blank Slate:

But greedy reductionism is far from the majority view, and it is easy to show why it is wrong. As the philosopher Hilary Putnam has pointed out, even the simple fact that a square peg won't fit into a round hole can't be explained in terms of molecules and atoms but only at a higher level of analysis involving rigidity (regardless of what makes the peg rigid) and geometry. And if anyone really thought that sociology or literature or history could be replaced by biology, why stop there? Biology could in turn be ground up into chemistry, and chemistry into physics, leaving one struggling to explain the causes of World War I in terms of electrons and quarks. Even if World War I consisted of nothing but a very, very large number of quarks in a very, very complicated pattern of motion, no insight is gained by describing it that way.

There's a bit of a "God of the gaps" mindset here that bothers me -- just because science has to currently admit not knowing something, or just because we don't have the vocabulary to express ourselves coherently enough for science to even begin addressing the nature of our experiences, it doesn't imply a supernatural origin or character to them. For the most part, the reason we can't explain these experiences is because there's no way to get an outside perspective from which to analyze them. The nature of language and conceptual thought is such that we can only really talk about things by contrasting them with different things. To what can you compare those moments of bliss that accompany this, uh, god's-eye view? It's sort of like not being able to see a tiny star when looking directly at it; you have to look just to the side of it to catch a glimpse of its dim light. Hence poetry and music, which often trace a circle around the experience without ever trying to pin it down. But lacking the ability to put something immediately into words, or to reduce it to its component parts like a piece of machinery spread out on the table, does not necessarily entail that we shrug our shoulders and start talking about God instead. None of these experiences, however awe-inspiring and ineffable, do anything to offset the fact that our personalities do not survive our body's death, or that there is nothing like monotheism's personal God overseeing the whole shebang. Did you get that? Let me say it again, because it's important: None of these experiences, however awe-inspiring and ineffable, do anything to offset the fact that our personalities do not survive our body's death, or that there is nothing like monotheism's personal, loving God overseeing the whole shebang.

Even someone like Nietzsche, who famously said that mystical explanations, far from being deep as commonly thought, were "not even shallow", was familiar with this sort of ecstatic communion with the spirit of life, the ground of all being, the which than which there is no whicher. The difference is, he was well aware that this is an understanding that transcends our limited, partial notions of love, peace, purpose, morality and all that good stuff. The harmony that pervades life itself includes all the notes we would call dissonant. The brief glimpses we have of "a cosmic place for everything and everything in its cosmic place" includes all the things we would call discordant. If you're seeing only "positive" manifestations of this insight, I daresay you're projecting a bit. All the mindless destruction and suffering that has permeated life as we know it since time immemorial has been just as necessary to the whole, which is why Nietzsche initially called his idea of the eternal recurrence "the most abysmal thought". If you want to merge in rapturous union with the sum total of all that is, was, and ever will be, you have to accept all of it. No picking and choosing, no highlighting the parts you approve of and ignoring the rest.

We can't exist on that level of understanding, any more than we can have a permanent drug high, a neverending orgasm, or an up without a down. Our day-to-day existence requires that we settle for imperfection and partiality. We can only visit there occasionally, and whereof we cannot speak, we must pass over in silence.

Thursday, January 27, 2011

Them Vultures Robbin' Everything, Leave Nothin' but Chains

That five-sided fist-a-gon
That rotten sore on the face of Mother Earth gets bigger
The trigger's cold, empty your purse


In defense circles, “cutting” the Pentagon budget has once again become a topic of conversation. Americans should not confuse that talk with reality. Any cuts exacted will at most reduce the rate of growth. The essential facts remain: U.S. military outlays today equal that of every other nation on the planet combined, a situation without precedent in modern history.

The Pentagon presently spends more in constant dollars than it did at any time during the Cold War -- this despite the absence of anything remotely approximating what national security experts like to call a “peer competitor.” Evil Empire? It exists only in the fevered imaginations of those who quiver at the prospect of China adding a rust-bucket Russian aircraft carrier to its fleet or who take seriously the ravings of radical Islamists promising from deep inside their caves to unite the Umma in a new caliphate.

...The duopoly of American politics no longer allows for a principled anti-interventionist position. Both parties are war parties. They differ mainly in the rationale they devise to argue for interventionism. The Republicans tout liberty; the Democrats emphasize human rights. The results tend to be the same: a penchant for activism that sustains a never-ending demand for high levels of military outlays.

But let's not fail to mention an overemphasis on military expenditures and unused industrial areas, Orlov insisted, describing them as mis-investments made by both empires.

"To this day, the former Soviet Union is littered with abandoned or semi-abandoned industrial sites just as the United States is," he noted.

By Orlov's estimation, the US military will never voluntarily quit being the world's largest oil consumer and largest polluter. Moreover, any plans suggested to the military to end its oil dependency in 30 years are an act of "desperation," he said.

"They have set their hair on fire and are running around in circles. That's their drill right now," Orlov quipped.

And yet, the progressive blogosphere thinks the best use of its time and energy is trying to make it easier for more people to be part of this institution. This is why I barely bother reading political blogs anymore.

No Complaints, but I Wish I Had More Time for My Brain

Snow. We were promised an inch and a half, and we ended up getting something close to ten. (The ladies in my life have precisely the opposite complaint, come to think of it.)

We've been lucky so far, this being the first significant storm of the season. But after the trauma of last year, I don't give a fuck. I swear by Odin's icicle-festooned scrotum, I don't want to see another snowflake for ten years, period, full stop. You know how Caligula ordered his soldiers to attack the water in the English Channel? Well, that's going to be me; stark raving mad, hacking at snowdrifts with a sword. Or maybe I'll just run screaming off into the woods, where I'll become a wendigo and prey on the local hillbillies.

Anyway, I have to hit the road in this mess in a couple hours, so the rest of my day is probably shot. But here's a story from a few weeks ago to entertain you in the meantime. And if Shanna shows up jeering about how this would barely be considered a flurry up in the frozen tundra of Canuckistan, throw something at her.

Wednesday, January 26, 2011

Some People Just Aren't the Type for Marriage and Family


But the notion that the most irresistible man in the world has marital cold feet and zero paternal instincts doesn't sit right with everybody. How can an anguished populace make sense of this crazy news? Theories about what's really going on here abound. First, of course, faster than you can say, "He must be gay," Internet commenters have been chiming in that Clooney, who has dated Italian model Elisabetta Canalis for two years, must be gay. "I know agents in Hollywood who, in the strictest confidence insists that Clooney is gay," confided one CNN poster, while another on E! quickly surmised: "He is as gay as Rock Hudson. He's fooling no one."

Others can't make sense of his quitter attitude, like the commenter on Us who said, "I don't get his negative view of marriage. He has also failed at some movies yet he tries again." And then there are those who simply refuse to believe the man. At The Stir, writer Amy Boshnack opined, "We've heard so many other celebs (and regular folk) say they were never going to marry again, just to end up eating their words," and optimistically observing that " If [Howard] Stern can marry again, I think anyone can!" And E! blogger Bruna Nessif pep talked, "Oh, chin up Georgey! Practice makes perfect," while musing, "Can't help but wonder how Elisabetta is taking this news." Here's a kooky idea. Maybe he just doesn't want to get married. Maybe he doesn't want to have kids. In fact, that's the theory I'm going with.

I've been pretty fortunate in that my mom has only dropped a couple hints about grandchildren and continuing the family line, and even that hasn't happened for quite a while. My parents wondered if I were gay as a teenager simply because I was a loner (granted, the long-haired, androgynous rocker look probably helped encourage such speculation). I don't know what they think now that I've been single for several years, but at least they don't bother me about it.

I just think it's funny how personally people take it when you refuse to identify as fish or fowl for them. You aren't interested in dating or getting married? What are you, gay? A serial killer? High maintenance? Damaged goods? A pervert with an embarrassing secret? TELL ME WHAT YOU ARE, GODDAMNIT! I'M LOSING SLEEP HERE! WHY DON'T YOU WANT TO BE LIKE ME? WHAT'S WRONG WITH ME?

And that, of course, is what I think it's really about. The spectacle of someone consciously resisting the gravitational pull of custom and habit stands as an implicit rebuke to the thoughtless way so many people stumble into the biggest decisions of their lives. Being forced to consider someone who has made different, contrary, and difficult choices and still attained contentedness is enraging to those who have sacrificed so much in deference to authority and conventional wisdom for the sake of an eventual happiness that may never even arrive. It ruins the illusion of security offered by a herd mentality.

Sunday, January 23, 2011


Adam Haslett:

Geoff Kloske, the head of Riverhead Books, publisher of George Saunders and Aleksandar Hemon, thinks current stylistic variety makes it impossible to claim we are in either a minimalist or maximalist period. “More, I fear, there is a flaccidity and casualness of style that has come from writing habits born out of e-mail and social media.” A kind of death of the sentence by collective neglect. Kloske is right that the incessant dribble of mini-messaging has made most people’s daily use of written language brutally factual in character, more private ad copy than prose. I’m old enough to have written letters to friends when I was younger, which took time and a bit of thought. Like most people, I don’t do that any more, and e-mail hasn’t replaced the habit. The writing of complete sentences for aural pleasure as well as news is going the way of the playing of musical instruments – it’s becoming a speciality rather than a means most people have to a little amateur, unselfconscious enjoyment. This isn’t the end of the world for literature. In a sense, it only intensifies its role as the repository of our linguistic imagination. But it’s a pity none the less; there’s a difference between pure spectatorship and semi-participatory appreciation. The latter is much warmer. It creates more room for fellow feeling and a bit less for the glare of celebrity and the correlative abjection of envy and fandom.

But things change. Will we still have Henry Jamesian prose in an age where popular writing has condensed from concision to near-hieroglyphics? And, is it really social media/technology that’s wrecked how we communicate or has it perhaps rather broadened the audience for text-based communication in a time when no one makes time for long thoughts? Culture: Facebook; Chicken; Egg.

Data will come. For now, we can know that schoolchildren will still read Strunk and White while texting under the table. They will still read Proust while going home to play Call of Duty: Black Ops. The snows of yesteryear are, axiomatically, historic, but we will see the creation of new works of literature. The next generation of great writers will learn how to write in the same way they always have: by reading books that move them.

I don't fear that good writing, beautiful writing, will disappear, of course; there are enough people out there willing to become modern-day Irish monks, dutifully preserving the know-how even as the overwhelming majority of our culture devolves to communicating entirely in emoticons and lolspeak. I just lament how much of the writing I encounter on a regular basis is flaccid, casual, brutally factual.

I know that words are abstractions that can never come close to capturing all of experience, but as I think back to how much I've grown to utterly adore writing, I'm sometimes astonished at how much more substantial I feel as a person in the years that I've been systematically trying to write my thoughts down; not just on the blog, but in good correspondence. I always feel some impatience and dissatisfaction with everything I write, still wanting to try again to express things just so, but I can admit that I feel so much better for the effort. I'm not surprised by this at all, even though I'm one of the odd ones who actually greatly prefer typing to writing. Reading and writing facilitate thinking, and when people never make time for the first two, well...

Saturday, January 22, 2011

The Room Behind the Shop

Sarah Bakewell, reviewing James Miller's Examined Lives (another book in my "to read" stack):

Yet his entire book conveys a sense that the genuinely philosophical examination of a life can still lead us somewhere radically different from other kinds of reflection. At the end of his chapter on Descartes, Miller cites the 20th-­century phenomenologist Edmund Husserl, whom he identifies as the major exception to the rule that places most post-Cartesian thinkers on one side or the other of the personal/impersonal divide. Apropos of Descartes, Husserl wrote, "Anyone who seriously intends to become a philosopher must 'once in his life' withdraw into himself and attempt, within himself, to overthrow and build anew all the sciences that, up to then, he has been accepting."

It is an extraordinary thing to do: a project that remains "quite personal," as Husserl admitted, yet that reaches in to seize the whole world and redesign it from the very foundation. Perhaps this is what still distinguishes the philosophical life: that "once in a lifetime" convulsion, in which one reinvents reality around oneself. It is a project doomed to fail, and compromises will always be made. But what, in life, could be more interesting?

This reminded me of a passage from one of Nietzsche's letters that I liked:

Now I am engaged in shaking off what does not belong to me, be it people, friend or foe, habits, conveniences, books. I shall live in solitude for years until, as a philosopher of life, fully matured and finished, I allow myself to (and then, perhaps, must) go again amongst men.

A little less directly, it also called to mind something from Bakewell's book about Montaigne:

We should have wife, children, goods, and above all health, if we can; but we must not bind ourselves to them so strongly that our happiness depends on them. We must reserve a back shop all our own, entirely free, in which to establish our real liberty and our principal retreat and solitude. Here our ordinary conversation must be between us and ourselves, and so private that no outside association or communication can find a place; here we must talk and laugh without wife, without children, without possessions, without retinue and servants, so that, when the time comes to lose them, it will be nothing new to us to do without them.

The phrase about the "back shop" or "room behind the shop" as it is sometimes translated - the arrière boutique - appears again and again in books about Montaigne, but it is rarely kept within its context. He is not writing about a selfish, introverted withdrawal from family life so much as from the need to protect yourself from the pain that would come if you lost that family. Montaigne sought detachment and retreat so that he could not be too badly hurt, but in doing so he also discovered that having such a retreat helped him establish his "real liberty," the space he needed to think and look inward.

Whether it's having some sort of private space available for periodic retreats, or having one particular, intense period of withdrawal and reflection, I like the general theme here. I often feel a strong urge to retreat into a cocoon, only reemerging when I've shaken off everything that doesn't belong to me, all the habits and ties that formed when I was younger and more impressionable. I want to subject my entire life to a severe audit, consciously choosing as much as I can while letting go of all the aimless orbits I more or less unthinkingly happened into.

The Dust of Exploded Beliefs May Make a Fine Sunset (Slight Return)

Let’s make the village a concept and do this again. Rosalie is an agnostic. She has heard the argument that no one can prove a negative and believes it. Then she wanders into a café and reads a big book on doubt as she waits out the rain. Now she knows that the term agnosticism was invented only a hundred years ago by Thomas Henry Huxley, and has no intellectual pedigree to speak of. Huxley made it up having read about Skepticism, which is a philosophically robust proposition that asks how we can know anything at all, given the limitations of our minds and our tiny, animal perspective. Skepticism is thousands of years old and has been brilliantly explored in every age. Agnosticism is the logic of Skepticism applied to only one question, the question of whether one particular people’s imagined idea of the supernatural actually exists.

To be sensible, either you are a Skeptic about all things, which allows you to be a profoundly interesting thinker but does not allow you to claim to know anything about the world; or you are a rationalist, which means you gather evidence, try to minimize your cultural bias, and make conclusions. If you make your decisions by rationalism, you can certainly say that an idea is not to be considered as at all valid if it has no evidence to argue for it being true. In Skepticism I have to allow that possibly all of life is happening in the dream of a cosmic elephant; in rationalism, I do not. It is philosophical nonsense to take Skepticism and apply it to one belief. In rationalism, it is possible to rule on the validity of a conjecture that has no evidence.

Huxley made up agnosticism because he wanted to leave room for people to be atheists but still keep a line of hope. But that is massively wrong-headed. When people confront the truth they get used to it and see that it is not so bad. So there is no afterlife. Big deal. Life is enough. When you are dead, you are so dead that nothing should matter to you about it. When you are alive, you’re alive. Every moment is so huge, there is so much of it, and we take in so little of it. You want more life at the end? You’re hardly using the life you have now. None of us are. We already have more than we can handle. Your job is to try to know the present and the past, to expand into the now, in part by knowing what was.

Some people get on a plane to change where they are but you can transform your surroundings as well as your inner world with just a touch of new knowing.

It may be fated that my love for Jennifer Michael Hecht remain unrequited, but that's okay. As long as she keeps offering up words like these to me, I can pretend she loves me too. Sigh...

Seriously, I wish more people could write about history and ideas in such an invigorating way. Doubt: A History and The Happiness Myth are two of the most fun books I've ever read. Philosophical stories of history as told by a caffeinated poet.

Friday, January 21, 2011

I Have No Need of That Hypothesis

Spinoza was a great man and a wonderfully subversive thinker, but religious/philosophical arguments framed in mathematical terminology don't exactly make for gripping reading. So I'm just going to quote extensively from Philipp Blom's book here, as it contains a great summary of his thinking:

Choosing a mathematically inspired way of constructing his argument, with numbered definitions, axioms and propositions, Spinoza argued that only a single substance could exist, a substance infinitely modified to create the world in all its variety. In his Ethics, he states that "God, or substance, consisting of infinite attributes, of which each expresses eternal and infinite essentiality, necessarily exists," coming to the conclusion that "besides God, no substance can be granted or conceived." So far, so good for the theologian, who could appreciate that a series of strict definitions and logical conclusions had led Spinoza to proving the necessity of God's existence.

Spinoza followed Descartes and the Scholastic tradition in defining God as necessarily existing, but he contradicted Cartesian dualism by saying that nothing could exist outside or independently of God, so, by implication, there could not be two realms of the world, mind and matter, but only one single substance of divine origin, infinitely moderated into material and mental phenomena. This not only reversed Descartes' rescue of theology, but also paved the way for the eighteenth-century materialists, who would argue that the mind is a mere function of the body, not an independent entity.

But Spinoza's logical rigor had even more troubling implications for theology, particularly, and notoriously, in chapter VI of his Tractatus, in which he wrote about miracles. God, Spinoza held, exists by necessity, is substance, and is perfect. His will is the law of the universe, and his perfection means that the world as such is perfect, since no imperfection can come out of perfection without rendering the originator imperfect. It is therefore impossible that God could will anything to be less than perfect without perverting the very nature of his being. If humans do not perceive the world as perfect—if they point to death, putrefacation and suffering as evident imperfections—it is only because these things are not understandable to them as they are to God. Things may seem imperfect from a person's limited perspective, but they are necessarily perfect in the eyes of God, who created them as expressions of his own perfection.

From this seemingly abstract and pious analysis, Spinoza drew a scandalous conclusion: Stories of divine miracles, he wrote, must be born out of ignorance or even conscious deception but cannot possibly be used to infer the existence of God. God's perfection means that the laws of nature must themselves be perfect, too, and it would be impossible for God to intervene in the course of natural events, because any alteration of the course of the universe would mean introducing imperfection, which is incompatible with God's own perfection. The laws of nature—that is, God's laws—cannot be contravened, even by God himself.

Here the circle of Spinoza's virtuoso argument closes. He makes out God to be a being so perfect, so moral, and so necessary that this being could not possibly intervene in the course of nature—indeed, can no longer act at all. Miracles are nothing but misunderstood natural occurrences, he argues. The laws of the universe are synonymous with God's will and intelligence; God himself becomes a metaphor for necessity, for natural laws.

This is a crucial point. Many interpreters have made Spinoza out to be a pantheist, who sees God's will, love and Providence in every leaf of grass and every dewdrop, but this is a fundamental misunderstanding. As his contemporary followers and enemies immediately understood, Spinoza's God is nothing but a particular way of referring to the laws governing the physical world, the only world. One might praise the absolute perfection of this God, but it is soon revealed that no entity, no Creator, no loving or angry Father, no God who saves and punishes is designated by that name. There is only impersonal necessity in a material world; there is no one left to pray to. Spinoza had done nothing less than praise God out of existence.

Thursday, January 20, 2011

The 800lb Gorilla in the Garden of Eden

Stefany Anne Golberg:

Are we looking at a future of edible balconies and backyard chickens and rooftop beekeepers? Most city livers (and we are now a majority) have felt to some degree or other that a life without occasional access to nature feels empty — or, not empty enough. We make our cities bigger and bigger, and still can’t fully shake the feeling that the things people build, the things that most remind us of our humanness, also rob us of an essential part of our humanity. We have come to think this absence can only be filled by being in an environment that has nothing to do with us, that is bigger than we are. An environment we can’t control, that allows us to relinquish control when we are inside it. A lack of access to the natural world, that world we fought so very hard to protect ourselves from, has always left us a little colder inside.

...Maybe the city is not such an obvious choice for agriculture. But then again, why not? Agriculture is, after all, culture. We can cultivate fountains in the desert; why not grow tomatoes on the windowsill? Urban gardeners tell us that we don’t need to leave the city to have a relationship with nature, nor do we have to leave nature alone in order to appreciate it. Whether or not urban growing truly brings us closer to nature, I cannot say. Perhaps, though, in turning farming into an aesthetic venture, urban growing will tweak the way we currently think about agriculture.

Certainly, many urban gardeners are interested in the environmental (i.e. moral) consequences of city growing. The eco-ethical dream of those like Folke Günther is that urban gardening could move beyond aesthetic concerns and really help feed the world’s urban poor. For now, though, the movement outside my window is not subsistence farming. No one in Brooklyn is going to starve without urban gardens. Even so, urban gardeners are earnest in their agricultural pursuits. I think most commercial farmers would be pretty surprised to see how much children in Prospect Park have learned about irrigation techniques. What’s surely exciting is that urban gardeners have us imagining cities as we've never seen them, that move beyond public parks and designated green zones: rooftop apple-picking, gardens in school cafeterias, skyscrapers that emerge out of forests. The modern city as the new Hanging Gardens of Babylon. Gardens — and still Babylon, too.

I have no problem with the aesthetics of green roofs and vertical growing walls, of course. But again, the "environmental (moral) consequences of city growing", just like those of eating meat, are not going to be offset by urban gardens and farmer's markets as long as there are seven billion people and counting on Earth. Human beings cannot keep being fruitful and multiplying indefinitely while relying on cosmetic tweaks to magically produce sustainability.


Jenna Woginrich:

I would come back to meat eating, and I would do it because of my love for animals.

Every meal you eat that supports a sustainable farm changes the agricultural world. I cannot possibly stress this enough. Your fork is your ballot, and when you vote to eat a steak or leg of lamb purchased from a small farmer you are showing the industrial system you are actively opting out. You are showing them you are willing to sacrifice more of your paycheck to dine with dignity. As people are made more aware of this beautiful option, farmers are coming out in droves to meet the demand. Farmers markets have been on a rapid rise in the US thanks to consumer demand for cleaner meat, up 16% in the last year alone.

It's a hard reality for a vegetarian to swallow, but my veggie burgers did not rattle the industry cages at all. I was simply avoiding the battlefield, stepping aside as a pacifist. There is nobility in the vegetarian choice, but it isn't changing the system fast enough. In a world where meat consumption is soaring, the plausible 25% of the world's inhabitants who have a mostly vegetarian diet aren't making a dent in the rate us humans are eating animals. In theory, a plant-based diet avoids consuming animals but it certainly isn't getting cows out of feedlots. However, steak-eating consumers choosing to eat sustainably raised meat are. They chose to purchase a product raised on pasture when they could have spent less money on an animal treated like a screwdriver.

I'm sorry my vegetarian friends, but it's time to come back to the table. You can remain in the rabbit hole and keep eating your salad, but the only way out for good is to eat the rabbit.

Who, exactly, is she talking to here? Most dedicated vegetarians I know consider it a moral act to refrain from killing and eating other animals; they just extend the circle of compassion, as the saying goes, to include pigs, cows and poultry in addition to the dogs, cats and toddlers most of us would never dream of slaughtering. The meaning is in the act itself, not the hypothetical result. Woginrich's justification for meat-eating would seem a bit too "ends justify the means" for them, allowing a little slaughter now in the vague hope of ending up somewhere down the line with less overall. So who, then, are these squishy vegetarians who are fine with eating meat as long as they can believe that the animal lived a flourishing life of eudaimonic well-being before willingly placing its throat against the butcher's knife, yet apparently don't have access to an organic grocery store or farmer's market? Are they more or less interchangeable with those who have the disposable income and progressive inclinations to choose to pay more for food out of environmental and ethical concerns? Aren't they already buying free-range meat? Do all of them put together come anywhere close to the number of people who buy their steaks and ground beef at Walmart? Shouldn't she be proselytizing to them instead?

The problem is not that a minority of hardcore vegetarians are selfishly refusing to support local family farms, the problem is that - to take the US as a lone example - a nation of 300 million people accustomed to eating cheap, convenient meat three times a day cannot be adequately supplied by said farms. This seems like the same mindset that imagines we can simply make a like-for-like exchange of oil, coal and natural gas for solar, wind, and hydroelectric power without ever having to ease up on the accelerator or turn off the lights.

This Cocaine Makes Me Feel Like I'm on this Song

Ever had goosebumps or felt euphoric chills when listening to a piece of music? If so, your brain is reacting to the music in the same way as it would to some delicious food or a psychoactive drug such as cocaine, according to scientists.

The experience of pleasure is mediated in all these situations by the release of the brain's reward chemical, dopamine, according to results of experiments carried out by a team led by Valorie Salimpoor of McGill University in Montreal, Canada, which are published today in Nature Neuroscience.

Music seems to tap into the circuitry in the brain that has evolved to drive human motivation – any time we do something our brains want us to do again, dopamine is released into these circuits. "Now we're showing that this ancient reward system that's involved in biologically adaptive behaviours is being tapped into by a cognitive reward," said Salimpoor.

She said music provided an intellectual reward, because the listener has to follow the sequence of notes to appreciate it. "A single tone won't be pleasurable in isolation. However, a series of single tones arranged in time can become some of the most pleasurable experiences that humans have ever reported. That's amazing because it suggests that somehow our cerebral cortex is following these tones over time and there must be a component of build-up, anticipation, expectation."

I'm not surprised at all, because I'm sure my music listening habits would make some people wonder about the possibility of addiction. But it is interesting what they note about intellectual pleasures being able to tap into a system designed to encourage more prosaic pursuits like eating and sex. I was just telling Shanna the other day that I have to be careful when I engage in writing or thinking about stimulating topics because they can have a physically energizing effect on me as potent as a cappuccino. Perhaps I'm using music and thoughts to accomplish something like what Steven Pinker talked about:

Now, if the intellectual faculties could identify the pleasure-giving patterns, purify them, and concentrate them, the brain could stimulate itself without the messiness of electrodes or drugs. It could give itself intense artificial doses of the sights and sounds and smells that ordinarily are given off by healthful environments.

Speaking of whom, I see I wasn't the only one to see this story and recall his characterization of music as "auditory cheesecake".

Wednesday, January 19, 2011

Who Is More Godless Than I, That I May Delight in His Instruction?

Philipp Blom:

Even today, the public discussion about moral and political issues is no longer framed in an explicitly religious context, but the change in terminology only conceals the all-pervasive influence of the unexamined theological ideas underlying it. Our vocabulary has changed, of course: We no longer speak about the soul but about the psyche; we have exchanged original sin for inherited, psychological guilt. But the cultural soil on which these ideas flourish has remained the same, and all too often our worldview is inherently religious without our even realizing it.

...Christianity is the religion of the suffering God. Christ was made flesh and had to die, to be tortured to death, thus allowing God the Creator to forgive humanity for its wickedness. Holbach and Diderot wrote all there is to be written about the perversity of this argument, but even the most irreligious of Westerners still believe in the positive, transformative value of suffering. We have all internalized the Romantic stereotype of the solitary, suffering genius (a figure almost singlehandedly invented by Rousseau in his Confessions). We love stories in which people triumph over adversity, in which they are almost crushed by wickedness or misfortune, only to emerge again, to be resurrected. This kind of story is found in many cultures, but not in all. The ancient Greeks attached no moral value to suffering. After journeying around the Mediterranean for twenty years and surviving many dangers, Homer's Odysseus is older—but not a wiser or better man.

I'm liking this book a lot already. The desire to root out "unexamined theological ideas" underlying supposedly non-religious principles, at least for examination if not eradication, has been one of my own personal little missions. Souls, afterlives, moral significance permeating the universe, a teleological progression to existence, apocalyptic/utopian thinking, and so many other manifestations of an essentially religious frame of mind are very common even among people who complacently pride themselves on not taking direct orders from preachers and holy books.

Tuesday, January 18, 2011

Whoever Fights Monsters Should See to It That in the Process He Does Not Become a Monster


Ideas have consequences, contorted ideas all the more so. Unfortunately, Nietzsche’s rhetorical proclivities for metaphorical enigma and hyperbole have been exploited as potential sources of inspiration for egregious acts of terror, most recently in Arizona. Ideas, and Nietzsche’s specifically, must be situated in both their proper intellectual and historical context to be properly understood.

It would be outrageously insensitive to suggest that a more nuanced interpretation of Nietzsche would have prevented Jared Loughner from pulling the trigger on January 8. Clearly, as we have learned from recent revelations of Loughner’s writings and ideas, he was, as Zane Gutierrez attested to, mentally deranged. To incriminate Nietzsche, though, would be anachronistic and based on a specious understanding of his philosophy. In a world of media communications defined by sound bites and a concomitant lack of context, historians have a responsibility to the public to provide the necessary context, the full picture, the long view, whatever one chooses to call it, to promote a more informed and productive public discourse.

Like I said the other day, I'm cheered by the fact that everything I've seen written so far has resisted the still-too-common tendency to trot out the same old myths about Nietzsche being some sort of social Darwinist who glorified physical violence over others as a means of proving one's mastery. But something that still makes me laugh about that is the fact that the first time I ever recall seeing Nietzsche's name was in a Dean Koontz novel that I read when I was maybe fourteen. I don't remember the name of it (and I'm too lazy to look it up), but the plot was based on a pair of friends who bond over, among other things, a misinterpretation of Nietzsche and become a sort of serial-killing tag team in order to prove their superhumanity. The detective who cracks the case, speaking for Koontz, laments how often Nietzsche gets misunderstood by such types. If a pulp novelist in the mid-80s knew that much...

Saturday, January 15, 2011

Welcome to a New Kind of Tension All Across the Idiot Nation

This is, of course, only the nerdiest of the many implausibilities depicted in the film. But the movie created enough buzz that NASA started getting mail about it. Yeomans told The Australian: "The agency is getting so many questions from people terrified that the world is going to end in 2012 that we have had to put up a special website to challenge the myths. We have never had to do this before." NASA has a special section on its website, "2012: A Reality Check", and has even created its own video about it. In the video, Yeomas laboriously debunks the "2012 myth," and explains that his only plan for December 12, 2012 is to "lay in an extra supply of egg nog."

As absurd as the movie is, public fear about the "2012 phenomenon" is even more so. The film's trailers, which ended by encouraging audiences to search the web ("Find out the truth: Google search 2012"), must bear part of the blame - but so must the hordes of overheated Googlers who hurtled into NASA's inbox. Perhaps they're the cataclysm we ought to be worrying about.

I just thought I'd share one example of how there truly is nothing new under the sun, since I was just reading about it last night in Sarah Bakewell's excellent book about Montaigne:

Signs of the imminence of this apocalypse were plentiful. A series of famines, ruined harvests, and freezing winters in the 1570s and 1580s indicated that God Himself was withdrawing His warmth from the earth. Smallpox, typhus and whooping cough swept through the country, as well as the worst disease of all: the plague. All four Horsemen of the Apocalypse seemed to have been unleashed: pestilence, war, famine and death. A werewolf roamed the country, conjoined twins were born in Paris, and a new star - a nova - exploded in the sky. Even those not given to religious extremism had a feeling that everything was speeding towards some indefinable end.

Freezing winters? Whooping cough? Conjoined twins, hell; how about a two-headed calf? And in place of a new star, would you accept a lunar eclipse on the winter solstice? And doesn't a werewolf sound kind of tame compared to a Montauk Monster and chupacabras? (Update: never mind, we've got a wolfman too!)

If you're nice to me, I might let you share my heavily fortified bunker.

And He Who Is Not a Bird Should Not Make His Nest Above the Abyss

What is nihilism?

It's the feeling that nothing in the world matters any more than anything else. Nietzsche's analysis was that people once found meaning in their belief in the Judeo-Christian God, but that in the post-medieval world belief wasn't sufficient anymore to give people the sense that things really mattered. The basic philosophical issue underlying the book, then, is: how are you supposed to live your life in order to make it possible that things matter again?

Is nihilism an intellectual problem, or an emotional one?

Some people really suffer from the feeling that nothing seems to matter any more than anything else. David Foster Wallace called it a 'stomach-level sadness.' I think that's a pretty good description of it.

I don't know; isn't the inability to find enjoyment in anything better described simply as "depression"? Nihilism does seem to me like a primarily intellectual problem, where people know what they love and enjoy but suffer from not being able to justify it. Our Greek and Christian intellectual heritage teaches us that contingent things are worthless, that only timelessness and permanence are worth grasping for, and since nothing but abstract ideas can ever come close to meeting such criteria, anyone who looks for them in the everyday world is going to have to reckon eventually with how to fully embrace things despite their contingency, or else continue to suffer. If I may quote that passage from Steven Ozment again:

The belief that momentary feelings of unity or visions of perfection can survive permanently into everyday life this side of eternity is the ante-room of nihilism and fascism. Such beliefs give rise to ahistorical fantasies, which can never materialize beyond the notion. To the extent that they are relentlessly pursued, they progressively crush the moments of solace that precious moments of grace can in fact convey. Historically such fantasies have spawned generations of cynics, misanthropes and failed revolutionaries who, having glimpsed resolution, cannot forgive the grinding years of imperfect life that still must be lived.

I know when I was young, dumb, and full of existentialism, grappling with the idea of meaning in a godless universe, the feeling of despair I'd get was similar to the vertigo I'd feel when my fear of heights was triggered -- even when I knew perfectly well I was safe from any danger of falling, even when I was only looking up at a tall object rather than being high off the ground myself, I couldn't reconcile that awareness with the optical illusion that had me so terrified. Nihilism, to me, is a similar sort of illusion. Even when your feet are firmly planted on the ground of everyday experience, secure in the meaning to be found there, you're tormented by an irrational fear of falling anyway.

...I'm adding this section in several hours after having originally posted this, having just seen this article. It's depressing to see that people like Jared Loughner still read Nietzsche and take away the lesson that they need to violently impose their will on everyone else, but at least decades of scholarship have made it so that an elegant explanation of why he's so ridiculously wrong to do so presents itself immediately afterward. Thank you, Matt Feeney, I couldn't agree more:

Nietzsche, oddly, has suffered a similar fate. Because of his assault on religion and rationalist metaphysics, and because of the hints of anarchy in his assorted visions of the future (e.g., "the transvaluation of all values"), he's taken as the West's über-nihilist. But he saw himself as the scourge of European nihilism, and possibly also its remedy. Nietzsche saw nihilism as a disease, which grows from, in Alexander Nehamas' words, "the assumption that if some single standard is not good for everyone and all time, then no standard is good for anyone at any time." It presents itself as mindless hedonism and flaccid spirit, but also as fanaticism.

So does that make Nietzsche and Jared Lee Loughner philosophical brethren after all, joined in the same fanatical fight against nihilism? In a word, no, and Loughner's pathological fixation on the meaning of words is the giveaway. One way of looking at Nietzsche's project is that he set out to teach himself and his readers to love the world in its imperfection and multiplicity, for itself. This is behind his assaults on religion, liberal idealism, and utilitarian systems of social organization. He saw these as different ways of effacing or annihilating the world as it is. It is behind his infamous doctrine of the Eternal Recurrence—in which he embraces the "most abysmal thought," that the given world, and not the idealizing stories we tell of it, is all there is, and he will affirm this reality even if it recurs eternally.

Jared Loughner's despair that everything is unreal and words have no meaning amounts to hatred of the world (a mania of moralism and narcissism) for its failure to resemble the words we apply to it. Faced with a choice between real people and some stupid abstraction about words, themselves mere abstractions, Loughner killed the people to defend the abstraction. This, then, really is a kind of nihilism, only not the kind that people think Nietzsche was guilty of. It's the kind of nihilism that Nietzsche was trying to warn us about, and help us overcome.

Friday, January 14, 2011

Death, the Great Equalizer

So I click on the Huffington Post - which probably means I have no right to complain in any event; caveat lector and all that - and I'm greeted with a gigantic headline: Military Panel Backs Women in Combat (on a post tagged "women's rights").

"Women in the military are exposed to the same kind of dangers that combat service exposes a soldier to, but the difference is that the women are not getting combat pay, and they're not getting combat-related opportunities for promotion," NOW President Terry O'Neill said in an interview with The Huffington Post. "So it's only fair to recognize that women belong, as much as men do, in combat units."

In 2005, the Washington Post interviewed dozens of U.S. soldiers serving in Iraq -- men and women of various ranks -- about the exclusion of women from combat, and they "voiced frustration over restrictions on women mandated in Washington that they say make no sense in the war they are fighting. All said the policy should be changed to allow, at a minimum, mixed-sex support units to be assigned to combat battalions. Many favored a far more radical step: letting qualified women join the infantry."

I don't have much to say that I haven't already said about the same topic as it relates to gays, atheists, the big-boned, the developmentally challenged, the sight-deprived, the assisted-living community, the recently paroled, the Amish, the homeless, the children not tall enough to ride this roller coaster, bestiality enthusiasts and everyone else our equal-rights fetishists would dearly love to dress up in combat gear for the glory of America. Just this: I submit to you that no one belongs in a combat unit. That the Iraq war itself is what made no sense. And that the most radical step of all would be for people to reject the imperial mindset that leads them to look at a world-straddling, resource-devouring, death-dealing colossus and see, first and foremost, a great opportunity for civil rights and career advancement.


I'm still getting back into fighting form here, so let me start with something simple. I offer to you a modest proposal in response to almost everything I've been reading about free speech, hate speech (bad speech, great speech) in the wake of Tuscon:

Thursday, January 13, 2011

Employees Must Wash Hands Before Returning to Work

I'm laid low with a bad case of food poisoning (Chinese takeout). My goals for today are modest: spend most of my time dozing in the comforting embrace of painkillers, and maybe have a bowl of soup. Regular programming will resume when I return to the land of the living.

From Tuesday afternoon to Wednesday evening, I lost eight pounds simply by my body violently ejecting everything that wasn't nailed down inside of me. Maybe if I can catch the flu later this season as well, I'll have my bikini body all ready to go for warm weather!

What is it with hospital waiting rooms keeping the TV tuned to Fox News? Nothing quite like sitting there shivering, aching and moaning like a junkie going through withdrawal as you watch G. Gordon Liddy tell you to invest all your money in gold and the news anchors debate just how much of a Communist Jared Loughner is.

Tuesday, January 11, 2011


To revisit an earlier point: Few things provide an undesirable comic effect in an otherwise serious post like the odd practice of refusing to type out FUCK or SHIT in their entirety. Unless you're concerned that a four year-old might be reading your blog, why bother replacing letters with punctuation marks? We've all known these words since we were six years old, haven't we? (Although, one instance I saw of replacing the U with a V gave FVCK an interesting Roman flavor. Still, when discussing topics like violence and anger, you might want to carefully consider whether you're trying to make your reader laugh.)

The thoughts and feelings symbolized by the word are what matter, and if those are offensive to you, then I suppose you shouldn't allow yourself to even think about them. The word itself possesses no inherent magic power. Spelling it out or saying it aloud will not conjure demons, I promise. We know what you mean. You know what you mean. Just fucking spell it out already. Like Louis C.K. said, you're putting the word in my head, making me say it to myself. You should be the one to say what you mean! Take responsibility for it!

Monday, January 10, 2011

Use Your Fist and Not Your Mouth

I had already resigned myself to reading plenty of achingly stupid shit about the Giffords shooting, but it's going to be hard to top this from Jack Shafer:

Any call to cool "inflammatory" speech is a call to police all speech, and I can't think of anybody in government, politics, business, or the press that I would trust with that power. As Jonathan Rauch wrote brilliantly in Harper's in 1995, "The vocabulary of hate is potentially as rich as your dictionary, and all you do by banning language used by cretins is to let them decide what the rest of us may say." Rauch added, "Trap the racists and anti-Semites, and you lay a trap for me too. Hunt for them with eradication in your mind, and you have brought dissent itself within your sights."

Our spirited political discourse, complete with name-calling, vilification—and, yes, violent imagery—is a good thing. Better that angry people unload their fury in public than let it fester and turn septic in private. The wicked direction the American debate often takes is not a sign of danger but of freedom. And I'll punch out the lights of anybody who tries to take it away from me.

Well, at least he won't mind too much if I call him a demented, retarded pigfucker, I suppose.

Look. If you find anyone calling for legal restrictions on public speech, then by all means, let's hear your best Patrick Henry impersonation. But there is nothing threatening to the First Amendment when someone uses their free speech to attempt to persuade other people to voluntarily modify their own, or even shame them for not doing so. Sarah Palin, Glenn Beck or anyone else in that clown car are perfectly free to use guns and revolution as the one-size-fits-all metaphors for talking about political issues, and the rest of us are free to tell them to grow the goddamn fuck up and stop being such hysterical fucking demagogues. The system seems to be working just fine, so what's the problem? If Palin really has the courage of her convictions, if she really believes her own bullshit, surely she won't allow one dead nine year-old girl and some public scolding to stop her, right?

Ah, but that's just the thing, isn't it? She doesn't believe it. None of them do. I see it in the teabaggers I know personally, who rant and rave about how mundane policy battles are the razor's edge between freedom and tyranny, working themselves up into a rabid state over our impending slavery... before going back to watching Monday night football and hanging out at the bar with the rest of the boys. There seems to be a little disconnect here between rhetoric and action, is what I'm saying. I mean, if I honestly believed that a four percent difference in tax rates for people who make five times as much money as me was the difference between the U.S.A. and Nazi Russia, I think I'd be spending my time in clandestine meetings with the rest of the revolutionary cadre, fomenting insurrections, that sort of thing.

No, it all reminds me of nothing so much as a surly teenager who responds to every request to help out with chores by hollering about having never asked to be born. Of course, if you offered to help tie the noose for the little shithead, you'd likely find a sudden change in attitude, but barring that, they have a reasonable expectation that if they just act like the most shrill, obstreperous assholes in response to everything they don't like, you'll finally get weary of listening to it and just let them have their way. It's no different with the teabaggers. Their stunted adolescent response is just to holler "Wolverines!" and go polish their hunting rifles.

We don't have to waste time in fruitless arguments over the precise nature to which implied violence in rhetoric legitimizes, encourages or causes violent action in response; we can just force them to directly own up to it. Is this what you want, then? Is this what politics needs to become? No? You don't actually intend to violently overthrow the government every time you lose an election? Then shut the fuck up. You too, Shafer, you fucking moron.

Sunday, January 09, 2011

When I Use a Word, It Means Just What I Choose It to Mean, Neither More nor Less

Hello, what's this? A link to a post on "techno-spirituality"? Well, that sounds potentially interesting -- does it have something to do with online cults? Kurzweil-style babbling about "spiritual machines" and immortality through science? Modern-day whirling dervishes who dance themselves into mystical ecstasy to the accompaniment of programmed beats and synthesizers? Aw, it's just a guide to taking five minutes out of the day to practice breathing.

Spirituality -- that ability to stay calm, focused and compassionate in a constant sea of change -- is therefore more important than ever.

I don't know about you, but I just call that "maturity". Taking deep, controlled breaths to deal with stress only seems to require common sense. It just struck me funny that the term is so debased that it apparently can refer to everything from vague metaphysical beliefs and the achievement of a mystical union with the Ground of All Being to the simple act of not coming completely unglued under pressure at work.

Saturday, January 08, 2011

When in Rome


Am I the only one who thinks it's extremely odd that we are sending "Homeland Security" agents to Afghanistan? Don't we have a military that's tasked with these sorts of chores? And if it's just a "loan" of certain specialists, why is Janet Napolitano making the announcement instead of the proper foreign service or military spokesperson? Afghanistan isn't in her portfolio --- at least I didn't think it was. I thought we were going to keep the new Homeland Security forces here in the ... homeland.

That's all rhetorical, of course, since it's been obvious for decades that many of our allegedly "domestic" agencies like the DEA and the ATF are really para-military organizations which are deployed all over the world. But it looks as though we aren't even going to pretend anymore that there's a separation between the two. And that means that we have created yet another sacred police/military budget item that will be nearly impossible to scale back.

What has changed since the collapse of Jim Crow has less to do with the basic structure of our society than with the language we use to justify severe inequality. In the era of colorblindness, it is no longer socially permissible to use race, explicitly, as justification for discrimination, exclusion, or social contempt. Rather, we use our criminal-justice system to associate criminality with people of color and then engage in the prejudiced practices we supposedly left behind. Today, it is legal to discriminate against ex-offenders in ways it was once legal to discriminate against African Americans. Once you're labeled a felon, depending on the state you're in, the old forms of discrimination -- employment discrimination, housing discrimination, denial of the right to vote, and exclusion from jury service -- are suddenly legal. As a criminal, you have scarcely more rights and arguably less respect than a black man living in Alabama at the height of Jim Crow. We have not ended racial caste in America; we have merely redesigned it.

...What caused the unprecedented explosion in our prison population? It turns out that the activists who posted the sign on the telephone pole were right: The "war on drugs" is the single greatest contributor to mass incarceration in the United States. Drug convictions accounted for about two-thirds of the increase in the federal prison system and more than half of the increase in the state prison system between 1985 and 2000 -- the period of the U.S. penal system's most dramatic expansion.

This kind of stuff is why I largely have no interest in political issues, at least not the sort that occupy most political bloggers. The ever-expanding police/military/surveillance state outweighs almost everything else in importance and in danger to the values we supposedly hold, yet unquestioned support for it is bipartisan, as is the inexorable upward transfer of wealth. Most progressive bloggers feebly note that supposed deficit hawks on the right have no interest in reining in military spending, but none of them are going to waste their time tilting at such windmills either, not when there's a chance to pose for a picture with the president, or important progress to be made in making sure gays, atheists and fat people all get an equal opportunity to participate in our foreign conquests and occupations. And as Glenn Greenwald observes, even though Democratic politicians just as much as their Republican counterparts are going to keep prolonging the same "wars" on drugs and terror, progressives will meekly drop even the pretense of rebelliousness when election time rolls around.

Feh. If you need me, I'll be doing some reading.

Who Is It That Can Tell Me Who I Am?

Wertheim pointed out that cyberspace had become a new kind of place, where alternate (or at least carefully curated or burnished) identities could be forged, new forms of collectivity and connection explored, all outside the familiar boundaries of the physical world, like the body and geography. It’s not such a long journey to follow those assertions to the “view that man is defined not by the atoms of his body but by an information code,” as Wertheim wrote. “This is the belief that our essence lies not in our matter but in a pattern of data.” She called this idea the “cybersoul,” a “posited immortal self, this thing that can supposedly live on in the digital domain after our bodies die.”

...Wertheim, it should be noted, saw the cybersoul notion as both flawed and troubling, and I would agree. Life’s essence reduced to captured data is an uninspiring, and unconvincing, resolution to the centuries-old question of where, in mind and in body, the self resides. At least other imagined versions of immortality (from the Christian heaven to the Hindu wheel of life) suggested a reconciliation, or at least a connection, with the manner in which a physical life is lived; the cybersoul’s theoretically eternal and perfect persistence ignores this concept. Most of all, though, fantasizing about living forever — in heaven or in a preserved pattern of data — strikes me as just another way of avoiding any honest confrontation with the fact of death.

Well, that depends. I certainly agree that fantasies of eternal life are just that, but at the same time, we can see for ourselves how words spoken and written centuries ago still permeate and deeply affect our lives and thoughts today. Where do some of my favorite authors end and I begin? Not that most of what's being published on Blogspot or Wordpress will still be profoundly moving people decades from now, but still, an author can take a little comfort from imagining their words forging connections that outlast the deaths of any of the particular individuals so affected, and sparking activity after long periods of lying dormant.

The "self", as many thinkers and neurologists can tell you, is an elaborate, ongoing fiction. It resides in a multiplicity of perspectives and nowhere fixed in particular. Like I said, the thoughts and words in my head are no more "mine" than the atoms in my body or the sustenance I take in, yet they all come together in a unique combination for a brief point in time. And for someone like me who largely identifies "who I am" with "what I think", I strongly disagree that it's somehow "uninspiring" to consider such patterns of data as equally integral to one's sense of identity as the humdrum details of daily life. Why is the self who types these words any less valid than the one who just washed clothes and is getting ready to walk the dogs? The self that we present to others online isn't a complete picture, but then again, where do we ever find one?

And with Strange Aeons Even Death May Die

And once again
You'll pretend to know that
That there's an end
That there's an end to this begin
It will help you sleep at night
It will make it seem that right is always right

- Smashing Pumpkins

The idea of dead scientists engaging in an experiment in eugenics is incredible enough. Yet the most striking feature in this episode – only fully revealed more than 100 years after the scripts began to appear – is the power that is ascribed to science itself. While spiritualism evolved into a popular religion, complete with a heavenly "Summerland" where the dead lived free from care and sorrow, the intellectual elite of psychical researchers thought of their quest as a rigorously scientific inquiry. But if these Victorian seekers turned to science, it was to look for an exit from the world that science had revealed. Darwinism had disclosed a purposeless universe without human meaning; but purpose and meaning could be restored, if only science could show that the human mind carried on evolving after the death of the body. All of these seekers had abandoned any belief in traditional religion. Still, the human need for a meaning in life that religion once satisfied could not be denied, and fuelled the faith that scientific investigation would show that the human story continues after death. In effect, science was used against science, and became a channel for belief in magic.

Much of what the psychical researchers viewed as science we would now call pseudo science. But the boundaries of scientific knowledge are smudged and shifting, and seem clear only in hindsight. There is no pristine science untouched by the vagaries of faith. The psychical researchers used science not only to deal with private anguish but also to bolster their weakening belief in progress. Especially after the catastrophe of the first world war, the gradual improvement that most people expected would continue indefinitely appeared to be faltering. What had been achieved in the past seemed to be falling away. If the scripts were to be believed, however, there was no cause for anxiety or despair. The world might be sliding into anarchy, but progress continued on the other side.

Many of the psychical researchers believed they were doing no more than show that evolution continues in a post-mortem world. Like many others, then and now, they confused two wholly different things. Progress assumes some goal or direction. But evolution has neither of these attributes, and if natural selection continued in another world it would feature the same random death and wasted lives we find here below.

...The fantasies that possessed the psychical researchers and the god-builders still have us in their grip today. Freezing our bodies or uploading our minds into a supercomputer will not deliver us from ourselves. Wars and revolutions will disturb our frozen remains, while death will stalk us in cyberspace – also a realm of mortal conflict. Science enlarges what humans can do. It cannot reprieve them from being what they are.

It looks like a fascinating book. Though I'm beginning to fear that I might need an afterlife just to read all the books I'm accumulating. If only science could help me do away with eating and sleeping...

But yes, the belief in some sort of immortal soul and afterlife, and the belief in a teleological essence to life itself both seem to me to be more interesting issues for skeptical attention than that of God's existence or lack thereof. As Schopenhauer said, given a stark choice between belief in personal immortality and belief in God, most people wouldn't hesitate to become atheists. And while some branches of Buddhism have managed to reconcile belief in a continuation of personal consciousness beyond death with a denial of any meaningful deity, Buddhists in general may be the only people who wouldn't suffer a crisis of identity if forced to relinquish an attachment to the idea of progress, the numbing distraction of being in constant motion toward a goal.

Friday, January 07, 2011

Bach Almost Persuades Me to be a Christian

Speaking of Arthur, it was his birthday the other day, so I sent him a link to this interview with the author of a book on Bach's cello suites. He's been a DJ for a classical music program on a local independent radio station before, and he's forgotten more about classical music than you and I will ever know, so it's always interesting to hear what he has to say:

Yes, Bach is the only artist who makes a convincing case, to me, for the existence of God, or for the mind of God, which would have to have the qualities of a Bach fugue, infinitely complex and at the same time incredibly clear and perspicuous. Someone in a book on the Well-Tempered Clavier made the point that you never hear the same Bach fugue twice; there are too many paths through the music. On the other hand, any path will do. Unlike the contrapuntal forays of his contemporaries, Buxtehude or Pachelbel, for example, Bach's fugues never seem merely to noodle. There is always a crystal clear theme or variation (subject or episode, in fugue-speak) guiding the ear through the complexity. And yet the complexity and inventiveness is such that the ear can't grasp it all at once in any single hearing, and so, far from experiencing satiety, the ear hungers for a re-hearing. Thus the infinite and the finite, the abstract and the concrete, the whole and the detail, exist in harmonious but dynamic simultaneity in his music, as in a perfect aesthetic world, a Heaven of the musical imagination.

I Know No Verse More Gnomic Than Thine

Chris Clarke:

The headline – whether page title or link text – should tell people exactly what’s in the article, they say, so that people will know what’s in your article before they even read it. There’s certainly justification for this approach. In tech writing, for instance, it just makes sense. If your article explains how to repair your mobile phone in the field, it’s clearly better to entitle it something like “Fixing Your Mobile Phone On The Road” rather than “When ET Can’t Phone Home” or “The SIM Sins” or “Lost Verizon.”

Pfft. Me, I would have gone with "No-Bar Blues". Get it? You get it, right? See, it's like "12-Bar Blues", but, you know, when someone doesn't have any phone service, you'll sometimes hear them say they don't have any "bars", because, uh...ahem. You know, "bars", the indication of signal, uh, strength. Um. Heh heh.

Anyway, I just had a good laugh imagining someone like Jakob Nielsen palming his face over my post titles, which, in case you've ever wondered, are usually snippets of song lyrics or poems, groan-inducing puns, or exotic words with a poetic ring to them that somehow relate to the theme of the post. Clarity doesn't factor in to my decision-making at all. (The title of this one is a direct quote of something my friend Arthur said to me about a poem I showed him.)

But not everyone who posts writing on the web is doing so for the benefit of web visitors. Some of us are more interested in readers. Eyeballs are one thing. Eyeballs hooked up to a functioning cerebrum is an altogether different, better thing. It’s a simple concept that seems surprisingly hard for some experts to grasp: there is more than one kind of writing on the web. There are news alerts. There are How To shorts. There is poetry. There are aimless diary entries. There are screed. There are plays, short stories, rants, recipes, verbal fusillades meant to inflame, prose meant to enlighten, verse meant to perplex. Try to make rules about structure and tempo and tone for any writing that appears on paper, and you’ll be laughed out of the one remaining independent bookstore in your county. The Web isn’t a genre. It’s a medium. The Web is paper, only faster and with a higher carbon footprint. Tech pundits and journalism pundits seem slowest to grasp this general point, for some reason, but the vast majority of writing on the web is neither tech writing nor journalism. It’s essay, memoir, epistolary writing – literature. Not all of it’s good literature, mind. But literature.

Yes, yes, yes. This is exactly what I love about the Internet, especially the blogosphere, and it's what underlies my loathing of the increasing tendency of people to coagulate around the lowest common literary denominator of social networking sites. I have friends who know about this blog yet never stop by to read, never answer emails, but repeatedly ask me why I'm not on Facebook or Twitter. Because fuck you if you're not willing to make any effort, that's why. I'm only interested in people who love words as much as I do.

Wednesday, January 05, 2011

We Are Darker Angels, Black Lightning in Our Heads

Tell me about The New Black.

Darian Leader is a British psychoanalyst who in a great way undermines today’s ideas about depression. He starts with the premise that we live in a society of hyped optimism, where depression appears as a danger that goes against optimism – it’s something for people who gave up the fight for success or whatever. Today we use the terms depression and stress too much – they dominate psychiatric and self-help discourse.

They’re debased terms; you might be ‘depressed’ if you miss the bus.

Absolutely. Or just the common boredom of children can be described as depression. But what Darian does is to return to the difference between melancholy and mourning, and he makes a great distinction between them. It’s very good to return to these different roots of depression, and to stay with them – not as traumatic things, but as something pretty normal which has been forgotten.

So we should re-codify depression?

Not perceive it as a unified term, but to see it as various different things, which is why he is using the old terms of melancholy, mourning and loss.

Darian is also critical of the pharmaceutical industry: depression appears as something universal that can be quickly dealt with using pretty much universal types of drugs. But, as he points out, this denies the fact that the symptom is connected to some cause beyond the depression. He shows that in depression everyone has a different logic and a different individual story, which can be linked to loss – of another human being, of identity, of a job, of health or love. It can also be linked to being stuck in circulating around some lack.

I'd be interested to check this book out. I actually spent a few months over a decade ago talking to a therapist, since a couple others were urging me to do it (and offering to pay for it). An American Indian guy, slightly New Agey, but also with books by Kant on his shelf. He was fun to talk to, but he didn't really tell me anything profound. I just enjoyed having an hour to bullshit about philosophy with someone with no distractions. It occurred to me that if only we could be guaranteed an hour of someone's undivided attention once or twice a week, a lot of our psychological problems might disappear on their own. Maybe we need to start waving cash around in front of our friends and family to help them focus.

But anyway, I laughed at the above point about depression as a danger that goes against optimism. That, more than anything, has led people to wonder about my own mental health: "You don't want a higher-paying job? You don't have any long-term plans for your life? You don't mind being single for so long? Are you sure you're okay?"

Tuesday, January 04, 2011

They're Coming to Take Me Away, Ha-Haaa!

Are you allergic to Twitter? Do you befriend people outside your target demographic? Then you may be suffering from an undiagnosed personal branding disorder.

• Schizoid Branding Disorder: You have been overheard proclaiming that Twitter is for blowhards with ADHD. You profess a love of "nature" and "reading books" instead. You refuse to chat with people online, and your cell phone service charges extra for texting. You call Blackberries "Crackberries" and claim that you're not even a little bit curious about the iPad. When someone asks you a polite question, such as "What's your current distribution capacity?" you merely roll your eyes and shrug, then wander off without answering.

...The next time you find yourself disparaging iPhone apps or raving about the restorative effects of growing organic milkweed in handmade windowboxes, it's imperative that you seek professional help immediately. Remember, there's no shame in admitting that you're indifferent to your own multi-platform marketing initiatives, as long as you can see clearly that it's not normal. The sooner you can admit that you're sick, the sooner you can address your ineffectual sales tactics and build a more resilient, dynamic personal brand that will resonate with a wide range of potential customers, now through the end of the fiscal year.

Monday, January 03, 2011

Consider the Sillies

I found that Jacoby essay via Jenny White:

I propose instead that we do what we can to prepare for that long life, and then we annihilate time.

I'm not the first to think of this, although I'd gladly snatch credit from the Zen masters. Live in the moment: A sentiment overripe from use, but how many of us have actually tried it when wagging fingers insist that as responsible people we should worry about the future as well as prepare for it. The timeline is unforgiving, as we are expected to atone for our pasts as well. Adulthood is an extended act of penance (thrice-weekly gym and no cookies), and old age is your comeuppance for all those unrepented acts of wantonness. Time is society's dream of your life. It doesn't have to be your life.

I agree. I think. I guess I just want to make sure that when we invoke Zen wisdom and say "Live in the moment", we understand that this means grasping that our ideas of the future and the past are both projections of this moment as well. We never actually exist in either the future or the past. When the future gets here, however you define it, it will be experienced as "now", just as the past was. So "living in the moment" doesn't mean, pace the Grass Roots, to just live on impulse and refuse to think about anything but satisfying immediate desires, nor does it mean, as certain deranged cult leaders would have you believe, that God will provide if you just have faith; unless, of course, you share said cult leader's belief that the world is due to end in a fiery apocalypse any minute now, in which case, all right, then.

We can't help but live in this moment. The hard part is simply realizing that.

Getting High On My Mortality

I would rather share the fate of my maternal forebears — old old age with an intact mind in a ravaged body — than the fate of my other grandmother.

Like Susan Jacoby, I actually hope I don't live to such an old age. And I share her conviction that diseases like Alzheimer's can allow one to live too long to live well. But as I've mentioned before, I developed rheumatoid arthritis when I was in my late twenties. And even though I'm fine now thanks to medication (which could, of course, destroy my liver or cause cancer, but I'll cross that bridge if or when I get there), I spent three and a half years in unrelenting physical pain as doctors puzzled over what was wrong with me if my blood tests kept coming back showing a negative rheumatoid factor. Thus, I can report with confidence that even with the overall good health of a twenty-something to alleviate at least some of the decrepitude, it's not easy to live that way either. The intact mind tends to be a bit too preoccupied with the ravaged body to spend much time enjoying the fruits of contemplation. So it's more like "six of one, half-dozen of another" if you ask me; a life as a physical or mental ruin with no reasonable hope of improvement is a life worth ending on its own terms.

I don't say it to be morbid. In fact, I think contemplating one's own mortality, while initially terrifying, of course, leads one to an appreciation of life that doesn't require the sorts of fantasies Jacoby mentions, of people determined to live longer than a century, a century and a half. Those people, to me, don't understand that life's finite limits are actually what give it meaning in the first place. As Steve Hagen said, we want it because it dies. And let's be real; if people thought they could solidly expect another fifty or so years of life, what would they do with that time? Travel, learn new languages, create art? Or spend that much more time eating junk food on the sofa while watching TV? The problem isn't that we don't have enough time, it's that we take the time we do have for granted. That's not going to change by adding on another few decades.