Monday, October 31, 2011
Sunday, October 30, 2011
Kids, mortgages, careers — somehow all the crap we’d once dismissed as the bleak concerns of geezers had caught up with us. Well, not with me. I might not have had enough disposable income to fund many trips like this, but I hadn’t committed myself to kids or mortgages, either. I was single, self-employed, only rented apartments, and lived off taco cart tacos and ham sandwiches. Such concessions guaranteed that my time was mine to fill. I could stay up until 2 a.m. reading if I wanted to, or hike in the mountains on a whim, with no fear of abandoning or disappointing anyone else. I had no one to account for but myself. I was also at that age where you started to wonder if the life you’d fashioned in youth had lost its charm....Counter to the usual progression of things, the older I got, the deeper my musical appreciations grew. I got into blues in my late-20s, got into late-’50s/early-’60s hard bop jazz soon after. I went through a classical music period in which I listened to Handel’s Water Music and Bach’s Brandenburg Concertos so frequently that a friend asked if I was going to start wearing white shirts with ruffles. Recently I’d seen 1950s jazz legends perform in New York, and octogenarian bluesmen tear through fierce sets in nondescript bars, men who used canes and had suffered strokes and still smoked. Although my tastes kept expanding, music had always been one of my central preoccupations....Unlike regular life, live music was never dull or predictable. It also elevated my existence without committing me to the sort of job required to finance the 18-year-plus task of parenting. I felt self-absorbed thinking this, even immature. So many of my friends who had kids constantly extolled parenting’s virtues: “You can’t imagine how much joy kids bring you,” they’d say, “that you could love another human being so deeply.” Baby’s first steps, baby’s first day of school, the quiet moments at home alone when they looked up at you and said, I love you, Dad — “It’s so rewarding. I would throw myself in front of a car for that kid.” I believed them, yet these were the same people who admitted: “I’m always tired.” “I’m buried in chores." “I have no time to myself.” “I worry I’m doing it all wrong; sometimes I can’t breathe.” “I’m stuck at my job at least until she starts middle school.” “I wish I could just jump in the car and drive to the beach and talk to no one.” At night they drank too much wine to cope, or smoked occasional cigarettes even though they’d officially quit....Through the thin veil of our public deceptions, I could see the truth: like me, he was trying to disguise the consumptive intensity of his musical attachments, trying to look like less of a freak and avoid being typecast as the old guy who refused to “grow up.” And despite the differences in our clothing and the gray of his hair, there was no denying what he was: not only a kindred spirit, but precisely the person I might one day be if I kept living the way I was living.I still couldn’t tell if that was a bad thing or not....Maybe it wasn’t parenting that bothered me so much as the mundane. Too much of life was just so earthly. If you broke down the activities that composed our daily existence, it didn’t amount to much: Which size garbage bag should I get? What’s the difference between spearmint and wintermint? Did the cashier actually give me my 10 percent discount? Always scrub the counter so food particles don’t stick. I needed something transcendent to counteract the blandness, even if it only lasted a few minutes. Which was the problem: It only lasted a few minutes. Then it was back to Is fluoride healthier than fluoride-free? Back to this.
It's kinda strange; I've done all the "responsible adult" things that could be reasonably expected of me -- held a steady job, had long-term relationships, owned a house, even raised a stepkid -- but I've always felt in the world of my peers rather than of it, largely because music and books have always remained the centerpiece of my life. I prefer my music studio-packaged instead of live, but yeah, I get what he's saying. I hardly ever talk about music anymore, because most people I know are stuck on repeat with the music of their late teens and early twenties if they even care about it at all. One friend, in the course of venting a bit about midlife crisis issues, asked how I stayed sane. Music, I said. Keeping music as my focus was what kept my entire sense of self from ossifying; it was the invigorating current that kept me from feeling like a stagnant lake. It's all the transcendence I need.
And while I'm sure they do love their kids, I have to smirk a bit at how utterly boring most of them become as parents, boring in the sense of having no interesting thoughts about anything anymore. I don't know how I managed to be born deaf to the siren song of the genetic imperative to pass on my DNA, but I sure am grateful for it.
The Chronicle on Emma Goldman:
Although Mikhail Bakunin, that fiercest of Russian anarchists, was one of her heroes, his famous definition of the revolutionary as a man who "has no interests of his own, no feelings, no habits, no longings, not even a name, only a single interest, a single thought, a single passion—the revolution" was as abhorrent to Goldman as corporate capitalism. If revolutionaries gave up sex and art while they were making the revolution, she said, they would become devoid of joy. Without joy, human beings cease being human. Should the men and women who subscribed to Bakunin's credo prevail, the world would be even more heartless after the revolution than it had been before.The conviction that revolution and the life of the senses dare not be mutually exclusive made Goldman eloquent in defense of causes—sexual freedom, birth control, marriage reform—that a majority of her fellow anarchists derided as trivializing the cause. Comrades repeatedly took her to task for, as many of them said, interpreting anarchism as a movement for individual self-expression rather than a revolution of the collective.
Luxemburg's letters from prison are, in fact, so resolutely cheerful and gentle that they can become cloying. There is a solemn whimsy in her devotion to animals, for instance, that puts the contemporary reader helplessly in mind of Disney cartoons: "Recently I sang the Countess' aria from Figaro, about six [titmice] were perched there on a bush in front of the window and listened without moving all the way to the end; it was a very funny sight to see." Yet coming from "Red Rosa," this kind of thing struck the first readers of her letters with the force of a revelation. Here was a revolutionary who loved flowers and birds, and Hugo Wolf's Lieder, and the poems of Goethe. This Luxemburg offers the strongest possible contrast with Lenin, who famously said, "I can't listen to music too often . . . it makes you want to say stupid nice things, and stroke the heads of people who could create such beauty while living in this vile hell. And now you must not stroke anyone's head: you might get your hand bitten off. You have to strike them on the head, without any mercy."
I just thought it was interesting to happen across two articles drawing that same contrast between two different pairs of male and female revolutionaries (and interesting to imagine how we today would react to someone from, say, OWS voicing similar sentiments to those of Bakunin and Lenin). Of course, even the more "sensible" ones sound hopelessly utopian now, though I must say that Goldman's On Anarchism, which I got years ago at a library book sale for a dollar, struck me as much more relevant than many of the writings of her contemporaries.
How might punctuation now evolve? The dystopian view is that it will vanish. I find this conceivable, though not likely. But we can see harbingers of such change: editorial austerity with commas, the newsroom preference for the period over all other marks, and the taste for visual crispness....One manifestation of this is the advance of the dash. It imitates the jagged urgency of conversation, in which we change direction sharply and with punch. Dashes became common only in the 18th century. Their appeal is visual, their shape dramatic. That's what a modern, talky style of writing seems to demand.By contrast, use of the semicolon is dwindling. Although colons were common as early as the 14th century, the semicolon was rare in English books before the 17th century. It has always been regarded as a useful hybrid—a separator that's also a connector—but it's a trinket beloved of people who want to show that they went to the right school.More surprising is the eclipse of the hyphen. Traditionally, it has been used to link two halves of a compound noun and has suggested that a new coinage is on probation. But now the noun is split (fig leaf, hobby horse) or rendered without a hyphen (crybaby, bumblebee). It may be that the hyphen's last outpost will be in emoticons, where it plays a leading role.
Hmm. I am indeed a fan of the dash; I see it as a linguistic pause button, allowing a brief aside, saving the lengthier ones for enclosure within parentheses. I also would rather dehyphenate most words; something about the hyphen always strikes me as annoyingly retro, like when people used to write "to-day". But as you can see from the previous two sentences, I find the semicolon very useful and fail to understand why it should have come to be synonymous with snobbery and esoteric rules.
An English professor friend of mine occasionally passes along egregious examples of the kind of writing her students submit; it's fun to read them out loud, verbatim, in keeping with the lack of punctuation. I usually manage to stay sanguine about the "kids these days", figuring that the glass has always been half-empty when it comes to basic literacy. The easiest way to become a good speller and good writer is to read, read, and read some more, and most people don't like to do it. So it goes. But I have to admit that I'm currently ready to issue death sentences to all those people who refuse to ever capitalize anything, even when submitting information for an official form. The Internet has become the recessive gene pool of Cormac McCarthy and E.E. Cummings, it seems.
Thursday, October 27, 2011
Tony Harrison: When are you gonna start thinking outside the box?Saboo: The box is there for a reason; to keep ball-men like you inside it.
As a rule, we're always supposed to applaud the collapse of the record industry. We are supposed to feel good about the democratization of music and the limitless palette upon which artists can now operate. But that collapse is why Lulu exists. If we still lived in the radio prison of 1992, do you think Metallica would purposefully release an album that no one wants? No way. Cliff Burnstein from Q Prime Management would listen to their various ideas, stroke his white beard, and deliver the following 45-second pep talk: "OK, great. Love these concepts. Your allusion to Basquiat's middle period was very apt, Lars. Incisive! But here's our situation. If you guys spend two months writing superfast Diamond Head songs about nuclear winter and shape-shifting, we can earn $752 million in 18 months, plus merchandizing. That's option A. The alternative is that you can make a ponderous, quasi-ironic art record about 'the lexicon of hate' that will outrage the Village Voice and mildly impress Laurie Anderson. Your call."...For much of my life, I lived under the myth that record labels were inherently evil. I was ceaselessly reminded that corporate forces stopped artists from doing what they truly desired; they pushed musicians toward predictable four-minute radio singles and frowned upon innovation, and they avariciously tried to turn art into a soulless commodity that MTV could sell to the lowest common denominator. And that did happen, sometimes. But some artists need that, or they end up making albums like this.
Ahhahaha. I haven't paid attention to Metallica for fifteen years, so I don't really care what they've done this time in their quest for utter shamelessness. But as far as the general principle goes, I blame the Romantics. I blame them for this incoherent muddling of creativity with chaos, the belief that uninhibited expression is a close relative or necessary precondition of genius, all of which manifests itself in everything from insufferably pretentious portrait-of-the-artist tossers to the knee-jerk opposition to ADHD drugs for children. Sometimes the tension between the untrammeled vision of the artist and the crass demands of the public or the market is what combusts into greatness.
Monday, October 24, 2011
I speak, of course, of Mary Elizabeth Williams's head.
But has a guy who once made a bit out of peeing himself at the MTV Awards truly had “an impact on American society in ways similar to the distinguished 19th century novelist and essayist best known as Mark Twain”? Is the guy who rocked a jazz flute solo in “Anchorman: The Legend of Ron Burgundy” really “a fearless observer of society” with an “uncompromising perspective of social injustice and personal folly”? You betcha.
Yes, she's talking about Will Ferrell. A very silly woman, to borrow Jane Austen's phrase. I wish I could bring this to the attention of my old 11th-grade English teacher (who was not shy about sharing his opinion that Twain was the greatest American writer ever) just to watch him sputter in apoplectic indignation.
Well, we can go on for a while about whether Steve Jobs was a "smart" man or a "stupid" man for spending so much time dicking around with alternative medicine in the first year after his cancer diagnosis, how people we would generally describe as "smart" can still do "stupid" things in particular, or whether it even makes sense to use broad adjectives like "smart" and "stupid" in an all-or-nothing manner when talking about the sum total of a person's identity, but surely we can all agree that this is a stupid essay, all told. I would excerpt parts of it, but it's so wild and scattershot that you really have to just read it all.
CliffsNotes version: if you've never been faced with a life-threatening illness you have NO RIGHT to judge what anyone else does when faced with a hard choice about treatment and anyway you're probably a Republican who hates poor people and wants to outlaw abortion too and Jobs was a Zen Buddhist which is a hell of a lot better than being a Bible-thumper, yeah I know I was just ranting about being judgmental but pfft that was like so many paragraphs ago so whatevs, anyway Zen is what made him great just like rasta made Bob Marley great so we have to accept the good with the bad, Q to tha E to tha muthafuckin' D - non-sequitur? what non-sequitur? - and if you still say Jobs acted stupidly in postponing treatment, well then you're a CATHOLIC too, and you think God has dominion over your life, dont'cha? Huh? Huh? Dont'cha? Catholicsezwhat HA GOTCHA, NOW, LIKE I WAS SAYING, just cuz Jobs himself came to feel that he had made a stupid mistake doesn't mean anything cuz he was large he contained multitudes and now let's take this baby offroad cuz I'm going to rant about Facebook and private parts and itchy noses and how just cuz Jobs was one of the most famous people in the world and a cult hero to unbelieveably annoying fanboys and fangirls and the subject of a new biography that everyone is wanking about DOESN'T MEAN that we have any right to invade his postmortem privacy by having an opinion on how he handled his illness and GAAAH here we go now I'm really gonna blast off Sarah Palin OWS pot-smoking witchfinder looking under the bed for hippies to punch Big Pharma ack glurg huff puff wheeze gaaaahhhhhh...
Damn, dude. Maybe try some, uh, kava or something. I hear that helps mellow you out.
Chomsky, a politically progressive linguist, should know better than to dismiss new forms of language-production that he does not understand as “shallow.” This argument, whether voiced by him or others, risks reducing those who primarily communicate in this way as an “other,” one who is less fully human and capable. This was Foucault’s point: Any claim to knowledge is always a claim to power. We might ask Chomsky today, when digital communications are disqualified as less deep, who benefits?
Oh, get the fuck off of it. None of us have any problem dismissing gossip as a more shallow form of conversation than intellectual discussions; no one who isn't still entranced by their shiny new techie toys should have any problem making the distinction here. It's telling that Jurgenson rambles through his essay, making ominous implications about the cognitive imperialism of first-world intellectuals, without ever addressing content and nuance, two of the most important criteria we use for judging depth in communication. Yes, yes, yes, it's possible to make good points in 140 characters or less. Yes, yes, yes, it's a great thing that more people, especially poorer people, are able to access the web through smartphones. None of which addresses, let alone changes, the fact that many important ideas need to be developed at length to be worthwhile, an unlikelihood if not an impossibility in the hyperactive, rapid-fire stimulation atmosphere of Twitter.
Saturday, October 22, 2011
Christianity may have forged a distinct ethical tradition, but its key ideas, like those of most religions, were borrowed from the cultures out of which it developed. Early Christianity was a fusion of the Ancient Greek thought and Judaism. Few of what are often thought of as uniquely Christian ideas are in fact so. Take, for instance, the Sermon on the Mount, perhaps the most influential of all Christian ethical discourses. The moral landscape that Jesus sketched out in the sermon was already familiar. The Golden Rule – ‘do unto others as you would have others do unto you’ – has a long history, an idea hinted at in Babylonian and Egyptian religious codes, before fully flowering in Greek and Judaic writing (having independently already appeared in Confucianism too). The insistence on virtue as a good in itself, the resolve to turn the other cheek, the call to treat strangers as brothers, the claim that correct belief is at least as important as virtuous action – all were important themes in the Greek Stoic tradition....If the story of the Renaissance and the Scientific Revolution has been rewritten in the interests of creating a mythical ‘Christian Europe’, so too has the story of the relationship between reason and faith in the Enlightenment. What are now often called ‘Western values’ – democracy, equality, toleration, freedom of speech, etc – are the products largely of the Enlightenment and of the post-Enlightenment world. Such values are, of course, not ‘Western’ in any essential sense but are universal; they are Western only through an accident of geography and history....To challenge the myths and misconceptions about the Christian tradition is not to deny the distinctive character of that tradition (or traditions), nor its importance in incubating what we now call ‘Western’ thought. But the Christian tradition, and Christian Europe, is far more a chimera than a pure-bred beast. The history of Christianity, its relationship to other ethical traditions, and the relationship between Christian values and those of modern, liberal, secular society is far more complex than the trite ‘Western civilization is collapsing’ arguments acknowledge. The irony is that the defenders of Christendom are riffing on the same politics of identity as Islamists, multiculturalists and many of the other ists that such defenders so loathe.
We can't seem to help it; we're born storytellers. But I still laugh at how pervasive it is, this need to believe in some sort of golden age when things were so much clearer, better, happier or healthier. Myths of purity and simplicity are evergreen themes, but humans have always been conflicted and confused, and it's only the increasing distance of time that allows us to think that history was ever so neat, tidy and linear.
Friday, October 21, 2011
Rick Perry is literally trying to steal Herman Cain's thunder
Whoa. Herman Cain literally has thunder? Well, you know, traditionally, the thunder gods were the head of the pantheon, so I'm surprised Cain isn't the nominee already. But more to the point: we all know what happened to Prometheus for stealing enough of a spark from Zeus's lightning to bring fire to humankind. Given what we know already of Perry's, ah, problematic racial issues, is it wise of him to be provoking a thunder-having black man? What would the punishment be for a Texas Republican? Would he be chained to that infamous rock for eternity and forced to read every scientific paper ever published on climate change while being sodomized by an endless line of gay atheists? Or, given his fondness for capital punishment, perhaps Cain would electrocute him with lightning bolts several times a day before bringing him back to life? Or will he be covered in cheese and tomato sauce (plus two toppings) and left to the wild animals?
Or - maybe I should have asked this to begin with - is it even possible to literally steal something metaphorical?
Wednesday, October 19, 2011
Religion provides answers to such questions as “How shall I live?” and “What is the meaning of the universe?” that science has no capacity to answer. But because answers to such questions are incapable of empirical testing by scientific methodology, how can we evaluate the answers that various religions give? As I have said above, the truth of religious beliefs can be seen in the lives of people who live by those truths. And if we see remarkable individuals in other traditions than our own we can accept that they have some kind of truth even if it is not completely the same as ours.
I don't feel like batting this one around like a cat mercilessly tormenting a small rodent. Let's just get right down to elucidating what he's saying here.
How do we know if religious beliefs are worthy? By looking at the actions of people who claim to be inspired by them.
Whoa, hey, hold up, you say. Most of us have seen religiously inspired actions that run the full gamut of human psychology, from selfless generosity to utter depravity. How can we tell which one of these wildly varying types truly represent the "truth" of a religion?
By only accepting as evidence the actions that we describe as remarkable (leaving aside that "remarkable" is a pretty value-neutral word from where I sit).
But, you may interrupt once again (awfully rude of you, you know), take Christianity. It's been widely accepted in biblical scholarship since Albert Schweitzer that Jesus was just another apocalyptic cultist whose central concern was what he believed to be the imminent end of the world and the day of judgement to follow, a self-styled Jewish prophet who didn't seem all that concerned with the worth of Gentiles. Why do we favor the insipid platitudes of the Sermon on the Mount as examples of Christianity's "truth" rather than the numerous other examples where judgement, condemnation, and retribution are shown to be integral to its worldview?
Because they comport with the values that we have come to cherish by way of rational secularism.
So... religion provides us with "truths" that science can't, and the way we know this is by judging those supposed truths in the light of rational consideration from as objective a perspective as we can attain, which, not coincidentally, happens to be a key component of a scientific outlook? Isn't this just an incredibly convoluted way of projecting your own capacity for making critical distinctions onto an arbitrarily-chosen external authority to avoid having to accept full responsibility for your own life? Isn't this an example of, pun intended, bad faith, ba-dum-bum-tssh?
Tuesday, October 18, 2011
We have come to interpret the very opposite of what Mill meant, that tolerance means that we should be respectful of all opinions. When we say our society has got to be tolerant, it becomes a relative thing – almost therapeutic mush that we are not allowed to offend anybody. This kind of tolerance leads to a situation where we refuse to challenge or test out arguments in the public sphere.That is what is so ironic about the contemporary understanding of tolerance. It effectively says you must bite your lip and tolerate all views. This was not at all what Mill meant. Furedi takes on how tolerance has become degraded. He says tolerance has to be robust, interventionist and judgmental. You should tolerate all views, but that does not mean silently sitting by and agreeing with them.
I remember once reading a perspective criticizing tolerance from the left, because it wasn't accepting enough. To that author, "tolerating" something still implied some sort of respectful disagreement or - gasp! - disdain for it. Only all-encompassing love and complete acceptance will do, comrade! You will be assimilated! (I don't recall what the plan was for dealing with dissenters.)
But anyway. I was in a local bookstore the other day, and noticed this book on display by the register. "Hey!" I said. "I know that woman!" Her two sons were school friends of mine; I'd been out to their farm many times. Very nice woman, very nice family. I have not a cross word to say about any of them.
The book, now... uh, it's a book about her, um, spiritual relationship with a cow. As I flipped through the first several pages, I realized that she was not using her cow as a literary device, a symbol of some sort of homespun, earthy wisdom. She's not being figurative or taking poetic license here. She really claims that her cow communicated with her through some sort of telepathy, that she "heard" the cow's thoughts in her own head as clearly as if she had spoken them. It's sorta like Conversations with God, but as if God had teats and stood around looking dopey while chewing his cud.
Now, as I said, this is a perfectly sweet woman, and I always appreciated her kindness to animals, which she taught well to her kids. But there's just no comfortable way to get around the fact that thoughts do not take the form of complete, linear sentences, scrolling across a screen in the brain like a news ticker, and they do not magically transfer through the ether in that state from one brain to another. Leaving aside, of course, that from what we know, cows do not have anything like the kind of cerebral activity that we do.
And so when this sort of thing comes up in personal conversation, it just hangs there like a rank odor that you realize, with a growing horror, is emitting from your acquaintance, and you shift uncomfortably, inching toward the exit, all the while hoping for a brief moment of self-awareness to dawn and allow them to excuse themselves gracefully.
I have this experience more times than I care to count. And it may surprise you to know that I invariably err on the side of politeness. Actually, one of the main reasons I enjoy writing pseudonymously on the Internet is because most of the people I know are, quite frankly, slightly nuts like that, and it would pain me to have to be the one to take a rational hammer to their metaphysical china. This is where I can speak the truth as I see it, letting the abstract ideas speak for themselves, independent of the personality behind them. Maybe not the people I know, but perhaps some stranger with similar beliefs will read what I write and be jarred into a different perspective. So I bite my lip in personal life, despite agreeing with Fox that mushy tolerance is not a value worth championing. Thus does sentimentality make hypocrites of us all.
But, as Scott Atran has pointed out, the whole point of religious faith is that it is contrary to what we know by material means—that is why it's faith. In other words, it is because it's impossible for the sun to stand still that people value their belief that God made the sun stand still for Joshua.As Atran puts it, "religious thought is insensitive to the kind of simple-minded disconfirmation through demonstrations of incoherence that [Sam] Harris and others propose." So when you come along and say science shows the Sun can't stand still, you're failing utterly to speak to the reason people believe in the miracle.
Of course, if you point out that the reason people believe in miracles is either ignorance or a narcissistic need to believe in magic, that the universe itself is favorably disposed toward them and regularly intervenes on their personal behalf for the most petty reasons, and you suggest that this is not merely a false security but an unhelpful mindset that they might do well to leave behind, you're accused of being rude and condescending. You're not allowed to make a case against religious belief with science, you're not allowed to make it with logic and rationality, and you're not allowed to make it with psychology. It's almost like certain apologists want atheists to just shut up and go away altogether. Oh, wait...
Monday, October 17, 2011
But Occupy Wall Street's most defining characteristics—its decentralized nature and its intensive process of participatory, consensus-based decision-making—are rooted in other precincts of academe and activism: in the scholarship of anarchism and, specifically, in an ethnography of central Madagascar....The defining aspect of Occupy Wall Street, its emphasis on direct action and leaderless, consensus-based decision-making, is most clearly embodied by its General Assembly, in which participants in the protest make group decisions both large and small, like adopting principles of solidarity and deciding how best to stay warm at night.This intensive and egalitarian process is important both procedurally and substantively, Mr. Graeber says. "One of the things that revolutionaries have learned over the course of the 20th century is that the idea of the ends justifying the means is deeply problematic," he says. "You can't create a just society through violence, or freedom through a tight revolutionary cadre. You can't establish a big state and hope it will go away. The means and ends have to be the same."When 2,000 people make a decision jointly, it is an example of direct action, or direct democracy, Mr. Graeber says. "It makes you feel different to go to a meeting where your opinions are really respected." Or, as an editorial in the protest's house publication, Occupied Wall Street Journal, put it, "This occupation is first about participation."...The idea that intellectual ferment is coming from the streets rather than academe is evidence that anarchism is witnessing something of a resurgence of interest among both activists and academics, says Nathan J. Jun, assistant professor of philosophy at Midwestern State University, in Texas, and author of the forthcoming Anarchism and Political Modernity.While some students in the movement might be passingly familiar with anarchist studies, Mr. Jun says, they have probably not read much of the scholarship. It is much more likely that anarchism itself has had the greater influence on Occupy Wall Street because, he says, many activists there "regard anarchy as an ideal to be realized."
Oh, dear. An ideal to be realized? Okay, there are those who would smile and shrug at the collapse of the nation-state, and at least they're a little more intellectually consistent and clear on the concept. But if you think a modern industrial state is going to be run along the same lines as a hippie commune or an Amish community, or that life in a decentralized, consensus-based community of a few hundred, maybe a few thousand people is necessarily going to be any less oppressive to individualist sensibilities, well...
More than anything else, I think about Isaiah Berlin's concept of value pluralism in cases like this: maybe, just maybe, some of the values you consider "good" are inherently in conflict with one another and cannot all converge on the same point in time. Maybe you can't have equality, peace, contentment, justice, individualism, consumer goods, etc. all at the same time. Maybe, despite the fact that a bomb in Haymarket Square, Nestor Makhno leading the Anarchist Black Army during the Russian Civil War, Leon Czolgosz assassinating William McKinley, Alexander Berkman shooting Henry Clay Frick (add your own favorite example here) have all failed to light the way to utopia, and have often even been counterproductive to the revolutionary cause, maybe the 1% aren't going to loosen their deathgrip on their purse strings until they get some genuine fear struck into them. Maybe the idea that all problems can be resolved rationally by sitting down peacefully and talking them out is an ideological delusion. I'm just saying.
Scott F. Aikin and Robert B. Talisse:
Consider an analogy. In freethinker and atheist circles, a version of the Xenophanes correlation is often invoked to capture the contingency of religious belief. The following is exemplary.Freethinker: If you were born in the United States of America, you are most likely to become a Christian. If you were born in Saudi Arabia, you are likely to become a Muslim. If you had been born in Norway in the viking ages, you would have believed in Thor and Odin. If you were born in Athens around 500 BC, you’d worship Zeus and Athena.The correlation here is, roughly, that the surrounding cultural milieu determines how one conceives of the divine. We may call this the sociological theory of religion. As the dominant religion of the culture varies with time and geography, the conceptions of the divine held by individuals will also vary. So far, this is only a descriptive point, but it is often deployed as a criticism of religion. The presumption seems to be that one’s conception of the divine should not be determined by simple contingencies. And so the more one’s theology is the product of time and chance, the less confident one should be that it is correct. The determining factors are sociological and historical, not rational.However, once we make this observation about our images of the divine, we can subject the whole of our theology to the same criticism. It’s not just conceiving god with a long beard that’s in trouble; conceiving of god as rational, loving, and good may be projections as well.Interestingly, the question of existence never arises for Xenophanes. In fact, in other fragments, Xenophanes offers positive conceptions of the greatest of the gods. But once we see the critical trajectory of Xenophanes’ challenges, we are compelled to ask the question: Isn’t god’s existence, too, a projection, the product of mere contingency?
Obviously, the analogy is only meant to cast doubt on the monotheistic conception of one true God for all people at all time. But pace their concluding question, this sentence is what I found most interesting: "It’s not just conceiving god with a long beard that’s in trouble; conceiving of god as rational, loving, and good may be projections as well." David Hume and other philosophical heavyweights have questioned whether there's any reason to suppose that God must necessarily be the ne plus ultra of benevolence and rational intelligence; there's nothing new about that.
The reasoning underlying the analogy, though, is based on the assumption that rationality is the best way to know the objective truth about things, to whatever extent we can. It's the only way to make sure you're not allowing your emotional reactions, limited experience and confirmation bias to color your perceptions. Is that true? You often hear religious apologists argue that there are other "truths" that science doesn't have the vocabulary to address, but they seem to feel that those truths are still capable of being rationally understood. Do they ever offer an alternative way of knowing that doesn't depend on rationality?
(I kid, of course. Even epistemological anarchy has been done before. Nothing new under the sun, alas!)
Many "why" questions are really "how" questions in disguise. For instance, if you ask: "Why does water boil at 100C?" what you are really asking is: "What are the processes that explain it has this boiling point?" – which is a question of how.Critically, however, scientific "why" questions do not imply any agency – deliberate action – and hence no intention. We can ask why the dinosaurs died out, why smoking causes cancer and so on without implying any intentions. In the theistic context, however, "why" is usually what I call "agency-why": it's an explanation involving causation with intention.So not only do the hows and whys get mixed up, religion can end up smuggling in a non-scientific agency-why where it doesn't belong.This means that if someone asks why things are as they are, what their meaning and purpose is, and puts God in the answer, they are almost inevitably going to make an at least implicit claim about the how: God has set things up in some way, or intervened in some way, to make sure that purpose is achieved or meaning realised. The neat division between scientific "how" and religious "why" questions therefore turns out to be unsustainable....But for the moment, we can say that any religious belief that involves an activist, really-existing God and claims that religion has something to say about why things happen, must also be encroaching on questions of how they happen, too. And if that's true, the easy peace which many claim should exist between science and religion just isn't possible.
It's not unidirectional, though. Perhaps politeness forbids him from suggesting that science and rationality can, in the course of delineating the hows, do away with the whys by process of elimination, leaving them nowhere to rest but in incoherence. The more we learn about how things work, the more we come to see the futility of trying to make the universe fit the Procrustean bed of purpose and meaning on a human scale. I speak to you from personal experience here, my friends: when you allow your desperate need for objective meaning rooted in abstract universal truth to atrophy, wither and fall away, life is still worth living.
"Fuck!" he exclaimed from the backseat. "I can't get a signal because we're out here in the middle of nowhere."
"What are the odds," I said to her beside me, "that people always find themselves smack dab in the middle of nowhere? How come it's never, 'I'm on the outskirts of nowhere'? 'I'm passing through the suburbs of nowhere'? 'I'm about one-quarter of the way to Nowhere Central, where the horizon and the landmarks are just starting to lose their distinctiveness'?"
"Or 'on the edge of nowhere'?" she replied.
"Well, if you were on the edge of nowhere, you'd actually be somewhere," he chimed in.
"Good point," she responded.
"Maybe 'nowhere' isn't actually a place," I speculated. "Maybe it's more like an amorphous fog that follows you around, so that when it settles, you're perfectly in the middle of it."
"Or maybe, as much as an affront to common sense as it seems," I continued, "we really are in the middle of nowhere, right here in rural Virginia. This is it, the fabled epicenter."
"The literal epicenter. Equidistant." she said, nodding along.
"Yes, literally. It extends way out into the Atlantic ocean on one side, and out into... I dunno, the Midwest on the other. But we happen to be perfectly centered at this moment, right here."
We paused a moment to reflect on the momentous significance of all this, on our humbling privilege.
"We are such nerds."
"Yeah, I know."
Hey, remember way back in mist-shrouded ancient history when a bunch of Egyptians got on Facebook and Twitter and totally had a bitchin' revolution that brought down an evil dictator? And how narcissistic, shit-for-brains American bloggers used that as fuel for their masturbatory obsession with social media? Good times, good times:
Around 7:30 PM, I received text and Twitter messages that an announcer for State TV had on air called for Egyptians to go down and “defend the soldiers who protected the Egyptian revolution” against “armed Copts” who had opened fire and were killing soldiers. Looking around me, I could see that many of those gathered in the tight area around the TV building seemed to have responded to the call. Rough-looking men were arriving in groups; people said they were neighborhood thugs. They held bludgeons, wooden planks, knives, and even swords, and walked boldly into the chaos of burning cars, flying bullets, and glass. “We’ll kill any Christian we get our hands on,” one of them shouted. Someone tweeted that he was in the middle of what looked like a militia, “men with clubs and antique pistols.” Nearby, a young girl was harassed, and a mob assaulted a young Coptic couple, beating them and ripping their clothes. One of the perpetrators emerged from the gang with blood on his hands. “Christian blood!” he boasted. (The couple survived—rushed away by ambulance to be treated for wounds and possible fractures.)...What exactly happened at Maspero on October 9, when, amid a great deal of confusion, a peaceful protest turned into something of a massacre, has become a question of enormous implications for Egypt’s military and government, as well as for its increasingly divided population. Although many are critical even of the idea of “Coptic protests,” saying their premise is sectarian and that the Copts should instead be out defending a “united Egypt” in Tahrir, the events at Maspero have pitted those who believe State TV and support the army, against those who don’t. In the former camp are a working class majority who are convinced the Copts are to blame. Although all political factions, including the Islamists, have condemned the violence, and the Islamist political bloc has been swift to publicly claim it embraces Copts and Muslims alike, insisting it will work for a democratic Egypt, at a grassroots level there are also growing indications that Muslim conservatism is spreading nation-wide. In the aftermath of the violence, I was reminded by friends that many young Egyptians are taught that “Christians are going to hell.”...Whatever is ultimately revealed about what happened at Maspero, many in the Coptic Christian community, which accounts for some 10 percent of Egypt’s 82 million population, regard their position in Egyptian society as increasingly tenuous. The original cause for the Maspero protest, the burning of the El-Marinab Church in Aswan on September 30, was the fifth such assault on a church since the fall of Mubarak, and the sixth in twelve months. Hours before the Aswan church was set on fire, a preacher at a nearby mosque used his midday sermon to incite further anger against the town’s Copts. And in Alexandria the week before the Maspero violence, I heard a Salafi preacher blame the ills of the world on Christians, Zionists, and women.
Jeez. Somebody should just press a "Dislike" button or something. And, uh, has anyone decided what color scheme we're supposed to use on our blog in response to this?
Sunday, October 16, 2011
I’m waiting in line for a cappuccino. It’s gonna be a good one: short, intense, the foamed milk emulsified with the syrupy shot. I glance up from my phone and look around at the cafe. It is, for lack of a better adjective, a hipster joint. There are the artfully branded items for sale (T-shirts, espresso cups, etc.) and a long list of single varietal beans. Hot water is being poured out of sleek Japanese kettles; the baristas are wearing fedoras. And then I look at the other people in line. I notice their costumes: the slim dark jeans, flannel shirts, scuffed boots, designy glasses, mussed hair. Everyone is staring down at the gadget in their hands. They all look like me. I look like them. This is the definition of self-loathing.I mean no disrespect. I’m a sucker for single-varietals. I own one of the Japanese kettles. Right now, I’m wearing that branded T-shirt from my coffee place. What interests me, however, is the irony of the situation. Here we all are, seeking uniqueness, looking for those things that neatly express the idiosyncrasy of our peculiar personalities. And yet, our uniqueness (at least as consumers) is mostly a sham. Somehow, we all end up in the same place, chasing the same trends while drinking the same drink while staring at the same app on the same phone.
Isaiah Berlin identified the period following the death of Aristotle as an important one in the development of Western concepts of individuality, when the philosophical schools "ceased to conceive of individuals as intelligible only in the context of social life...and suddenly spoke of men purely in terms of inner experience and individual salvation." He specifically credited the ancient sophist Antiphon and Diogenes with reacting against the social life of the polis with the idea of independence meaning freedom leading to happiness. The individual rather than the group became the natural unit, ethics were the ethics of the individual, and privacy became a new value.
Peter Watson said that "In all cases, then, we have, centering on the sixth century BC, but extending 150 years either side, a turning away from a pantheon of many traditional 'little' gods, and a great turning inward, the emphasis put on man himself, his own psychology, his moral sense or conscience, his intuition and his individuality." Colin Morris called the "discovery of the individual" one of the most important cultural developments between the years 1050 and 1200, one which seemed to correlate with a fundamental change in Christianity. And John Benton claimed that an increase in self-esteem during this period, along with a larger verbal and visual vocabulary for concepts of selfhood, led to the age of discovery and the Renaissance.
So our conceptions of individuality itself have been a long work in progress, to understate it mildly. But I would imagine that the sort of individuality that Lehrer is talking about, one that primarily identifies the self with expression through consumer products, is much more recent. Authors like Thomas Frank and Joseph Heath and Andrew Potter have argued that the urge to differentiate oneself through owning the newest and the rarest commodities is actually the driving force behind rapacious consumerism, rather than the usual conception of consumers being a mindless herd of people all wanting to be just like everyone else. The emphasis on external signifiers, of which there can only be so many variations, after all, as opposed to internal experience, is what causes this silly kind of hipster angst.
Friday, October 14, 2011
See, I don’t want to be part of a yoga world of happy talk about unending potential and perfect happiness. I don’t have much time for the kind of self-impressed platitudes that give yoga a bad name. Like so many of the secular, health-oriented, somewhat prideful members of my clan, I do yoga to quiet my brain, not to fill it with nonsense. And yet nonsense abounds. Last month, I dropped in on a class at another studio. As class began, the teacher offered her thoughts about the goodness of the world and its benevolence toward us. “If you just reach out with your intention,” she said sagely, “the universe will rise to meet you half-way.” I almost walked out. The earthquake in Japan had happened the day before....The point is that the practice of attentiveness—the fundamental practice that yoga cultivates—should lead us to contemplate the full reality of our life, which includes its inevitable end. As the yogi Richard Freeman puts it, “Yoga is a rehearsal for death.” That is the universe rising up to meet you.For me, this discussion was a rare moment when I had some inclination of what “yoga spirituality” might mean, particularly for someone who doesn’t actually believe in spirituality. In this version, there is no promise of health or happiness. There is only our embrace of reality, in both its quiet joys and its suffering. We recognize ourselves as part of the universe, and we accept that universe’s fundamental indifference to us. Then we see what flows from that. I suspect that this embrace of death, and life, doesn’t arise from an act of will or from reading the right books. Maybe, though, it comes from the act of the placing one’s feet in exactly the right alignment, and paying attention.
I started doing yoga when I was twelve years old. I took a copy of a 28-day program by Richard Hittleman that my mom had on the shelf, and every day for the next month, I would get up at 5: 30 and go through the routine, dutifully lying still after each pose and "being aware of the changes going on in my body." Mainly, I just spent that time wondering what exactly I was supposed to be noticing, worried that the Karma Police were going to nab me for not doing it right.
I've kept at it off and on lo these many years since, mixing traditional yoga poses with regular calisthenics and strength exercises. I've never been to a class for it, mainly for the reasons McAlister mentions; I just like the attentive act of feeling my muscles stretch and following my breath. No thought of enlightenment or physical gain, just the enjoyment of holding still and being quiet. In fact, I'm going to get up and go through some exercises right now.
Thursday, October 13, 2011
Perhaps the true sign of success is the ability to free oneself from work to engage in the active process of what Aristotle called idleness, the effortfully investigation of life to 'know oneself.' What if we were to cease 'work' when it no longer was for necessity; perhaps it would increase the ability of others to secure work toward a similar vein and allow for an increasing collective to investigate those age-old questions of 'who am I' and 'what is the meaning of life'.Perhaps the post-50 sign of true success is living the Idle life.
Age-old questions? Uh, excuse me, excuse me, but we were just informed in no uncertain terms by a melodramatic Philosophy 101 student that no one is asking those questions.
But seriously, though -- post-50? Sheeit, I'm already living it pre-40, baby. I just - literally, just - paid off the little bit of credit card debt I had. The only financial obligation left is a mortgage, and I could easily foresee a day when even that becomes too onerous of a burden for my liking. And I'm going to be spending the majority of the next few days haunting library sales. As part of my job.
I do have a spare storage shed if anyone else wants to come join the slacker collective. Just chip in some for groceries, is all.
In case you were looking for some good writing on the darker side of the blogosphere:
It’s now almost 10 years that I have been living life on the needle. During that time I’ve shot up in parks, cars, toilets and on buses. In addition to my arms, legs, stomach and chest, I’ve also injected in my fingers, toes, palms and forehead. I’ve hit nerves, arteries, joints and bone, and have suffered every imaginable lump, bump and swelling. I’ve poisoned myself 4 times with ‘dirty heroin’, had abscesses the size of golf balls and I’ve Od’d twice. On my entire body I have only one visible vein left. In my determination to self-medicate I’ve lost family, friends, lovers, two Cockatoos and a dog. My bank is in the red and so after 34 years I have less than nothing.As I write this it is 24 hours since my last injection and that seems a long time. Previous to that it was 72 hours and previous to that 7 days. The longest I’ve ever been heroin or needle free is 5 months. But I do have some qualities and I use them to convince the few people left around me that I’m changing... that I’ve finally seen the light. And as I sit there with my perforated escape plan laid out, I busk and dance my way around all the awkward questions. At one point I even promise to stop smoking and cut down on the chocolate. It’s then I realize I’ve gone too far, that I’ve said too much. The place kind of deflates with disappointment and without even looking up I know what they’re all thinking, “He’s not getting better... he’s getting worse!” And I can’t blame them for that... I’m thinking exactly the same myself.
Wednesday, October 12, 2011
John Gray on the progressive faith of Steven Pinker:
Some of the impulses we inherit from our evolutionary past may incline us to conflict, but others— “the better angels of our nature,” as Abraham Lincoln called them—incline us to peaceful cooperation. In order to show that conflicts between the two will in future increasingly be settled in favour of peace, Pinker needs to be able to identify some very powerful trends. He does his best, but the changes to which he points—the spread of democracy and the increase of wealth, for example—are more problematic than he realises. The formation of democratic nation-states was one of the principal drivers of violence of the last century, involving ethnic cleansing in inter-war Europe, post-colonial states and the post-communist Balkans. Steadily-growing prosperity may act as a kind of tranquilliser, but there is no reason to think the increase of wealth can go on indefinitely—and when it falters violence will surely return. In quite different ways, attacks on minorities and immigrants by neo-fascists in Europe, the popular demonstrations against austerity in Greece and the English riots of the past summer show the disruptive and dangerous impact of sudden economic slowdown on social peace. All the trends that supposedly lie behind the Long Peace are contingent and reversible....Pinker’s attempt to ground the hope of peace in science is profoundly instructive, for it testifies to our enduring need for faith. We don’t need science to tell us that humans are violent animals. History and contemporary experience provide more than sufficient evidence. For liberal humanists, the role of science is, in effect, to explain away this evidence. They look to science to show that, over the long run, violence will decline—hence the panoply of statistics and graphs and the resolute avoidance of inconvenient facts. The result is no more credible than the efforts of Marxists to show the scientific necessity of socialism, or free-market economists to demonstrate the permanence of what was until quite recently hailed as the Long Boom. The Long Peace is another such delusion, and just as ephemeral.
Six months before he was murdered in his study in Mexico City, Leon Trotsky wrote: "I shall die a proletarian revolutionist, a Marxist, a dialectical materialist, and, consequently, an irreconcilable atheist. My faith in the communist future of mankind is no less ardent, indeed it is firmer today, than it was in the days of my youth."There is something tragicomic in this confession of faith. Dialectical materialism, though it claimed to be based in science, was never more than superstitious gibberish. When he invoked the supposed science to bolster his failing political hopes, Trotsky was engaging in a type of magical thinking, using words as charms to ward off the terrors of history. At the same time - and this is the irresistibly comical element of Trotsky's career - he never ceased to regard himself as anything other than an uncompromising rationalist....Hitchens's account of the origins of neoconservatism has obvious parallels with his own political trajectory. He has always made it clear that, for him, the decision to invade Iraq was justified as the beginning of a revolutionary war. It is this continuing ideological mindset that accounts for many of the misjudgements he has made over the past decade. For Hitchens, that the Iraq war proved to be a disaster does not show the enterprise to have been a mistake - any more than the disastrous history of the former Soviet Union shows that the Bolshevik revolution (for which Hitchens continues to nurse a decidedly soft spot) was a mistake. In both cases, the human costs count for very little in the final analysis. What matters is the world-transforming revolutionary impulse that animated both experiments....There are some who represent Hitchens as a contrarian or provocateur, without convictions. They are wrong. What sort of provocateur would write that "Bin Ladenism" is more dangerous than German Wilhelmine imperialism, the Nazi-Fascist axis and international communism? Such a patently absurd claim could only be made by one who deeply believes it to be true.Leave aside the grotesque disproportion in lumping the Kaiser's Germany in with mid-20th-century totalitarianism. What is wholly fantastical is putting Osama Bin Laden's gang in the same category as Nazi Germany and the Soviet Union - two extremely powerful states with vast industrial and military resources, the first coming close to conquering all of Europe, the second annexing Europe's eastern half and the Baltic states while imposing itself throughout central Asia. In passing over these undeniable facts, Hitchens is not playing the role of intellectual gadfly. He is showing himself to be a believer who - like Trotsky - blanks out reality when it fails to accord with his faith.That Hitchens has the mind of a believer has not been sufficiently appreciated.
Tuesday, October 11, 2011
Statistics are indeed showing that more men are struggling now than in the past, which is a result of vast economic forces, as well as social ones (Christina Hoff Sommers wrote very presciently about "The War Against Boys" in 2000). And this is serious, and needs to be paid attention to....A darker aspect is that this new power balance/imbalance means men are having to grapple with feelings of inferiority that they're not quite accustomed to, and this can be hard on couples, particularly in a world that almost presumes women will have inferiority complexes.
My ex-girlfriend took an auto mechanics course in high school and knew a fair amount about troubleshooting and making minor repairs. My current girlfriend is far more knowledgeable about all sorts of traditionally masculine skills than I am; I hardly ever used anything more than the most basic tools until my brief stint as a satellite technician. She showed me how to patch a minor leak in the roof, and she made and executed all the plans for building shelves in the garage while I just followed her instructions. I've never felt the slightest bit threatened by having to defer to female superiority in matters like that. In fact, it's downright amusing to go into a Home Depot and watch her do all the talking to the employee who comes over to help while I just stand there smiling. If I were going to learn a new skill, culinary classes would be at the top of my list.
To a certain degree, I do like purposely tweaking people's conceptions of how a heterosexual male is supposed to think and act, but I've just never identified with the sort of rooster pride that concerns itself with being accepted as one of the boys. It's never occurred to me to see it as anything less than a plus if my partner has skills that complement mine, and I feel sorry for any poor bastard who can't appreciate his wife making more money than him.
All paid jobs absorb and degrade the mind.- Aristotle
I understand why people find this inspiring or profound. But this is absolutely terrible advice from a practical standpoint.Most of us are never going to get paid to do what we love. Work is something we do to support ourselves. Ideally, we don't hate it. That's the best most of us will ever do with our employment – we can consider ourselves fortunate if we don't actively loathe it. But love it? Steve Jobs got paid to do what he apparently loved, and good on him. He is not like most of us, though. He had a lot of talent and he happened to love something that was beyond lucrative.
Hear, hear. As Max Weber famously noted, the old Calvinist notion of a "calling" has long been intertwined with our ideas about work. Personally, I lament all the wasted time and unnecessary anxiety I felt as an adolescent, struggling to figure out why I couldn't think of a career choice that appeared to me in a flash of divine illumination. Many of my peers seemed to have their lives all planned out by the end of high school; what was wrong with me? Was I going to be a - gasp! - failure?
But now, one of the things that ruffles my feathers about all these platitudes and exhortations is the unspoken and likely unconscious elitism behind them. I would add to what Ed said that many of the jobs that are absolutely necessary for society to run at all are inherently boring, tiring, and nowhere near being nourishing and life-affirming. There's an implicit understanding in speeches like the address Jobs delivered at Stanford that we are the special ones; we have the education and privilege to customize our lives to our exacting preferences. Trash collecting? Maintenance work? Don't those people have an app to take care of that sort of thing?
Monday, October 10, 2011
When people tell me they are spiritual, first I think of healing crystals and astral charts, a lock of white hair tied to the end of a stick, drum circles and dreamcatchers, the cosmic juice between us all, man, synchronicity as a sign of some kind of, like, churning force!Shaking off the stardust, I turn to thinking that the Spiritual Person probably has cobbled together a set of private beliefs they don’t really feel like explaining. After one of my best friends almost died in a car accident, he custom designed a personal program based on the Beatitudes, Buddhism, and Emerson. It verges on genius, and it is a spirituality. Religion is a form you sign; spirituality is ideas. But if we each get to decide what spirituality means, what the freak is spirituality?See how terrible I am at this? Spirituality is one of those annoyingly flexible words like freedom, a blankness that invites our self-centered definition to scribble itself all over the big dry erase board of its name.In Andre Comte-Sponville’s excellent morsel of meditation, The Little Book of Atheist Spirituality, he writes that “we are finite beings that open onto infinity.” That’s better. Let me request that you suppress the word spirituality for two seconds, and instead invite you to open onto the ethereal atmosphere between us, weather, vibes, the forever stuff, our flickering understanding of what connects us, and what connects us all to eternity.
Shanna was once talking to me about the field of linguistics; specifically between denotation, the dictionary definition of a word, and connotation, the implications and social uses of a word. For me, my objection to the word "spiritual" is that it fails on both levels.
Ho, there! Now, it should - hopefully - go without saying that I am all in favor of the turn away from institutional forms of religion toward a inward form of seeking. I hereby proclaim it to be a Good Thing that more people have the education and leisure to gain awareness of other beliefs and ideas from around the world and incorporate them into their lives after at least some rational analysis, independent of the stifling weight of family and community tradition.
But on a denotative level, the word "spirit", in this context, is almost always understood to be contrasted with matter. I don't believe words have magical power, of course, but they do form the conceptual framework we organize our thoughts within, and as such, I think it's important to think carefully about the terms we use and what they imply. If you want to rebel against "organized religion" and all of its unfortunate legacy, it seems to me that rejecting the old Platonic dualism would be a good place to start.
And on a connotative level, the word "spiritual" doesn't signify anything in particular. It's vague to the point of utter meaninglessness, as Welch says, and yet, it still commands a certain reflexive respect simply by virtue of not being organized religion, which no cool kid would want to admit belonging to. It somehow manages to still have indie credibility despite having topped the charts and become a household name. But I still say that when you actually press SNR people on what it is they really believe, you invariably find that they are using the same old templates as the stultifying religion they supposedly left behind. They confuse their evasive, lazy lack of clarity for a sign that they've transcended all classification.
There's nothing esoteric or otherworldly about the experience or mentality. You don't need to invoke eternity, ethereality, forever, transcendence, or other loaded terms. All that's needed is to live attentively. Language, as beautiful and useful as it is, tends to have a deadening effect on our ability to do that. We mistake the word for the experience, the menu for the meal, and focus on the finger instead of the moon.
Speaking of Alan Lightman:
Dennett says that I am concerned that Dawkins is “too darned clear, too brutally frank when he articulates his case” against religion. It is not Dawkin’s clarity that concerns me. It is his condescension towards believers and his labeling of this large group of people as non-thinkers....Dennett reminds me that Richard Dawkins is deeply appreciative of the art, music, and poetry that religion has engendered, but it is just that Dawkins believes that religion, on balance, has accomplished more harm than good.I would find it difficult to attempt such a tally. Whatever the results of such a balance, does that mean we, like Dawkins, should throw out religion wholesale, take a condescending attitude toward people of religious beliefs, label people of religious beliefs as non-thinkers imperious to scientific evidence? No. It means that we should continue to oppose those practices of religion that do damage, we should continue to oppose irrational thinking on issues that require rational thinking and evidence. But, at the same time, I would argue that we should allow our existence to encompass some things that we cannot explain by rational argument and proof.
If you read Dennett's earlier essay, you'll notice that in the second sentence, he says:
...Alan Lightman joins a long line of atheist apologists who feel compelled to respond negatively to Richard Dawkins’ campaign but find it hard to put forward a crisp, fact-based objection.
So Lightman responds by once again complaining that Dawkins is condescending and rude without ever providing a specific example, such as, say, a quotation where Dawkins attacks believers themselves, as opposed to faith or belief in the abstract, in such personal, denigrating terms. Brilliant.
I like that last sentence too. "We should allow our existence to encompass some things that we cannot explain by rational argument and proof." Again, who fails to do this, aside from people with burlap bags for heads and straw sticking out from their collars and shirtsleeves? Who demands rationality and proof in art, poetry and music? Ironically enough, it's people like Lightman who seem inclined toward a strict dichotomy rather than a continuum when it comes to questions of faith; as he says earlier, he thinks that if the existence of an intelligent creator or an objective meaning of life can't be disproved by science, they are just as valid as any other belief and are required to be taken seriously. But lumping faith in a personal God in with faith in what we understand of scientific laws is disingenuous in the extreme; there are massive degrees of difference between them when it comes to evidence and likelihood.
Sittin' here in the wee hours, just contemplating the usual middle-of-the-night issues: the nature of self, the nonexistence of God, other lighthearted fare. Daniel Dennett keeps popping up in my readings tonight:
Yes, many unanswered questions persist. But these are early days, and neuroscience remains immature, says Churchland, a professor emerita of philosophy at University of California at San Diego and author of the subfield-spawning 1986 book Neurophilosophy. In the 19th century, she points out, people thought we'd never understand light. "Well, by gosh," she says, "by the time the 20th century rolls around, it turns out that light is electromagnetic radiation. ... So the fact that at a certain point in time something seems like a smooth-walled mystery that we can't get a grip on, doesn't tell us anything about whether some real smart graduate student is going to sort it out in the next 10 years or not."Dennett claims he's got much of it sorted out already. He wrote a landmark book on the topic in 1991, Consciousness Explained. (The title "should have landed him in court, charged with breach of the Trade Descriptions Act," writes Tallis.) Dennett uses the vocabulary of computer science to explain how consciousness emerges from the huge volume of things happening in the brain all at once. We're not aware of everything, he tells me, only a "limited window." He describes that stream of consciousness as "the activities of a virtual machine which is running on the parallel hardware of the brain.""You—the fruits of all your experience, not just your genetic background, but everything you've learned and done and all your memories—what ties those all together? What makes a self?" Dennett asks. "The answer is, and has to be, the self is like a software program that organizes the activities of the brain."
And here he responds to Alan Lightman's faith-based twaddle from last week:
Is there any rational grounding for a belief in a miracle-making interventionist God? (No.) Do we need God to account for the brilliant design of living things? (No.) Do we need God to somehow underwrite or ground our confidence that our ethical convictions are not just parochial prejudices? (No.) There is nothing gloriously, ineffably, tantalizingly imponderable about these questions, carefully crafted and vetted by philosophers and scientists over the centuries.Lightman fails to consider the possibility, moreover, that the reason many theological questions continue to evade the bright light of rational inquiry is that they have been ingeniously crafted by theologians to do just that. As the traditional concepts of God, heaven and hell crumble in the collision with science, the theologians invent new, more “sophisticated” concepts to take their place. They are improvements only in the sense that they are more immune to falsification by any imaginable discovery....Like many atheists, Dawkins — as Lightman surely knows — is deeply appreciative of all the glorious art and music and poetry that religion has engendered. But still he trots out the canonical list of glories to float the implication that we atheists are a philistine lot. Shame on him. That is what I call faith-fibbing. Not so much a bald-faced lie as a carefully indirect misrepresentation. He can’t actually claim that Dawkins doesn’t weigh the many contributions of religion to the arts against the damage it has wrought. Dawkins does just that, and arrives at a judgment with which Lightman apparently disagrees: All things considered, religion’s blessings are outweighed by the harm they do. The problem is that Lightman doesn’t tackle that difficult issue; he prefers to sing the praises of faith without holding it to account.